ResearchSpace

A non-linear manifold alignment approach to robot learning from demonstrations

Show simple item record

dc.contributor.author Makondo, Ndivhuwo
dc.contributor.author Hiratsuka, M
dc.contributor.author Rosman, Benjamin S
dc.contributor.author Hasegawa, O
dc.date.accessioned 2018-05-23T06:57:45Z
dc.date.available 2018-05-23T06:57:45Z
dc.date.issued 2018-04
dc.identifier.citation Makondo, N. et al. 2018. A non-linear manifold alignment approach to robot learning from demonstrations. Journal of Robotics and Mechatronics, vol. 30(2): 265-281 en_US
dc.identifier.issn 0915-3942
dc.identifier.issn 1883-8049
dc.identifier.uri https://www.fujipress.jp/jrm/rb/robot003000020265/
dc.identifier.uri doi: 10.20965/jrm.2018.p0265
dc.identifier.uri http://hdl.handle.net/10204/10229
dc.description Published in Journal of Robotics and Mechatronics, vol. 30(2): 265-281 en_US
dc.description.abstract The number and variety of robots active in real-world environments are growing, as well as the skills they are expected to acquire, and to this end we present an approach for non-robotics-expert users to be able to easily teach a skill to a robot with potentially different, but unknown, kinematics from humans. This paper proposes a method that enables robots with unknown kinematics to learn skills from demonstrations. Our proposed method requires a motion trajectory obtained from human demonstrations via a vision-based system, which is then projected onto a corresponding human skeletal model. The kinematics mapping between the robot and the human model is learned by employing Local Procrustes Analysis, a manifold alignment technique which enables the transfer of the demonstrated trajectory from the human model to the robot. Finally, the transferred trajectory is encoded onto a parameterized motion skill, using Dynamic Movement Primitives, allowing it to be generalized to different situations. Experiments in simulation on the PR2 and Meka robots show that our method is able to correctly imitate various skills demonstrated by a human, and an analysis of the transfer of the acquired skills between the two robots is provided. en_US
dc.language.iso en en_US
dc.publisher Fuji Technology Press en_US
dc.relation.ispartofseries Worklist;20914
dc.subject Learning from demonstrations en_US
dc.subject Knowledge transfer en_US
dc.title A non-linear manifold alignment approach to robot learning from demonstrations en_US
dc.type Article en_US
dc.identifier.apacitation Makondo, N., Hiratsuka, M., Rosman, B. S., & Hasegawa, O. (2018). A non-linear manifold alignment approach to robot learning from demonstrations. http://hdl.handle.net/10204/10229 en_ZA
dc.identifier.chicagocitation Makondo, Ndivhuwo, M Hiratsuka, Benjamin S Rosman, and O Hasegawa "A non-linear manifold alignment approach to robot learning from demonstrations." (2018) http://hdl.handle.net/10204/10229 en_ZA
dc.identifier.vancouvercitation Makondo N, Hiratsuka M, Rosman BS, Hasegawa O. A non-linear manifold alignment approach to robot learning from demonstrations. 2018; http://hdl.handle.net/10204/10229. en_ZA
dc.identifier.ris TY - Article AU - Makondo, Ndivhuwo AU - Hiratsuka, M AU - Rosman, Benjamin S AU - Hasegawa, O AB - The number and variety of robots active in real-world environments are growing, as well as the skills they are expected to acquire, and to this end we present an approach for non-robotics-expert users to be able to easily teach a skill to a robot with potentially different, but unknown, kinematics from humans. This paper proposes a method that enables robots with unknown kinematics to learn skills from demonstrations. Our proposed method requires a motion trajectory obtained from human demonstrations via a vision-based system, which is then projected onto a corresponding human skeletal model. The kinematics mapping between the robot and the human model is learned by employing Local Procrustes Analysis, a manifold alignment technique which enables the transfer of the demonstrated trajectory from the human model to the robot. Finally, the transferred trajectory is encoded onto a parameterized motion skill, using Dynamic Movement Primitives, allowing it to be generalized to different situations. Experiments in simulation on the PR2 and Meka robots show that our method is able to correctly imitate various skills demonstrated by a human, and an analysis of the transfer of the acquired skills between the two robots is provided. DA - 2018-04 DB - ResearchSpace DP - CSIR KW - Learning from demonstrations KW - Knowledge transfer LK - https://researchspace.csir.co.za PY - 2018 SM - 0915-3942 SM - 1883-8049 T1 - A non-linear manifold alignment approach to robot learning from demonstrations TI - A non-linear manifold alignment approach to robot learning from demonstrations UR - http://hdl.handle.net/10204/10229 ER - en_ZA


Files in this item

This item appears in the following Collection(s)

Show simple item record