Ratshidaho, TTapamo, JRClaassens, JGovender, Natasha2012-11-012012-11-012012-10Ratshidaho, T, Tapamo, JR, Claassens, J and Govender, N. ToF camera ego-motion estimation. 4th CSIR Biennial Conference: Real problems relevant solutions, CSIR, Pretoria, 9-10 October 2012http://hdl.handle.net/10204/62664th CSIR Biennial Conference: Real problems relevant solutions, CSIR, Pretoria, 9-10 October 2012We present three approaches for ego-motion estimation using Time-of-Flight (ToF) camera data. Ego-motion is defined as a process of estimating a camera’s pose (position and orientation) relative to some initial pose using the camera’s image sequences. Ego-motion facilitates the localisation of the robot. The ToF camera is characterised with a number of error models. Iterative Closest Point (ICP) is applied to consecutive range images of the ToF camera to estimate the relative pose transform which is used for ego-motion estimation. We implemented two variants of ICP, namely point-to-point and point-to-plane. A feature-based ego-motion approach that detects and tracks features on the amplitude images and uses their corresponding 3D points to estimate the relative transformation, is implemented. These approaches are evaluated using groundtruth data provided by a motion capture system (Vicon). The SIFT ego-motion estimation was found to perform faster when compared to the ICP-based methods.enEgo-motionTime-of-FlightToFAutonomous robotsIterative Closest PointICPToF camera ego-motion estimationConference PresentationRatshidaho, T., Tapamo, J., Claassens, J., & Govender, N. (2012). ToF camera ego-motion estimation. http://hdl.handle.net/10204/6266Ratshidaho, T, JR Tapamo, J Claassens, and Natasha Govender. "ToF camera ego-motion estimation." (2012): http://hdl.handle.net/10204/6266Ratshidaho T, Tapamo J, Claassens J, Govender N, ToF camera ego-motion estimation; 2012. http://hdl.handle.net/10204/6266 .TY - Conference Presentation AU - Ratshidaho, T AU - Tapamo, JR AU - Claassens, J AU - Govender, Natasha AB - We present three approaches for ego-motion estimation using Time-of-Flight (ToF) camera data. Ego-motion is defined as a process of estimating a camera’s pose (position and orientation) relative to some initial pose using the camera’s image sequences. Ego-motion facilitates the localisation of the robot. The ToF camera is characterised with a number of error models. Iterative Closest Point (ICP) is applied to consecutive range images of the ToF camera to estimate the relative pose transform which is used for ego-motion estimation. We implemented two variants of ICP, namely point-to-point and point-to-plane. A feature-based ego-motion approach that detects and tracks features on the amplitude images and uses their corresponding 3D points to estimate the relative transformation, is implemented. These approaches are evaluated using groundtruth data provided by a motion capture system (Vicon). The SIFT ego-motion estimation was found to perform faster when compared to the ICP-based methods. DA - 2012-10 DB - ResearchSpace DP - CSIR KW - Ego-motion KW - Time-of-Flight KW - ToF KW - Autonomous robots KW - Iterative Closest Point KW - ICP LK - https://researchspace.csir.co.za PY - 2012 T1 - ToF camera ego-motion estimation TI - ToF camera ego-motion estimation UR - http://hdl.handle.net/10204/6266 ER -