PL EN


Preferencje help
Widoczny [Schowaj] Abstrakt
Liczba wyników
2016 | 26 | 1 | 81-97
Tytuł artykułu

RGB-D terrain perception and dense mapping for legged robots

Treść / Zawartość
Warianty tytułu
Języki publikacji
EN
Abstrakty
EN
This paper addresses the issues of unstructured terrain modeling for the purpose of navigation with legged robots. We present an improved elevation grid concept adopted to the specific requirements of a small legged robot with limited perceptual capabilities. We propose an extension of the elevation grid update mechanism by incorporating a formal treatment of the spatial uncertainty. Moreover, this paper presents uncertainty models for a structured light RGB-D sensor and a stereo vision camera used to produce a dense depth map. The model for the uncertainty of the stereo vision camera is based on uncertainty propagation from calibration, through undistortion and rectification algorithms, allowing calculation of the uncertainty of measured 3D point coordinates. The proposed uncertainty models were used for the construction of a terrain elevation map using the Videre Design STOC stereo vision camera and Kinect-like range sensors. We provide experimental verification of the proposed mapping method, and a comparison with another recently published terrain mapping method for walking robots.
Rocznik
Tom
26
Numer
1
Strony
81-97
Opis fizyczny
Daty
wydano
2016
otrzymano
2014-10-03
poprawiono
2015-04-29
poprawiono
2015-10-04
Twórcy
  • Institute of Control and Information Engineering, Poznań University of Technology, ul. Piotrowo 3A, 60-965 Poznań, Poland
  • Institute of Control and Information Engineering, Poznań University of Technology, ul. Piotrowo 3A, 60-965 Poznań, Poland
  • Autonomous Systems Lab, ETH Zurich, LEE J 201, Leonhardstrasse 21, 8092 Zurich, Switzerland
  • Autonomous Systems Lab, ETH Zurich, LEE J 201, Leonhardstrasse 21, 8092 Zurich, Switzerland
Bibliografia
  • Belter, D., Łabecki, P. and Skrzypczyński, P. (2012). Estimating terrain elevation maps from sparse and uncertain multi-sensor data, IEEE 2012 International Conference on Robotics and Biomimetics, Guangzhou, China, pp. 715-722.
  • Belter, D., Łabecki, P. and Skrzypczyński, P. (n.d.). Adaptive motion planning for autonomous rough terrain traversal with a walking robot, Journal of Field Robotics, (in print).
  • Belter, D., Nowicki, M., Skrzypczyński, P., Walas, K. and Wietrzykowski, J. (2015). Lightweight RGB-D SLAM system for search and rescue robots, in M.K.R. Szewczyk and C. Zieliński (Eds.), Recent Advances in Automation, Robotics and Measuring Techniques, Advances in Intelligent Systems and Computing, Vol. 351, Springer, Cham, pp. 11-21.
  • Belter, D. and Skrzypczyński, P. (2011a). Integrated motion planning for a hexapod robot walking on rough terrain, 18th IFAC World Congress, Milan, Italy, pp. 6918-6923.
  • Belter, D. and Skrzypczyński, P. (2011b). Rough terrain mapping and classification for foothold selection in a walking robot, Journal of Field Robotics 28(4): 497-528.
  • Belter, D. and Skrzypczyński, P. (2013). Precise self-localization of a walking robot on rough terrain using parallel tracking and mapping, Industrial Robot: An International Journal 40(3): 229-237.
  • Belter, D. and Walas, K. (2014). A compact walking robot-flexible research and development platform, in M.K.R. Szewczyk and C. Zieliński (Eds.), Recent Advances in Automation, Robotics and Measuring Techniques, Advances in Intelligent Systems and Computing, Vol. 267, Springer, Cham, pp. 343-352.
  • Berger, M., Tagliasacchi, A., Seversky, L., Alliez, P., Levine, J., Sharf, A. and Silva, C. (2014). State of the art in surface reconstruction from point clouds, in S. Lefebvre and M. Spagnuolo (Eds.), Eurographics 2014-State of the Art Reports, The Eurographics Association, Geneve.
  • Bloesch, M., Gehring, C., Fankhauser, P., Hutter, M., Hoepflinger, M.A. and Siegwart, R. (2013). State estimation for legged robots on unstable and slippery terrain, Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems, Tokyo, Japan, pp. 6058-6064.
  • Dey, T.K., Ge, X., Que, Q., Safa, I., Wang, L. and Wang, Y. (2012). Feature-preserving reconstruction of singular surfaces, Computer Graphics Forum 31(5): 1787-1796.
  • Dryanovski, I., Morris, W. and Xiao, J. (2010). Multi-volume occupancy grids: An efficient probabilistic 3D mapping model for micro aerial vehicles, Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems, Taipei, Taiwan, pp. 1553-1559.
  • Fankhauser, P., Bloesch, M., Gehring, C., Hutter, M. and Siegwart, R. (2014). Robot-centric elevation mapping with uncertainty estimates, International Conference on Climbing and Walking Robots (CLAWAR), Poznań, Poland, pp. 433-440.
  • Handa, A., Whelan, T., McDonald, J. and Davison, A. (2014). A benchmark for RGB-D visual odometry, 3D reconstruction and SLAM, IEEE International Conference on Robotics and Automation, ICRA, Hong Kong, China, pp. 1524-1531.
  • Hebert, M., Caillas, C., Krotkov, E. and Kweon, I. (1989). Terrain mapping for a roving planetary explorer, Proceedings of the IEEE International Conference on Robotics and Automation, Scottsdale, AZ, USA, pp. 997-1002.
  • Hornung, A., Wurm, K., Bennewitz, M., Stachniss, C. and Burgard, W. (2013). OctoMap: An efficient probabilistic 3D mapping framework based on octrees, Autonomous Robots 34(3): 189-206.
  • Hutter, M., Gehring, C., Bloesch, M., Hoepflinger, M.A., Remy, C.D. and Siegwart, R. (2012). StarlETH: A compliant quadrupedal robot for fast, efficient, and versatile locomotion, International Conference on Climbing and Walking Robots (CLAWAR), Baltimore, MD, USA, pp. 483-490.
  • Kleiner, A. and Dornhege, C. (2007). Real-time localization and elevation mapping within urban search and rescue scenarios, Journal of Field Robotics 24(8-9): 723-745.
  • Khoshelham, K. and Elberink, S. (2012). Accuracy and resolution of kinect depth data for indoor mapping applications, Sensors 12(2): 1437-1454.
  • Kolter, J., Kim, Y. and Ng, A. (2009). Stereo vision and terrain modeling for quadruped robots, Proceedings of the IEEE International Conference on Robotics and Automation, Kobe, Japan, pp. 1557-1564.
  • Konolige, K. (1997). Small vision systems: Hardware and implementation, 8th International Symposium on Robotics Research, Monterey, CA, USA, pp. 111-116.
  • Kweon, I. and Kanade, T. (1992). High-resolution terrain map from multiple sensor data, IEEE Transactions on Pattern Analysis and Machine Intelligence 14(2): 278-292.
  • Łabecki, P. and Belter, D. (2014). RGB-D based mapping method for a legged robot, in C.Z.K. Tchoń (Ed.), Zeszyty Naukowe Politechniki Warszawskiej, Warsaw University of Technology Press, Warsaw, pp. 297-306, (in Polish).
  • Łabecki, P. and Skrzypczyński, P. (2013). Spatial uncertainty assessment in visual terrain perception for a mobile robot, in J. Korbicz and M. Kowal (Eds.), Intelligent Systems in Technical and Medical Diagnostics, Advances in Intelligent Systems and Computing, Vol. 230, Springer-Verlag, Berlin, pp. 357-368.
  • Matthies, L. and Shafer, S. (1987). Error modeling in stereo navigation, International Journal of Robotics and Automation 3(3): 239-248.
  • Nowicki, M. and Skrzypczyński, P. (2013). Combining photometric and depth data for lightweight and robust visual odometry, European Conference on Mobile Robots, Barcelona, Spain, pp. 125-130.
  • Park, J.-H., Shin, Y.-D., Bae, J.-H. and Baeg, M.-H. (2012). Spatial uncertainty model for visual features using a kinect sensor, Sensors 12(7): 8640-8662.
  • Pfaff, P., Triebel, R. and Burgard, W. (2007). An efficient extension to elevation maps for outdoor terrain mapping and loop closing, International Journal of Robotics Research 26(2): 217-230.
  • Plagemann, C., Mischke, S., Prentice, S., Kersting, K., Roy, N. and Burgard, W. (2008). Learning predictive terrain models for legged robot locomotion, Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems, Nice, France, pp. 3545-3552.
  • Poppinga, J., Birk, A. and Pathak, K. (2010). A characterization of 3D sensors for response robots, in J. Baltes et al. (Eds.), RoboCup 2009, Lecture Notes in Artificial Intelligence, Vol. 5949, Springer, Berlin, pp. 264-275.
  • Rusu, R., Sundaresan, A., Morisset, B., Hauser, K., Agrawal, M., Latombe, J.-C. and Beetz, M. (2009). Leaving flatland: Efficient real-time three-dimensional perception and motion planning, Journal of Field Robotics 26(10): 841-862.
  • Saarinen, J., Andreasson, H., Stoyanov, T. and Lilienthal, A.J. (2013). 3D normal distributions transform occupancy maps: An efficient representation for mapping in dynamic environments, International Journal of Robotics Research 32(14): 1627-1644.
  • Sahabi, H. and Basu, A. (1996). Analysis of error in depth perception with vergence and spatially varying sensing, Computer Vision and Image Understanding 63(3): 447-461.
  • Sharf, A., Lewiner, T., Shklarski, G., Toledo, S. and Cohen-Or, D. (2007). Interactive topology-aware surface reconstruction, ACM Transactions on Graphics 26(3), Article No. 43.
  • Skrzypczyński, P. (2007). Spatial uncertainty management for simultaneous localization and mapping, Proceedings of the IEEE International Conference on Robotics and Automation, Rome, Italy, pp. 4050-4055.
  • Skrzypczyński, P. (2009). Simultaneous localization and mapping: A feature-based probabilistic approach, International Journal of Applied Mathematics and Computer Science 19(4): 575-588, DOI: 10.2478/v10006-009-0045-z.
  • Stelzer, A., Hirschmuller, H. and Gorner, M. (2012). Stereo-vision-based navigation of a six-legged walking robot in unknown rough terrain, International Journal of Robotics Research 31(4): 381-402.
  • Szeliski, R. (2011). Computer Vision, Algorithms and Applications, Springer, London.
  • Thrun, S., Burgard, W. and Fox, D. (2005). Probabilistic Robotics (Intelligent Robotics and Autonomous Agents), The MIT Press, Cambridge, MA.
  • Walas, K. and Belter, D. (2011). Supporting locomotive functions of a six-legged walking robot, International Journal of Applied Mathematics and Computer Science 21(2): 363-377, DOI: 10.2478/v10006-011-0027-9.
  • Walas, K. and Nowicki, M. (2014). Terrain classification using Laser Range Finder, 2014 IEEE/RSJ International Conference on Intelligent Robots and Systems, Chicago, IL, USA, pp. 5003-5009.
  • Ye, C. and Borenstein, J. (2004). A novel filter for terrain mapping with laser rangefinders, IEEE Transactions on Robotics and Automation 20(5): 913-921.
  • Yoon, S., Hyung, S., Lee, M., Roh, K., Ahn, S., Geeb, A., Bunnunb, P., Calwayb, A. and Mayol-Cuevas, W. (2013). Real-time 3D simultaneous localization and map-building for a dynamic walking humanoid robot, Advanced Robotics 27(10): 759-772.
  • Zucker, M., Bagnell, J., Atkeson, C. and Kuffner, J. (2010). An optimization approach to rough terrain locomotion, IEEE International Conference on Robotics and Automation, Anchorage, AK, USA, pp. 3589-3595.
Typ dokumentu
Bibliografia
Identyfikatory
Identyfikator YADDA
bwmeta1.element.bwnjournal-article-amcv26i1p81bwm
JavaScript jest wyłączony w Twojej przeglądarce internetowej. Włącz go, a następnie odśwież stronę, aby móc w pełni z niej korzystać.