Skip to main content
Video s3
    Details
    Presenter(s)
    Yin-Tsung Hwang Headshot
    Display Name
    Yin-Tsung Hwang
    Affiliation
    Affiliation
    National Chung Hsing University
    Country
    Country
    Taiwan
    Abstract

    This paper presents the key sensing, navigation techniques, and edge AI computing chip design for an autonomous mover for people rich environments. It is equipped with a smart sensing module consisting of Lidar, camera and ultrasonic array radar. Camera images are analyzed by using 2-stage CNN models for not only object recognition but also pedestrian behavior prediction. A 1-ray Lidar performs SLAM and data fusion results are passed to the navigation module, which is based on a reinforcement learning model. Deep Learning Architecture acceleration chip design and a mapping tool are also developed. A 4-wheeled autonomous mover prototype was built and preliminary evaluation results are presented.

    Slides
    • iAMEC, an Intelligent Autonomous Mover for Navigation in Indoor People Rich Environments (application/pdf)