Details
Presenter(s)
![Yin-Tsung Hwang Headshot](https://confcats-catavault.s3.amazonaws.com/CATAVault/ieeecass/master/files/styles/cc_user_photo/s3/user-pictures/hwang.jpg?h=2eb3a0cf&itok=bMa-NcMX)
Display Name
Yin-Tsung Hwang
- Affiliation
-
AffiliationNational Chung Hsing University
- Country
-
CountryTaiwan
Abstract
This paper presents the key sensing, navigation techniques, and edge AI computing chip design for an autonomous mover for people rich environments. It is equipped with a smart sensing module consisting of Lidar, camera and ultrasonic array radar. Camera images are analyzed by using 2-stage CNN models for not only object recognition but also pedestrian behavior prediction. A 1-ray Lidar performs SLAM and data fusion results are passed to the navigation module, which is based on a reinforcement learning model. Deep Learning Architecture acceleration chip design and a mapping tool are also developed. A 4-wheeled autonomous mover prototype was built and preliminary evaluation results are presented.