Details
![Hayden Kwok-Hay So Headshot](https://confcats-catavault.s3.amazonaws.com/CATAVault/ieeecass/master/files/styles/cc_user_photo/s3/user-pictures/27023.png?h=e697bf01&itok=_utlyJ9y)
- Affiliation
-
AffiliationUniversity of Hong Kong
- Country
A case study in applying modern FPGAs as a platform to accelerate intelligent vision-guided crop detection in agricultural field robots is presented. A state-of-the-art YOLOv3 object detection neural network was adapted to detect broccoli and cauliflower in image dataset obtained from autonomous agricultural robots. A baseline floating point implementation achieved 96% mAP, and an efficient, quantized implementation suitable for FPGA implementation 92% mAP. The proposed FPGA solution has 136.86 ms inference latency while consuming 12.43W in a low latency configuration, and 28.48 frames per second while consuming 17.78W in a high throughput one. Compared to an embedded GPU implementation of the same task, the FPGA solution was 4.12 times more power-efficient and offers 6.85 times higher throughput, translating to faster and longer operation of a battery-powered field robot.