A Mini Autonomous Car (MAC)

2022 ELE Engineering Design Project (XF09)


Faculty Lab Coordinator

Xavier Fernando

Topic Category

Signal Processing / Communication

Preamble

An autonomous car is a vehicle capable of sensing its environment and operating without human involvement. A human passenger is not required to take control of the vehicle at any time. An autonomous car can go anywhere a traditional car goes and do everything that an experienced human driver does. Self-driving cars combine a variety of sensors to perceive their surroundings, such as thermographic cameras, radar, lidar, sonar, GPS, odometry and inertial measurement units.Advanced control systems interpret sensory information to identify appropriate navigation paths, as well as obstacles and relevant signage.

Objective

To design a robot car that navigates its environment without bumping into things. This car will use a LiDAR (Light-Radar) sensor that is used in real autonomous cars.

Partial Specifications

The MAC shall be able to move by itself and avoid obstacles.

It shall use the LiDAR in addition to other sensors for this purpose.

The sensory data shall be collected and processed in real time for navigational purposes.

The MAC shall also have wireless communication capabilities.

Suggested Approach

The car can have a Raspberry Pi at its core and as it drives around, LiDAR sensor data can be transmitted from the Pi to a nearby PC which uses a SLAM (simultaneous localization and mapping) system to create a virtual, 3D replica of the room surrounding the Raspberry Pi. The program can also provide a virtual representation of the car in Unity to estimate its position in the room.

See https://www.tomshardware.com/news/raspberry-pi-car-maps-room-with-unity

This project was built using a Raspberry Pi 4 4GB model and uses a YDLIDAR X2L LiDAR sensor to detect the 3D space around it. The Linux-based PC used for processing the data is in the same room and running Ubuntu 20.04.4.

S Lab was kind enough to make the project open source and has shared the code for interested parties over at GitHub. S Lab explains, the rotation angle and position data is transmitted to unity using ROS (robotics operating system) along with ROS-TCP-Connector 0.7.0 which is also available at GitHub. There is a video also available.

Group Responsibilities

The group is responsible for successfully completing, integration and testing the whole project.

Student A Responsibilities

Moving parts of the MAC.

Student B Responsibilities

LiDAR signal acquisition and processing.

Student C Responsibilities

Other sensors and integration.

Student D Responsibilities

Mahine learning and optimization.

Course Co-requisites

To ALL EDP Students

Due to COVID-19 pandemic, in the event University is not open for in-class/in-lab activities during the Winter term, your EDP topic specifications, requirements, implementations, and assessment methods will be adjusted by your FLCs at their discretion.

 


XF09: A Mini Autonomous Car (MAC) | Xavier Fernando | Friday August 12th 2022 at 05:05 PM