Skip to content

Cyber-physical-Systems-Lab/Autonomous-car-guide

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

4 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Autonomous-car-guide


Table of Contents



Overview


This project implements a cyber-physical system (CPS) for smart mobility using an autonomous RC car. The system uses ArUco marker detection to identify and localize the vehicle, and edge detection via thresholding and contour analysis to prevent collisions with physical boundaries (yellow lane boundaries). It demonstrates real-time control, sensing, and feedback with a cost effective sensor setup and a well constructed hardware platform with high quality components.

The goal is to provide an accessible and reproducible platform without relying on onboard cameras, AI, or machine learning, ideal for learning basic autonomy.


System Architecture


This CPS architecture integrates physical mobility, embedded computation, and real-time visual perception.

System Architecture

RC Car (Client - Raspberry Pi)

  • Computation: Raspberry Pi 5
  • Actuators: Quicrun fusion mini 16 brushless motor Hitec D89MW servo
  • Communication: TCP socket client connecting to the laptop/server
  • Power Supply: 7.4V Li-Po battery with step-down converters for 5V components
  • Software: Python script that listens and executes incoming commands via GPIO/PWM using the Adafruit ServoKit and gpiozero libraries

Laptop (Server - Vision & Control Unit)

  • Camera: USB-connected webcamera (overhead view)
  • Visual Processing:
    • ArUco marker detection for car localization (OpenCV)
    • Yellow lane boundary detection via HSV thresholding and contour detection
  • Control Logic:
    • Automatic lane boundary detection and reactive steering
    • Manual control via keyboard commands

Communication Protocol

  • Type: TCP socket
  • Flow:
    1. Client (Raspberry Pi) connects to server
    2. Server processes camera feed, detects marker and boundaries
    3. Server sends commands (e.g., "left", "90") via TCP
    4. Client parses and executes the command using GPIO/PWM

Navigation Algorithm


The following flowchart illustrates the central navigation logic used by the server. It processes the camera feed, identifies ArUco markers to locate each vehicle, and evaluates nearby lane boundary points (yellow contours) to determine an appropriate steering command.

The system scores candidate points based on their direction and distance relative to the car's heading. Only valid points in front of the vehicle and within a dynamic threshold are considered. A turn command is sent only if the change in angle is significant and within safe limits.

Navigation Algorithm


Installation


On Laptop (Controller / Server):
git clone
https://github.com/Cyber-physical-Systems-Lab/AutonomousCarGuide.git
cd Autonomous-car-guide/Server
sudo apt-get update
sudo apt-get install python3-pip
pip3 install -r requirements_server.txt

On Raspberry Pi (RC Car / Client):
git clone
https://github.com/Cyber-physical-Systems-Lab/AutonomousCarGuide.git
cd Autonomous-car-guide/Client
sudo apt-get update
sudo apt-get install python3-pip python3-dev i2c-tools
pip3 install -r requirements_client.txt --break-system-packages


Usage


For autonomous mode:

  1. Start the Server on the Laptop python aruco_edge_detector.py

  2. Start the Client on the Raspberry Pi python client.py

For manual steering:

  1. Start the Server on the Laptop python steering.py

  2. Start the Client on the Raspberry Pi python steering_client.py


Simulations / Demos


  • Basic ArUco marker tracking

  • Edge detection using yellow border tape

  • Automatic edge avoidance behavior

  • Real-time manual control over Wi-Fi

Autonomous Vehicle Demo

View from the central computer while running:

View


Configuration


Parameters are currently defined as constants in the scripts:

Client:

  • USER = 1
  • SERVER_IP = "192.168.x.x" # replace with your laptop IP
  • STEERING_CHANNEL = 0 # PWM channel on PCA9685 to use

Server:

  • ANGLE_THRESHOLD = 1 # Minimum angle difference required before sending a new command
  • LOW_THRESHOLD = 40 # Lower bound for
  • HIGH_THRESHOLD = 80 # Upper bound
  • SCALE = 0.2 # Scaling factor for adjusting the turn intensity
  • SEND_INTERVAL = 0.2 # How often the server can send a message to the same vehicle
  • WEIGHT = 0.5 # Aggressiveness of the steering
  • ANGLE_FAVOR = 0.4 # How much we want to favor angle over distance in the point system.

Lower and upper limits of the HSV color range:

  • lower_yellow = np.array([18, 80, 60])
  • upper_yellow = np.array([40, 255, 255])

The dynamic_threshold function has hardcoded angle thresholds that may be altered and changed depending on the wanted "look" distance


Dependencies


Requirements laptop (Server):

  • Python 3.8
  • OpenCV 4.x
  • NumPy
  • keyboard (for manual control via server)

Raspberry Pi (Client):

  • Python 3.8
  • gpiozero
  • Adafruit ServoKit
  • Adafruit Blinka

Contributing


To contribute:

  1. Fork the repository
  2. Create a branch: git checkout -b feature/my-feature
  3. Make your changes and commit
  4. Push and open a Pull Request

License


This project is released under an open-source license to support education and research in autonomous systems.

Please cite this work appropriately in academic or institutional use.

While the system is intended for controlled environments, users who adapt or extend it are responsible for ensuring that their implementations comply with safety, legal, and ethical standards. Misuse of the system in real-world or safety-critical scenarios is strongly discouraged without proper validation and oversight.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 2

  •  
  •  

Languages