A comprehensive collection of autonomous robotics projects showcasing pathfinding, computer vision, multi-agent communication, and embedded systems programming using Raspberry Pi and TI Tiva C microcontrollers.
This repository contains autonomous robotics projects developed progressively to build toward a Capture the Flag robot competition. Each project builds upon the previous one, demonstrating incremental development of complex robotic systems including computer vision, sensor integration, path planning, and wireless communication.
- Autonomous Navigation: Develop robots capable of navigating mazes and environments using IR sensors
- Computer Vision: Implement color detection and object tracking using OpenCV
- Multi-Agent Coordination: Create communication protocols for team-based robot coordination
- Embedded Systems: Program TI Tiva C microcontrollers for real-time sensor processing and motor control
- Hardware Integration: Interface Raspberry Pi with microcontrollers via UART for distributed computing
Final Team Project - The culmination of all previous projects, implementing a competitive robot team for Capture the Flag gameplay.
Location: PathFindingAndTracking/FinalProject-CaptureTheFlagTeamCode/
- Offensive Strategy: Autonomous flag capture using color tracking and pathfinding
- Defensive Strategy: Goal protection with aggressive defender behavior
- Team Coordination: Multiple robot roles (offense, defense, goalkeeper)
- Real-time Vision: OpenCV-based color detection for flag identification
- Adaptive Behavior: State-based decision making for game situations
CaptureTheFlag_Offense.py- Offensive robot controllerCaptureTheFlag_Goalie.py- Defensive robot controlleractions.py- Movement primitives and behaviorsPWMrobotControl.py- Low-level motor control via UART
- Python 3
- OpenCV (cv2) for computer vision
- PiCamera for video capture
- UART communication with TI Tiva microcontroller
- PWM motor control
Object Detection and Pursuit - Foundation for visual servoing and target tracking.
Location: PathFindingAndTracking/Project1_Tracking/
- Detect and track colored objects (stress balls) using computer vision
- Implement visual servoing to follow moving targets
- Foundation for flag detection in Capture the Flag game
- HSV color space conversion for robust color detection
- Contour detection and filtering
- Centroid calculation for object localization
- PID-like control for smooth tracking
ball_test.py- Main tracking algorithmrobotObjectTracker.py- Object tracking implementationmakeROI.py- Region of Interest selection tool
Run Instructions:
python3 ball_test.pyCourtney Banh, Nick Vaughn (October 2018)
Autonomous Exploration - Systematic area coverage using the Boustrophedon (Butler) algorithm.
Location: PathFindingAndTracking/Project2_Coverage/
- Implement systematic coverage algorithm for unknown environments
- Create grid-based map representation
- Integrate IR sensors for obstacle detection
- Combine exploration with color tracking capabilities
- Butler Algorithm: Systematic back-and-forth coverage pattern
- Occupancy Grid Mapping: Matrix-based environment representation
- Sensor Fusion: IR sensors for obstacle detection
- Hybrid Behavior: Switch between exploration and target tracking
- Reverse Path: Return to origin after tracking excursion
RobotCoverage.py- Main coverage algorithmmap_class.py- Grid-based map representationrobotTrack.py- Tracking methods (detectGoal, trackingAlg, reverseMoves)PWMrobotControl.py- Motor control interface
Run Instructions:
python3 RobotCoverage.pyNicholas Vaughn, Courtney Banh
Distributed Robotics - Wireless communication protocol for robot team coordination.
Location: PathFindingAndTracking/Project3_MultiAgent/
- Develop frame-based communication protocol
- Implement master-slave role switching
- Coordinate multiple robots wirelessly
- Demonstrate with "Simon Says" game
- ZigBee Wireless: XBee wireless communication modules
- Frame Protocol: Custom message framing for reliable transmission
- FSM Design: Finite state machine for protocol states
- Dynamic Roles: Master/slave role negotiation
- Command Interpretation: Parse and execute remote commands
robotCommProtocol.py- Complete communication protocol implementationsimon()- Master robot behaviorplayer()- Slave robot behaviorperformCommand()- Command execution
Run Instructions:
python3 robotCommProtocol.pyNicholas Vaughn, Courtney Banh
Embedded Systems Programming - Low-level microcontroller code for sensor processing and motor control.
Location: TI Tiva Micro Projects/
Location: TI Tiva Micro Projects/MazeRobot/lab6/
- GPIO interrupt-driven sensor reading
- IR sensor interface (front, left, right)
- PWM motor control
- ADC for analog sensor processing
- UART communication with Raspberry Pi
- TM4C123GH6PM microcontroller (TI Tiva C)
- Three IR distance sensors
- Encoder feedback for odometry
- Motor driver integration
Location: TI Tiva Micro Projects/MoveRobotToXYPoint/
- Coordinate-based navigation
- UART communication protocol
- Real-time motor control
- Modular code architecture
main.c- Main control loop and initializationmove.c/h- Movement primitives and controluart_com.c/h- UART communication handlers
- Microcontroller: TI Tiva C Series TM4C123GH6PM
- IDE: Code Composer Studio (CCS)
- Programming: C/C++
- Interfaces: GPIO, UART, PWM, ADC, Interrupts
- Languages: Python 3, C
- Computer Vision: OpenCV (cv2)
- Hardware Interface:
- picamera (Raspberry Pi Camera Module)
- GPIO control
- UART/Serial communication
- Image Processing:
- NumPy
- imutils
- Development Tools:
- Code Composer Studio (for TI Tiva)
- Python 3.x
- Compute Platforms:
- Raspberry Pi (Python host controller)
- TI Tiva C Series TM4C123GH6PM microcontroller
- Sensors:
- IR distance sensors (front, left, right)
- Raspberry Pi Camera Module
- Wheel encoders
- Actuators:
- DC motors with PWM control
- Motor driver circuits
- Communication:
- UART (Pi β Tiva)
- ZigBee/XBee wireless modules (robot β robot)
- Computer Vision:
- HSV color space filtering
- Contour detection and analysis
- Morphological operations (erosion, dilation)
- Frame differencing
- Path Planning:
- Boustrophedon coverage algorithm
- Reactive navigation
- Occupancy grid mapping
- Control Systems:
- PWM motor control
- Sensor-based reactive control
- State machine architectures
- Communication:
- Custom frame-based protocols
- Master-slave coordination
- UART serial communication
- High-level processing on Raspberry Pi (vision, planning)
- Low-level control on TI Tiva (sensors, motors, real-time)
- UART bridge for seamless communication
- Reusable motion control libraries (
PWMrobotControl.py) - Abstracted sensor interfaces
- Behavior-based architecture for complex tasks
Each project builds on previous work:
- Project 1 β Vision system
- Project 2 β Vision + Exploration
- Project 3 β Vision + Exploration + Communication
- Final β All capabilities in competitive environment
- Competitive robotics (Capture the Flag)
- Autonomous navigation in uncertain environments
- Multi-agent coordination
- β Embedded Systems Programming (C on microcontrollers)
- β Computer Vision (OpenCV, real-time image processing)
- β Robotics (kinematics, sensor fusion, control)
- β Hardware Integration (UART, GPIO, PWM, ADC)
- β Multi-threading and Real-time Systems
- β Wireless Communication Protocols
- β Algorithm Implementation (pathfinding, coverage, tracking)
- β Modular code architecture
- β Progressive development methodology
- β Hardware abstraction layers
- β State machine design patterns
- β Object-oriented programming in robotics
- β Sensor calibration and characterization
- β Integration of multiple subsystems
- β Real-time performance optimization
- β Debugging hardware-software interfaces
- β Team coordination for competitive robotics
Most project folders contain detailed README files:
- PathFindingAndTracking/ReadME.md - Overview of vision-based projects
- Project1_Tracking_README.txt - Color tracking details
- proj2ReadMe.txt - Coverage algorithm explanation
- Project3_README.txt - Communication protocol details
- FinalProject_README.txt - Capture the Flag setup
# Python dependencies
sudo apt-get install python3 python3-opencv
pip3 install picamera imutils numpy
# Hardware requirements
- Raspberry Pi (any model with camera support)
- TI Tiva C Series microcontroller
- Robot chassis with motors
- IR sensors
- Camera moduleEach project includes run instructions in its README. General pattern:
cd PathFindingAndTracking/[ProjectFolder]
python3 [MainFile].pyFor more detailed information, see:
- PROJECTS.md - Quick reference guide with all project files and commands
- ARCHITECTURE.md - Technical system architecture and design decisions
- Individual project READMEs - Detailed setup instructions in each project folder
- Nicholas Vaughn (EngineerNV)
- Courtney Banh
- John Kim
- Jason Kerins (exploration code reference)
RobotProjects/
βββ PathFindingAndTracking/ # Main robotics projects
β βββ Project1_Tracking/ # Color detection and tracking
β βββ Project2_Coverage/ # Area coverage and mapping
β βββ Project3_MultiAgent/ # Wireless communication
β βββ FinalProject/ # Early CTF development
β βββ FinalProject-CaptureTheFlagTeamCode/ # Competition code
βββ TI Tiva Micro Projects/ # Embedded systems projects
β βββ MazeRobot/ # Autonomous maze navigation
β βββ MoveRobotToXYPoint/ # Point navigation system
βββ README.md # This file
Last Updated: November 2025
License: Educational/Portfolio Project