YCP Optical Communications Capstone - Class of 2024
Summer 2023 and Spring 2024 Computer Vision Aided Optical Communications Repository
Based on implementation developed by YCP Optical Communications Capstone - Class of 2023
The purpose of this project is to improve upon the optical wireless
communication (OWC) system design by developing a computer vision system to aid in the discovery and maintenance of connections. The system establishes dynamic OWC links and maintains them among mobile
robot devices. Communication between devices is completed optically using infrared light from transceivers which require a direct Line of Sight (LOS) with other devices. Optical communication can allow for data to be communicated at a much faster rate than other forms of communication such as radio frequency which are more commonly used. Additionally, signals sent over optical are directionally targeted, which limits the ability for attackers to sniff data or jam communications. To establish a communication between two mobile robot devices, a discovery method identifies a robot visually and attempts to make a communication socket. Once connected, a maintenance algorithm ensures reliable connection for the duration of the communication. These algorithms are capable of enhancing security of the discovery process and reliability of the
maintenance algorithm.
The specific implementation employs TCP to establish reliable connections between robots. TCP ensures that all data is transmitted successfully without losing any packets. It also uses ICMP pings to assist in associating a visual robot with an IP address. The Python implementation does not limit the number of robots that can be connected at any given time. Additionally, multi-hop/message-passing is under development, which would allow robots to communicate with other robots in the network that they cannot physically see.
The config file requires a variety of information:
Runs object detection model on USB cameras at video0 and video1. The model is a custom Robot model for detecting the TurtleBots with the green balls on top of them. Also outputs the locations of objects in the frame. Close the object detection window to quit.
Shows live video concatenated horizontally in a new window from two USB cameras on video0 and video1. Press q to quit.
With 3 USB Cameras on video0...2, the script will capture a frame on each camera and then display the 360-degree image in a new window. Press q to quit.
The chart shows an overview of a generic form of the algorithms and classes used in the project.