Photon Flight : Fiber-Optic Autonomous UAV
Summary: 

The Photon Flight (Autonomous UAV) project focuses on creating a high-performance tethered drone capable of sending two real-time video streams and complete flight telemetry through a single fiber optic cable. Using a lightweight fiber tether instead of conventional RF connections, the platform demonstrates how advanced networking, optical communications, and embedded systems can be merged within an aerial vehicle. Throughout the process, the team gains experience with IP video streaming, low-latency video encoding, network design, Embedded Linux integration, and connecting entire system with a flight controller (MATEK F405). The main objectives are to build a reliable, high-bandwidth communication system, show consistent fiber-based command and control, and assess the advantages of optical tethering for secure, interference-resistant drone operations. In the end, this drone acts as a test platform for new communication technologies in robotics, monitoring, and environments where electromagnetic interference is an issue.

The main goal of this project is to build a tethered drone that can operate dependably in dangerous settings to help with search-and-rescue missions. The system uses fiber-optic communication for fast, secure data transfer, includes onboard AI to detect objects and people in real time, and features modular hardware that can be quickly deployed. By integrating these features, the drone aims to boost awareness in emergency situations, make rescue efforts safer, and offer a valuable tool for finding and aiding people in places that are unsafe or unreachable for rescue teams.

Technical Approach/Methodology: 

The primary goal for the fall quarter was to assemble all drone components and verify their functionality. This included mounting the flight controller, ESC, motors, Raspberry Pi, AI camera, and battery on the frame, as well as completing power and signal wiring using a combination of soldered joints and crimped connectors. The team successfully completed this objective, fully integrating the hardware and performing bench tests to ensure all subsystems were operational. ESC (Electronic Stability Control) and AI camera mount components were 3D printed and placed onto the body of the assembly. Additionally, the fiber-optic communication link was successfully assembled, configured, and tested, demonstrating reliable data transmission between the drone and a ground station prototype. By the end of the quarter, the drone was mechanically and electrically assembled and ready for flight in the upcoming quarter.

The objectives for the winter quarter focused on enhancing system capabilities and preparing for controlled testing. This objective was accomplished by demonstrating full integration of the communication system via fiber, complete physical construction of the drone, and successful flight coordinating video & controls over optical links. Iterative refinement of the drone body via additive printing resulted in an optimized media rack to house the Raspberry Pi and SFP. The battery mount was similarly iterated to maximize efficient use of space on top of the frame, and incorporate a GPS/compass standoff. Multiple instances of a spooling mechanism were explored, resulting in a system that was mounted on the underside of the drone body. A pretrained object detection LLM was implemented on the Raspberry Pi AI Camera, and the team modified the Mission Planner HUD to enable live streaming video and project it in the software by leveraging GStreamer and FFmpeg to develop the media pipeline. The drone is able to detect and actively track a human standing ~40 feet in front of the drone.

Outcomes: 

The Photon Flight drone has been fully assembled and has successfully demonstrated system functionality. The drone communicates through a fiber-optic cable with a laptop computer using the MAVLink communications protocol, with the system interfacing through a ground station hosted on a personal computer. The camera mounted on the drone frame provides a live video feed at 30 frames per second with 2-4 ms of latency. An object detection model is applied to this feed and currently processes and relays video data. Qualitatively, the system demonstrates accurate detection of human subjects within the camera’s 40-foot range. This detection capability will continue to be refined during testing to support additional object classes and improve overall model accuracy. Flight inputs for the drone are provided through an RC controller connected to the computer, and testing has confirmed the ability to perform wired flight operations while switching between autonomous object tracking and manual flight control.

The current drone configuration achieves an approximate flight time of 20 minutes with a dry weight of 1,494 g and an all-up weight (AUW) of 2,274 g. The platform supports a maximum payload capacity of 2,730 g. Each motor is capable of producing approximately 2,467 g of thrust, resulting in a total maximum thrust of 9,868 g across the four-motor system. These performance characteristics provide a thrust-to-weight margin suitable for stable flight while accommodating onboard sensing and processing hardware.

Course Department: 
EECS
Academic Year: 
2025-2026
Term(s): 
Fall
Winter
Project Category: 
Internal (faculty, staff, TA)
Sponsor/Mentor Name: 
Professor Peter Burke
Project Poster: 
Project Video: