Welcome to the ECE Senior Design Showcase, an event that highlights a combination of innovation and practical engineering solutions. Our students have invested many hours developing projects that address real-world challenges, blending technical expertise with creativity.
This semester, they will present a diverse array of projects, including a 3D-printed prosthetic arm controlled by an adaptive machine learning system, a tracking and recognition system, a robotic caregiver, an automated pet care solution, and a vehicle designed to locate survivors in hazardous environments.
Here are the projects on display
The Projects
ML Enhanced EMG Prosthetic Arm
×
Machine Learning Enhanced EMG Prosthetic Arm
Team
Amiyah Stephens CoE
Brandon Lee
CoE
Cheyenne Campos
EE
Chris Davorson
EE
Edward Hernandez
EE
Upper-limb prosthetics face critical accessibility barriers including prohibitive costs, complex calibration processes that frustrate new users, and a high abandonment rate. This project addresses these challenges through an open source 3D-printed arm design and a machine learning enhanced electromyographic (EMG) control system. Using a single MyoWare sensor we developed an intuitive pressure-based design where users control hand gestures through bicep contractions. Our personalized machine learning approach using a Convolutional Neural Network (CNN) adapts to individual users' muscle patterns, achieving over 80% accuracy for rock-paper-scissors gestures after under 5 minutes of calibration. The ~$200 system demonstrates that advanced prosthetic control can be accessible, offering a viable alternative to $20,000+ commercial solutions. This work provides a framework for a system that could increase prosthetic adoption rates among upper-limb amputees.
Advisor: Dr. Xuan Liu
Tracking/Recognition System
×
OmniTrack: Multi-Axis Tracking/Recognition System
Team
Korhan Gegcil CoE
Isbah Kurury EE
Saif Ali CoE
Tunmise Olayiwola EE
OmniTrack, a multi-axis tracking and recognition system, is a stationary tracking platform. It integrates a camera, Raspberry Pi, and software for real-time object recognition and tracking. The system features 180-degree rotation and tilt for enhanced adaptability. Key components include a motorized PTZ (Pan-Tilt-Zoom) camera mount, computer vision algorithms, and embedded control systems. This technology has applications in security, defense, automation, and safety, appealing to industries like military defense, autonomous robotics, and smart surveillance.
Advisor: Dr. Xuan Liu
Smart AR Glasses Attachment
×
Smart AR Glasses Attachment
Team
Jiahao Lin EE
Smartphone speech-to-text apps are very useful for Deaf and hard-of-hearing users, but the ergonomics of holding the phone or looking down at the screen can be awkward, inconvenient, and unsafe outdoors when walking. Consumer-grade AR headsets are currently too heavy and expensive to be feasible.
This project provides a low-cost smart AR glasses attachment that clips to a normal pair of eyeglasses to display heads-up captions and navigation cues. It features a Seeed XIAO nRF52840 microcontroller, a 0.96" SPI OLED display, a Li-Po battery with USB-C charging, and a small right-angle glass prism which reflects the OLED image into the user's field of view. A custom Android app does speech-to-text and Google Maps navigation on the phone, and streams compacted text and turn-by-turn directions over Bluetooth Low Energy using a very lightweight UART-style protocol. The prototype demonstrates live-caption and walking navigation arrows on the glasses with the phone remaining in the user's pocket for comfortable hands-free operation. With only commodity components and a very simple optical path, this ~$<100 add-on attachment is a practical and accessible assistive device for hearing-impaired users, and a flexible building block for future smart AR features such as notifications, calendar reminders, and context-aware suggestions.
Advisor: Dr. Xuan Liu
Robotic Care Companion
×
Robotic Care Companion for the Elderly
Team
Taha Rizvi CoE
Collins Mba CoE
Jason Li EE
he goal of this project is to produce an AI-driven robotic companion. The robot will hold conversations, create alarms, contact people, and perform gestures. The design will utilize motors for movement, an LCD screen for displaying information, a microphone and speakers for communication, and a microcontroller with wifi capabilities for information processing. This product will help long-term hospital patients by aiding them with certain hospital functions, such as contacting nurses or timing medication, while alleviating loneliness stemming from isolation. The adaptable design and multilayered implementation of this robot will also give users a range of use cases outside standard medicine.
Advisor: Dr. Xuan Liu
Pet Oriented Smart Home
×
P.O.S.H.: Pet Oriented Smart Home
Team
Thomas Scaturo EE
Malik Salim CoE
Kevin Flores-Camacho CoE
Hellen Montoya CoE
This project presents a smart, pet-oriented home system designed to simplify and secure pet care. The system features an RFID-enabled door lock, a real-time video monitoring camera, and an automated food dispenser. A motorized door mechanism and integrated dispenser form the core mechanical design, while the electronics rely on RFID sensors, cameras, and IoT modules for seamless operation. The modular software architecture offers local control over access and feeding schedules, minimizing manual oversight. Ensuring reliability remains central to the design. Potential markets include busy pet owners seeking convenience, veterinary clinics requiring supervised access, and kennels optimizing feeding schedules and security
Advisor: Dr. Xuan Liu
Rescue Vehicle
×
Electric-Powered Rescue Vehicle with Motion Detection/Cameras
Team
Carol Wisly Latif CoE
Jannatul Zenith CoE
The purpose of this project is to build a small prototype vehicle that can be used by emergency teams during fires, building collapses, or earthquakes to look for trapped survivors in spaces that are too dangerous or too tight for people to enter.
The vehicle is driven by an Arduino Uno and two motor drivers that independently control four DC motors, while Bluetooth allows commands to be sent from a phone or laptop. A Raspberry Pi with a camera and onboard sensors provides video streaming and basic sensing of the environment. All electronics are powered from a single 12 V battery using DC-DC converters to supply safe 5 V rails. We tested the system for different motion modes, Bluetooth range, camera performance, and sensor response. The vehicle was able to respond correctly to motion commands, operate reliably over typical indoor distances, stream usable video, and detect nearby motion. This platform offers a low-cost, modular base that can be extended with more advanced autonomous navigation and sensing in future work.
Advisor: Dr. Xuan Liu
Fire Detection for Server Rooms
×
Advanced Fire Detection System for Server Rooms
Team
Gabriel Oliveira Drehmer CoE
Alp Hance CoE
With the rapid expansion of data-center infrastructure, the need for efficient computational and environmental management has become increasingly critical as cooling and equipment maintenance represent major operational expenses. This project introduces a low-cost, high-resolution monitoring solution designed to reduce waste and improve equipment reliability by providing appliance-level thermal visibility. Using DHT22 sensor probes, the system captures temperature and humidity data from the exhaust of individual network appliances and sends digital readings to a Raspberry Pi every two seconds, where they are displayed on a web-based administrative dashboard. Administrators can configure device-specific thresholds. When a threshold is exceeded, the system automatically triggers an MLX90640 thermal imaging camera to capture a snapshot and email it to the designated recipient. This $150 solution supports multiple probes and delivers device-specific measurements, offering a level of thermal monitoring not available in traditional rack or room-based commercial systems.