05/01/2025
By Danielle Fretwell
The Francis College of Engineering, Department of Electrical and Computer Engineering, invites you to attend a Doctoral Dissertation Proposal defense by Flore Stécie Norcéide on: "Neuromorphic Imaging and Processing for Tracking and Evaluation at the Edge (NIPTEE)."
Candidate Name: Flore Stécie Norcéide
Degree: Doctoral
Defense Date: Thursday, May 8, 2025
Time: 11 a.m. - 1 p.m.
Location: Falmouth 203, Center for Advanced Computation and Telecommunications
Committee:
Advisor: Kavitha Chandra, Eng. D., Associate Dean of Undergraduate Affairs, Electrical and Computer Engineering, UMass Lowell
Committee Members
- Charles Thompson, Ph.D., Professor, Electrical and Computer Engineering, UMass Lowell
- Orlando Arias, Ph.D., Assistant Professor, Electrical and Computer Engineering, UMass Lowell
- Ian Humphrey, Technology Director, Department Lead, Technology Engineering, Raytheon, An RTX Business
Abstract:
Unmanned Aerial Vehicles (UAVs) equipped with electro-optical sensors are being increasingly deployed in mission-critical environments, for real-time situational awareness. They are constrained by size, weight, power and cost (SWaP-C) requirements and often operate in unreliable network connectivity, with adversarial interference, and limited onboard computing resources. Conventional frame-based vision sensors embedded in these systems require high-bandwidth connectivity, often producing redundant information.
This research investigates the application of neuromorphic vision sensors (NVS) onboard UAVs for moving target detection, classification, and tracking. Using the asynchronous, event-based output of NVS systems, a computational framework is designed to augment frame-based methods and reduce data volume, enhance temporal resolution, and improve mission performance under varying lighting and motion conditions. The proposed architecture integrates NVS with lightweight embedded computing systems capable of running local real-time vision workloads. The contributions of this research include the application of machine learning algorithms for motion segmentation of event data to isolate mission-relevant targets from dynamic backgrounds; the deployment of energy-efficient, low-latency inference algorithms such as spiking neural networks and event-driven filters; and the development of a benchmarking methodology to evaluate SWaP-C trade-offs in hardware and software.
The goal is to enable a deployable event-based vision system that allows UAVs to perform target detection and evaluation in compute and bandwidth-limited settings. Applications of this system include tactical reconnaissance, environmental monitoring, search and rescue, and disaster response.