Apple Vision Pro Tumor Detection System
Combining RGB and Near Infrared sensors with immersive 3D visualization to identify tumors in augmented reality in real-time. Created at the UIUC Electrical and Computer Engineering department.
Project Details
This tumor diagnostic system was created by myself and a dedicated team of engineers at the University of Illinois at Urbana-Champaign. It leverages the Apple Vision Pro's advanced sensors alongside a Nvidia Jetson Nano to detect tumors with unprecedented accuracy. By combining RGB and NIR imaging with machine learning algorithms, we can highlight potential tumor regions in real-time AR overlays.
The system was developed in collaboration with leading oncologists to ensure clinical relevance. Our custom algorithms process the sensor data with sub-millisecond latency, enabling seamless AR visualization during surgical procedures.
Initial clinical trials showed a 92% detection accuracy for tumors larger than 2mm, significantly improving on existing techniques. The system integrates with existing hospital DICOM systems for seamless workflow integration.
Key Features
- Real-time tumor detection with <1ms latency
- Multi-spectral imaging combining RGB and NIR
- 3D tumor visualization with depth mapping
- HIPAA-compliant data processing
- Surgical navigation integration
Project Gallery

System Design Overview

Surgeon using Vision Pro in OR

Sensor Housing and Components

System Design Overview
Technical Implementation
Design Process and Research Documentation
Technical Challenges & Solutions
- •Real-time processing: Developed custom Metal shaders for GPU acceleration of image processing pipelines
- •Sensor fusion: Created novel algorithms to combine RGB and NIR data with sub-millisecond synchronization
- •Privacy compliance: Implemented on-device processing with zero PHI data leaving the Vision Pro
Want to know more?
Interested in the technical details or exploring how these skills could translate to your team?
Contact Me!