WATER ANALYST POCKET PRO - Microplastic, Heavy Metal and Inorganics PORTABLE Water Detection System for Kidney Health

Project number
26056
Organization
Kidney ADVANCE Project - NIH/ACABI
Offering
ENGR498-F2025-S2026
Project Goal/Summary: The purpose of this project to develop a functional, PORTABLE Pocket/point-of care analysis system to detect 1. Microplastics, 2. Heavy Metals (lead, cadmium, arsenic and mercury) and 3. Inorganics (Nitrate/Nitrite and Phosphates) i.e. dangerous, health threatening water contaminants, in water or other similar ingestible fluids; This system will reduce personal and group water risk, while serving as tool for monitoring and protection. Clean Water for a Safe Future – Healthy Body and Kidneys!

Project Background: Increasingly our water supply is being contaminated with pollutants from industry, waste discard and agriculture that is difficult to detect and difficult to remove. Despite attempts at community purification contaminants slip through. Three classes of agents: 1. Small size plastic fragments and particles – plastic micro and nanoparticles, 2. Heavy metals - lead, cadmium, arsenic and mercury and 3. Inorganics – Nitrates/nitrites and phosphates are becoming prevalent in the environment, creeping into our water supply and finding their way into ingestible water, liquids and foods. These materials present long-term hazards, not only to the individual but to society and to all higher animal life. The particular invidious nature of these materials is their long-term durability and persistence – in both water, the environment, and in the body, once ingested. These agents have been shown to induce a range of health consequences including: elevation of cholesterol levels, liver and kidney abnormalities, altered thyroid function, as well as effects on reproductive health and certain malignancy risks. This project aims at developing a simple, portable, pocket/point-of care system system, that may be dispersed in the community – at home and in regional labs to assess potable water quality. Developing a system of harmful agent monitoring will allow a present and future look as to the state of contamination and will lend itself to corrective actions. Also, data supplied may amassed to develop large amounts of data for Big Data and Artificial Intelligence approaches to this environmental and health problem for a safer future.

Requirements: 1. Jump – in Review and define microplastics, heavy metals and inorganics – what are they, what end-organ biological damage can each induce, and what is their environmental prevalence and distribution in Arizona and around the U.S.? 2. Detection methods - For each contaminant group define the range of methods that may be utilized to detect each, their sensitivity ease-of-use and cost-effectiveness. 3. Preferred Build: Design a small footprint device (size of textbook max, more like a tablet, cell phone or voltmeter) with cassettes that integrate with a Smartphone and connects to a display/graphical user interface and the cloud. 4. Test strips/Cassettes (for the device- FOCUS ON PAPER MICROFLUIDIC, or CHANNEL MICROFLUIDICs as preferred DETECTION METHODS. Goal of detection: 1 order of magnitude < to 2 orders of magnitude > MCL (maximum contaminant level) for each contaminant, e.g. 0.5 to 0.005 mg/ml for Cadmium 5. SMARTPHONE USE - for readout and data collection of strips/cassette, processing and streaming to cloud and an AI TOOL. 6. Microplastics -in particular Device will detect and size - Team can benefit from work of prior Sr Design team (2024 and 2025 BOTH award winning) - use optics, small laser or electrostatic means for particle detection. 7. Report – Alignment/Integration with environmental standards – device/system will develop a standard form readout and report that in accordance with evolving health agency/public health reporting standards, e.g. AZ).

Doctor AI - The Smart Patient Exam Room - AI Assisted History, Diagnostics and Patient Motion - New "Digital" Biomarkers for Improved Patient Care

Project number
26055
Organization
ACABI, supported by Craig M. Berge Dean's Fund
Offering
ENGR498-F2025-S2026
Project Background/Scope: The health care encounter – whether In the medical office, clinic, hospital, home or field is critical in obtaining vital information to guide and direct diagnosis, delivery and accuracy of healthcare. Studies have shown that > 70% of diagnoses stems from the physician or health worker carefully questioning and observing the patient. Sadly, patient encounters today have become shorter, with the physician hampered while performing an exam by the burden of electronic health record (EHR) data entry and use of a computer. Studies have also shown that many correct diagnoses are made by the doctor using information, such as: what and in what way the patient speaks, how the patient looks and acts, how the patient behaves, how the patient sits, how the patient walks, and other information gained by focused, attentive, one–on–one patient observation and consultation. In routine doctor-patient interactions today much of this information is not being observed or recorded and is lost!

This project will develop an AI based tool system to be used in a “Smart Patient Exam Room” or any “Exam Space” to capture information that physicians often miss AS WELL AS DEVELOP TOOLS THAT GO BEYOND – TO EXTRACT NEW INFORMATION – creating DIGITAL BIOMARKERS OF DISEASE. This project benefits from and builds upon work done by previous Sr. Design teams who built a basic system to capture sound and image; and from a dedicated room in COM-T for this purpose allocated to this project by the medical school!

Requirements: I. Hardware – re-up a kit for sound and visual capture – can benefit from prior Sr Design team – kit should be portable (for use in any space) and fixed (for use in dedicated exam room in med school), including high fidelity microphones and cameras. II. Software/Computational Tools/AI. The team will build three sets of tools:
1. Voice to text Symptom Frequency Index and related Common/Keyword/other word Index Analysis Tools.
Step 1: Develop a dictionary of diagnostic terms from the medical “review of systems” (symptoms and signs) and from short recordings of patients with specific diseases, e.g., a patient with dyspnea might say, "I have been having a hard time catching my breath recently. I can't walk around the grocery store without stopping to catch my breath..." all this forming the analytic lexicon. Step 2: Voice to Text and Word Frequency Analysis Using a speech to text system, such as Whisper, translate the recorded audio to text. Analyze all words spoken and record their frequency. Then compare the patient’s speech to the dictionary to identify all diagnostic terms/keywords. Then create a rank scale symptom and sign frequency index, including such endpoints as: #times a word was used over an entire conversation, % Diagnostic term used = #times keyword used /all words; inter-word frequency of specific terms; and other endpoints TBD.
Step 3: Speech semantic and sentiment analysis tools – Develop semantic and sentiment analysis tools to extract meaning and emotional content to develop additional verbal quantitative biomarkers. Step 4: Develop standardized report – to display data, upload to EHR or secure cloud, printout. Keywords and endpoints can be graphed and used to compare across "multiple visits" seeing how often the patient uses the diagnostic words, e.g., a patient with dyspnea reports that they are “short of breath” 15 times in their first visit. In subsequent visits they report that they are short of breath 9, 3, 1 and 0 times. This would indicate improvement. Step 5: AI Assessment of Diagnosis and Therapy - Determine suggested diagnosis and next steps (diagnosis and therapy via querying open and closed Gen AI (ChatGPT 5 and Claude vs Llama)
2. Facial Affect/Mood/Happiness Indicator - using cameras and facial recognition the team will design a system to analyze mood and happiness based on facial expressive characteristics, grimace and related visual expressive signs. AI will be utilized and machine learning to refine the diagnostic readout.
3. Motion analysis tool - The team will use Google Mediapipe to record patient motion, create a visual skeleton of motion that may be played back and develop basic quantitative gait information in terms of speed, symmetry, stability, sit to stand and related variables that may be compared serially
NOTE: For all tools the team will develop: 1. a record keeping and display system for serial trend analysis and 2. Integrate raw and processed data into means of storage and recall from an electronic health record. Team will present data on a Screen/Heads-up display for easy analysis serially by the physician

Small UAV Doppler Navigation using Honeywell Atlas Automotive Radar

Project number
26054
Organization
Honeywell Aerospace
Offering
ENGR498-F2025-S2026
Attach provided ATLAS Radar to student determined UAV
Mount the Radar to the UAV
Using the provided ATLAS Radar measure the velocity of the UAV
Compare student determined source of truth to Radar outputs (Host Velocity determined by Radar Measurements)
Be able to determine velocity successfully using the Radar over multiple terrains (Concrete, pavement, Unpaved surfaces, Stretch: Water)
Stretch goal to be able to identify what type of surface the UAV is navigating over

Unpowered, High Lift-to-Drag Hypersonics Projectile for Low Altitude Operations (Hyper-Shot)

Project number
26053
Organization
Lockheed Martin
Offering
ENGR498-F2025-S2026
Maneuvering a projectile at hypersonic flight conditions (M = 5 – 8) at low altitude operation introduces many design challenges, such as high heat flux and large mechanical loads. Through modern simulation and design tools, it may be possible to design highly maneuverable hypersonic systems for low altitude operation that can be economically manufactured using modern fabrication technologies, such as additive manufacturing.
The goal of this project is to explore the design space, identify an optimal set of system specifications that optimize a free flying (unpowered) system for range and affordable manufacturability – and then design a concept that optimally achieves the specifications. The project will entail geometry/configuration definition using a CAD system, discipline analysis such as aerodynamics, aerothermal, structural sizing, mass properties, stability & control, and conclude with vehicle sizing to meet mission requirements. The projectile will be launched at speed from an independent source. Hence, on board propulsion is not required. Also, it can be assumed that flight control can be achieved through aerodynamic surfaces, reaction control jets, or a combination thereof. The requirements for the design include:

• Mach: 5 – 8
• Aerodynamically Stable and Thermally Survivable
• Optimized for Maximum Range and Affordable Manufacturing (Minimum Cost)
• Max length: 1.0 m
• Max Diameter: 0.1 m (including fins)
• Max mass: 10 kg
• Minimum Maneuverability: 15g turn
• Minimum Volume Efficiency (h= V2/3/Splanform) ≥ 0.12

To validate the design, students will use lower order aerodynamics and/or CFD analysis to assess aerodynamic forces and moments predicted using engineering methods and will verify mission performance using flight trajectory analysis and optimization with inputs derived from discipline analyses. The team will perform wind tunnel tests at their university with a 3D printed prototype.
Additionally, transient aerothermal analysis of the projectile and assessments of location and thermal requirements for a sensor window will be performed. Pending results of the Preliminary Design Review in the fall semester, the spring semester will either focus on a refined design for a gun launch or perform extended testing to characterize dynamic fin deployment.

AI-Powered, Hospital Supply Retrieval System

Project number
26050
Organization
Intelligent Clinical Systems
Offering
ENGR498-F2025-S2026
Background:
More than 5 million U.S. nurses regularly face delays accessing essential equipment in acute-care supply rooms—where every second matters. Even brief retrieval interruptions can jeopardize patient outcomes. A 2024 white paper (Vizzia & Georgia State) found nurses spend up to 60 minutes per shift searching for supplies, costing U.S. hospitals over $14 billion annually in lost productivity. Despite being stocked and labeled, supply rooms often function as chaotic, memory-driven “hunt-and-peck” environments that increase cognitive load, delay care, and contribute to clinician burnout. This wasted time spent looking for medical supplies adds up to a massive, systemwide drain on time, workflow efficiency, and clinical responsiveness across every shift in every hospital. Hospital supply rooms are a technology desert that are still largely untouched by the kind of intelligent, connected systems transforming nearly every other corner of healthcare.

Scope:
Our technology is an AI-powered retrieval system that reimagines hospital supply rooms as intelligent, high-performance clinical hubs.
In acute-care settings, where seconds can determine outcomes, our device uses advanced on-device intelligence and ambient computing to deliver hands-free, sub-10-second access to critical supplies. It adapts to both urgent events (such as a code blue) and everyday clinical workflows, turning a passive supply room into an active, responsive part of the care team.
Designed for speed, adaptability, and privacy, the system operates entirely at the edge—meaning all intelligence runs locally, without reliance on cloud connectivity. This ensures ultra-low latency and safeguards sensitive data while enabling reliable performance in even the most demanding clinical environments.
Our work blends cutting-edge AI techniques with efficient embedded systems engineering, pushing the boundaries of what’s possible in real-time, privacy-preserving healthcare technology. The platform is built to be extensible, capable of supporting a new generation of smarter, faster, more connected clinical spaces.
This is bleeding-edge work in real-time, privacy-preserving AI on resource-constrained hardware. It opens the door to smarter, faster clinical environments, and builds a platform extensible to broader healthcare and other sensitive domains.



MediBrick: Dissemination and Expansion

Project number
26048
Organization
UA Department of Biomedical Engineering
Offering
ENGR498-F2025-S2026
Overview
MediBrick is an open-source platform developed at the University of Arizona to measure physiologic signals in a classroom setting.
The system consists of modules to measure ECG and impedance, blood pressure and heart sounds, blood oxygenation, temperature, body movement and activity and air quality. Some modules are fully developed while others are untested and under development.
This project shall address three issues of the MediBrick:
- Dissemination path of a fully tested and debugged product that can be replicated by student engineers and researchers worldwide.
- Educational support materials explaining the measurement principles and medical need addressed by each module.
- Expansion of the system to include:
- Gas flow sensor to measure lung capacity inexpensively with state-of-the-art clinical sensor
- Electric power controller for passive heating and cooling elements.

The current project home is: https://github.com/uutzinger/BioMedicalSensorBoard

Safety
All systems must meet electrical safety standards for measuring on human subjects.

Dissemination
Modules exist as functioning prototypes, however their replication has not been attempted. Some modules passed measurement evaluations and accuracy tests however others have not yet measured physiologic signal because their software is under development. The team shall attempt replication and document the replication so that it can be attempted by other student engineers. If such instructions already exist, the team shall verify them and improve them.

Education Support Materials
Educational support materials need to be developed with clear learning goals stated. They shall be developed to describe the physical principles involved in generating the physiological signal as well as the principle used to measure it. They shall also include explanation of medically relevant conditions and purpose to conduct such measurements. If such educational material already exists for a module, it shall be verified and enhanced.

Modules
Modules shall be inexpensive. They shall interface with an ESP micro controller board that is interchangeable with the other measurement modules. The module must be battery operated and able to charge the battery. It must be able to communicate with the BLE interface. The sensor module physical dimensions must match the other modules dimensions. The module must have one general purpose button and one simple display for status information.

Flow Sensor
It must be able to measure 250 Standard Liter Per Minutes Gas flow from the lung and interface with standard clinical intubation equipment. A potentials solution is an interface to the SFM3300-D flow sensor from Sensirion.

Actuator / Power Control Module
The module must accept at least 24V input voltage and modulate a current of at least 12 Ampere using a MOSFET power switch. The controller shall be able to invert the current so that a Peltier element could either be heating or cooling. There shall be minimal self-heating. The power shall be adjusted through pulse modulation. The system shall be able to control at least two channels independently.

Documentation
Currently the project is housed on GitHub as a project of the sponsor. The team’s digital products shall be delivered either as a contribution to the existing project, or a new independent project shall be created. The project cannot be owned by the team or a team member alone. Other contributors shall be able to enhance and expand the project in the future whereas the sponsor will want to maintain an approver role.

Facilities, Equipment, and Supply Requirements
The work shall be conducted either in the Salter laboratory or the Engineering Design Center at the University of Arizona. No measurement on human subjects shall be conducted alone.

Generative ATC/Pilot Conversations for NextGen Avionics Systems

Project number
26047
Organization
Universal Avionics
Offering
ENGR498-F2025-S2026
Air Traffic Control (ATC) and pilot radio communications are essential for flight safety and operational efficiency. For testing avionics based on AI technology that supports aviation communication, there is a need for realistic, generative conversations between ATC and pilots that include not only the textual dialogue but also nuanced details like accent, emotion, and ambient noise.

This project will involve:
1. Reviewing authentic ATC/pilot communication recordings to understand common phrasing, call structures, and communication styles.
2. Annotating raw human audio recordings with metadata specifying accents, emotional tone (e.g., calm, stressed, urgent), and background noise characteristics (e.g., cockpit chatter, static, airport sounds).
3. Designing an AI-powered generative system capable of creating realistic ATC/pilot conversations that simulate believable real-world flight scenarios.
4. Developing methods to ensure variability and authenticity in generated conversations, including simulating diverse global accents and environmental conditions.
5. Building a prototype software tool that can generate and output both text transcripts and synthetic audio based on the annotated metadata.
6. Testing and validating the realism of generated conversations against expert evaluations (e.g., aviation professionals).

Hypersonic Materials Characterization Apparatus

Project number
26046
Organization
Northrop Grumman
Offering
ENGR498-F2025-S2026
Background: Northrop Grumman Space Systems develops leading edge aerospace technologies, especially in Hypersonic flight. Launch vehicles in this regime encounter environments with ultra-high temperatures, extreme vibrational shock, and large dynamic/structural loads. As their material temperature may change thousands of degrees in a few seconds, designs require the proper characterization of material strengths at high temperatures. However, existing material property data was generated from exposure to low temperatures for long durations. This is the impetus of this project: a desire to obtain short-duration exposure material properties at very high temperatures. This data will be used to assess material capability for hypersonics and drive design improvements.

Scope: (1) Work with Northrop Grumman Engineers to understand the needs and desires of the delivered project. (2) Establish and maintain a project management plan, timeline, and budget. (3) Design a Hypersonic Materials Characterization Apparatus that displays sample information, tensile load, and temperature over time for common dog-bone sample geometries. (4) Analyze the design to withstand loads and temperatures. (5) Perform a design concept trade study on factors such as accuracy, instrumentation, repeatability, cost, manufacturability, scalability, and ease of modeling/testing. (6) Construct a working prototype of the designed apparatus ahead of an in-depth test campaign. (7) As part of the design effort, conduct a Preliminary Design Review (PDR) and Critical Design Review (CDR) which are required to provide periodic updates and reviews by Northrop Grumman subject matter experts (SMEs) and leadership. (8) Demonstrate via a variety of tests to verify that all design requirements specified are met by the proposed design. (9) Provide final product to Northrop Grumman in the form of a working prototype and final report chronicling design, analysis, and test results. (10) Present final product at the Northrop Grumman Chandler Design Day fair with additional Northrop Grumman sponsored Arizona capstone teams such as ASU, ASU Polytechnic, and NAU.

Automated Weight Bearing Ultrasound Foot Scanner: Version 3

Project number
26045
Organization
UA Department of Biomedical Engineering
Offering
ENGR498-F2025-S2026
The automated weight bearing 3D ultrasound foot scanner is a compact portable medical device that measures the stiffness of the structures supporting the arch of the foot. The foot scan results will be used clinically to screen patients at risk for impending arch collapse. The device consists of a medical grade platform, a servomotor actuator, an ultrasound probe, an ultrasound gel pad, a computer controller and a graphical user interface.

Automated On-Microscope Bioprinter for Live-Cell Culture and Imaging

Project number
26044
Organization
UA Department of Biomedical Engineering
Offering
ENGR498-F2025-S2026
Performance Requirements
The bioprinter must consistently meet specific performance benchmarks critical for successful cell culture:
• Temperature maintained at 37 ± 0.5°C with uniformity within 1°C
• CO₂ concentration maintained at 5 ± 0.2%
• Sterility maintained without contamination for multi-week experiments
• Fluid dispensing accuracy within ±5% volume
• Positional accuracy of needle placement within ±1 µm
• High-quality fluorescence imaging at 20×–40× magnification without significant degradation; secondary imaging resolution around 10 µm
• Reliability and safe failure mechanisms ensuring continuous uptime during critical experiments
Project constraints
• budget approx $5,000 (part or all of which can be supplemented by my funding)
• Final prototype must be safe and functional.
• The control software and user interface will be developed in Python (a required technology for this project), allowing flexibility in hardware interfacing and data analysis.

Get started and sponsor a project now!

UA engineering students are ready to take your project from concept to reality.