Winner Yale Healthcare Hackathon 2017
Role: Team Leader, Mechanical Engineer
Team: 6 participants (Radiology Resident, Electrical and Software Engineers, Clinical Researchers)
Skills: Concept Generation, User Research, Rapid Prototyping
Tools: Foam Sheets, Hot Glue
At the Yale Healthcare Hackathon 2017, I teamed up with a radiology resident, two undergraduate students with experience in clinical research, and two electrical and software engineers with a healthcare software startup. Our group targeted the issue of work-related pain that plagued ultrasound technologists. We focused on improving positioning of the body and hand of the ultrasound tech through improved design of the probe and repositioning of the vision system. I led my team through analogous and competitor product research and multiple concept generation sessions to identify a concept to pursue. I oversaw creation of the presentation and business plan while owning design and execution of the physical prototype. We received the Yale School of Medicine Department of Radiology and Biomedical Imaging Award for our ultrasound system redesign.
On the left, team members demonstrate the awkward hand positions that the user is forced into while targeting certain anatomy areas and looking at the ultrasound screen. On the right, they are demonstrating the improved grip options with our probe prototype and the use of a Microsoft Hololens vision system to project the ultrasound image into the technicians field of view. The technician will still need to turn and use the tower system from time to time, but can focus their stance and gaze on the target anatomy during the majority of the procedure.
We interviewed the radiology resident on our team and found that there and three most commonly used buttons on the ultrasound tower: gain (or brightness), depth of the image, and capture (photographs and video). I designed a three button system for the probe. A front button would toggle between gain, depth, and capture. The arrow buttons would allow the user to adjust gain and depth, and would trigger a photo capture (bottom arrow) and video capture (top arrow). Relocating these buttons to the probe removes the reliance on the tower system and allows the technician to focus on the anatomy they are visualizing.