Domain: Research

Themes: Assistive & Accessible Technology

Measuring ultrasound waves to improve touch technology

With the escalating digitisation of the world around us, touchscreens are increasingly replacing buttons and other functional devices that are easy to feel. But touchscreens are not accessible for visually impaired people. PhD student Zak Morgan is measuring ultrasound waves, which will eventually feed into improving technologies that rely on touch.

Ultrasound technology has been used for many years, and is well known in the medical field to scan different body parts. The concept of ultrasound in this context is that it transmits sound waves into the body, and creates images from the waves that echo back. The use of ultrasound through water is therefore well established, since the body is mostly comprised of water. Less is known about how to measure and make sense of ultrasound waves through the air. Yet being able to do this holds the possibility of using ultrasound for haptic applications, in which ultrasound creates a feeling – which could be heat or a vibration – that people experience when they touch it.

In the labs at UCL, there are Phased Array Transducers (PAT boards) that enable research with ultrasound. These PAT boards have more than 250 ultrasound transducers, which are a little like tiny speakers that emit sounds within the human hearing range. PAT boards have mostly been used for creating holograms or levitating small objects. But there are uses for the technology beyond this. By focusing the ultrasound at specific points, turning it on and off and adjusting the strength, people can put their hands into these sound fields and feel vibrations on the skin. Tapping into this concept could be useful in applications for visually impaired users in the future.

Phased Array Transducers (PAT boards)
Phased Array Transducers (PAT boards)

Measuring ultrasound through air

In response to this, Zak Morgan is focusing on measuring ultrasound waves through the air, in order to eventually improve technology for everyone. The benefits for visually impaired users could be significant.

“One example of ultrasound haptic feedback that could benefit visually impaired people is in lifts, as the world is becoming higher tech, we’re trending away from buttons that you can feel to call for a lift. Instead, touchscreens are becoming more common. This isn’t accessible as it’s just a flat touch screen. But with more information about how ultrasound haptic feedback works, you could have a projection of haptic feedback that a user can feel before they touch the screen. That would help visually impaired users detect if they are in the right place on the screen before they select an option on the touchscreen.”

Morgan studied Computer Science at undergraduate and Master’s level at UCL, through which he met his supervisor, Dr Youngjun Cho. Having explored many different technologies through his degrees, he wanted to focus on broadening understanding about ultrasound haptic feedback, and gained funding for this PhD.

Throughout the research period, Morgan will be using a variety of techniques to measure ultrasound. One of these techniques is thermography. This works by running ultrasound through a material. When the sound is absorbed by the material, it should heat up. At this point, Morgan will use a thermal camera to look at the changes heat causes to the material. This temperature change should link to a pressure measurement. The theory is that different ultrasound waves will each have a different impact, and these measurements can eventually be used to create specific shapes or other feelings that people can understand through touch.

Three images exampling the thermal camera. Heat is displayed in colour.
An example measurement using a thermal camera

However, applying the technology to something users can interact with will be a challenge. Materials can obstruct sound, which means that acoustically transparent materials would need to be used in any device that uses haptic feedback. Morgan is looking towards materials used in cinemas to help with this. In the cinema setting, speakers are typically located behind the projector screen, and the screens are perforated with tiny holes to enable sound to travel through. The same principle will be needed for touchscreen displays. The device will need to sit behind a display where users will feel heat or vibrations caused by the ultrasound waves, in order to interact with it.

Another challenge that Morgan will need to overcome is how skin reflects sound waves. Some of the ultrasound waves will be bounced back when an individual touches the device. In addition, the way in which people will grasp and feel a device using haptic feedback will cause a number of reflections. This will alter the person’s perception, and so Morgan will need to take a number of different measurements to account for reflections from a user’s hands.

Next steps

Morgan aims to have a complete set of measurements for multiple techniques during his PhD, which is due to run until September 2025. He would also like to create a system to optimise how ultrasound is generated in the lab environment. His aim is to have the research and development tools created, so that the process of applying this to assistive technologies is quicker and easier in the future.

“My focus is on enabling devices to become better overall,” said Morgan. “It can be tricky to gain funding for technology that is specifically for users with accessibility needs. But if you can make it useful for the general population, you can gain a lot more traction with ideas like this. For example, in a lot of new cars, the in-car control panel is often now a touchscreen. But that’s not great for people who are driving. You might get distracted, and you can't really feel that anything's happened when you interact with a touchscreen. But if you select an option on the touchscreen and you get some kind of feedback, like a noise or a vibration, that could be really helpful. So I think there could be a huge range of uses for the information I will gather through my research.”


Funded by: This work was supported by the Royal Academy of Engineering Chairs in Emerging Technology Scheme (CiET1718/14).