Medical Minutes by John Schieszer
Aritifcial Device Helping Those with Low Vision
A new portable artificial vision device is now helping those who are blind to read a message on an electronic device, a newspaper article or a menu. The promise of new software technology has now gone from the scientists’ bench to helping the blind in a way never before possible. The new device is called OrCam and it is capable of recognising text, monetary denominations and faces, and can be programmed to recognise other objects. It includes a miniature camera and earpiece, which can be mounted to the right side of any spectacle frame. A cord connects the unit to a pack that houses the device’s battery and computer (held in the user’s hand or clipped on a belt or put in a pocket). The device can be activated by tapping it, pressing the trigger button, or pointing at a target item. It takes a picture of whatever the user is eyeing. Using optical character recognition technology, the device then reads aloud any text found in the picture that was taken, which is heard only by the user via the earpiece and not by others nearby.
The OrCam was recently made commercially available and costs approximately US$2,500-$3,500. Researchers at the University of California Davis Eye Centre in Sacramento (California) evaluated the usefulness of an OrCam for patients with low vision. The study included 12 legally blind patients, with best-corrected visual acuity of 20/200 or worse in their better eye. The researchers found that, after an initial training session, patients were able to perform the tasks better (at least nine of the 10 items on the test) when using the portable artificial vision device. Average scores with the device were also better in seven patients who used other low-vision aids.
Smartphones Aiding Lung Patients in New Ways
Researchers in Seattle have developed SpiroCall, a new health-sensing tool that can accurately measure lung function from anywhere in the world over a simple phone call. It is designed to work with older mobile phones and landlines, not just smartphones. Many people who have asthma, cystic fibrosis or other chronic lung diseases have no way of measuring how well their lungs are functioning outside of a clinic or doctor’s visit.
“We wanted to be able to measure lung function on any type of phone you might encounter around the world, smartphones, dumb phones, landlines and pay phones,” said Shwetak Patel, a professor of computer science and engineering and electrical engineering at the University of Washington in Seattle. “With SpiroCall, you can call a 1-800 number, blow into the phone and use the telephone network to test your lung function.”
In 2012, researchers from the UW’s UbiComp Lab introduced SpiroSmart, which allows individuals to monitor their lung function by blowing into their smartphones. The individual takes a deep breath in and exhales as hard and fast as they can until they can’t exhale any more. The phone’s microphone senses sound and pressure from that exhalation and sends the data to a central server, which uses machine-learning algorithms to convert the data into standard measurements of lung function. Over the last four years, the team has collected data from more than 4,000 patients who have visited clinics in Seattle and Tacoma, as well as in India and Bangladesh, where clinicians have measured lung function using both SpiroSmart and a commercial spirometer. That comparative data has improved the performance of the machine-learning algorithms.
The research team also developed a 3-D printed whistle that can be used in conjunction with SpiroCall and which changes pitch when the patient exhales. The whistle is designed to address training challenges by enabling patients to hear what a “good” test sounds like. The whistle may help improve the test performance for patients who are very ill and whose vocal cords are not able produce enough sound for the phone’s microphone to pick up.
John Schieszer is an award-winning international journalist and radio and podcast
broadcaster of The Medical Minute. He can be reached at firstname.lastname@example.org.