Smartphones have changed a lot since the introduction of the first iPhone in 2007. Today, these indispensable devices dictate how we communicate, how we record and share memories, and much of how we entertain ourselves. One area where these ubiquitous devices have come to show incredible amounts of promise is in medical diagnostics, where they could soon reshape how we detect diabetes, gauge female fertility or even spot skin cancers.
The processing power of today’s smartphones, along with their high-quality cameras and array of ever-improving sensors, makes them well-equipped to take on all kinds of tasks. When it comes to medical care, scientists from all fields are making exciting advances that leverage these capabilities to diagnose different conditions in more expedient ways. Let’s take a look at a few early-stage but highly promising examples.
A smartphone test for female fertility
For the purposes of procreation, knowing when a woman is ovulating can be key to successful outcomes, and a smartphone-based saliva test could remove a lot of the guesswork. The technology was developed by scientists at Brigham and Women's Hospital and demonstrated in 2018, offering a possible alternative to urine tests or basal body temperature analysis as a way of gauging fertility.
The test consists of a glass slide, which a saliva sample is deposited onto. As the fluid dries, it crystallizes into a fern-like structure with distinct patterns, a process known salivary ferning, which can reveal which stage of the menstrual cycle the subject is in. While at-home kits exist for these tests already, the team wanted to remove the risk of human error by letting a smartphone do the heavy lifting.
The slide is inserted into an optical device that is placed over the smartphone’s camera. An artificial intelligence app then carries out analysis of the sample, with experiments showing that it could identify the ovulation phase of the menstrual cycle with 99 percent accuracy.
Type 2 diabetes
This week we took a look at the results of a promising study in which a smartphone was used to detect type 2 diabetes with incredible accuracy, using nothing but the camera. The technique makes use of photoplethysmography (PPG), a technique where blood volume changes can be detected by shining light onto tissues.
A team from University of California, San Francisco used this approach, with the help of a smartphone camera and flash, and a deep-learning algorithm trained on 2.6 million PPG recordings, to detect diabetes in three separate cohorts. The technique accurately detected diabetes in around 80 percent of subjects, and proved even more precise when basic patient data was factored in, such as body mass index and age.
Concussions
One way scientists are looking to improve the way we detect and treat concussions is by looking into the eyes, where erratic movements or an inability to track moving objects can be indicative of a brain injury. In 2017 we learned how smartphones could play a role in this, with scientists at the University of Washington developing an app for concussion detection called PupilScreen.
The app uses the smartphone’s flash to stimulate the eye, its camera to record a three-second video, and a deep-learning algorithm to detect changes in the way pupils respond to the light. In a pilot study involving 48 subjects, the team was able to use this approach to diagnose concussions with almost perfect accuracy.
Pancreatic cancer
Another way smartphones could bring about better health outcomes via the eyes is by picking up the early signs of pancreatic cancer, a disease in which symptoms often don’t appear until it is too late. A smartphone app also developed at the University of Washington shapes as a possible screening tool for the condition by detecting signs of jaundice, one of pancreatic cancer’s early symptoms.
More specifically, the app uses the smartphone’s camera and computer vision algorithms to search for elevated levels of a substance called bilirubin, which leads to the yellowing of the skin and eyes seen in jaundice. It does this by assessing the wavelengths of light that are absorbed by part of the eyeball, and in their testing, the scientists behind it were able to correctly detect these early signs of pancreatic cancer 89.7 percent of the time.
Skin cancer
Another type of cancer that smartphones may be able to pick up earlier on in the piece is that of the skin. Soon after the iPhone came out, researchers began to ponder how it could be used to help spot early stage melanomas, and in 2017 we reported on a very exciting advance in the field from scientists at Stanford University.
The technology was powered by an artificial intelligence algorithm that uses deep learning to detect early-stage melanomas, after being trained on more than 100,000 images of skin lesions. The team then ran experiments where the performance of the system was compared to professional dermatologists, and found that it could classify skin cancers on a comparable level to the trained experts.
HIV and syphilis
Back in 2015 we looked at a highly promising, multifunctional dongle that could be plugged into a smartphone to enable it to detect both HIV and syphilis. Developed at Columbia University, the platform relies on disposable plastic cassettes that are loaded up with reagents that can detect antibodies for both conditions, and needs just a drop of the patient’s blood to do it.
The team carried out field tests in Rwanda, in which it demonstrated that results could be turned around in just 15 minutes. The mobility, ease-of-use and manufacturing cost of the accessory, at just US$34, make this kind of approach a promising healthcare solution in developing regions, where access to these kinds of tests might be lacking.