Digital devices permeate our lives. For most of our waking hours, we sport mini-computers strapped to our wrists or tucked into a pocket. We rely on these gadgets for an ever-expanding array of tasks, and without them (or their sufficiently juiced batteries), modern life can come to a screeching halt. It seems almost natural, then, that we ask: How can these digital companions make us healthier — can they warn us of an impending heart attack, say, or predict the early stages of Alzheimer’s disease?
That’s precisely the goal of a new wave of research that seeks to harness the vast amounts of data that are collected passively by personal devices, such as Fitbits, Apple Watches, and smartphones, throughout the day. For example, how often does an individual stand up and move around? How far does she typically walk and for how long? What is her gait like? How often does she leave the house or make phone calls? How frequently does she send text messages? How fast does she type?
To sort through the “big data” that flow from these devices, researchers are turning to different forms of artificial intelligence, or AI, to create a kind of “digital phenotype” that could help monitor patients’ health over time. For example, a team in Boston is leveraging this approach to study patients who have recently undergone treatment for brain cancer. In many forms of the disease, the standard-of-care post-treatment is fairly straightforward: patients receive an MRI scan every few months to check for complications or recurrence. The Boston researchers are now looking for ways to stratify these patients, using digital phenotyping, for example, to help pinpoint those who require more aggressive interventions. They and others believe that smartphones will become another tool in clinicians’ toolbox to help identify acute changes in patient behavior that indicate a decline in health.
A sweeping, NIH-funded consortium spanning 13 universities and non-profit organizations is pursuing similar goals. But first, it is addressing some of the shortcomings of the current generation of commercial wearable devices. These devices gather only a few types of user health data and do not display raw sensor data. Because of these and other limitations, they are not well suited to the kinds of research required to determine how wearables can be leveraged to predict disease and improve health.
Thus, the research team has designed a series of wearable devices that can collect diverse types of sensor data and operate for a full day on a single battery charge. These include a watch-style gadget that decodes hand and arm movements and measures not just heart rate — as most of today’s wearables do — but also heart rate variability; a micro-radar sensor to enable the detection of heart and lung activity without the need for skin contact (via electrodes); and a set of “computational eyeglasses” that provide real-time eye and gaze tracking. The team has also created smartphone apps that can connect wirelessly
to their devices to collect sensor data and generate a digital profile of the user’s health. With these tools, the investigators are tackling a range of health problems, including addiction, cigarette smoking, heart failure, and obesity. However, the approach is broadly applicable and could also help unearth
insights into a range of other conditions. Importantly, the team’s work is open-source, so wearable device developers could leverage this work to build new sensors and apps for their own products.
For more information about Dr. Arnaout’s research, please contact Partners HealthCare Innovation by clicking here.
Most Recent Posts:
Collaboration demonstrates Mass General Brigham’s commitment to digital transformation and enhancing patient experience SEATTLE, Oct. 31, 2022…
Innovation Grand Rounds is a monthly live series with talks by innovators and executives from across the…
Photo, left to right: Joseph Ferrara (moderator), Veranex Solutions; Philip Rackliffe, GE Healthcare; Erin McKenna; Kevin Boyle,…