Fostering Industry Innovation with Expert Digital Tools
In late 2022, DPHARM (Disruptive Innovations to Modernize Clinical Research) sat down with Joan Severson, our Chief Innovation Officer, to discuss how sensor fusion expedites clinical research, and how to foster innovation in our industry by the expert design and deployment of digital tools.
Read the interview below as originally published by DHARM.
What work is Clinical ink leading to disrupt and innovate clinical research?
My team is immediately focused on taking the data from mobile, sensors, and wearable technology and fusing it together to create a holistic picture of the patient. We do this by integrating data from the everyday context of a patient’s life – what we call “patient science” – and fusing it with more traditional clinical data. We’re at the beginning of something that will drastically change the industry. By fusing these wearable, sensor, and mobile data together, we are able to create very rich data models to better prove or more quickly disprove therapy efficacy – as well as understand an intervention’s effect on a patient’s quality of life. As we look to the future, my team is most excited about our work in tying passive data to active assessments, as well as possibilities that open up when working with patient-consented geofencing and GPS data integration.
What is the “patient science” data you’re referring to?
At Clinical ink, our vision is to advance clinical discovery by positioning ourselves at the convergence of data, technology, and patient science. Everyone in our industry uses phrases like patient-centered or patient centricity, but we believe the future of clinical trials is truly dependent on powering patients to become more active participants in science. We have an obligation to develop comprehensive, actionable, patient-centered assessments and measurements that put science back into the patient’s hands – or, as I like to say, on their wrists and in their pockets.
How are you addressing the challenges of integrating data from a patient’s life with clinical-grade data?
We’ve developed processes and platforms that are designed to support the evolving modalities of patient data capture both in the clinic and at home; voice, movement, lifespace, cognition, mood, activity, medication use, biometrics. Our platform supports ingestion of data from our native applications in clinic and at home, as well as third party sources. As much as we would like to see everyone’s data formatted for ingestion – processing and synchronization through industry data standards and APIs – the reality is we will probably always have cases where we need to do some analysis to bring data together and to create data models that support study operations as well as data analysis processes and pipelines.
How do you give an accurately weighted context to something gathered from a patient’s everyday life and put it into the scale used for clinical data?
It’s not necessarily about the weighting of data, it’s about fusing this incredibly rich data together to create an enhanced model of a patient’s condition and its effect on their quality of life. Our data science team works with patients, study sponsors, clinical teams, and our research collaborators to understand what impacts patients’ quality of life and how. At the end of the day the platform needs to build a model of a patient quite similar to how a physician builds their own model in their chart each time they are presented with a patient. We are not intending on replacing that physician, but we are building an additional model of the patient that lives in digital space.
Can you give an example of sensor fusion?
We’re doing this work every day. You can see this play out brilliantly in our public WATCH-PD work in collaboration with the University of Rochester, Biogen, Takeda, and the Critical Path Institute. In this study – participants used an iPhone and Apple Watch to collect active tasks and passive behavioral data for over 12 months. Tasks included active psychomotor, cognitive, voice tasks and ePROs (mood, fatigue, ADLs), and continuous passive behavioral data streams. These assessments were all aligned with the Unified Parkinson’s Disease Rating Scale (UPDRS). The interdisciplinary work of this study will aid in developing digital biomarkers to better understand Parkinson’s patient burden, so that the industry can target therapies and address quality of life.
How has your advanced technologies and digital biomarkers team deployed sensors and mobile devices for sensor fusion?
The team has been deploying sensors and mobile devices into clinical studies for nearly a decade, with the end goal of creating digital endpoints, or biomarkers. We have never been closer. We’ve partnered with some of the most respected pharmaceutical companies, research institutions, and government organizations to deploy the patient use of consumer-grade devices for research of various indications – Parkinson’s and movement disorders, Oncology, COVID and other respiratory conditions, as well as rare diseases. We are currently conducting studies that tie mobile cognitive assessments with mobile voice capture to better understand and create digital biomarkers for long COVID. We’re excited to share more about this work.
How is sensor fusion technology enabling a better understanding of disease progression?
By coupling sensor fusion technology with disease-focused feature engineering and artificial intelligence, we are able to demonstrate that more frequently acquired, remotely-monitored measures yield greater sensitivity to disease progression. We do this by capturing high temporal resolution sensor data via collection of both active mobile assessments – measuring mobility, cognition, and voice – and passive mobility data. Using our source of abundant multivariate data, we have extracted data features selective for disease status. We then feed those features into a random forest model and evaluate its classification accuracy in predicting disease status in an independent dataset. In the WATCH-PD study, our model was able to distinguish the healthy volunteers from early-stage PD patients with 92% accuracy, providing preliminary support for the use of our platform in generating digital biomarkers of early-stage PD status.
How do you foster an innovative environment at Clinical ink and amongst your team?
By championing an interdisciplinary team with wide ranging experiences, perspectives, and expertise. Every day, we apply principles from human factors, computer science, clinical practice, neuroscience, multivariate data analysis, user experience design – even aerospace engineering – in order to concept, develop, and deploy our technology. This creates an environment of constant problem solving and commitment to progress toward a common goal. You have such a long history with digital tools, such as wearables and other mobile technologies, in clinical trials.
What is your perspective on how they’re being used today in clinical research?
Sophisticated digital technologies are sorely underutilized in clinical research. Mobile sensors and wearable technology now afford us a promising class of more objective, performance-related research tools that complement eCOA and ePRO. We can now take advantage of ubiquitous, patient-friendly mobile and wearable technologies. We need to do it more and tap into the power of responsive user experience design that can support patients to enable more diverse participation, create better engagement, reduce anxiety, and build trust.
What is your chief goal or north star as Clinical ink’s Chief Innovation Officer?
To provide robust solutions in the sensors, wearables, and digital biomarkers space. And my standard for that is very high. This is not about throwing a wearable into a trial and hoping for the best; it’s about ensuring we offer the translational, applied research expertise to guarantee success and spark true innovation.
Read the interview here.
Chief Innovation Officer,