MedCity Influencers

How smart data management will allow biopharma to embrace the digital twin

An increasingly compelling benefit of digital transformation, across numerous industries, is the ability to construct a digital twin—that is, a full in silico replica of a real-life structure, instrument or process.

If the value of digital transformation wasn’t clear before, the pandemic has been a wakeup call for its critical importance, both in advancing innovation and navigating the unexpected. Life sciences companies that were already fully digitized have had distinct advantages—keeping their research moving forward during the pandemic, and for some, pivoting quickly to developing new treatments and vaccines for Covid-19. But the advantages of full digital transformation are even larger and longer-lasting—and they will pave the way for new techniques and capabilities that will fundamentally change how science is done.

An increasingly compelling benefit of digital transformation, across numerous industries, is the ability to construct a digital twin—that is, a full in silico replica of a real-life structure, instrument or process. Organizations in industries outside of biopharma have been using the concept for years, from NASA to GE to Boeing. The aviation industry uses digital twins to model engine operations and predict their function and service needs over time. It’s an accurate, cheaper (and of course safer) way of predicting events that will happen in the future.

For any industry, digital twins are an incredibly efficient way of understanding your processes and getting ahead of events outside of normal operation. Digital twins have been propelled by the incredible advances in the Internet of Things (IoT) and Artificial Intelligence (AI), which are the driving forces for industrial automation and the concept of smart factory. In 2020, the global digital twin market was valued at $3.1 billion, and is expected to grow to more than $48 billion by 2026. Perhaps not surprisingly, the pandemic spurred an increase in demand for digital twins in the healthcare and pharmaceutical industries.

For the bioprocess industry, there is also enormous potential in the digital twin. We’ve been automating more and more over the years, but the concept of creating a digital replica of processes in the lab—and perhaps, one day, entire labs or manufacturing plants—is extremely appealing. Since the timeline from discovery to market is still up to 15 years and costs billions of dollars, strategies to predict how a drug will perform, the safety issues it may involve, and how it can be scaled for manufacture can significantly improve the processes and reduce the time it takes to get to market and to the patient.

For several years now, the industry has been working in the direction of the digital twin. But there are important differences between jet engines and drug development, particularly biologics—biological processes are extraordinarily complex and we don’t yet have a full understanding of the countless nuances of human cells. It’s difficult to model and simulate what you don’t fully know—but we’re getting closer and closer every day.

One way that we can get there is by collecting and collating data across conditions. Disturbing cellular environments in a multitude of ways, or changing conditions in the processes, instruments and workflows and measuring the outcomes, can build a vast mountain of data that allows you to begin to create accurate models.

Making sense of large volumes of data once you have it is no simple feat, particularly if your company isn’t fully digital. The reality is that approximately 50% of companies still primarily use Excel or even paper to record data—and even if they’re partially digitized, data are often kept in data silos, lakes, or warehouses that are not fully integrated.

Therefore, we need to shift away from the old systems of data storage and management toward one that curates data and provides integration and contextualization in a way that’s easy to understand. Adoption of this type of system—a biopharmaceutical lifecycle management system, or BPLM—not only enriches data collection and analysis, but it makes possible the creation of digital twins in biopharma.

The key is that a BPLM creates a comprehensive data backbone across the entire development lifecycle—the data is contextualized, as it is collected at the point of generation, across multiple stages and processes. Data can then be integrated from different instruments, workflows, and departments in an efficient and structured manner. Importantly, a substantial source of wasted time and money in R&D—rework, due to lost data—is significantly reduced. The final outcome is that data is vastly easier to understand and gain insight from.

Again, an important benefit of a BPLM system is that it makes the digital twin possible: captured data can “train” digital twins and make them valuable predictive and even troubleshooting tools. As we develop ever more sophisticated therapies and technologies, the digital twin will be even more valuable. mRNA vaccine development, for example, is particularly well-suited to the process, due to the complexity of the many phases involved; a digital replica can help optimize the developments. One clear example is that scientists can tweak the mRNA code in silico as new variants come up, to create an updated version of the vaccine to serve as a booster.

Another benefit of the digital twin is that it opens the way for smaller biotechs—if a greater portion of work is done digitally and predictively, the time and cost of bringing a drug to market decreases. In this way, the playing field evens out for promising younger companies who want to compete with the giants—and since some of these smaller companies bake in their digital strategy right from the beginning, this gives them an advantage as they grow. But for any size company, the use of the digital twin reduces costs and improves efficiency: reduced work and rework alone can save millions of dollars.

In essence, a BPLM sits at the center of our movement into the future and into biopharma 4.0. By collecting and analyzing data in a totally new way, the BPLM is a “hub” that facilitates integration across systems with the internet of things and commoditizing the AI and the advanced analytics. Partners working together will be needed to fully realize this integration—first across lab instruments, then labs and facilities, and then one day across disciplines.

There will almost certainly be quick acceleration in this digital direction over the next decade, both in industry and in everyday life. We’ve already enjoyed it: Smart phone apps are constantly predicting, and even simple cars check their drivers if they try to change lanes without signaling or get too close to the car ahead. That’s a twin, constantly and silently predicting and adjusting. For bioprocess, we’ll also get there, starting small and moving toward a digital twin for everything. The caveat is that simulations are by definition not 100% exact—but we can get closer and closer, not only by considering physical effects by means of natural laws in the models, but also by collecting more contextualized data to refine them. The quality of data will always predict the quality of simulation. This is true modeling in real time—and it will provide unprecedented predictive power in the years ahead.


Avatar photo
Avatar photo

Alberto Pascual

Alberto Pascual is a Doctor in Bioinformatics with a long experience in Data Science and biomedical domains. Graduated in Computer Science, he received a PhD in Bioinformatics from the Autonomous University of Madrid. After some postdoc period, he joined the National Center for Biotechnology (CNB), leading a bioinformatics Research and Core facility group. He also co-founded Integromics, a Bioinformatics start up, in 2003. The company received the Frost and Sullivan European Bioinformatics Project of the Year Award for 2007. The company was acquired by PerkinElmer in 2014, when he served as a Senior Manager AI and Analytics Innovation. Currently he is the Director of Data Science and Analytics at IDBS (Danaher group) where he is responsible for developing and executing IDBS’ strategy on AI, Data Science and Analytics to deliver novel products and services to customers in life sciences R&D markets.

This post appears through the MedCity Influencers program. Anyone can publish their perspective on business and innovation in healthcare on MedCity News through MedCity Influencers. Click here to find out how.

Shares0
Shares0