Big Pharma relies on Big Data to combat the research slump
Digitalization is becoming a source of hope: new analyses, networking, and larger amounts of data could give a boost to drug development.
“Data, data everywhere, but no drugs,” lamented Jerry Karabelas, the former pharmaceutical head of the Swiss Novartis Group. At that time, around the turn of the century, the pharmaceutical industry had been digging through the flood of data from the booming genomics industry more or less in vain. Therefore, this was followed by a long period of disillusionment.
Almost two decades later, a new flood of data has hit the pharmaceutical industry. This time, industry representatives are convinced, data could have a more lasting effect. Vas Narisimhan, Karebelas’ successor and current head of Novartis, speaks of a “digital revolution” and sees the Swiss pharmaceutical giant in the midst of transforming into a “medical and data science company.”
Meanwhile, Roche, a competitor based in Basel, Switzerland, is investing billions in specialized data and analytics companies such as Foundation Medicine, a US-based enterprise, and wants to “supplement pharmaceutical diagnostics with a third axis, data management,” as Roche CEO Severin Schwan phrases it. Tina Lupberger, head of strategy and innovation of Pfizer’s oncology division, is already talking about a paradigm shift for the pharmaceutical industry. “If we don’t make a move, others will,” she warns. And Big Pharma will be pushed into the role of the supplier.
So far, the industry has mostly been viewed as a laggard regarding digitization. But the process of catching up has begun. Therefore this topic is of the utmost importance, not only at Pfizer, Roche, and Novartis, the leading trio in the pharmaceutical industry, but for all companies affiliated with life sciences. External experts estimate the financial potential to be of a considerable amount. Experts from McKinsey, for example, project a “100 billion dollar opportunity” for the pharmaceutical sector.
McKinsey estimates that advanced data analytics could amount to 45 and 75 percent EBITDA growth of the industry over the course of the next ten years. Industry experts at Ernst & Young believe that digital platforms will play a key role in the success of pharmaceutical companies in the future.
In contrast to the late 1990s, when attention focused almost exclusively on gene data, big data now affects big pharma players on a much broader front. This principle affects the business model in its entirety.
Levers for higher returns
Many industry representatives hope that Big Data can be leveraged to become the most commercially valuable contribution to product development. After all, the drug development process is still considered the main financial bottleneck.
Although granting a significant increase in new and successful developments, on average, additional revenues are not sufficient to earn decent returns on research investments to expedite growth. An analysis done by McKinsey demonstrates, that during the 1990s, revenues from newly developed drugs during the first seven years in the market was triple the cost of development.
Research & Development in the pharmaceutical industry
Currently, the factor is at 1.1, which means that revenues are only slightly higher than costs. The so-called “return on investment” of pharmaceutical research is therefore declining. It is anticipated that Big Data could provide a decisive lever to reverse this trend.
These various approaches are pursued internally by pharmaceutical companies as well as by external specialists. For example, artificial intelligence could considerably increase the efficiency of various routine research activities. Werner Seiz, Director of Research at Sanofi-Aventis Germany, refers to the vast number of tissue samples required in connection with the testing of active substances for possible carcinogenic side effects, all of which must be analyzed.
To eliminate the possibility, pharmaceutical companies must have tens of thousands of tissue samples tested by their pathologists. Modern image recognition systems can accelerate and optimize this work.
JOHNSON & JOHNSON HEAD OF RESEARCH STOFFELS
“We are experiencing a new era of medicine.”
For Pharma, using search and analysis systems for improved access to and efficient use of existing knowledge has become a key issue. The pharmaceutical company, Abbvie, is currently setting up its own database of scientific literature, allowing researchers to view their own specialized literature collections, similar to the music streaming service Spotify. “By offering this service, we want to initiate a new kind of interaction. For instance, between different disciplines, where neurologists and oncologists are researching the same protein but have never shared their relevant information.
In addition, features of the database allow users from other departments within Abbvie to “follow” other researchers and select the stream of literature shared. “This will provide us with the decisive advantage of bringing researchers closer together,” says Lars Greiffenberg, responsible for IT Research and Development and Translational Informatics at Abbvie Deutschland.
Accelerated development of new active compound
The very existence of employees with such job titles in the pharmaceutical industry demonstrates the transition of the industry. Translational Informatics is expected to establish bridges between early and late research, external and internal scientists, facilitating more exchanges between different groups. This will enable new active substances to be developed faster and more efficiently.
According to Greiffenberg’s assessment, however, it will take a few more years before Abbvie’s “Spotify of technical literature” is sufficiently trained to truly speak of artificial intelligence.
The computer will then suggest relevant literature to the scientist, “the machine knows his fields of interest and preferences, enabling it to filter the 8,000 to 10,000 new scientific publications published every day,” explains the Ph.D. biologist.
Abbvie has already benefited from the development of its own database. Their literature analysis has led to the discovery of numerous additional proteins that might be involved in the development of Alzheimer’s disease. These proteins have not yet appeared in any of the commercial databases the pharmaceutical industry has access to.
“Whether and to what extent these proteins are involved will now require further investigation. But we are opening a new research chapter since we have approached the literature differently,” said Greiffenberg.
The pharmaceutical manufacturer, Merck, which is building a similar system in collaboration with the Eschborn-based analytics company Innoplexus, took a slightly different approach. The aim was not only to make internal and external data more accessible to the company’s own scientists, quickly, easily, and efficiently, but also to use artificial intelligence to elevate the use of this data to a whole new level.
Merck intends to both use the system internally, and make it accessible externally. Therefore, it is not located in the Pharmaceuticals division, but in the Life Science division of the Group, which already acts as one of the leading suppliers for biopharmaceutical research. “In research, we still have insufficient data sharing of this kind,” says Merck CEO Stefan Oschmann. Therefore, the potential of the project is enormous.
Innoplexus, founded in 2011, and now employing approximately 250 people, can be regarded as a specialist using structured data pools and artificial intelligence to provide financial service providers and pharmaceutical companies, among others, with both the necessary software and publicly available information. “Our goal is to democratize data and break up data silos within a company and between organizations,” says Innoplexus co-founder Gunjan Bhardwaj.
According to Innoplexus, the company’s software is capable of analyzing more than 27 million publications, one million dissertations, and 375,000 clinical studies. In particular, the Innoplexus CEO believes that preclinical research can be significantly accelerated by employing such systems.
Analysis with artificial intelligence
The listed US company, Medidata, pursues similar goals, although from a different starting point. Medidata is a leading provider of software for clinical trials, with sales of around $600 million and a market capitalization of almost $5 billion.
According to the company, 18 of the 25 leading pharmaceutical companies, as well as numerous smaller pharmaceutical manufacturers, biotech companies, and clinical research organizations (CROs) currently use the company’s systems.
The strategic goal now is to broaden the software and service offering and to generate new insights and solutions for the industry from the aggregated analysis of the study data. “The pharmaceutical industry is still far behind in the adoption of new technologies in the field,” says Medidata CEO Tarek Sherif.
“This offers great opportunities for us.” In just a few years, he is convinced, the company’s revenues could grow into billions. The long-term goal is to become a kind of SAP for the life sciences industry.
Similar to Big Pharma and many competitors, the US company now also looks beyond the data generated directly in research and clinical trials. Data from clinical practitioners is also becoming increasingly important for both external service providers and the pharmaceutical companies themselves.
This information is increasingly being collected in a form that enables large-scale analysis using artificial intelligence. In addition, more and more health data is available that goes beyond information from classical diagnostics.
Two factors, in particular, are the driving forces behind this development. Increasingly powerful molecular-biological diagnostic methods, particularly in the field of gene sequencing, and growing opportunities for the continuous acquisition of health data with the aid of consumer electronics (such as smartphones and wearables) are driving this development.
Hubertus von Baumbach, CEO of Boehringer Ingelheim, sees in the possibilities even the most lasting effect of digitalization. In particular, this high amount of data will determine the drug profiles in the future.
Takeovers generate know-how
Roche is aiming in a similar direction with the purchase of the US-based company Flatiron for $1.9 billion negotiated a few months ago. Flatiron specializes in software solutions for the electronic processing and evaluation of clinical data for cancer therapy, and has established a comprehensive database of patient treatment data in collaboration with various US clinics.
Roche now wants to use this database to enrich its existing knowledge with “real world data”.
In Roche CEO, Schwan’s, view, this offers the opportunity to use drugs more precisely, in addition to being able to draw conclusions about the previously unknown mechanisms of action of drugs. ” There are connections that could not possibly be seen before.”
The acquisition of Flatiron is flanked by alliances with the consulting firm Accenture and the healthcare division of the US company General Electric (GE). Accenture will help with data integration and GE will contribute know-how in the field of imaging diagnostics. Three years ago, the Basel-based group secured a majority stake of more than one billion dollars in the US company Foundation Medicine, which specializes in the digital analysis of genome data. Roche now wants to link this with Flatiron’s data sets in order to generate even deeper insights into the interaction between genetic factors and drug therapies in cancer treatment. Schwan believes that this could prove to be an important advantage, particularly in the case of very heterogeneous cancers.
However, the benefit of software-supported or even self-learning data evaluation depends on the quality of the data. “Artificial intelligence will only be able to exploit its strengths if the question underlying this process is formulated very precisely,” said Greiffenberg from Abbvie. “AI needs good preparation. The only way to use a mowing robot sensibly is to prepare the lawn accordingly.”
However, the benefit of software-supported or even self-learning data evaluation depends on the quality of the data. “Artificial intelligence will only be able to play to its strengths if the question underlying this process is formulated very precisely,” said Greiffenberg from Abbvie. “AI needs good preparation. You can only use a mowing robot sensibly if the lawn has been prepared accordingly.”
MERCK-CHEF STEFAN OSCHMANN
“There’s an outdated way of thinking in the healthcare system – we’re ready to change systems.”
Many pharmaceutical companies are currently involved in initiatives such as IMI, Allotrope, and Pistoia, which contribute to the processing of research data in such a way that it can be used across companies. That they are machine-readable and can be found again. This sounds simple, but involves a lot of effort. According to Greiffenberg, such initiatives are very important in order to one day be able to exploit the strengths of artificial intelligence in the industry. For example, by giving companies access to the necessary amount of clinical data in order to train the algorithms so that reliable analyses can later be carried out. “It is too expensive for a company alone to build up a sufficient data pool,” said the Abbvie director.
The companies do not give up their company knowledge or intellectual property in such cooperations. One has to imagine this in such a way that they give their algorithms to other companies for training purposes – and get them back better trained. “The study data of the individual companies remain protected,” says Greiffenberg.
This article was originally published on Handelsblatt.