ghp June 2015

ghp June 2015 | 31 The healthcare industry in its widest sense is hugely complex: human beings are complex, conditions and their treatment are complex and the R&D and manu- facture of treatments also hugely complex. The oppor- tunity for improved insight across the industry is strong. In an ideal world we would like to be making better decisions, quickly, based on all the factors affecting that decision. Imagine if we could remove subjectivity and the necessity to make decisions under the pressures of time and making sure it’s the surest and safest option. But could we be close to a Healthcare 4.0 where we can benefit from the same efficiencies and accuracy that is already widespread in the likes of Industry 4.0? Well, the various buzzwords of ‘Big data’ and ‘Internet of Things’ have given a lot of attention … perhaps hype … for predictive analytics as a core use case and new possibilities. Predictive analytics can be applied throughout the healthcare cycle in areas such as: speeding up research trials; simulation of care quality; accelerated time to market for new therapies; market access and resource allocation for new therapies; high risk patients for ACO (accountable care organisation) and hospitals; the reduction of hospital readmissions; changing be- haviour for healthier lifestyles; patient experience; and financial performance. There is software that can correlate symptoms and set probability assessments in real time, and they can do this in a way that is faster than a human observer. Other software can combine data about the physics, physiology, and genetics of the human body with information about materials used in medical devices such as pacemakers to anticipate how they will perform over time. The motivation for the majority of these analyses and more is consistent: Improving patient care while avoid- ing financial and reimbursement penalties for hospitals. The idea is to predict potential failures and weakness- es in processes and equipment so they can be fixed before affecting patients and efficiencies. By having such accurate and timely insight, decisions can also be made quicker with almost guaranteed outcomes. But what has stopped us maximising these opportuni- ties to date? Firstly it comes down to culture. There are many within the healthcare industry who have spent years learning their skill and craft. Many believe that an algorithm is no match to diagnosing a patient and in some cases they are, as at the time of writing, correct. Yet two differ- ent doctors can and will provide two different opinions, and variability (particularly where there are complexities or comorbidities) causes healthcare pathways to be diffused and deviate. Secondly, it comes down to the quality of the data. Hospitals are renowned for their disparate data silos and often the data is unstructured too i.e. in text format and itself not organised. Despite the push towards elec- tronic health records, paper records still exist in many cases. And this has prevented the majority of existing predictive analytics models from working effectively. Even when data are electronic and accessible, nearly all predictive analytics algorithms require a clean data- set and indeed a hypothesis – explicit or implicit – i.e. a known signal to look for. Some of these (decision trees and neural networks) can use machine learning to be trained but the data are rarely clean and by the time you have a statistically significant sample of issues you really have a much bigger problem. Mostly the problem is that issues look like one-offs. So trying to find an ‘un- known unknown’ signal without generating lot of false positives is much more of a challenging problem.