The Future of Big Data: Deep Learning

Dean Stephens
October 12, 2015

The use of anesthesia eliminated one of humankind’s greatest fears – the pain of surgery. A young dentist and renowned surgeon made history on October 16, 1846 with the first successful surgical procedure performed with anesthesia.

In the 1870s, healthcare was revolutionized by the germ theory of disease, which led to public health efforts and antiseptic practices. In the 1920s, the discovery of penicillin ushered in the use of antibiotics and propelled forward the use of medication as treatment for disease.

In the 1970s, randomized clinical trials brought about evidence-based medicine.

We are now on the precipice of a new revolution. Data-driven healthcare promises an era of personalized health – personalized information and recommendations, personalized wellness and prevention, personalized medicine and treatments, and personalized care delivery. We will no longer think in terms of “the average patient” but rather the “specific person.”

The space is white hot:

  • Over $4 billion in venture funding went to digital health companies in 2014.
  • Companies receiving the most money are involved in big data analytics.
  • Almost 100 digital health companies were acquired for a combined $20 billion.
  • Apple recently launched Apple Watch and Research Kit.
  • IBM committed $1 billion to its Watson Group.
  • Google Ventures invested 36 percent of its fund in healthcare and life sciences in 2014, making it the company’s largest allocation.
  • Intel bought Basis, and Samsung introduced Simband smartwatch architecture.
  • A recent estimate from Oliver Wyman pegged the amount that the U.S. healthcare system could save at 40 percent of our current $3 trillion spend—nearly $1.2 trillion.
  • McKinsey reports $300-$450 billion of that could come from Big Data solutions.

The future of Big Data is Deep Learning. We are going to see more practical applications of Artificial Intelligence, namely Deep Learning—a more sophisticated machine learning artificial neural network approach to getting intelligence from Big Data.

The word is out that the biggest part of Big Data is the unstructured part, and it contains valuable information and learnable patterns. We now have the computing power and are seeing the emergence of technologies to make sense of it.

Deep Learning and other technologies are trying to understand the 80 percent of unstructured data contained in EMRs and make that information actionable. Deep Learning goes beyond natural language processing (NLP), which is about finding needles in a haystack of unstructured text. NLP is entity recognition. However, in order to recognize an entity in free text, it needs to be explicitly stated. Deep Learning is about inferring what’s in the information even if it’s not explicitly present. NLP can identify a patient with sleep apnea if sleep apnea or a synonym is mentioned in the text. Deep Learning applied to text analysis can infer from the evidence—the patient is a heavy snorer, is obese, has low blood oxygen levels and is taking Provigil—that the patient likely has sleep apnea.

This is powering many of the advances in computer vision, voice recognition and text analysis at companies like Google, FB, Microsoft and Baidu. The effects of this technology will change the economics of virtually every industry. Jeremy Howard, CEO of medical diagnostic startup Enlitic and previous chief scientist at Kaggle, said in a recent TED Talk that Deep Learning is capable of providing the kinds of services that currently employ 80 percent of the developed world. Simiarly, Venture capitalist Vinod Khosla believes data-driven technologies like these will replace 80 percenrt of what doctors do. Mark Andreessen said software was eating the world. I think Deep Learning is starting to nibble at it.

Dean Stephens is the CEO of Talix.
View all Blog Posts