After a great first day at the Deep Learning In Healthcare Summit in Boston, we’re back for day 2 where this morning’s discussions have brought together leading speakers to share their cutting-edge research in the industry and how it’s disrupting healthcare.
Mason Victors, lead data scientist of Recursion Pharmaceuticals, kicked off this morning’s discussion ahead of his talk later on in the day. Every year, thousands of new drugs are unsuccessful in being brought to market and are often left in freezers and forgotten. Recursion Pharmaceutical repurpose these to identify drugs as potential treatments which can reach patients quickly. By combining the best elements of technology with the best elements of science the ability to answer really complex questions exponentially increases.
This morning’s sessions saw startups in healthcare presenting their latest research, and we first heard from the Director of Machine Learning at Arterys, Daniel Golden.
‘About 6 million adults in the US are experiencing heart failure right now’, so by making faster and more accurate assessments of their conditions, this could be hugely reduced. As the first FDA approved company in clinical cloud-based deep learning in healthcare, Arterys ‘aim to help clinicians make faster and more accurate assessments of the volumes of blood ejected from the heart in one cardiac cycle’. This is known as ejection fraction. Arterys are moving towards solving the problem of cutting healthcare costs and improving access; where a human analyst would expect to take between 30 minutes and an hour analysing images to make a diagnosis, Arterys takes an average of 15 seconds to produce a result for one case. We heard about the architecture they are working with and the ‘critical challenge in using data that was used to support clinical care as opposed to machine learning’. This means that and Arterys are often presented with incomplete images to analyse, and they have therefore trained their model to recognise the missing sections of the image and fill in the blanks. ‘Roughly 20% of the data we work from is complete, so without this model 80% of our data would be redundant.’
The next startup to present was PulseData who are trying to overcome the ‘laborious task of building robust data pipelines for inconsistent datasets.’ We heard from CEO and founder Teddy Cha who said they aim to track ‘transactional, temporal, and ambient data on patients that isn’t sitting right in front of doctors.’ This involves amalgamating historical data from various sources such as insurance claims, Facebook posts, private health visits and more. There are numerous questions to be asked when making accurate medical predictions, and creating individual datasets can be eliminated by PulseData’s node-based approach, where machines treat data work and features as abstractions or "nodes". This provides ‘a way to track a calculation where you have a sequence of events that each perform their own calculation, so you don’t need to worry about any of them independently’ each time a new question is asked. Powerfully, nodes can be made dependent on other nodes, allowing them to rapidly assemble data pipelines and implement variables each time the question changes, rather than rewriting the pipeline from scratch.
Hunter Jackson, Co-Founder and Chief Scientific Officer of Proscia, spoke about their work in changing the way in which doctors diagnose cancer through their analysis of genetic images. Where many companies ‘are taking a direct AI approach, (Proscia) are taking more of a cancer specific approach’. Billions of slides are analysed every year in the US alone, many of which are hidden away stored on drives where their value is restricted to whatever on-site researchers can uncover. Proscia take a cancer specific approach to this image analysis and Jackson explained how ‘one slide can produce millions and millions of patches’ which can help answer their key questions: ‘who shall we treat, how should we treat them, and did their treatments work?’. One of the key obstacles they previously faced was getting hold of the medical images, however we heard that ‘with the help of clinical partners, we are developing deep learning powered tools that activate those digital slides to address problems in the clinic, create opportunities for translational research and data licensing, and inform disease prognosis and therapeutic plans.’ Another issue that was covered was the desire to predict the likelihood of recurrence in cancers, and when you bring pathology into cancer prediction, you are enabling more predictive biomarkers for cancer and enabling more accurate predictions. We heard about some recent successes in this domain including identifying metastases in breast and gastric lymph nodes with deep convolutional neural networks and using deep learning to predict lymph node metastasis from the primary tumour.
Michael Dietz from Waya.AI continued the discussion on medical images and explained how they are working to improve image classification with generative adversarial networks (GANs) to ‘turn your smartphone into a dermatologist, and eventually into a doctor’. A high percentage of the population have access to smartphones, and having a dermatologist constantly on hand to photograph and diagnose skin conditions could help prevent a multitude of conditions. The GANs Waya.AI use are one of the most promising areas in deep learning research, and we heard about how they can be used in the real world for unsupervised learning. The models that Dietz and his team are using ‘learns entirely from data so there’s no predisposition or human bias, we’re letting the machine learn entirely from data.’ Waya.ai uses GANs for a several different tasks in skin cancer detection, and we saw the improved results obtained when using these as opposed to traditional methods. Although GANs accurate and efficient, Dietz explained that they are ‘really hard to train and we had to have a hack to nudge them to work which is an unstable situation’. However, Waya.AI have found a method to overcome this by calculating different distances to make a reasonable and efficient approximation. Through this application of AI in healthcare, the goal is to find the causes and mechanisms of disease and analysing the patterns that connect everything together.
Rounding off the morning of startup showcases, Fabian Schmich, data scientist from Roche began his discussion building on the issues faced by Proscia with ‘pathology being a fairly old field so much has changed, people are using the same old protocols with slide imaging’, so there is a lot of progress to be made. There’s an increasing demand for tissue analysis in drug development and Schmich explained how Roche are improving tissue annotation with deep learning. ‘We now are in the middle of a digital revolution in pathology’ which allows Roche to quantitatively analyse cells across the whole image which is a game changer in pathology. The problem with current pathology lies in human error and inconsistencies where technicians hand draw their analyses onto low resolution images. Deep learning, however, overcomes this by segmenting aspects of the images to get a deeper analysis. Roche ‘take an architecture and convolutionise it by changing the last couple of fully connected layers and add up sample layers’ which results in each image having multiple complex labels rather than being restricted to one. Schmich went on to explain the challenges in data mining these images and how they can tap into infrustructure that they already have to leverage data to train deep neural networks to plug in and test different architectures.
As the discussions continue into the afternoon, we’re looking forward to hearing from Biswaroop (Dusty) Maiumdar from IBM Watson Health, who will discuss Empowering the Future of Cognitive Computing in Healthcare, amongst several other healthcare leaders in deep learning.
Couldn't make it to Boston?|
If you want to hear more from the Deep Learning Summit and Deep Learning In Healthcare Summit you can register now for on-demand video access.