When we get sick or injured, it's reassuring to know that we can call a doctor or visit a hospital. In the UK, the NHS provides us with a sense of security in knowing that advice and help are always available - if you're unsure if you should visit a doctor you can call the non-emergency 111 number or call your GP to book an appointment. Whilst this is a great service, healthcare, however, will be vastly improved by the introduction of artificial intelligence: diagnostics will be improved through medical imaging, machines will be able to predict the likelihood of disease, and drugs will be produced in a more time efficient way. Cutting down the wait time and accuracy of treatments with AI advances will save countless lives each year. At the Deep Learning in Healthcare Summit in London this September 20 - 21 we will explore the deep learning tools & techniques set to revolutionise healthcare applications, medicine & diagnostics. Confirmed speakers include Greame Rimmer, Engineer Manager at Google, Ahmed Serag, Research Scientist at Philips, Sarah Culkin, Strategic Data Lead at NHS England, and many more.
With the vast amounts of money that are being invested in improving healthcare with machines, the progressions are rapid. We're taking a look at the most recent breakthroughs and advances in the space:
From image analysis to data management, artificial intelligence is reshaping healthcare. Darren Schulte from Apixio looks at some real-world examples, and the advantages and disadvantages. The potential for both artificial intelligence and robotics in healthcare is considerable, and artificial intelligence and robotics are increasingly a part of our healthcare ecosystem. An example is with Apixio, which focuses on healthcare data science and the implementation of artificial intelligence in healthcare. The company has been using artificial intelligence and augmented intelligence to solve the overwhelming data problem facing the industry today.
The advent of electronic medical records with large image databases, along with advances in artificial intelligence with deep learning, is offering medical professionals new opportunities to dramatically improve image analysis and disease diagnostics.
Researchers from the Johns Hopkins University Applied Physics Laboratory (APL) in Laurel, Maryland, and collaborators at the Johns Hopkins School of Medicine, have developed image analysis and machine learning tools to detect age-related macular degeneration. In Nature Medicine, members of the team discuss the potential of such tools to be used clinically and applied to other image-based medical diagnoses as well.
A new AI bot performed with better accuracy rates and more safety at almost all tests conducted against primary care professionals.
Last week, doctors at London’s Royal College of Physicians were subjected to the world's first demonstration of an artificial intelligence (AI) robot performing a clinical test. The point of the event was to show how well the chatbot, engineered by digital medicare startup Babylon Health, would perform at the MRCGP exam, the Royal College of General Practitioners final test for trainee doctors. In the last five years, general practitioners have averaged a 72% score at the exam, declared director at Babylon Health Dr. Mobasher Butt before announcing to the audience his bot's score. “It got 82%," said the medical expert as people began to clap.
At the moment, most health-related applications of AI are at the research or early trial stage and it is not yet clear how successful they will be in wider healthcare systems. In a recent briefing note, we highlighted several areas of clinical care where AI is thought to have strong potential, such as the analysis of medical images and scans for early signs of disease, or monitoring of patients’ vital signs for indications of deterioration. Some healthcare providers are also testing AI systems to assist with administrative tasks such as scheduling and as a first point of contact for health information and triage. There is hope that AI could help address challenges associated with the ‘care gap’ and ageing populations, and could assist people with chronic disease, disability, and frailty in the home.
Before 2017, gastroenterologist Cheng Chunsheng had to inspect over 1,000 gastroscopy pictures to search for possible esophageal cancer symptoms, a cancer which appears in the food pipe. However, this painstaking process is no longer needed since the People's Hospital of Nanshan District in Shenzhen where Cheng works introduced "Tencent AIMIS", an artificial intelligence or AI-enabled imaging software released in August last year. "The AI system screens through each report and notifies the doctor if further inspection is needed," said Cheng. The system has significantly boosted his efficiency.
Think of your immune response as a giant machine-learning problem, with your body as the computer. Immune cells travel around your body, sampling all sorts of matter they come into contact with, from your own cells to the cells of organisms that definitely shouldn't be there. If immune cells encounter something they know shouldn't part of your body -- bacteria or a virus, say -- the body scales up the cells that know how to deal with that interloper. If there's a cell that's seen the intruder before and knows how to tackle it, your body rapidly reproduces it thousands of times -- enough that it can overwhelm the bacteria or virus before it has time to make its home in your body.