We’re always hearing that applying machine learning in the real world is what businesses need to be doing to ensure they don’t get left behind. This however, can be challenging for companies when they’re initially looking to adopt these technologies. How much will it cost? How time consuming will it be? How do I train my staff? This is where Indico comes in - the start-up focused on making deep learning practical in the enterprise by helping to automate tedious back-office tasks, improving the efficiency of labor-intensive document-based workflows, and extracting valuable insights from your unstructured content, including text and images.
Back just last year in 2017, Tom Wilde joined Indico as CEO as part of their focus around the enterprise customer. The company was founded back in 2013 around deep learning research at Olin School of Engineering, and was already competent at solving unstructured content challenges. Tom’s role was focusing on helping craft the product strategy and build out the go to market strategy, which has been a huge success.
Coming from the enterprise search industry and watching it evolve from its early roots in boolean and dictionary based approaches to semantic approaches to machine learning, and now to Deep Learning, this was a perfect fit for Tom. He explained that ‘deep learning is poised to fundamentally transform things like text analytics and many of the adjacent fields like search, content management, CRM, customer service and others.’
Indico are firm believers that deep learning should be easily applied and practical in enterprise for several reasons. According to most recent market studies from the likes of Forrester, Gartner, BCG etc, fewer than 25% of companies have begun to deploy AI in a meaningful way, and of those less than half have been able to quantify its benefit. It’s not surprising, as AI is a complex field, and when you peel back the onion, you find that there are many elements required for a successful deployment and for capturing ROI. The biggest two we have found has been access to quality labeled training data, and involvement of the right subject matter experts who can define the existing process and the desired outcome. This will require some significant shifts in the way projects are tackled. Historically, new technologies could be neatly divided between IT-based projects, and “consumer enterprise” based applications. Now, successful application of AI will require a collaboration between the two.
At the Deep Learning Summit in Boston this May 24 - 25, Tom will be presenting on some of their most recent progressions in bringing deep learning and AI to enterprise, and we asked him a few questions in advance of his presentation.
Ultimately AI will be infused in all software applications, so it won’t be something that small businesses need to consciously choose to deploy. For large enterprises, AI will represent a unique opportunity to build IP around their unique understanding of a particular industry challenge combined with proprietary data that they have. For these companies, having an explicit mandate to understand and take advantage of AI is imperative.
For Indico our whole world is unstructured content-text, image, audio etc. While there is approximately 10X the amount of unstructured content in the enterprise relative to structured content, the investment has been the inverse. This is primarily because unstructured has been such a challenging problem companies have set it to the side. Deep Learning will unlock these assets and allow companies to capture their full value.
There are certainly large ethical questions to ultimately solve, the more pressing issue and a more practical framing is the question of explainability and validation. What we hear from our customers is that it’s not enough for AI to augment or automate a process. The enterprise needs to understand in some detail how it is solving it. This is generally referred to as model explainability and validation. It extends to highly regulated fields where increasingly companies need to expose in detail the models inputs, training, and accuracy. In the world of unstructured data, we are investing a lot in applications that allow customers to drill into the models to explore tradeoffs between precision and recall (often referred to generally as “accuracy”) as well as see the specific training data that is generating the result in question so that the training data can be updated as needed. Overall, this also allows the data scientist and the Subject Matter Expert (SME) to collaborate with a shared understanding of the problem framing.
A number of things. First off, I think within three years it will be incumbent on most SME’s to have a basic understanding of machine learning, the same way everyone had to learn how to use the internet. This understanding will help SME’s identify good use cases and opportunities for applying ML. Second, there will be a large demand for data engineers. These roles will focus on ETL type tasks and being able to prepare data for machine learning and then be sure the output gets pushed to the right enterprise execution platforms. Finally, there will be a large demand for data scientists, who have actual training in the algorithms related to machine learning.
Honestly, I can’t think of a single industry that won’t be impacted. While we have witnessed over the last 20 years the most dramatic growth in corporate profitability and productivity in the history of business, the next wave will be almost entirely driven by the adoption of AI.
Register before April 6 to guarantee your place with Early Bird Discounted Passes. At the Summit, Tom will be joined by global experts in deep learning with the likes of Daniel Smilkov, Software Engineer, Google Brain, Anoop Deoras, Lead Researcher, Netflix, Sangram Ganguly, Senior Research Scientist, NASA Ames Research Center, Greg Amis, Principal Software Engineer, TripAdvisor, Sergul Aydore, Machine Learning Scientist, Amazon, and many more sharing their cutting-edge work.