By Sophie Curtis on May 21, 2015
To interact with the world’s information, most of us turn to search engines. We do it so often that we tend to think of search engines as the internet. But they’re not - they’re gateways to it, and by definition gateways have some fixed parameters.
In enterprise environments, we rely on databases to do much the same thing. We put data into a database, we ask it a question and it returns an answer. The inherent (often unrecognized) limitations of these technologies have governed how we think about accessing information; about what we expect to find, and about what we even presume to ask.
is a software products company, building entirely new ways of interacting with information and developing a powerful suite of analytics tools built specifically for the big data world. Their goal is to help our customers become more profitable and more effective by providing accurate, real-time awareness on a massive scale, showing customers the connections between their data and the world’s.
We spoke with Scott Lightner
, CTO of Synthos Technologies, to hear more about how they're using deep learning to revolutionize their work.
What do you feel are the leading factors enabling recent advancements in deep learning?
There are several – from the availability of inexpensive hardware to the proliferation of data, and as a consequence of all that data, an increase in the number of people and organizations looking to make sense of it. But perhaps the largest contributing factor is the cost of memory. In-memory applications and platforms enable the kind of automatic feature and pattern discovery needed for big data analytics. And, of course, data analytics at the enterprise level is in greater demand than ever.
Large volumes of data will always be accompanied with large sets of features, which adds to the computational complexity necessary to mine it. In-memory computing enables the analysis of huge amounts of data, but in-memory at scale has been viewed with some skepticism. We’ve recently received patent protection on our compression approach, which makes in-memory at scale not only feasible but cost-effective as well. So, alongside the rise of in-memory computing, I think we’ll see some genuinely game-changing applications of deep learning in the very near future – from finance and energy to the intelligence community and the federal government.
How is Synthos Technologies taking a Deep Learning approach to product development?
Two ways: entity disambiguation and knowledge discovery.
First, on entity disambiguation, instead of using deep learning models on batch processing tasks, we apply the analytics per transaction. We leverage hundreds of millions of features describing millions of entities. This makes our big-data, feature-based approach much more accurate and, because our platform is in-memory, much faster too. For knowledge discovery, our Finch platform enables deep learning to be applied within transactional environments. By having a platform with embedded transaction analytics that you can apply deep learning models to – on-the-fly, per transaction – we are enabling entirely new capabilities.
The virtues of transactional analytics mean you can change the model without rerunning a big batch process in Hadoop. Different transactions can leverage different deep learning models and parameters; each query can apply a different model. This approach enables new ways of interacting with information. Never before has intelligence like this been able to be gathered this quickly, this holistically.
What are the practical applications of your work and what sectors are most likely to be affected?
On the knowledge discovery front, we see applications in finance, where a deep learning approach could allow banks, credit card companies or regulators to detect anomalies that could indicate fraud or risk. In the customer service arena, deep learning approaches will enable organizations to build better, richer customer profiles. This will help them improve customer service and satisfaction, or offer customized products or promotions.
Last, we’ve developed a business intelligence tool on our platform that ingests news data. Users can quickly see the entities trending in news, understand their relationships to other entities and find similar entities in just a few clicks. It takes in a dynamic, streaming feed, and as information about an entity changes, so do a user’s search results. This is proving valuable in environments – like public relations, marketing, advocacy or competitive intelligence – where understanding a changing external landscape is mission critical.
Which industries do you think will be disrupted by deep learning in the future?
Deep learning has the potential to impact industries across the board. Any industry that relies on the swift, accurate, contextual understanding of their information assets can capitalize on deep learning approaches. The question will be how and how quickly. What we’re seeing right now is that organizations are at varied points on the data needs continuum. Some customers need and want a completely new deep learning-based platform. Some want apps that leverage the best of what deep learning has to offer. And some, like our customers in the intelligence space, want a better knowledge discovery service so that their developers can build applications behind their firewalls. In either case, the need for more sophisticated, more dimensionalized approaches is there. And, as the world creates more and more complex data, that’s not something that’s going away.
The Deep Learning Summit is taking place in Boston on 26-27 May. Places are now limited, register today to avoid disappointment.
Synthos Technologies are sponsoring the Deep Learning Summit. Join the conversation with the event hashtag #reworkDL
Deep Learning Summit