Emerging from a niche in the database market, graph technology could actually be the thing to help us make sense of all the AI we’re using to understand the world and our business in it, according to Gartner.
No longer the last on the shopping list of new database trends, graph processing will grow 100 per cent annually to 2023, the IT analyst giant forecast.
Graph databases are being used to help analyse network relationships like those in social media or company ownership. But they are not about to stop there, according to Pieter den Hamer, Gartner senior research director.
“The key thing to keep in mind is that graphs are, indeed, everywhere, they are in our brains, for example,” Den Hamer said.
Speaking to the Gartner Data & Analytics Summit, he elaborated that graph technology would go on to underpin a new trend: composite AI.
“It is one of the biggest trends that we’re seeing today in AI,” Den Hamer said. “Because of this growing pervasiveness of this fundamental role of graph, we see that this will lead to composite AI, which is about the notion that graphs provide a common ground for the culmination, or if you like the composition of notable existing and new AI techniques together, they’ll go well beyond the current generation of fully data-driven machine learning.”
Roughly speaking, graph databases work by storing a thing in a node – say, a person or a company – and then describing its relationship to other nodes using an edge, to which a variety of parameters can be attached. They are not just being used by data scientists to solve business problems, they are also useful in the data science process, to help understand ontology and augment data integration, Den Hamer said.
Meanwhile, graph databases often come in handy for data scientists, data engineers and subject matter experts trying to quickly understand how the data is structured, using graph visualisation techniques to start “identifying the likely most relevant features and input variables that are needed for the prediction or the categorisation that they’re working on,” he added.
Graph may also be employed to help in feature engineering, that tricky business of figuring out what is important in the dataset. They could also be used as a basis for new types of neural networks, to help explain the outputs of AI and to uncover the business rules behind data, he said.
The status of graph has increased such that Gartner plugged it as one of the top four analytics technologies that would enable businesses to “adapt to a changing world.”
“Whether it’s building recommendation engines for detection systems or infrastructure monitoring, graph will become key to understand increasingly complex inter-relationships,” keynote presenter and Gartner research director Gareth Herschel said.
Others on Gartner’s data analytics hype list include the “data fabric”, an abstraction layer that lets the user get to the right data quickly, while also controlling security and governance, and generative adversarial networks, the technique of playing one ML model off against another to generate new data.
Herschel also backed OpenAI’s GPT-3, the ML language model that uses deep learning to produce human-like text.
“These natural language generation techniques will enable machines to tell us data stories. Instead of us becoming data literate, they are becoming human literate,” he claimed.
But there are other perspectives on GPT-3, especially when dealing with important topics apparently marginal to mainstream western culture. As one cognitive science PhD student pointed out, the language generator has produced “factually wrong and grossly racist text” on the subject of Ethiopia, for example.
The need for graph databases to help understand AI techniques, or GPT-3 to communicate the stories within data, raise questions about whether we always need more tools to understand or manage the tools we already have. Maybe it will be turtles all the way up, as well as down. ®