Highlights from Jelani Harper's insideBIGDATA article 'Knowledge Graphs 2.0: High Performance Computing Emerges' featuring Keshav Pingali, CEO of Katana Graph.
There is a need for high performance graph computing…in two ways. One is the volume of data, and the other is time to insight.
Knowledge graph dependence parallels that of Artificial Intelligence. Businesses will need to evaluate massive amounts of data for insights to make critical decisions. Computing on massive amounts of data takes time, energy and resources. To get the most out of the knowledge graphs, high performance computing is necessary, and scaling is a requirement for high performance computing.
Pingali discusses, “A scale out solution is essential in some verticals…in fintech, security identity,” Pingali reflected. “We’re talking about very big graphs, very big topologies, in some cases maybe a trillion edges. And also, lots of property data on nodes and edges.”
The amount of unstructured data enterprises need to process from the cloud, social media and IoT is expanding exponentially. This onslaught of data will demand an investment in knowledge graphs and will shape the future of the graph computing ecosystem. Enterprise businesses that adopt the use of knowledge graph will have a competitive edge.
More than half of the world’s data was created in the last two years, but less than 2 percent of it has been analyzed.
Pingali points to a startling observation: “more than half of the world’s data was created in the last two years, but less than 2 percent of it has been analyzed. Some of this data is of course structured data…but a lot of that data is also unstructured and can be viewed usefully as graphs and processed usefully with graph algorithms.”
“That is where real intelligence comes in, to figure out what might happen in the future and mitigate any bad things that might happen and ensure you can exploit all the good things that might happen,” Pingali predicted. “This is going to require using lots and lots of knowledge graphs, as well as AI. Knowledge graphs and AI are really made for each other…in a platform where you can quickly spin up those kinds of applications and exploit the enormous amount of data that we all have.”
Speed in detecting abnormalities in data is critical to highly information-reliant industries such as financial institutions, medical and pharmaceutical organizations, and governments. Keshav Pingali notes specific risk instances including “intrusion detection, fraud detection, and anti-money laundering.”
To read the full article please use this link: Knowledge Graphs 2.0: High Performance Computing Emerges