Go to our on-demand library to view VB Transform 2023 sessions. Register here
It is a fact that most companies today are experimenting with large language models (LLMs) and generative AI.
The first: more conservative companies looking at technology in a centralized way, creating centers of excellence and developing policies on how they want to experiment.
Second: organizations that could potentially be under threat if they don’t start pushing the envelope with generative AI technology as soon as possible. That goes for the customer service space, “where it’s clearly going to be transformative in a massive way.”
Addressing the audience in a fireside chat at this week’s VentureBeat Transform 2023, Carbonara said, “Right now the shift that everyone is going through in both big business and startups is, ‘Okay, how does this new technology affect me? What’s my strategy here? What’s my moat? How can I use it to my advantage? Does it threaten me?'”
From simple bots to “hyperautomation”
Especially in this era of gen AI and more experimentation, automation is still a very important topic that companies are investing a lot of money in, Carbonara pointed out.
Clearly, automation is being used in many different ways, he said. Citi Ventures looks at it as using software to automate different processes in a large enterprise: transaction processing, data processing, customer experience, customer onboarding.
>>Follow for all of our VentureBeat Transform 2023 coverage<
He described automation as going through three stages. The first stage is what he called “RPA 1.0,” the initial ability of software robots to manipulate digital systems. The second iteration was intelligent process automation, which is this ability to add some intelligence to that process.
We are now in a stage of “hyperautomation,” he said, which involves running more complex tasks on multiple systems using multiple technologies. One example: applying optical character recognition to figure out what a document is and natural language processing (NLP) to put it into context, then feed it into an algorithm so the data can be used to make decisions.
“So it went from sort of a single bot, to intelligence, to multiple intelligence with sort of a meta layer of orchestration and control on top of it,” Carbonara said.
Arriving at a “golden dataset”
Today, the biggest challenge large enterprises face when it comes to automation is data quality, Carbonara said: getting good, high-quality data and creating a “golden dataset” to make informed and strategic decisions .
“Whether it’s the most advanced LLM or a very simple model, if you don’t have quality data, then you have a challenge in actually getting a good result,” said Carbonara.
Another bottleneck is the integration of cutting-edge technologies into legacy systems. Organizations need to determine whether such systems will scale and whether they will be able to handle the demands placed on them. And, particularly in regulated industries, there needs to be a level of auditability, controls and governance.
Data quality, governance key
Looking to the future, he predicted that all big businesses will have AI agents of some sort that will perform different tasks. These could be thought of as autonomous agents that will interact with each other (for example, a software building agent interacting with a security agent on an identified vulnerability).
These agents will have access to some data stores, he said, so organizations will need to figure out how to build governance around that. How can agents access the data and what can they do with it? Can they just read it? Or can they read and write it, update it?
“I think there are a lot of interesting questions here for large enterprises about how to get the data quality and data governance in place to enable these capabilities that these autonomous agents will realize,” he said.
VentureBeat’s mission it is to be a digital city square for technical decision makers to gain insights into transformative business technology and transactions. Discover our Briefings.