We need to hear from you! Take our short AI survey and share your thoughts on the current state of AI, the way it’s being implemented, and what you expect in the future. Learn more
Less than two years after the release of ChatGPT, enterprises are showing great interest in using generative AI in their operations and products. A brand new study by Dataiku AND CompetentA survey of 200 leading enterprise IT analysts and leaders worldwide found that almost all organizations are spending significant amounts of cash exploring generative AI use cases or have already implemented them in production.
However, the path to full implementation and full productivity is not without obstacles, and these challenges create opportunities for firms providing generative AI services.
Significant investment in generative AI
Survey results announced today at VB Transform highlight the significant financial commitments for generative AI initiatives. Nearly three-quarters (73%) of respondents plan to spend greater than $500,000 on generative AI in the next 12 months, with almost half (46%) committing greater than $1 million.
However, only a third of surveyed organizations have a specific budget allocated to generative AI initiatives. More than half fund their generative AI projects from other sources, including IT, data science, or analytics budgets.
Countdown to VB Transform 2024
Join enterprise leaders in San Francisco July September 11 for our flagship AI event. Connect with peers, explore the opportunities and challenges of Generative AI, and learn how you can integrate AI applications into your industry. Register now
It’s unclear how pouring money into generative AI affects departments which may otherwise profit from the budget, and the return on investment (ROI) for this expenditure stays unclear. However, there is optimism that the value added will ultimately justify the cost, as progress in large language models (LLMs) and other generative models shows no signs of slowing down.
“As more LLM use cases and applications emerge across enterprises, IT teams need a way to easily monitor both performance and costs to get the most out of their investments and identify problematic usage patterns before they significantly impact the bottom line,” the report reads in part.
AND previous study According to Dataiku, they show that enterprises are exploring various forms of applications, ranging from improving customer support to streamlining internal operations resembling software development and data evaluation.
Ongoing challenges in implementing generative AI
Despite the enthusiasm around generative AI, integration is easier said than done. Most survey respondents reported infrastructure barriers to using LLM in the way they would really like. In addition, they face other challenges, including compliance with regional regulations, resembling the EU AI Act, and domestic policy challenges.
The operational costs of generative models also remain a barrier. Hosted LLM services resembling Microsoft Azure ML, Amazon Bedrock, and OpenAI API remain popular decisions for generative AI exploration and production in organizations. These services are easy to make use of and abstract the technical challenges of establishing GPU clusters and inference engines. However, their token-based pricing model also makes it difficult for CIOs to administer the costs of large-scale generative AI projects.
Alternatively, organizations can use self-hosted, open LLMs that may meet the needs of enterprise applications and significantly reduce application costs. However, they require upfront costs and in-house technical talent that many organizations do not have.
Tech stack complexities make generative AI adoption even harder. A staggering 60% of respondents reported using greater than five tools or software components at every stage of the analytics and AI lifecycle, from data ingestion to MLOps and LLMOps.
Data Challenges
The advent of generative AI has not eliminated pre-existing data challenges in machine learning projects. In fact, data quality and usability remain the biggest data infrastructure challenges IT leaders face, with 45% citing them as their top concern. Next comes data access issues, cited by 27% of respondents.
Most organizations are sitting on a large stack of information, but their data infrastructure was built before the era of generative AI and without machine learning in mind. Data often exists in different silos and is stored in different formats that are incompatible with each other. It should be preprocessed, cleansed, anonymized, and consolidated before it might probably be used for machine learning purposes. Data engineering and data ownership remain significant challenges for most machine learning and AI projects.
“Even with all the tools organizations have at their disposal, people still haven’t mastered data quality (as well as usability—is it fit for purpose and does it meet user needs?),” the study says. “It’s almost ironic that the biggest challenge facing the modern data stack is… not modern at all.”
Opportunities amidst challenges
“The reality is that generative AI will continue to change and evolve, with different technologies and vendors coming and going. How can IT leaders get into the game while also remaining agile to what’s next?” said Conor Jensen, Field CDO at Dataiku. “All eyes are on whether this challenge—on top of rising costs and other risks—will dwarf generative AI’s value production.”
As generative AI moves from exploratory projects to a technology that underpins scalable operations, generative AI service providers can support enterprises and developers by providing higher tools and platforms.
As technology evolves, there might be many opportunities to simplify the technology and data stacks in generative AI projects, reducing integration complexity and helping developers focus on solving problems and delivering value.
Enterprises also can prepare for the wave of generative AI technologies, even if they are not yet exploring the technology. By running small pilots and experimenting with latest technologies, organizations can find pain points in their data infrastructure and policies and start preparing for the future. At the same time, they’ll start building internal skills to make sure they have more options and are higher prepared to leverage the technology’s full potential and drive innovation in their industries.