The CEOs of Dropbox and Figma are backing Lamini, a startup building a generative AI platform for enterprises

The CEOs of Dropbox and Figma are backing Lamini, a startup building a generative AI platform for enterprises

Lamini, a Palo Alto-based startup creating a platform to assist enterprises implement generative artificial intelligence technology, has raised $25 million from investors including Stanford computer science professor Andrew Ng.

Laminifounded a few years ago by Sharon Zhou and Greg Diamos, has an interesting sales offer.

- Advertisement -

Zhou and Diamos argue that many generative AI platforms are too general-purpose and do not have solutions or infrastructure tailored to corporate needs. Lamini, on the other hand, was built from the ground up with enterprises in mind and focuses on ensuring the high accuracy and scalability of generative AI.

“The top priority of almost every CEO, CIO, and CTO is leveraging generative AI in the organization for maximum ROI,” Zhou, CEO of Lamini, told TechCrunch. “But while getting a working demo on a laptop for an individual developer is easy, the road to production is littered with failures on each ends.”

According to Zhou, many firms have expressed frustration with obstacles to meaningful use of generative AI in their business functions.

According to March vote MIT Insights found that only 9% of organizations have widely adopted generative AI, though 75% have experimented with it. The most vital obstacles include lack of infrastructure and IT capabilities, weak governance structures, insufficient skills and high implementation costs. Security is also an essential factor – recently questionnaire According to Insight Enterprises, 38% of firms said security impacts their ability to leverage generative AI technology.

So what is Lamini’s answer?

Zhou says “every piece” of Lamini’s technology stack has been optimized for enterprise-scale generative AI workloads, from hardware to software, including the engines used to support model orchestration, tuning, run and training. “Optimized” is a vague word, yes, but Lamini is pioneering one step that Zhou calls “memory tuning,” a technique for training a model on data to accurately resemble parts of that data.

Fine-tuning the memory could potentially reduce hallucinations, Zhou says, or instances where the model makes up facts in response to a request.

“Memory tuning is a training paradigm that is as powerful as, but goes beyond fine tuning, training a model on proprietary data including key facts, figures, and figures, resulting in a highly accurate model” – Nina Wei, Artificial Intelligence Designer at Lamini , he told me via email, “and can remember and recall the exact match of any key piece of information, rather than generalizing or hallucinating.”

I’m undecided I purchase this. “Memory tuning” appears to be more of a marketing term than an academic one; There are no scientific publications on this subject – at least I have not been capable of find any. I’ll leave Lamini to present the evidence that his “memory tuning” is higher than other hallucination reduction techniques that are/have been tried.

Fortunately for Lamini, memory tuning is not the only differentiator.

Zhou says the platform can operate in high-security environments, including airless environments. Lamini enables firms to run, tune and train models in a variety of configurations, from on-premises data centers to public and private clouds. Additionally, it scales workloads “elastically,” reaching over 1,000 GPUs if an application or use case requires it, Zhou says.

“Incentives are currently misaligned in the market for closed-source models,” Zhou said. “We are going to put control back in the hands of more people, not just a few, starting with the companies that care most about control and have the most to lose from their proprietary data owned by someone else.”

For what it’s price, Lamini’s co-founders have quite a track record in the field of artificial intelligence. They also clashed with Ng individually, which little question explains his investment.

Zhou was previously a faculty member at Stanford University, where she led a group researching generative artificial intelligence. Before earning her PhD in computer science under Ng, she was a machine learning product manager at Google Cloud.

Diamos, for his part, co-founded MLCommons, an engineering consortium dedicated to creating standard benchmarks for AI models and hardware, in addition to MLCommons’ benchmark suite, MLPerf. He also led artificial intelligence research at Baidu, where he collaborated with Ng while the latter was chief scientist there. Diamos was also a software architect at Nvidia MIRACLES team.

The co-founders’ industry connections appear to have given Lamini a fundraising advantage. In addition to Ng, Lamini’s investors include Figma CEO Dylan Field, Dropbox CEO Drew Houston, OpenAI co-founder Andrej Karpathy and, oddly enough, Bernard Arnault, CEO of luxury goods giant LVMH.

AMD Ventures is also an investor (which is ironic considering Diamos’ roots in Nvidia), as are First Round Capital and Amplify Partners. AMD got involved early by providing Lamini with data center hardware, and today Lamini is operational many of his models on AMD Instinct GPUs, bucking the industry trend.

Lamini makes the lofty claim that training and model performance are comparable to equivalent Nvidia GPUs, depending on workload. Since we are unable to check this claim, we’ll leave it to third parties.

To date, Lamini has raised $25 million in seed and Series A rounds (Amplify led the Series A). Zhou says the money might be used to triple the company’s 10-person team, expand its computing infrastructure and begin development toward “deeper technical optimizations.”

There are many enterprise-focused, generative AI providers that would compete with facets of the Lamini platform, including tech giants like Google, AWS, and Microsoft (through their OpenAI partnership). Google, AWS, and OpenAI, in particular, have been aggressively courting the enterprise in recent months with features akin to improved tuning, private data tuning, and more.

I asked Zhou about Lamini’s customers, revenues and overall go-to-market dynamics. At this fairly early stage, she didn’t need to reveal too much, but did say that AMD (through its affiliation with AMD Ventures), AngelList and NordicTrack are among Lamini’s early (paid) users, in addition to several undisclosed government agencies.

“We are growing rapidly,” she added. “The number one challenge is customer service. We were only dealing with inbound demand because we were flooded. Given the interest in generative AI, we are not representative of the overall technology slowdown – unlike our peers in the AI ​​world, we have gross margins and revenues that are more like a regular tech company.”

Amplify general partner Mike Dauber said: “We believe there is a huge opportunity for generative AI in the enterprise. While there are many AI infrastructure companies out there, Lamini is the first I see that takes enterprise problems seriously and creates a solution that helps enterprises unlock the enormous value of their private data while meeting even the most stringent compliance and security requirements.”

Latest Posts

Advertisement

More from this stream

Recomended