Exclusts: Expandliai collects seed extension worth USD 20 million to develop the AI ​​application platform

FriendStartup of the AI ​​inference platform, collected $ 20 million in the seed extension round, the company said only Crunchbase News.

- Advertisement -

In the case of dirt, AI inference is “the last mile” in which users are involved in artificial intelligence. When you ask Chatbot and get a response generated by the AI ​​model, this process is referred to as inference.

AI inference has grow to be extremely essential, from 80% to 90% of the GPU devoted to the application, and only the remaining part used for training, in accordance with the estimates of CEO Endingliai Byung-Gon “Gon” Chun.

Byung-Gon Chun

“The AI ​​inference market explods when more organizations passed from AI experiments to implement production,” he said.

Chun founded funikliai in 2021. Seul National Universitywhere he has been conducting research on accelerating the performance of the AI ​​model for over a decade. The company began to focus on applying AI in 2022, even before the release OpenaiChatgpt. The basic team of products and engineering also comes from the Chun research group at the Seul National University.

At the end of 2023, Fughtlia moved his headquarters to Redwood City in California, said Chun, who was previously AI researcher in Microsoft AND Facebook.

In short, the startup goals to help firms with a faster, cheaper and simpler application of the AI ​​model. Speed ​​is essential for LLM for many reasons. Quick inference can lead to lower operational expenses, especially in cloud -based environments, in which the calculation of resources is settled on use.

“Instead of spending huge amounts and time to build, optimize and support the complex infrastructure to apply AI, they can use our optimized GPU platform to implement and scale AI inference,” says Chun.

By numbers

Capstone Partners He ran the Fircliai increase, which also included the participation of latest supporters Sierra VenturesIN GraduatesIN KDB investment AND KB Securities.

The startup at the end of 2021 collected a round of $ 6 million seeds, also led by Capstone. While the company refused to disclose the valuation, Chun noticed that it was compared to the last increase in Firmlialiai.

Funikliai technology has gained significant adhesion, providing cost savings up to 90% GPU, while providing “the fastest LLM application results on the market”.

Improper disclosure of hard revenues, Chun said that Fughtliai recorded a rapid increase in each consumption and income in 2025, driven by accelerating the adoption of generative artificial intelligence in production. Expected that revenues shall be 6x to 7x higher than 2024.

“Although we are not yet profitable, our priority was effective scaling, ensuring that gross margins remain strong, even when we expand our ability,” said Crunchbase News in an interview.

Diverse customer base

Fughtliai cooperates with firms based on AI, from startups to large enterprises. For example, he recently established a partnership with LG Electronics. The company is also the only provider of API Exeone, (*20*)LG AI ResearchFoundation model.

The company has about 25 to 30 large customers. One customer is Distributed laboratoryCHATBOT Sociality in Korea, which, according to Chun, has achieved a significant reduction in infrastructure costs by using the Exchangeeliai platform to run many LLM.

The startup revenue model is a valuation based on use for application. Application is measured in two ways: by GPU hours consumed on its platform or by the variety of processed tokens or imaging steps.

Customers can select from three product options, depending on their needs: dedicated endpoints that allocate GPU only to the customer; End points without a server that gives APi interfaces with popular AI models; or containers that work directly as a part of their very own customer infrastructure.

Chun said that the company invented a “continuous party”, a technique that is a pioneer in the LLM application field (reminiscent of chatgpt).

“LLM inference movement is very dynamic and uneven. Applications appear in irregular times and they do not all take the same amount of time. That is why the traditional method is not enough,” said Crunchbase News. “We came up with a continuous party to solve this problem. Thanks to the continuous party, you can dynamically add new demands to the party or remove completed from the party at a fine -grained level, keeping the party efficiently.”

He added that this process allows the system to maintain a high use of the GPU, even when the movement is dynamic and uneven.

Reserved inference engine

In the opinion of Chuna Fuglustlia, he stands out in an increasingly competitive landscape, because he specializes in the AI ​​inference. For example, one of the competitors is Fireworks AI.

The Experliai platform is powered by a reserved application engine that uses deep algorithmic and systemic optimizations on the GPU, which in his opinion enable models to work at much lower costs and higher speeds.

“We offer the widest range of the model in the industry,” said Chun said Crunchbase News, noticing that Fugthtliai supports each open source and non -standard models, with over 420,000 models directly with the possibility of deployment with Hugging.

Down EUN-GGAN songThe partner at Capstone Partners, Entxtliai, showed “exceptional technical innovations” in the AI ​​inference space.

“The ability of their platform to ensure excellent performance, while reducing costs, makes them an ideal partner for scaling of AI operations,” he said in a written statement. “We are excited about running this round.”

Latest Posts

Advertisement

More from this stream

Recomended