Deal Dive: Human Native AI Builds Marketplace for AI Training Licensing Deals

Deal Dive: Human Native AI Builds Marketplace for AI Training Licensing Deals

To be accurate, AI systems and large language models should be trained on vast amounts of information, but they shouldn’t be trained on data they do not have the right to make use of. OpenAI license offers with The Atlantic and Vox last week showed that either side are interested in striking licensing deals for AI training content.

Human Native AI is a start-up based in London that is building a market for mediating such transactions between many firms implementing LLM projects and entities willing to license their data.

- Advertisement -

Its goal is to assist AI firms find data on which to coach their models, while ensuring that rights holders consent and are compensated. Rights holders upload their content for free and connect with AI firms for revenue shares or subscription deals. Human Native AI also helps rights owners prepare and price their content and monitors for any copyright infringements. Human Native AI takes a cut of each transaction and charges AI firms fees for transaction services and monitoring.

James Smith, CEO and co-founder, told TechCrunch that the idea for Human Native AI got here from his previous experience working on Google’s DeepMind project. DeepMind also ran into problems with not having enough good data to properly train the system. Then he saw that other AI firms were encountering the same problem.

“We seem to be living in the Napster era of generative AI,” Smith said. “Can we move on to a better era? Can we make it easier to get content? Can we give creators some level of control and compensation? I kept thinking, why isn’t there a market?”

He pitched the idea to his friend Jack Galilee, an engineer at GRAIL, while walking in the park with his kids, just as Smith had done with many other potential startup ideas. However, unlike in the past, Galilee stated that they need to do so.

The company launched in April and is currently in beta. Smith said demand from either side has been really encouraging and they have already signed several partnerships that will likely be announced in the near future. Human Native AI announced a £2.8 million seed round this week led by LocalGlobe and Mercuri, two UK micro-VCs. Smith said the company plans to make use of the funds to grow its team.

“I’m the CEO of a two-month-old company and I’ve been able to meet with CEOs of 160-year-old publishing companies,” Smith said. “This suggests that there is great demand from the publishing side. Similarly, every conversation with a large AI company goes exactly the same way.”

While what Human Native AI is building is still in its very early stages, it appears to be the missing piece of infrastructure in the burgeoning AI industry. The big AI players need a lot of information for training purposes, and giving rights holders an easier strategy to work with it, while giving them full control over how their content is used, looks like a good approach that might make either side of the table glad.

“Sony Music just sent letters to 700 AI companies asking them to stop using them,” Smith said. “This is the size of the market and the potential customers who can obtain data. The number of publishers and rights holders may number in the thousands, if not tens of thousands. We think that’s why we need infrastructure.”

I also think it could possibly be much more useful for smaller AI systems that do not necessarily have the resources to sign a contract with Vox or The Atlantic to still have access to data for training purposes. Smith said they’re hoping for that too, and that each one significant licensing deals so far have been with larger AI players. He hopes human artificial intelligence will help level the playing field.

“One of the main challenges with content licensing is the high up-front costs and significantly limiting the number of people you can work with,” Smith said. “How to increase the number of buyers of your content and reduce entry barriers? We think it’s really exciting.”

Another interesting element is the future potential of information collected by Human Native AI. Smith said that in the future they’ll give you the option to offer rights holders with greater clarity on the best way to price their content based on the history of transaction data on the platform.

This is also a good time to launch Human Native AI technology. Smith said that as the EU’s Artificial Intelligence Act evolves and potential AI regulation in the U.S., it’ll change into much more urgent for AI firms to source their data ethically – and have the receipts to prove it.

“We are optimistic about the future of AI and its impacts, but we need to make sure that we as an industry are responsible and don’t decimate the industries that got us to this point,” Smith said. “It wouldn’t be good for human society. We need to make sure we find the right ways to enable people to participate. We are optimistic on the human side of AI.”

Latest Posts

Advertisement

More from this stream

Recomended