Site icon Xtartup Bar

Vellum.ai has raised $5 million as demand for generative AI services surges

Vellum.ai has raised $5 million as demand for generative AI services surges

Vellum.ai, a multistage firm, announced this morning that it had closed a $5 million seed investment. They raised $5 million to further our objective of assisting firms in developing LLM production use cases. The business did not specify who its lead investor was for the round. But it confirmed that Rebel Fund, Eastlink Capital, Pioneer Fund, Y Combinator, and many investors participated in the round.

Need for Developing Vellum:

According to Akash Sharma, CEO and co-founder of Vellum, the firm now has 40 paying clients. Their income rises by 25% to 30% every month. Sharma and his co-founders (Noa Flaherty and Sidd Seethepalli) said they worked with GPT 3 in early 2020 when its beta was launched at Dover, another Y Combinator business from 2019. They constructed generative AI programs at Dover to produce recruiting emails, job descriptions, etc. But they discovered they spent too much time on their prompts and couldn’t assess their quality. As a result, they also needed to create tools for fine-tuning and semantic search. Sharma explained that the sheer volume of manual labor was building up.

Modern LLMs and their Requirements:

Seeing a market open new chances to produce tools is not unusual. But current LLMs may not only transform the AI market itself, but they may also expand it. Sharma told, “It was never possible to use natural language [prompts] to get results from an AI model until recently released LLMs. The shift to accepting natural language inputs expands the [AI] market significantly. Because a product manager or a software engineer, literally anyone can be a prompt engineer”. A rapid increase in demand tends to correlate with the market size. Thus, it’s safe to infer there is significant business demand for LLMs. “It is simple to spin up an LLM-powered prototype and launch it.” Sharma explained. “But when companies end up taking something like that to production, they realize that there are many edge cases that come up, which tend to provide weird results.” So, businesses must do more than skin GPT outputs obtained from queries if businesses want their LLMs to be good.

Vellum declined to disclose its price structure but stated that its services cost between three and four figures per month. When combined with more than a dozen clients, Vellum has a reasonably respectable run rate for a seed-stage firm. Vellum allows AI prompters to compare model output side by side. It also assists in searching for company-specific data to add context to specific prompts. You can also use other tools like testing and version control. It can ensure that their prompts are giving out the right stuff.

Exit mobile version