Stability AI Launches StableCode for Enhanced Code Generation

Stability AI Launches StableCode for Enhanced Code Generation

Stability AI is famous for its Stable diffusion text-to-image generation model. However, the generation AI startup’s ambition extends beyond this domain. Stability AI is further planning to expand its network into code generation.

Stability AI has recently introduced its first StableCode. This code is a new open large language model (LLM) designed to help users generate programming language code. This stable code has three different levels:

  1. A base model (General use)
  2. Instructional model
  3. Long context window model (can accommodate 16,000 tokens)

The StableCode utilizes the initial data set of the programming language information sourced from open source BigCode Project. Stability AI has done further optimization in this data. StableCode facilitates Python, Java, Go, C++, and Markdown programming. The head of research at Stability AI, Christian Laforte, said:

“What we would like to do with this kind of model is to do a similar thing as we did for Stable Diffusion, which helped everyone in the world to become an artist. We’d like to do the same thing with the StableCode model: basically allow anyone that has good ideas [and] maybe has a problem, to be able to write a program that would just fix that problem.”

StableCode: Built on BigCode and big ideas:

All the large language models (LLM), including the StableCode, utilize the data for training purposes. Stablecode obtained this from BigCode Project. Using BigCode for the LLM generative AI code tool is not a novel concept. In May, HuggingFace and ServiceNow introduced the open StarCoder LLM based on BigCode. The lead research scientist at Stability AI, Nathan Cooper, said:

“We love BigCode, they do amazing work around data governance, model governance and model training. We took their datasets and we applied additional filters for quality and also for constructing the large-context-window version of the model, and then we trained it on our cluster.”

StableCode’s longer token length is a game-changer for code generation:

SstableCode’s extended version has a context window of 16,000 tokens which according to the Stability AI is larger than any other model. According to Cooper, the longer context window allows the use of specialized prompts for generating codes. Cooper said;

“You can use this longer context window to let the model know more about your code base, and what other functions are defined in other files. So that when it does suggest code, it can be more tailor-made to your code base and to your needs.”

Roping in better code generation with rotary position embedding (RoPE):

StableCode, just like the other generative AI models, relies on the transformer neural network architecture. StarCoder utilized the ALiBi (Attention with Linear Biases), whereas StableCode used another distinct technique termed “rotary position embedding” (RoPE). According to Cooper, the ALiBi strategy is similar to the current tokens compared to the preceding tokens, and this approach is less optimal for coding. He said;

“I don’t think that coding lends itself to this idea of weighing the present more important than the past, so we use … RoPE, [which] does not have this sort of bias where you’re weighing the present more than the past.”

About The Author

Leave a reply

Your email address will not be published. Required fields are marked *

Get Latest news in your inbox

Join our mailing list to receive the latest happenings from the startup world.

You have Successfully Subscribed!

Pin It on Pinterest

Share This