Stability AI launches StableLM, open-source alternatives to ChatGPT

The large-language-model sector continues to swell, as Stability AI, maker of the popular image-generation tool Stable Diffusion, has launched a suite of open-source language-model tools.

Dubbed StableLM, the publicly available alpha versions of the suite currently contain models featuring 3 billion and 7 billion parameters, with 15-billion-, 30-billion- and 65-billion-parameter models noted as “in progress,” and a 175-billion-parameter model planned for future development.

By comparison, OpenAI’s GPT-4 has a parameter count estimated at 1 trillion, 6x higher than its predecessor, GPT-3.

The parameter count may not be a good measure of large-language-model (LLM) efficacy, however, as Stability AI noted in its blog post announcing the launch of StableLM:

“StableLM is trained on a new experimental dataset built on The Pile, but three times larger with 1.5 trillion tokens of content. […] The richness of this dataset gives StableLM surprisingly high performance in conversational and coding tasks, despite its small size of 3 to 7 billion parameters.”

It’s unclear at this time exactly how robust the StableLM models are. The Stability AI team noted on the organization’s GitHub page that more information about the LLMs’ capabilities would be forthcoming, including model specifications and training settings.

Related: Microsoft is developing its own AI chip to power ChatGPT

Provided the models perform well enough in testing, the arrival of a powerful, open-source alternative to OpenAI’s ChatGPT could prove interesting for the cryptocurrency trading world.

As Cointelegraph reported, people are building advanced trading bots on top of the GPT API and new variants that incorporate third-party-tool access, such as BabyAGI and AutoGPT.

The addition of open-source models into the mix could be a boon for tech-savvy traders who don’t want to pay OpenAI’s access premiums.

Those interested can test out a live interface for the 7-billion-parameter StableLM model hosted on HuggingFace. However, as of the time of this article’s publishing, attempts to do so found the website overwhelmed or at capacity.