As part of a growing collection of AI services that help businesses use their files, data cloud service Snowflake has introduced an open source big speech design, Arctic LLM, as part of a growing collection of offerings. Typical use situations include data analysis, including attitude study of reviews, chatbots for customer service or sales, and business intelligence queries, like the removal of income information.
Snowflake’s Arctic is being offered alongside other LLM designs from Meta, Mistral AI, Google and Reka in its Cortex solution, which is only available in limited areas. In June, Snowflake announced that Cortex will be accessible via the AWS Asia Pacific ( Tokyo ) region in APAC and Japan. Consumers are anticipated to benefit from the giving over time both globally and the rest of APAC.
Arctic will also be accessible via hyperscaler Amazon Web Services, as well as other design landscapes and collections used by businesses, which include Hugging Face, Lamini, Microsoft Azure, NVIDIA API library, Perplexity, Up AI and others, according to the business.
What is Snowflake Arctic?
Arctic is Snowflake’s new” state- of the art” LLM, launched in April 2024, designed primarily for enterprise use cases. On several benchmarks, including SQL code generation and instruction following, the company has shared data that shows Arctic performs well against other LLMs.
On a budget of$ 2 million, Snowflake’s Head of AI Baris Gultekin claimed the LLM took three months to build, which was an eighth of the time of some other models. This success allows the model to push the limits of how quickly and affordably an enterprise-grade LLM can be created.
What are the main features of Snowflake Arctic that stand out?
Arctic LLM’s goal is to provide “efficient intelligence,” excels at common enterprise tasks, and is less expensive to use when building custom AI models based on enterprise data. It is also pushing the open source envelope, having been released on an open source Apache 2.0 licence.
The Arctic AI model is designed to specifically address the demand for” conversational SQL data copilots, code pilots, and RAG chatbots” rather than the general-purpose world understanding offered by many other open source LLMs, including Meta’s Llama models.
SEE: Zetaris on the enterprise data mess and the federated data lakes
Capabilities in “enterprise intelligence”
Snowflake created its own “enterprise intelligence” metric to measure the LLM’s performance, which was a combination of coding, SQL generation and instruction following capabilities.
On standard AI model benchmarking tests that challenge and provide a percentage score for LLM models in specific domains of capability, Arctic vs. models from Databricks, Meta, and Mistral came out favorably. When compared to LLMs with higher budgets, the model’s ability to excel in enterprise intelligence was admirable, according to Snowflake.

Training and inference efficiency
Gultekin claimed that the Arctic AI LLM provides enterprise clients with a more affordable way to train custom LLMs based on their own data. Additionally, the model was developed to reduce costs and improve practicality for enterprise deployments through efficient inferencing.
Apache 2.0 is open source.
Snowflake’s decision to use an Apache 2.0 license to make the Arctic LLM open source is partially attributable to what Gultekin described as the AI team’s extensive background in open source. As a result, the company now offers data recipes, research insights, weights, and code.
Snowflake thinks that real open source developer contributions will allow the industry and the product to advance more quickly, while Gultekin said that having a peek behind the scenes would help enterprise customers more trust the model.
What effects will Snowflake Arctic have on the AI sector?
Snowflake’s Arctic release caused a splash in the enterprise data and tech community, thanks to its speed and efficiency and SQL generation capabilities. Gultekin claimed that the company’s decision to “push the envelope on open source” has sparked interest in the research community.
SEE: Our comparison of Snowflake with Azure Synapse Analytics
” This is our first release, and it sets a really good benchmark. There will not be a single winner in the market because all customers are very interested in making their own choices. We have already seen a tonne of usage, and we expect that to continue”, he said.
Does Snowflake have an AI background?
In the past, Snowflake provided a number of machine learning solutions. In order to support the generative AI boom in 2023, it acquired a number of AI companies, including NXYZ, a business where Gultekin served as one of its chief executive officers and co-founder, and Neeva, a data search company. Since then, Snowflake has expanded its generative AI platform, AI search capabilities, and is adding LLM models.