We had a jam-packed week alongside more than 60,000 attendees at Amazon Web Services (AWS) re:Invent, one of the largest hands-on conferences in the cloud computing industry. Engaging with partners and customers — and showcasing what’s new on the Snowflake product front — made for a dynamic time in Las Vegas. Here are highlights from the collaborations, integrations and product enhancements that we were proud to dig in to throughout the week.
Bring gen AI to your governed enterprise data with Snowflake Cortex
No matter your industry, department or role, you can leverage generative AI (gen AI) and large language models (LLMs) to increase efficiencies and uncover new solutions to business challenges. But without a governed data foundation, you can’t trust results or unlock all that’s possible with these breakaway technologies.
To ensure data remains protected from unintended use, Snowflake Cortex (now in private preview) gives users access to industry-leading LLMs (e.g., Llama 2) to equip them to quickly analyze data and build differentiated AI applications. By bringing LLMs to secure and governed data, you eliminate the need to manage integrations or operate separate graphics processing unit (GPU) infrastructure, meaning users of any technical skill level can create value with gen AI technologies within seconds — all within Snowflake’s security perimeter.
When Snowflake Cortex is paired with the Streamlit front-end python library, developers can create chatbots and other LLM-powered experiences in a matter of minutes. And with Snowpark Container Services (coming to public preview soon in select AWS regions), they’ll have full flexibility in terms of running inference with any open-source model, and will be able to fine-tune foundation models based on proprietary data in a matter of hours, significantly reducing the time to value.
This additional Snowpark runtime enables developers to seamlessly deploy, manage and scale custom containers (including those incorporating LLMs) inside their Snowflake-managed infrastructure with GPU instances. When working with less sensitive data or when the right set of governance and security controls is implemented on an AWS account, developers can also use their Snowflake data with LLMs from services such as Amazon Bedrock, which can easily be integrated using Snowpark external access functionality currently in public preview. By integrating Snowflake with LLMs from external services like Amazon Bedrock, developers can choose from a wide array of LLMs to plug-and-play based on their specific use cases.
Accelerate machine learning and AI workflows with Snowflake, Amazon SageMaker and Amazon Bedrock
Amazon SageMaker is a popular machine learning (ML) platform used by developers to create, train and deploy models for a wide variety of use cases such as sales forecasting and fraud detection. Snowflake already integrates deeply with many Amazon SageMaker features, and at AWS re:Invent we showcased additional functionality with Amazon SageMaker Canvas, a workspace for ML solutions. Snowflake users can connect directly from Amazon SageMaker Canvas to generate accurate ML predictions without any code required. The connector also enables users to expedite feature engineering and build ML models while avoiding unnecessary data copying. In addition, Snowflake users can more quickly create custom models with imported data by accessing ready-to-use foundational models from Amazon Bedrock and Amazon SageMaker Jumpstart.
Developers can now also integrate Snowpark — Snowflake’s toolkit for running Python and other languages in Snowflake— with Amazon SageMaker Studio Notebooks. This lets them leverage the familiar development interface of a notebook while directing complex data preparation and feature engineering steps to run in Snowflake (rather than having to copy and manage copies of data inside their notebook instance). Beyond all this, the integration between Snowflake and Amazon SageMaker simplifies data architectures and enhances data security to help users transform information into actionable insights.
To learn more about these recent developments on generative AI and integrations with AWS, tune in to BUILD, Snowflake’s upcoming virtual conference for developers.
Develop reliable streaming pipelines with Snowpipe Streaming and Amazon Managed Streaming for Kafka (Amazon MSK)
Creating AI solutions involves a wide range of data, types and more often than not, data is needed in real time to ensure software is up to date and running smoothly. A Snowflake theater session at this year’s re:Invent showcased how a financial services provider uses Snowpipe Streaming and Amazon MSK to apply real-time data to their dashboards, ML models and applications. Since Snowpipe Streaming is designed for ingesting time-series data with near-real time latencies, using it in conjunction with Amazon MSK equips developers to achieve an attractive cost/latency profile with a united infrastructure while establishing streamlined pipelines for batch and streaming data. Fewer pipelines to manage, customers benefit from a simplified infrastructure with a lower total cost of ownership.
Take advantage of award-winning application development
Each year, through their AWS Partner Award program, AWS recognizes “partners who have demonstrated outstanding results and innovation using AWS products and solutions.” And Snowflake is honored to have been selected as the Global ISV Data and Analytics Partner of the Year for 2023. In awarding this distinction, AWS lauded Snowflake for its transformative impact on how developers build, deliver, distribute and operate their applications through the Snowflake Native App Framework. Customers are building and monetizing applications using the Snowflake Native App Framework on AWS — benefiting from the security, governance and network of the Snowflake Data Cloud.
To foster further application development with Snowflake and AWS, Snowflake recently launched the Powered by Snowflake Funding Program, which intends to invest up to $100 million (U.S.) toward the next generation of early-stage startups building Snowflake Native Apps on AWS. The program will provide select startup participants with up to $1 million in free Snowflake credits on AWS over four years. Learn more about the program and how to apply here.
The post Snowflake’s AWS re:Invent Highlights for Fast-Tracking ML, Gen AI and Application Innovations appeared first on Snowflake.