Cloudflare has launched Workers AI, a solution designed to provide the necessary infrastructure for efficiently and cost-effectively scaling and deploying AI models, catering to the upcoming generation of AI applications.
The company has established strategic partnerships to offer GPU access on its global network, enabling low-latency AI inference near users. Paired with the Data Localization Suite, Workers AI helps customers prepare for emerging compliance regulations related to AI.
Cloudflare’s privacy-centric approach ensures that data used for inference remains separate from training LLMs. Additionally, Cloudflare provides a model catalog to assist developers in various AI use cases, such as LLM, speech-to-text, image classification, and sentiment analysis.
“Cloudflare has all the infrastructure developers need to build scalable AI-powered applications, and can now provide AI inference as close to the user as possible. We’re investing to make it easy for every developer to have access to powerful, affordable tools to build the future,” said Matthew Prince, CEO and co-founder of Cloudflare. “Workers AI will empower developers to build production-ready AI experiences efficiently and affordably, and in days, instead of what typically takes entire teams weeks or even months.”
Cloudflare’s new vector database, Vectorize, now enables developers to build full-stack AI applications entirely on Cloudflare: from generating your embeddings with the built-in models in Workers AI and indexing them in Vectorize, to querying them and storing source data in R2.
Lastly, Cloudflare is launching the AI Gateway to enhance the reliability, observability, and scalability of AI applications. AI spending is rapidly growing, with projections reaching $154 billion this year and exceeding $300 billion by 2026, according to the latest forecasts from IDC.
Cloudflare also announced it is partnering with Hugging Face to optimize the most popular models to run on Cloudflare’s largest global cloud network, Microsoft to make it easier for companies to seamlessly deploy AI models, as well as Databricks and Meta.
The post Cloudflare releases platform for AI inference appeared first on SD Times.
from SD Times https://ift.tt/qc3SJpo
Comments
Post a Comment