Cloudera Unveils Six New Accelerators for ML Projects, Enhancing AI Deployment Efficiency

Cloudera Unveils Six New Accelerators for ML Projects, Enhancing AI Deployment Efficiency
Published on
2 min read

Cloudera announced six new Accelerators for ML Projects (AMPs), designed to reduce time-to-value for enterprise AI use cases. The new additions focus on providing enterprises with cutting-edge AI techniques and examples within Cloudera that can assist AI integration and drive more impactful results.

AMPs are end-to-end machine learning (ML) based projects that can be deployed with a single-click directly from the Cloudera platform.  Each AMP encapsulates industry-leading practices for tackling complex ML challenges with workflows that facilitate seamless transitions, no matter where enterprises are running examples or deploying data. 

With its collection of AMPs, Cloudera is committed to making AI more accessible so businesses can accelerate adoption and maximize the value of both their own data and the generated AI outputs. The latest AMPs and updates include:

Fine-Tuning Studio - Provides users with an all-encompassing application and “ecosystem” for managing, fine tuning, and evaluating LLMs.

RAG with Knowledge Graph - A demonstration of how to power a RAG (retrieval augmented generation) application with a knowledge graph to capture relationships and context not easily accessible by vector stores alone.

PromptBrew  - Offers AI-powered assistance to create high-performing and reliable prompts via a simple user interface.

Document Analysis with Cohere CommandR and FAISS - Showcases RAG using CommandR as the LLM and FAISS as the vector store.

Chat with Your Documents - Building upon the previous LLM Chatbot Augmented with Enterprise Data AMP, this accelerator enhances the responses of the LLM using context from an internal knowledge base created from the documents uploaded by the user.

In addition to accelerating AI projects, Cloudera AMPs are fully open source and include deployment instructions for any environment, serving as further testament of Cloudera’s commitment to the open source community.

“While almost every business is experimenting with Generative AI, the technology is still so new that there are very few best practices for enterprises,” said The Futurum Group’s Chief Technology Advisor, Steven Dickens. “As a result, it’s common practice for data scientists and AI engineers to build on existing examples when starting new AI projects. However, there are many drawbacks with this approach, including added security and legal risks. AMPs remove this ambiguity by providing fully built, end-to-end solutions that give data scientists a ready-to-go MVP for various AI use cases that are proven to be effective and able to quickly drive value.”

“In today’s environment, enterprises are constrained with time and resources to get AI projects off the ground,,” said Dipto Chakravarty, Chief Product Officer at Cloudera. “Our AMPs are catalysts to fast-track AI projects from concept to reality with pre-built solutions and working examples, ensuring that use cases are dependable and cost effective, while reducing development time. This enables enterprises to swiftly experience the productivity gains and efficiencies that come from AI initiatives.”

𝐒𝐭𝐚𝐲 𝐢𝐧𝐟𝐨𝐫𝐦𝐞𝐝 𝐰𝐢𝐭𝐡 𝐨𝐮𝐫 𝐥𝐚𝐭𝐞𝐬𝐭 𝐮𝐩𝐝𝐚𝐭𝐞𝐬 𝐛𝐲 𝐣𝐨𝐢𝐧𝐢𝐧𝐠 𝐭𝐡𝐞 WhatsApp Channel now! 👈📲

𝑭𝒐𝒍𝒍𝒐𝒘 𝑶𝒖𝒓 𝑺𝒐𝒄𝒊𝒂𝒍 𝑴𝒆𝒅𝒊𝒂 𝑷𝒂𝒈𝒆𝐬 👉 FacebookLinkedInTwitterInstagram

Related Stories

No stories found.
logo
DIGITAL TERMINAL
digitalterminal.in