Launch Featured Arcee AI, From Small Language Model Pioneer, to Pioneering SLM-Powered Agentic AI Workflows First, we pioneered small language models (SLMs). Now, we're elevating them to their full potential, leveraging them in our end-to-end, easy-to-use agentic AI workflow platform called Arcee Orchestra. Here's a look at how we got started with SLMs, and how we're now taking them to the next level.
Launch Announcing the Arcee Model Engine Public Beta Get direct access to the small language models (SLMs) that power Arcee Orchestra, our new end-to-end, SLM-powered agentic AI platform. Sign up for the public beta of the Arcee Model Engine today.
Featured 7 Billion Reasons to Choose Arcee-Meraj-Mini: The Open-Source Arabic SLM for All Hot on the heels of our top-performing 72B Arabic-language model Arcee-Meraj, we bring you a 7B version: Arcee-Mini-Meraj, which boasts exceptional performance in instruction-following, generating long texts, structured data understanding, and generating structured outputs.
Featured The Power of Non-English LLMs: Meet our Arabic model, Arcee-Meraj (معراج) We've taken our groundbreaking general-purpose model, Arcee Nova, and enhanced it for Arabic – leading to an Arabic-language LLM that's enterprise-ready, with unprecedented text-generation and comprehension capabilities.
Product Do Direct Preference Optimization (DPO) with Arcee AI's training platform Direct Preference Optimization (DPO) is one of the top methods for fine-tuning LLMs... It's available on our model training platform - and today, we bring you support for DPO on our training APIs.
Product Train, Merge, & Domain-Adapt Llama-3.1 with Arcee AI Get Llama-3.1 but better – customize the OS model for all your needs, using Arcee AI's training, merging, and adaptation techniques and tools. Our team created this guide to get you started.
Insights How Do I Prep my Data to Train an LLM? So you want to train a custom language model, and you do have the requisite large set of text data. But how do you know that the data is *really actually ready* for model training? Our researchers here at Arcee AI tell you what to look out for.
Featured Introducing the Ultimate SEC LLM: Revolutionizing Financial Insights We built Llama-3-SEC upon the powerful Meta-Llama-3-70B-Instruct model, with the goal of providing unparalleled insights and analysis capabilities for financial professionals, investors, researchers, and anyone working with SEC filings and related financial data.
Arcee releases new repo of LLMOps tools Check out our new Model-Tools repository on Github: a collection of custom tools, scripts and more to boost your LLMOps capabilities.
Insights When should I use LLMs vs SLMs? When it comes to the world of language models and Gen AI, a key question for companies looking to adopt these innovations is which model(s) to use. As if it’s not already complicated enough with the plethora of foundational models out there, it is now even more daunting
Insights What is an SLM (Small Language Model)? The world of LLMs (Large Language Models) has cooked up a storm in recent years, with the rise of OpenAI’s GPT and the increasing proliferation of open source language models. Much excitement abounds, and virtually everyone and their grandma are mesmerized by the fact that a chat-based LLM can
24 AI influencers to follow for 2024 2024 is poised to be the year of AI adoption across businesses and the consumer sector. At Arcee, our goal is to continue sharing insights from both our team and other domain experts, so that we empower every business with the knowledge and ability to leverage this phenomenal technology. As
Product Langchain+Arcee: build domain models with greater flexibility Combining Arcee’s generators and retrievers with Langchain allows you to build almost any AI application you wish, from using retrievers to fully autonomous systems using chains and agents.
Product Arcee Releases Commercial Product to Contextualize Language Models Emerging out of intensive research and development, the Arcee team is excited to release our commercial product to contextualize language models in the Arcee platform.
Product Arcee Exits Stealth with Open Core LLMs Arcee is excited to exit stealth with an open core to contextualize domain adapted language models (DALMs). Unifying the Retriever and Generator As language modeling techniques have evolved, we have seen an increasing effectiveness in retrieval augmented generation (RAG), where the generator model is provided with relevant context documents from
Introducing Arcee.ai: Elevating Enterprise Language Models with Domain Adaptation (Meet DALM) In a rapidly evolving digital landscape, the power of language models has emerged as a driving force behind the transformation of various industries. Large Language Models (LLMs) like ChatGPT have showcased their incredible potential in comprehending natural language and excelling across a multitude of tasks. However, the journey towards harnessing
Product The Pros and Cons of Rag Systems and Fine-tuning in Natural Language Processing Introduction: Natural Language Processing has seen remarkable advancements in recent years, enabling machines to comprehend and generate human language effectively. Two popular approaches used in NLP are RAG (Retrieval-Augmented Generation) systems and fine-tuning. In this blog post, we will delve into the strengths and weaknesses of both approaches to gain
Product Unveiling DALM: Revolutionizing Enterprises with Domain Adapted Language Model Systems In the rapidly evolving landscape of Generative AI, the quest for more refined and contextually aware language models has led to our development of the Domain Adapted Language Model System (DALM). This innovation promises to reshape how enterprises harness the power of language models tailored to their specific domains. In