
Arcee
Unlock Seamless Domain Adaptation with Our Specialized Domain Adapted Language Model system - DALM.
Date | Investors | Amount | Round |
---|---|---|---|
- | investor investor | €0.0 | round |
investor investor investor investor investor investor | €0.0 | round | |
* | $24.0m | Series A | |
Total Funding | 000k |
Related Content
Arcee operates in the enterprise artificial intelligence sector, specializing in the development and deployment of small language models (SLMs). The company was established in 2023 by co-founders Mark McQuade (CEO), Brian Benedict, and Jacob Solawetz, who identified enterprise reluctance to adopt generative AI due to security and transparency issues with both closed-source and open-source models. The founding team possesses a diverse background in the tech industry; McQuade has experience from Hugging Face, Benedict from Tecton, and Solawetz from Roboflow, combining expertise in AI, machine learning, sales, and data engineering.
Arcee's core business revolves around providing a platform that enables companies to build, train, and deploy purpose-built AI agents powered by SLMs securely within their own cloud environments. This approach addresses key enterprise concerns by allowing clients to maintain full control over their models, data, and intellectual property. The company targets businesses with highly proprietary data that require specialized, secure, and efficient AI solutions. Its business model includes two primary offerings: Arcee Enterprise, an in-VPC (Virtual Private Cloud) deployment, and Arcee Cloud, a hosted SaaS platform, catering to different scales of operational needs. The company generates revenue through these platforms, with pricing tiers such as "Orchestra Dev" available for individual developers.
The flagship product, Arcee Orchestra, is an end-to-end agentic AI solution that facilitates the creation of custom AI workflows. It allows users to automate complex tasks by routing them to a team of specialized SLMs, which can be integrated with over 200 external systems like Slack, GitHub, and HubSpot. The platform features a continual pre-training layer, a model merging layer (Arcee pioneered the open-source MergeKit library), an alignment layer, and a Retrieval-Augmented Generation (RAG) layer. This enables the creation of highly tailored models for specific use cases such as code review, legal document analysis, or customer service automation. By focusing on SLMs, which range from millions to a few billion parameters, Arcee provides a more cost-effective, faster, and resource-efficient alternative to large language models (LLMs).
Keywords: small language models, SLM, agentic AI, enterprise AI solutions, custom AI models, secure AI, private AI, model merging, MergeKit, Arcee Orchestra, Retrieval-Augmented Generation, RAG, continual pre-training, AI workflow automation, financial services AI, legal tech AI, healthcare AI, Mark McQuade, Brian Benedict, Jacob Solawetz, AI data privacy, domain-specific AI