Close Menu
TechurzTechurz

    Subscribe to Updates

    Get the latest creative news from FooBar about art, design and business.

    What's Hot

    58% of CISOs are boosting AI security budgets

    October 15, 2025

    Enhanced Games founder on the controversial ‘future of sports’

    October 15, 2025

    3 days left: Save up to $624 on your Disrupt 2025 Pass

    October 15, 2025
    Facebook X (Twitter) Instagram
    Trending
    • 58% of CISOs are boosting AI security budgets
    • Enhanced Games founder on the controversial ‘future of sports’
    • 3 days left: Save up to $624 on your Disrupt 2025 Pass
    • Your next toilet could tell you to drink more water – here’s how it’ll know
    • Liberate bags $50M at $300M valuation to bring AI deeper into insurance back offices
    • Chinese Threat Group ‘Jewelbug’ Quietly Infiltrated Russian IT Network for Months
    • Eightfold co-founders raise $35M for Viven, an AI digital twin startup for querying unavailable coworkers
    • Introducing MAESTRO: A framework for securing generative and agentic AI
    Facebook X (Twitter) Instagram Pinterest Vimeo
    TechurzTechurz
    • Home
    • AI
    • Apps
    • News
    • Guides
    • Opinion
    • Reviews
    • Security
    • Startups
    TechurzTechurz
    Home»AI»Arcee opens up new enterprise-focused, customizable AI model AFM-4.5B trained on ‘clean, rigorously filtered data’
    AI

    Arcee opens up new enterprise-focused, customizable AI model AFM-4.5B trained on ‘clean, rigorously filtered data’

    TechurzBy TechurzAugust 1, 2025No Comments7 Mins Read
    Share Facebook Twitter Pinterest LinkedIn Tumblr Reddit Telegram Email
    Acree opens up new enterprise-focused, customizable AI model AFM-4.5B trained on 'clean, rigorously filtered data'
    Share
    Facebook Twitter LinkedIn Pinterest Email

    Want smarter insights in your inbox? Sign up for our weekly newsletters to get only what matters to enterprise AI, data, and security leaders. Subscribe Now

    Arcee.ai, a startup focused on developing small AI models for commercial and enterprise use, is opening up its own AFM-4.5B model for limited free usage by small companies — posting the weights on Hugging Face and allowing enterprises that make less than $1.75 million in annual revenue to use it without charge under a custom “Arcee Model License.“

    Designed for real-world enterprise use, the 4.5-billion-parameter model — much smaller than the tens of billions to trillions of leading frontier models — combines cost efficiency, regulatory compliance, and strong performance in a compact footprint.

    AFM-4.5B was one of a two part release made by Arcee last month, and is already “instruction tuned,” or an “instruct” model, which is designed for chat, retrieval, and creative writing and can be deployed immediately for these use cases in enterprises. Another base model was also released at the time that was not instruction tuned, only pre-trained, allowing more customizability by customers. However, both were only available through commercial licensing terms — until now.

    Arcee’s chief technology officer (CTO) Lucas Atkins also noted in a post on X that more “dedicated models for reasoning and tool use are on the way,” as well.

    The AI Impact Series Returns to San Francisco – August 5

    The next phase of AI is here – are you ready? Join leaders from Block, GSK, and SAP for an exclusive look at how autonomous agents are reshaping enterprise workflows – from real-time decision-making to end-to-end automation.

    Secure your spot now – space is limited: https://bit.ly/3GuuPLF

    “Building AFM-4.5B has been a huge team effort, and we’re deeply grateful to everyone who supported us We can’t wait to see what you build with it,” he wrote in another post. “We’re just getting started. If you have feedback or ideas, please don’t hesitate to reach out at any time.”

    The model is available now for deployment across a variety of environments —from cloud to smartphones to edge hardware.

    It’s also geared toward Arcee’s growing list of enterprise customers and their needs and wants — specifically, a model trained without violating intellectual property.

    As Arcee wrote in its initial AFM-4.5B announcement post last month: “Tremendous effort was put towards excluding copyrighted books and material with unclear licensing.”

    Arcee notes it worked with third-party data curation firm DatologyAI to apply techniques like source mixing, embedding-based filtering, and quality control — all aimed at minimizing hallucinations and IP risks.

    Focused on enterprise customer needs

    AFM-4.5B is Arcee.ai’s response to what it sees as major pain points in enterprise adoption of generative AI: high cost, limited customizability, and regulatory concerns around proprietary large language models (LLMs).

    Over the past year, the Arcee team held discussions with more than 150 organizations, ranging from startups to Fortune 100 companies, to understand the limitations of existing LLMs and define their own model goals.

    According to the company, many businesses found mainstream LLMs — such as those from OpenAI, Anthropic, or DeepSeek — too expensive and difficult to tailor to industry-specific needs. Meanwhile, while smaller open-weight models like Llama, Mistral, and Qwen offered more flexibility, they introduced concerns around licensing, IP provenance, and geopolitical risk.

    AFM-4.5B was developed as a “no-trade-offs” alternative: customizable, compliant, and cost-efficient without sacrificing model quality or usability.

    AFM-4.5B is designed with deployment flexibility in mind. It can operate in cloud, on-premise, hybrid, or even edge environments—thanks to its efficiency and compatibility with open frameworks such as Hugging Face Transformers, llama.cpp, and (pending release) vLLM.

    The model supports quantized formats, allowing it to run on lower-RAM GPUs or even CPUs, making it practical for applications with constrained resources.

    Company vision secures backing

    Arcee.ai’s broader strategy focuses on building domain-adaptable, small language models (SLMs) that can power many use cases within the same organization.

    As CEO Mark McQuade explained in a VentureBeat interview last year, “You don’t need to go that big for business use cases.” The company emphasizes fast iteration and model customization as core to its offering.

    This vision gained investor backing with a $24 million Series A round back in 2024.

    Inside AFM-4.5B’s architecture and training process

    The AFM-4.5B model uses a decoder-only transformer architecture with several optimizations for performance and deployment flexibility.

    It incorporates grouped query attention for faster inference and ReLU² activations in place of SwiGLU to support sparsification without degrading accuracy.

    Training followed a three-phase approach:

    • Pretraining on 6.5 trillion tokens of general data
    • Midtraining on 1.5 trillion tokens emphasizing math and code
    • Instruction tuning using high-quality instruction-following datasets and reinforcement learning with verifiable and preference-based feedback

    To meet strict compliance and IP standards, the model was trained on nearly 7 trillion tokens of data curated for cleanliness and licensing safety.

    A competitive model, but not a leader

    Despite its smaller size, AFM-4.5B performs competitively across a broad range of benchmarks. The instruction-tuned version averages a score of 50.13 across evaluation suites such as MMLU, MixEval, TriviaQA, and Agieval—matching or outperforming similar-sized models like Gemma-3 4B-it, Qwen3-4B, and SmolLM3-3B.

    Multilingual testing shows the model delivers strong performance across more than 10 languages, including Arabic, Mandarin, German, and Portuguese.

    According to Arcee, adding support for additional dialects is straightforward due to its modular architecture.

    AFM-4.5B has also shown strong early traction in public evaluation environments. In a leaderboard that ranks conversational model quality by user votes and win rate, the model ranks third overall, trailing only Claude Opus 4 and Gemini 2.5 Pro.

    It boasts a win rate of 59.2% and the fastest latency of any top model at 0.2 seconds, paired with a generation speed of 179 tokens per second.

    Built-in support for agents

    In addition to general capabilities, AFM-4.5B comes with built-in support for function calling and agentic reasoning.

    These features aim to simplify the process of building AI agents and workflow automation tools, reducing the need for complex prompt engineering or orchestration layers.

    This functionality aligns with Arcee’s broader strategy of enabling enterprises to build custom, production-ready models faster, with lower total cost of ownership (TCO) and easier integration into business operations.

    What’s next for Arcee?

    AFM-4.5B represents Arcee.ai’s push to define a new category of enterprise-ready language models: small, performant, and fully customizable, without the compromises that often come with either proprietary LLMs or open-weight SLMs.

    With competitive benchmarks, multilingual support, strong compliance standards, and flexible deployment options, the model aims to meet enterprise needs for speed, sovereignty, and scale.

    Whether Arcee can carve out a lasting role in the rapidly shifting generative AI landscape will depend on its ability to deliver on this promise. But with AFM-4.5B, the company has made a confident first move.

    Correction: This piece originally misspelled Arcee’s name in several places. We’ve since updated the article to correct it and regret the errors.

    Daily insights on business use cases with VB Daily

    If you want to impress your boss, VB Daily has you covered. We give you the inside scoop on what companies are doing with generative AI, from regulatory shifts to practical deployments, so you can share insights for maximum ROI.

    Read our Privacy Policy

    Thanks for subscribing. Check out more VB newsletters here.

    An error occured.

    AFM4.5B Arcee Clean Customizable data enterprisefocused filtered model opens rigorously trained
    Share. Facebook Twitter Pinterest LinkedIn Tumblr Email
    Previous ArticleI am a retailing expert, and this little device is costing retailers worldwide billions in lost sales
    Next Article The DJI Mini 4K drone is on sale for $249 for Prime members
    Techurz
    • Website

    Related Posts

    Security

    npm, PyPI, and RubyGems Packages Found Sending Developer Data to Discord Channels

    October 14, 2025
    Security

    Satellites Are Leaking the World’s Secrets: Calls, Texts, Military and Corporate Data

    October 14, 2025
    Security

    Buying an Android smartwatch? I found a model that’s highly functional and affordable

    October 13, 2025
    Add A Comment
    Leave A Reply Cancel Reply

    Top Posts

    The Reason Murderbot’s Tone Feels Off

    May 14, 20259 Views

    Start Saving Now: An iPhone 17 Pro Price Hike Is Likely, Says New Report

    August 17, 20258 Views

    CNET’s Daily Tariff Price Tracker: I’m Keeping Tabs on Changes as Trump’s Trade Policies Shift

    May 27, 20258 Views
    Stay In Touch
    • Facebook
    • YouTube
    • TikTok
    • WhatsApp
    • Twitter
    • Instagram
    Latest Reviews

    Subscribe to Updates

    Get the latest tech news from FooBar about tech, design and biz.

    Most Popular

    The Reason Murderbot’s Tone Feels Off

    May 14, 20259 Views

    Start Saving Now: An iPhone 17 Pro Price Hike Is Likely, Says New Report

    August 17, 20258 Views

    CNET’s Daily Tariff Price Tracker: I’m Keeping Tabs on Changes as Trump’s Trade Policies Shift

    May 27, 20258 Views
    Our Picks

    58% of CISOs are boosting AI security budgets

    October 15, 2025

    Enhanced Games founder on the controversial ‘future of sports’

    October 15, 2025

    3 days left: Save up to $624 on your Disrupt 2025 Pass

    October 15, 2025

    Subscribe to Updates

    Get the latest creative news from FooBar about art, design and business.

    Facebook X (Twitter) Instagram Pinterest
    • About Us
    • Contact Us
    • Privacy Policy
    • Terms and Conditions
    • Disclaimer
    © 2025 techurz. Designed by Pro.

    Type above and press Enter to search. Press Esc to cancel.