Close Menu
TechurzTechurz

    Subscribe to Updates

    Get the latest creative news from FooBar about art, design and business.

    What's Hot

    Creating a qubit fit for a quantum future

    August 28, 2025

    Anthropic will start training its AI models on chat transcripts

    August 28, 2025

    CrowdStrike buys Onum in agentic SOC push

    August 28, 2025
    Facebook X (Twitter) Instagram
    Trending
    • Creating a qubit fit for a quantum future
    • Anthropic will start training its AI models on chat transcripts
    • CrowdStrike buys Onum in agentic SOC push
    • I asked Google Finance’s AI chatbot what stocks to buy – and its answer surprised me
    • Intel has received $5.7 billion under Trump’s investment deal
    • This Qi2 battery pack from Anker just made wireless charging essential for me
    • Bob Odenkirk’s ‘Nobody 2’ Gets Streaming Date, Report Says
    • Unravelling 5G Complexity: Engaging Students with TIMS-Powered Hands-on Education
    Facebook X (Twitter) Instagram Pinterest Vimeo
    TechurzTechurz
    • Home
    • AI
    • Apps
    • News
    • Guides
    • Opinion
    • Reviews
    • Security
    • Startups
    TechurzTechurz
    Home»Apps»My go-to LLM tool just dropped a super simple Mac and PC app for local AI – why you should try it
    Apps

    My go-to LLM tool just dropped a super simple Mac and PC app for local AI – why you should try it

    TechurzBy TechurzAugust 5, 2025No Comments4 Mins Read
    Share Facebook Twitter Pinterest LinkedIn Tumblr Reddit Telegram Email
    My go-to LLM tool just dropped a super simple Mac and PC app for local AI - why you should try it
    Share
    Facebook Twitter LinkedIn Pinterest Email


    Jack Wallen / Elyse Betters Picaro / ZDNET

    ZDNET’s key takeaways

    • Ollama AI devs have released a native GUI for MacOS and Windows.
    • The new GUI greatly simplifies using AI locally.
    • The app is easy to install, and allows you to pull different LLMs.

    If you use AI, there are several reasons why you would want to work with it locally instead of from the cloud. 

    First, it offers much more privacy. When using a Large Language Model (LLM) in the cloud, you never know if your queries or results are being tracked or even saved by a third party. Also, using an LLM locally saves energy. The amount of energy required to use a cloud-based LLM is growing and could be a problem in the future.

    Ergo, locally hosted LLMs.

    Also: How to run DeepSeek AI locally to protect your privacy – 2 easy ways

    Ollama is a tool that allows you to run different LLMs. I’ve been using it for some time and have found it to simplify the process of downloading and using various models. Although it does require serious system resources (you wouldn’t want to use it on an aging machine), it does run fast, and allows you to use different models.

    But Ollama by itself has been a command-line-only affair. There are some third-party GUIs (such as Msty, which has been my go-to). Until now, the developers behind Ollama hadn’t produced their own GUI.

    That all changed recently, and there’s now a straightforward, user-friendly GUI, aptly named Ollama.

    Works with common LLMs – but you can pull others

    The GUI is fairly basic, but it’s designed so that anyone can jump in right away and start using it. There is also a short list of LLMs that can easily be pulled from the LLM drop-down list. Those models are fairly common (such as the Gemma, DeepSeek, and Qwen models). Select one of those models, and the Ollama GUI will pull it for you. 

    If you want to use a model not listed, you would have to pull it from the command line like so:

    ollama pull MODEL

    Where MODEL is the name of the model you want.

    Also: How I feed my files to a local AI for better, more relevant responses

    You can find a full list of available models in the Ollama Library.

    After you’ve pulled a model, it appears in the drop-down to the right of the query bar.

    The Ollama app is as easy to use as any cloud-based AI interface on the market, and it’s free to use for MacOS and Windows (sadly, there’s no Linux version of the GUI).

    I’ve kicked the tires of the Ollama app and found that, although it doesn’t have quite the feature set of Msty, it’s easier to use and fits in better with the MacOS aesthetic. The Ollama app also seems to be a bit faster than Msty (in both opening and responding to queries), which is a good thing because local AI can often be a bit slow (due to a lack of system resources).

    How to install the Ollama app on Mac or Windows

    You’re in luck, as installing the Ollama app is as easy as installing any app on either MacOS or Windows. You simply point your browser to the Ollama download page, download the app for your OS, double-click the downloaded file, and follow the directions. For example, on MacOS, you drag the Ollama app icon into the Applications folder, and you’re done.

    Using Ollama is equally easy: select the model you want, let it download, then query away.

    Pulling an LLM is as easy as selecting it from the list and letting the app do its thing.

    Jack Wallen/ZDNET

    Should you try the Ollama app?

    If you’ve been looking for a reason to try local AI, now is the perfect time. 

    Also: I tried Sanctum’s local AI app, and it’s exactly what I needed to keep my data private

    The Ollama app makes migrating away from cloud-based AI as easy as it can get. The app is free to install and use, as are the LLMs in the Ollama library. Give this a chance, and see if it doesn’t become your go-to AI tool.

    Want more stories about AI? Check out AI Leaderboard, our weekly newsletter.

    app dropped GoTo LLM Local Mac simple Super Tool
    Share. Facebook Twitter Pinterest LinkedIn Tumblr Email
    Previous ArticleWhy going online in Russia can be complicated and dangerous
    Next Article AI crawlers vs. web defenses: Cloudflare-Perplexity fight reveals cracks in internet trust
    Techurz
    • Website

    Related Posts

    AI

    AI super PACs, the hottest investment in tech

    August 27, 2025
    Security

    Attackers steal data from Salesforce instances via compromised AI live chat tool

    August 27, 2025
    AI

    Google is building a Duolingo rival into the Translate app

    August 26, 2025
    Add A Comment
    Leave A Reply Cancel Reply

    Top Posts

    Start Saving Now: An iPhone 17 Pro Price Hike Is Likely, Says New Report

    August 17, 20258 Views

    You Can Now Get Starlink for $15-Per-Month in New York, but There’s a Catch

    July 11, 20257 Views

    Non-US businesses want to cut back on using US cloud systems

    June 2, 20257 Views
    Stay In Touch
    • Facebook
    • YouTube
    • TikTok
    • WhatsApp
    • Twitter
    • Instagram
    Latest Reviews

    Subscribe to Updates

    Get the latest tech news from FooBar about tech, design and biz.

    Most Popular

    Start Saving Now: An iPhone 17 Pro Price Hike Is Likely, Says New Report

    August 17, 20258 Views

    You Can Now Get Starlink for $15-Per-Month in New York, but There’s a Catch

    July 11, 20257 Views

    Non-US businesses want to cut back on using US cloud systems

    June 2, 20257 Views
    Our Picks

    Creating a qubit fit for a quantum future

    August 28, 2025

    Anthropic will start training its AI models on chat transcripts

    August 28, 2025

    CrowdStrike buys Onum in agentic SOC push

    August 28, 2025

    Subscribe to Updates

    Get the latest creative news from FooBar about art, design and business.

    Facebook X (Twitter) Instagram Pinterest
    • About Us
    • Contact Us
    • Privacy Policy
    • Terms and Conditions
    • Disclaimer
    © 2025 techurz. Designed by Pro.

    Type above and press Enter to search. Press Esc to cancel.