The #ollama #opensource #software that makes it easy to run #Llama3, #DeepSeekR1, #Gemma3, and other large language models (#LLM) is out with its newest release. The ollama software makes it easy to leverage the llama.cpp back-end for running a variety of LLMs and enjoying convenient integration with other desktop software.
The new ollama 0.6.2 Release Features Support For #AMD #StrixHalo, a.k.a. #RyzenAI Max+ laptop / SFF desktop SoC.
https://www.phoronix.com/news/ollama-0.6.2