Mistral Small 3.1 is now available on Ollama, which makes it pretty easy to run on a Mac - the model is 15GB and multi-modal, so it can answer questions about images in addition to other standard LLM stuff. My notes on running it locally here: https://simonwillison.net/2025/Apr/8/mistral-small-31-on-ollama/