Being able to run LLMs locally and easily is truly a game changer. I have heard about Ollama before and decided to take a look at it this past weekend.
Key questions I'll address are:
- Why is running LLMs locally becoming a hot thang
- What is Ollama?
- Should you use Ollama?