
Sometimes plugging online just to use an app is a little annoying. May be you are on metered connections and you are low on cash. Or you are far from the nearest cybercafe or your Internet is down. Or perhaps you just don't want to plug into it at all. But you really need to generate something using AI. How about having your own model installed on your machine and chatting away just like you do with ChatGPT?
Enter Ollama. Ollama gives you access to open LLM AI models that you can install on your machine and use without going online. You do not even need an internet connection to run it (just on its setup, models installation and the regular software updates). We'll show you how to set up Ollama and how to setup an AI LLM model on your machine but first ... the system requirements. We are assuming you have:
That's it. Now let's start. You will need an internet connection to setup everything. And it should be enough since the Ollama setup is some hundreds of MB alone as it comes with several heavy libraries with it. We are also assuming you'll be using a Windows machine.
1. Head over to https://ollama.com/download to download the Ollama setup. You can either use Powershell or the downloadable executable to install it.
2. Once setup on your machine you are now ready to install the AI LLM model of your choice. Head over to https://ollama.com/search and look for the model you would like to install. We'll use the DeepSeek RI model (https://ollama.com/library/deepseek-r1) here since we are techies. The page has Powershell commands that you can use to download an AI LLM model to your machine. For example to download and run DeepSeek R1 use command:
ollama run deepseek-r1
Once set up you can proceed to ask the model questions and watch it "thinking".
"What about a ChatGPT-like page since I don't like command stuff?"
I can hear you asking this. Once installed Ollama comes with its own ChatGPT-like user interface. Navigate to your system tray then right-click on the Ollama icon. In the context menu that pops up choose "Open Ollama". Now you have a cool UI as shown below.
On the chatbox you can select the AI LLM model you have just installed or even download another model. On this interface you can "chat" with your model that you have just installed without any internet connection. And if you are coder who is into pair programming you can use it locally to give you code snippets, debugging without connecting to the internet.
For advanced users, you can use Ollama's cloud models that offer significant reasoning capabilities and fast processing. You can also use it to:
It is all up to you.
Years of Experience
Happy Clients
Web Systems Built