An LLM on a home computer

I saw How to run an LLM on your PC, not in the cloud, in less than 10 minutes and decided to try it on my Linux desktop system. It took a lot more than 10 minutes but I got it to work.

Looking at the BIOS, the system dates from 2013. Here are the details:

gmcdavid@Tyrone:~$ neofetch
  .-/+oossssoo+/-.               gmcdavid@Tyrone 
`:+ssssssssssssssssss+:`           --------------- 
-+ssssssssssssssssssyyssss+-         OS: Ubuntu 22.04.4 LTS x86_64 
.ossssssssssssssssssdMMMNysssso.       Host: HP ProDesk 600 G1 SFF 
/ssssssssssshdmmNNmmyNMMMMhssssss/      Kernel: 6.5.0-18-generic 
+ssssssssshmydMMMMMMMNddddyssssssss+     Uptime: 15 days, 19 hours, 23 mins 
/sssssssshNMMMyhhyyyyhmNMMMNhssssssss/    Packages: 2781 (dpkg), 12 (snap) 
.ssssssssdMMMNhsssssssssshNMMMdssssssss.   Shell: bash 5.1.16 
+sssshhhyNMMNyssssssssssssyNMMMysssssss+   Resolution: 1600x900 
ossyNMMMNyMMhsssssssssssssshmmmhssssssso   DE: GNOME 42.9 
ossyNMMMNyMMhsssssssssssssshmmmhssssssso   WM: Mutter 
+sssshhhyNMMNyssssssssssssyNMMMysssssss+   WM Theme: Adwaita 
.ssssssssdMMMNhsssssssssshNMMMdssssssss.   Theme: Yaru [GTK2/3] 
/sssssssshNMMMyhhyyyyhdNMMMNhssssssss/    Icons: Yaru [GTK2/3] 
+sssssssssdmydMMMMMMMMddddyssssssss+     Terminal: gnome-terminal 
/ssssssssssshdmNNNNmyNMMMMhssssss/      CPU: Intel i3-4160 (4) @ 3.600GHz 
.ossssssssssssssssssdMMMNysssso.       GPU: Intel 4th Generation Core Proce 
-+sssssssssssssssssyyyssss+-         Memory: 9055MiB / 31995MiB 
`:+ssssssssssssssssss+:`
  .-/+oossssoo+/-.                                       
 

How to get the GPU info? suggested that I run sudo lshw -C display, which reported:

  *-display                 
       description: VGA compatible controller
       product: 4th Generation Core Processor Family Integrated Graphics Controller
       vendor: Intel Corporation
  [several lines omitted]      

which means the GPU is integrated into the CPU and is probably not very powerful. Not surprising. This is an old machine that I bought refurbished. I added additional RAM to get it up to 32GB, and it works well for what I usually do, but running AI may be a stretch.

The Register article said it could be intstalled by a curl command, so I tried it:

  gmcdavid@Tyrone:~$ curl -fsSL https://ollama.com/install.sh | sh
  Command 'curl' not found, but can be installed with:
  sudo snap install curl  # version 8.1.2, or
  sudo apt  install curl  # version 7.81.0-1ubuntu1.14
  See 'snap info curl' for additional versions.
  gmcdavid@Tyrone:~$ sudo snap install curl
  [sudo] password for gmcdavid: 
  curl 8.1.2 from Wouter van Bommel (woutervb) installed
  gmcdavid@Tyrone:~$ curl -fsSL https://ollama.com/install.sh | sh
  >>> Downloading ollama...
  ######################################################################## 100.0%#Warning: Failed to open the file /tmp/tmp.4G2j4U60AW/ollama: No such file or  
  Warning: directory
                                                                             0.0%curl: (23) Failure writing output to destination  

So I tried the step-by-step procedures in Ollama on Linux. Still got errors

  gmcdavid@Tyrone:~$ sudo curl -L https://ollama.com/download/ollama-linux-amd64 -o /usr/bin/ollama
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed
100   113  100   113    0     0    564      0 --:--:-- --:--:-- --:--:--   565
  0     0    0     0    0     0      0      0 --:--:-- --:--:-- --:--:--     0
Warning: Failed to open the file /usr/bin/ollama: Read-only file system
  0  281M    0  2747    0     0   4615      0 17:44:54 --:--:-- 17:44:54  4615

I really do not understand curl. However, I noticed the piece https://ollama.com/download/ollama-linux-amd64 and simply entered that into Firefox. That worked, and ollama-linux-amd64 appeared in my download directory after a short wait, since it is a 295MB file. I then manually copied it to /usr/bin/ and renamed it to be simply ollama.

gmcdavid@Tyrone:/usr/bin$ sudo chown root:root ollama  

fixed the file ownership, after which I followed the other steps in Ollama on Linux:

  sudo useradd -r -s /bin/false -m -d /usr/share/ollama ollama 

  cd /etc/systemd/system
   
  
  sudo nano ollama.service [creating the file with the text from Ollama on Linux]
  
  
  sudo systemctl daemon-reload
  sudo systemctl enable ollama
  
  sudo systemctl start ollama
  
  gmcdavid@Tyrone:/etc/systemd/system$ sudo nano ollama.service
  gmcdavid@Tyrone:/etc/systemd/system$ sudo systemctl daemon-reload
  gmcdavid@Tyrone:/etc/systemd/system$ sudo systemctl enable ollama
  Created symlink /etc/systemd/system/default.target.wants/ollama.service → /etc/systemd/system/ollama.service.
  gmcdavid@Tyrone:/etc/systemd/system$ sudo systemctl start ollama  

Success! Then, following the Register article, I downloaded and started the mistral model. The download took a while.

gmcdavid@Tyrone:/etc/systemd/system$ ollama run mistral
pulling manifest 
pulling e8a35b5937a5... 100% ▕████████████████▏ 4.1 GB                         
pulling 43070e2d4e53... 100% ▕████████████████▏  11 KB                         
pulling e6836092461f... 100% ▕████████████████▏   42 B                         
pulling ed11eda7790d... 100% ▕████████████████▏   30 B                         
pulling f9b1e3196ecf... 100% ▕████████████████▏  483 B                         
verifying sha256 digest 
writing manifest 
removing any unused layers 
success 
>>> Who are you and what is your purpose?
 I am an artificial intelligence designed to assist and communicate with users. My purpose is to help answer questions, provide information, and complete tasks as requested by users to make their lives easier and more productive. I can perform various functions such as setting reminders, managing calendars, sending emails, making reservations, providing news 
updates, and much more. I am always available 24/7 and constantly learning and improving to better assist my users.

>>> /?
Available Commands:
  /set            Set session variables
  /show           Show model information
  /load    Load a session or model
  /save    Save your current session
  /bye            Exit
  /?, /help       Help for a command
  /? shortcuts    Help for keyboard shortcuts

Use """ to begin a multi-line message.

>>> /bye

It was a little slow answering my questions “Who are you and what is your purpose?” It was sort of like using a 300 Baud dial-up system (which I did decades ago), but it works. I reran it. It did not need to downlad mistral again, which saved some time. However, it still ran slowly.

  gmcdavid@Tyrone:~$ ollama run mistral
  
  >>> Who made you?
   I was created by a team of artificial intelligence engineers and developers. They designed and programmed me with advanced natural language processing and machine learning capabilities to enable effective communication and task assistance.
  
  >>> What is the meaning of life?
   The meaning of life is a philosophical question that has been debated throughout history. Some people believe that life has no inherent meaning, and it is up to each individual to create their own purpose. Others hold various religious or spiritual beliefs that provide meaning through an afterlife or connection to a higher power. 
  Ultimately, the answer to this question depends on personal beliefs and values. It may be helpful to reflect on what brings joy, fulfillment, and satisfaction in your own life to discover your unique sense of purpose.

 >>> /bye
  gmcdavid@Tyrone:~$ sudo systemctl stop ollama
  [sudo] password for gmcdavid: 
  gmcdavid@Tyrone: $ ollama run mistral
  Error: could not connect to ollama app, is it running?
  gmcdavid@Tyrone:~$

The last few lines are where I stopped the program and verified that it was not running. I am still a little nervous about this creature 🙂

Leave a comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.