Howto ollama: Difference between revisions

From Vidalinux Wiki
Jump to navigation Jump to search
No edit summary
Tag: Manual revert
 
(7 intermediate revisions by the same user not shown)
Line 5: Line 5:
install ollama dependencies:
install ollama dependencies:
  pacman -S ollama ollama-cuda nvidia-container-toolkit docker docker-compose
  pacman -S ollama ollama-cuda nvidia-container-toolkit docker docker-compose
= systemd =
= service =
copy systemd ollama daemon to /etc/systemd/system/:
copy systemd ollama daemon to /etc/systemd/system/:
  cp /usr/lib/systemd/system/ollama.service /etc/systemd/system/
  cp /usr/lib/systemd/system/ollama.service /etc/systemd/system/
Line 27: Line 27:
to remove a model:
to remove a model:
  ollama rm deepseek-r1:8b
  ollama rm deepseek-r1:8b
to show info about model:
ollama show deepseek-r1:8b


= create your own ssl cert =
= create your own ssl cert =
Line 57: Line 59:
deploy services using docker-compose:
deploy services using docker-compose:
  docker-compose up -d
  docker-compose up -d
add your domain to /etc/hosts file:
echo "127.0.0.1 chat.example.com"
to enter openwebui interface:
to enter openwebui interface:
  https://chat.ovoxcloud.com
  https://chat.example.com
 
= vscodium =
for development install vscodium, fork of vscode without telemetry:
pacman -S vscodium --noconfirm
when installed, go to extensions and install:
continue
select your own ai model:
ollama

Latest revision as of 02:12, 21 August 2025

install packages

  • archlinux

install nvidia drivers:

https://wiki.vidalinux.org/index.php?title=Howto_install_nvidia_drivers#archlinux

install ollama dependencies:

pacman -S ollama ollama-cuda nvidia-container-toolkit docker docker-compose

service

copy systemd ollama daemon to /etc/systemd/system/:

cp /usr/lib/systemd/system/ollama.service /etc/systemd/system/

edit /etc/systemd/system/ollama.service and add the following:

Environment="OLLAMA_HOST=0.0.0.0"

reload systemd daemon:

systemctl daemon-reload

restart ollama.service

systemctl restart ollama.service

verify ollama is listening on all interfaces:

netstat -tnulp|grep 11434

ollama

to search for supported models on ollama:

https://ollama.com/search

to pull a model:

ollama pull deepseek-r1:8b

to list models installed:

ollama list

to run a model:

ollama run deepseek-r1:8b

to remove a model:

ollama rm deepseek-r1:8b

to show info about model:

ollama show deepseek-r1:8b

create your own ssl cert

create rsa key:

openssl genrsa -out server.key 3072

create certificate csr:

openssl req -new -key server.key -out server.csr

fill the following blanks:

Country Name (2 letter code) []: US
State or Province Name (full name) []: Puerto Rico
Locality Name (eg, city) []: San Juan
Organization Name (eg, company) []: Vidalinux.com Corp.
Organizational Unit Name (eg, section) []: Linux Consulting
Common Name (eg, your name or your server's hostname) []: chat.example.com
Email Address []: chat@example.com
Please enter the following 'extra' attributes
to be sent with your certificate request
A challenge password []: just press enter
An optional company name []: just press enter

create the certificate:

openssl x509 -req -days 365 -in server.csr -signkey server.key -out server.crt

openwebui

clone git repo:

git clone https://github.com/vidalinux/ollama.git

copy server.key and server.crt to nginx/ssl/

cd ollama/openwebui
mkdir nginx/ssl
cp server.key server.crt nginx/ssl/

deploy services using docker-compose:

docker-compose up -d

add your domain to /etc/hosts file:

echo "127.0.0.1 chat.example.com"

to enter openwebui interface:

https://chat.example.com

vscodium

for development install vscodium, fork of vscode without telemetry:

pacman -S vscodium --noconfirm

when installed, go to extensions and install:

continue

select your own ai model:

ollama