Porównaj commity

...

5 Commity

Autor SHA1 Wiadomość Data
LRVT dfd7ff1d7a
Update README.md 2024-04-23 14:01:17 +02:00
LRVT a6c2acd44a
Update README.md 2024-04-23 12:54:28 +02:00
LRVT c6942a02f6
Update README.md 2024-04-23 12:44:27 +02:00
LRVT fbca5ac56d
Update README.md 2024-04-23 12:41:10 +02:00
L4RM4ND ead126fdb4 add llms 2024-04-23 12:40:10 +02:00
5 zmienionych plików z 105 dodań i 0 usunięć

Wyświetl plik

@ -61,6 +61,7 @@ docker compose up
- [Games and Control Panels](#games-and-control-servers)
- [Genealogy](#genealogy)
- [Identity Management - Single Sign-On (SSO) & LDAP](#identity-management---single-sign-on-sso--ldap)
- [LLM & AI](#large-language-models--ai)
- [Miscellaneous](#miscellaneous)
- [Money, Budgeting & Management](#money-budgeting--management)
- [Note-taking & Editors](#note-taking--editors)
@ -122,6 +123,15 @@ A [proxy](https://en.wikipedia.org/wiki/Proxy_server) is a server application th
- [Keycloak](https://github.com/keycloak/keycloak-containers/tree/main/docker-compose-examples) - Keycloak is an open-source Identity and Access Management (IAM) solution for modern applications and services.
- [lldap](examples/lldap) - lldap is a lightweight authentication server that provides an opinionated, simplified LDAP interface for authentication. It integrates with many backends, from KeyCloak to Authelia to Nextcloud and more.
### Large Language Models & AI
**[`^ back to top ^`](#-project-list)**
A [Large Language Model (LLM)](https://en.wikipedia.org/wiki/Large_language_model) is a language model notable for its ability to achieve general-purpose language generation and other natural language processing tasks such as classification. LLMs can be used for text generation, a form of generative [AI](https://en.wikipedia.org/wiki/Artificial_intelligence), by taking an input text and repeatedly predicting the next token or word.
- [Ollama + Open WebUI](examples/ollama-ui) - Get up and running with Llama 3, Mistral, Gemma, and other large language models using Ollama. Using an interactive, user-friendly WebUI via Open WebUI (formerly known as Ollama WebUI).
- [Serge](examples/serge) - A web interface for chatting with Alpaca through llama.cpp. Fully dockerized, with an easy to use API.
### Virtual Private Network (VPN) & Remote Access
**[`^ back to top ^`](#-project-list)**

Wyświetl plik

@ -0,0 +1,23 @@
# References
- https://github.com/ollama/ollama
- https://hub.docker.com/r/ollama/ollama
- https://github.com/open-webui/open-webui
# Notes
You can spawn Ollama first and then download the [respective LLM models](https://ollama.com/library) via docker exec. Alternatively, spawn the whole stack directly and download LLM models within Open WebUI using a browser.
````
# spawn ollama and ui
docker compose up -d
# (optional) download an llm model via docker exec
docker exec ollama ollama run llama3:8b
````
Afterwards, we can browse Open WebUI on `http://127.0.0.1:8080` and register our first user account. You may want to disable open user registration later on by uncommenting the env `ENABLE_SIGNUP` variable and restarting the Open WebUI container.
> [!TIP]
>
> You likely want to pass a GPU into the Ollama container. Please read [this](https://hub.docker.com/r/ollama/ollama).

Wyświetl plik

@ -0,0 +1,39 @@
services:
ui:
image: ghcr.io/open-webui/open-webui:main
container_name: ollama-ui
restart: always
ports:
- 8080
expose:
- 8080
volumes:
- ${DOCKER_VOLUME_STORAGE:-/mnt/docker-volumes}/ollama/open-webui:/app/backend/data
environment:
#- "ENABLE_SIGNUP=false"
- "OLLAMA_BASE_URL=http://ollama:11434"
#networks:
# - proxy
#labels:
# - traefik.enable=true
# - traefik.docker.network=proxy
# - traefik.http.routers.ollama-ui.rule=Host(`ai.example.com`)
# - traefik.http.services.ollama-ui.loadbalancer.server.port=8080
# # Optional part for traefik middlewares
# - traefik.http.routers.ollama-ui.middlewares=local-ipwhitelist@file,authelia@docker
ollama:
image: ollama/ollama:latest
container_name: ollama
restart: always
expose:
- 11434
volumes:
- ${DOCKER_VOLUME_STORAGE:-/mnt/docker-volumes}/ollama/data:/root/.ollama
#networks:
# - proxy
#networks:
# proxy:
# external: true

Wyświetl plik

@ -0,0 +1,7 @@
# References
- https://github.com/serge-chat/serge
# Notes
TBD

Wyświetl plik

@ -0,0 +1,26 @@
services:
serge:
image: ghcr.io/serge-chat/serge:main
container_name: serge
restart: unless-stopped
ports:
- 8008
expose:
- 8008
volumes:
- ${DOCKER_VOLUME_STORAGE:-/mnt/docker-volumes}/serge/weights:/usr/src/app/weights
- ${DOCKER_VOLUME_STORAGE:-/mnt/docker-volumes}/serge/datadb:/data/db/
#networks:
# - proxy
#labels:
# - traefik.enable=true
# - traefik.docker.network=proxy
# - traefik.http.routers.serge.rule=Host(`serge.example.com`)
# - traefik.http.services.serge.loadbalancer.server.port=8080
# # Optional part for traefik middlewares
# - traefik.http.routers.serge.middlewares=local-ipwhitelist@file,authelia@docker
#networks:
# proxy:
# external: true