Owl Blog

What does an owl do to be intentional?

A personal space for thoughtful engineering, practical learning, and consistent growth.

All Development Italian

Tagged: open-webui × Clear

New Article

Admin login required to create articles.

Reproducible Local LLM Stack on a Laptop — Docker, k3d, Ollama, Open WebUI

Reproducible Local LLM Stack on a Laptop — Docker, k3d, Ollama, Open WebUI

3/1/2026 by AdminOwl | 3 min read
local-llm DevOps docker k3d ollama open-webui Ansible
This post documents a reproducible DevOps workflow to run a local LLM stack on an Acer Predator laptop (64 GB RAM, RTX 5070 Ti — 12 GB VRAM, 2 TB storage). It covers host preparation, optional GPU passthrough, running Ollama for local inference, creating a lightweight Kubernetes cluster with k3d for the UI, deploying Open WebUI, and automating the stack with Ansible. The automation and manifests referenced live in https://github.com/binarobb/local-llm-stack.
Read More Edit