Open5

2025年に遊んだもの

おしょうさんおしょうさん

Jan 2025

  • reviewing my lazyvim setup
  • wrote ansible playbook to setup my working env
    • bash, git, nvim, lazyvim, mise
  • reviewed repositories for services running using docker
    • dns
    • rp
    • ldap
    • gotify
    • gitlab
    • kroki
    • jupyter notebook
    • gitlab runner
    • certbot
    • keepalived
  • tried out LLMs locally using llama.cpp, vllm, and ollama

Feb 2025

  • redesigned lab environment along with new public repo to write the new post
  • tested cilium l2 and gateway

Mar 2025

  • purchased mac mini m4 and made it local LLM server (ollama & open webui)
  • tested out open webui apis
  • wrote test pipes and function on owui
  • tested MCP servers
  • omg I built my own "sebastian" persona on latte, must write an article on this...
    • open webui, ollama, latte, had dLLM Mercury and Gemma3 come up with the persona text
おしょうさんおしょうさん

Apr 2025

  • Letta (MemGPT) requires more machine resource to run with local LLM... gave up using it. LLM just loses track of memory and frequently forgets how to even use the tool whose instruction is always included as the system prompt...
  • Bought ANOTHER mini PC I can use as hypervisor
  • Built hugo website using docsy theme
    • had problem with the latest hugo then (1.146.3?) and the latest docsy (0.11.0)
    • hugo 1.145.0 works okay except for mermaid, and had partials script overwrite the code comes with the theme to get it working
    • MkDocs Material theme... Hugo w/ docsy theme... I like both
  • Testing out gemma3 quantization aware trained models
    • gotta keep eye on ollama library for gemma3. its the only model that gets continuous update...
    • Testing out cogito, mistral-small, and granite 3.3 as well, hoping some of them evetually work out well with my mcp tools
  • Load from running GitLab 17.10.4 was crazy. Its CPU usage occassionally hit the ceiling (1100+% on 12 cores) and the service rebooted... Upgraded to 17.11.0 and now monitoring.
  • Planning to test out gitea, hoping it to be less demanding
  • Revised my ansible project for homelab-v3 to support projects on gitea
  • Write ansible tasks and docker compose project to deploy and register gitea act-runner
  • Established backup and restore process for Gitea 1.23.7 & postgres 16.8, wrote script to upload backup data to Cloudflare R2, and put everything together on systemd timer and service to run the backup process periodically
  • Learn Gitea Actions (GitHub Actions) for the first time... very different from how GitLab pipelines look like
おしょうさんおしょうさん

May 2025

  • Migration from GitLab to Gitea wip
    • Designing my own "Gitea Pages" to migrate what I have published on GitLab Pages
    • Migrated 200 repos
    • Re-writing GitLab CI/CD pipeline jobs in Gitea Workflow
    • Revising ansible playbooks to use service definition repositories hosted on Gitea wip
  • Continuously updating dot repository
    • Lazyvim exit, replacing plugins with my own selections and customizations
    • More go programs and shell scripts as easy access CLI tools developed
  • Implemented kube-prometheus stack using community helm chart
  • Shutdown lab-hlv3 kubernetes cluster
  • Main hlv3 kubernetes cluster GitOps source migrated from GitLab to Gitea
  • Planning to do hlv4, another kubernetes cluster from scratch
  • Qwen3 30b-a3b is my favorite local LLM to run on macmini m4 as it's a lot faster than Gemma3 27b
  • Playing with Gemma3 27b and Gemma3n 4b on Google
おしょうさんおしょうさん

June 2025

  • Migrate all repo from GitLab to Gitea, rewriting workflows and setting up GitLab Pages alternative locally
  • Shutdown self-hosted GitLab
  • Deploy Sophos Firewall on Proxmox and create additional bridge to connect the devices to form a separate, internal network
おしょうさんおしょうさん

August 2025

  • Setup Proxmox VE 9, shutdown Proxmox VE 8.4
  • Setup and run Debian 9 servers on proxmox
  • Move reverse proxy and DNS services to rootless docker hosts
  • LLM - qwen3:30b-a3b is still my favorite among gemma3n:e4b and gpt-oss:20b to run on my macmini 4