Lobby Labs // Homelab Systems

Scientific-grade infrastructure for the personal cloud.

Lobby Labs is a living Ubuntu-based homelab focused on self-hosting, resilience testing, and hands-on research into modern infrastructure patterns. Everything is measured, versioned, and designed to evolve.

Telemetry feed

Services

18

apps, data, routing

Uptime

99.95%

rolling 90 days

Containers

25

docker + systemd

Alerts

6

last 30 days

Core stack

Infrastructure pillars

Each pillar is treated as a research track with experiments, documentation, and automated recovery drills.

Hosting & Apps

Business and admin systems run directly on the Ubuntu host with clear separation between public and internal tools.

  • Lobby Studio admin
  • Server dashboard
  • Internal tooling

Networking & Edge

All inbound traffic is routed through Cloudflare with reverse proxying and secure tunnels.

  • Cloudflare proxy
  • Cloudflare tunnels
  • Service routing rules

Automation & Bots

Always-on bots handle community automation and operational tasks across Discord.

  • Sportscord
  • FightDemons
  • Webhook automations

Game Infrastructure

Containerized game servers are managed through Pelican for fast provisioning and isolation.

  • Pelican panel
  • Bedrock + Modded Minecraft
  • Hytale
Security posture updated hourly

Edge protection

Cloudflare

tunnels + proxy

Exposure score

Low

only tunneled services

Patch cadence

Monthly

Ubuntu updates

Lab protocol

How the lab is used.

The Lobby Labs server supports production apps, community automation, and game hosting in one place. It’s built to keep the business running while giving room to experiment with new services as they come online.

Business ops

Admin tools and internal dashboards keep day-to-day operations moving.

Community automation

Discord bots and webhooks handle engagement and moderation tasks.

Game hosting

Containerized servers power Minecraft and Hytale for friends and community.

Alerts & oversight

Service alerts route to Discord for fast visibility.

Roadmap

Upcoming experiments

The next research sprints focus on resilience, AI tooling, and more apps.

Phase 01

Private AI LLM

Local LLM stack using Ollama + Open WebUI for private inference on the server.

Phase 02

AdventureLog

Self-host AdventureLog to track and plan trips inside the lab stack.

Phase 03

Self-hosted Sentry

Bring in self-hosted Sentry for error tracking across apps and bots.