Home Server Dashboard.

Figure: Real time home server monitoring with glance dashboard.

I got really into building a home server this month, and honestly, it turned into way more than I expected.

It started simple - I wanted to digitize my old DVD collection and have everything in one place. But once I started researching, I realized this was a perfect chance to learn Docker, Linux server management, and all the technologies I’ve been wanting to get more hands-on experience with.


Why I Built It Myself

I could have bought a Synology NAS and called it a day. They cost around $400 and everything just works, but where’s the learning in that?

I wanted to actually understand:

  • How Linux servers work
  • How Docker containers talk to each other
  • Why services crash and how to fix them
  • How to set up monitoring so I know when things break

Plus, I wanted to build something that would genuinely grow my technical skills - managing a fleet of Docker containers teaches a lot more than buying a prebuilt NAS.


The Hardware

Minisforum UN100P mini pc.

Figure: Minisforum UN100P mini pc running headless.

I went with a cheap refurbished mini PC - a Minisforum UN100P for $105.

It’s tiny, silent, and uses around 6 watts of power at idle.

Specs:

  • Intel N100 processor
  • 16GB RAM
  • 512GB storage
  • Runs headless (no monitor needed)

Could I have gone for something with more performance headroom? Sure. But this is plenty for what I needed, and it was cheap enough that if I messed up completely, I wouldn’t feel terrible about it.


The Software Stack

This is where it got interesting. I’m running Ubuntu Server (no GUI, just command line), and everything runs in Docker containers.

What’s running:

  • Plex (the actual media server)
  • A few management tools to keep everything organized
  • Monitoring dashboards so I can see what’s happening
  • Some automation to handle transcoding and repetitive tasks

Why Docker? Because if something breaks, I can just restart that one container instead of the whole server.

Plus, Docker is widely used in production environments, so it’s great hands-on experience.


What I Actually Learned

  • Docker networking is confusing at first.
    I spent an entire day figuring out why containers couldn’t talk to each other. Turns out I needed to create a custom network. Once I got it, it made sense.

  • Health checks are important.
    Services would show as “running” but weren’t actually working. I learned to add health checks so Docker knows if something’s actually broken.

  • Set resource limits on everything.
    One service had a memory leak and took down the whole server. Now everything has CPU and RAM limits so one bad service can’t kill everything else.

  • Monitoring from the start.
    I added monitoring tools after things started breaking. Should have done that on day one.

  • Document everything.
    I kept notes as I went, and when something broke weeks later, those notes saved me hours of troubleshooting.


The Struggles

Not gonna lie, there were some frustrating moments:

  • Spent 3 hours troubleshooting networking before realizing I had a typo in the config
  • Had a storage issue where the system only saw 100GB instead of 512GB (still haven’t fixed that)
  • One service kept crashing on startup — turned out it was starting before its dependencies were ready

What’s Next

I’m planning to set up remote access via Cloudflare Tunnel so I can access it from anywhere. Right now, it only works on my home network.

I’ll also need to fix that storage issue (likely a partition problem), and should really automate my backups instead of doing them manually.

A task for another day.


Why This Matters for My Career

This project gave me hands-on experience with:

  • Linux server administration
  • Docker and containerization
  • Network configuration
  • Troubleshooting production-like systems
  • Infrastructure monitoring

It’s rewarding to see how much this overlaps with what professionals do in DevOps and system administration - managing containers, troubleshooting, and keeping systems stable

Plus, it’s running 24/7 with minimal issues which feels pretty good.


Final Thoughts

Is this overkill for organizing some videos? Probably.

But I learned more in one month of actually doing this than I would have from watching tutorials or reading documentation.

Breaking things and fixing them is how you actually learn.

If you’re thinking about building something similar - just start. You’ll mess up, Google a lot, and learn way more than you expected.