TinyPilot: Build a KVM Over IP for Under $100

TinyPilot is my inexpensive, open-source device for controlling computers remotely. It works even before the operating system boots, so I use TinyPilot to install new OSes and debug boot failures on my bare metal homelab servers.

TinyPilot is now available for sale

The kit includes all the parts you need to build your own TinyPilot.

This post details my experience creating TinyPilot and shows how you can build your own for under $100 using a Raspberry Pi.

Photo of TinyPilot connecting two computers

Using TinyPilot to control my Ubuntu laptop from Chrome on my Microsoft Surface

I don’t want your life story; just tell me how to build it 🔗︎

If you’re a grinch who wants to skip my fascinating tale of triumph and despair in developing TinyPilot, jump directly to the section, “How to Build Your Own TinyPilot."

Demo 🔗︎

Why TinyPilot? 🔗︎

A few years ago, I built my own home server for testing software. It’s been a valuable investment, and I use it every day.

Photo of my homelab VM server

The homelab server I built in 2017 to host my virtual machines

The server has no keyboard or monitor attached because I access it over ssh or a web interface. This is a convenient setup, but it also turns small issues into a colossal pain.

Every few months, I’ll screw something up and prevent the server from booting or joining the network, effectively locking me out of the machine. To get things running again, I have to disconnect everything, drag the server over to my desk, and juggle cables around to connect the server to the keyboard and monitor at my desktop.

Commercial solutions 🔗︎

Friends have raved to me about their experience with iDRAC. It’s a chip in Dell servers that provides a virtual console from the moment the system powers on. I briefly considered an iDRAC for my next home server, but its hefty price tag quickly put an end to that. The license alone costs $300, and it requires expensive custom hardware.

Screenshot of $300 price for iDRAC 9 Enterprise license

A license for Dell’s iDRAC technology costs $300 per machine plus the cost of hardware

Next, I looked at commercial KVM over IP solutions. They provide similar functionality to Dell’s iDRAC, but they’re external devices that connect to a computer’s keyboard, video, and mouse ports (hence the name KVM). Sadly, they’re even more expensive, ranging in price from $500 to $1000 per unit.

Screenshot of purchsase page for Raritan Dominion KVM over IP

Commercial KVM over IP devices cost between $500 and $1,000.

As lazy as I am about dragging servers around, I couldn’t justify spending $500 to save myself the trouble of swapping cables around a few times per year.

So, I did what any appropriately irrational programmer would do: spend several hundred hours building my own KVM over IP.

Building a KVM over IP with Raspberry Pi 🔗︎

The Raspberry Pi is a small, inexpensive single-board computer. The devices are powerful enough to run a full desktop operating system, so their $30-60 price point makes them a popular tool among hobbyists and programmers.

Raspberry Pi in the palm of my hand

The Raspberry Pi is a fully-functional computer that fits on a single chip and costs only $30-60.

Recent versions of the Pi support USB on-the-go (USB OTG), which allows the Pi to impersonate USB devices such as keyboards, thumb drives, and microphones.

As a proof of concept of my Pi-as-KVM idea, I created a simple web app called Key Mime Pi.

Screenshot of Key Mime Pi web interface

Key Mime Pi, my early precursor to TinyPilot that only supported keyboard forwarding.

Key Mime Pi connects to another computer via USB and registers as a USB keyboard. It also presents a web page and listens for JavaScript key events. As the user types, Key Mime Pi captures the key events and translates them into keystrokes through its fake USB keyboard. This causes the keystrokes to appear on the target computer. I described this behavior in depth in my previous post.

The challenge of capturing video 🔗︎

Keyboard forwarding isn’t so useful if you can’t see what’s happening on the screen. My obvious next step was to find a way to capture my server’s display output in the Pi and render it in the browser.

My first attempt at video capture was to use the Lenkeng LKV373A HDMI extender. Daniel Kučera (aka danman) did an excellent job reverse engineering this device. It was available from Chinese merchants on eBay for around $40, so it seemed like my best option.

Photo of Lenkeng LKV373A HDMI extender

The Lenkeng LKV373A HDMI extender was my first attempt at HDMI video capture.

Capturing video was tricky because the LKV373A transmitter isn’t a video capture device. Its intended purpose is to pair with an LKV373A receiver that converts the network stream back to HDMI output. In danman’s investigation, he discovered a way to intercept and capture the video stream, but the LKV373A speaks a non-standard variant of the RTP protocol that few video tools understand.

Fortunately, danman contributed a patch to ffmpeg that handles the LKV377A’s goofy behavior, so I was able to render the stream using ffmpeg’s video player:

ffplay -i udp://239.255.42.42:5004
Screenshot of ffplay rendering video stream from LKV373A

Rendering the video stream from the LKV373A with ffplay

It was here that I received my first taste of a problem that dogged me throughout the project: latency. There was almost a one-second delay between the target computer and the video playback on my desktop.

Photo of Lenkeng LKV373A HDMI extender

The LKV373A introduced 838 milliseconds of latency before any re-encoding.

I tried playing around with ffplay’s many command-line flags to speed up the stream, but I never pushed past 800 milliseconds. And that was on my desktop with its high-end GPU and CPU. It didn’t bode well performance-wise for my scrappy little Raspberry Pi.

Fortunately, I found a better solution by complete coincidence.

HDMI to USB dongle 🔗︎

While mindlessly scrolling through Twitter, I happened to see a tweet by Arsenio Dev about a low-cost HDMI to USB dongle he had just purchased:

Screenshot of Rufus

A tweet from Arsenio Dev tipped me off to a better video capture solution.

Capturing video at 1080p resolution and 30 frames per second seemed a little too good to be true, so I ordered one from eBay. It was only $11, including shipping. I don’t even know what you call it — it has no brand name, so I’ll just call it “the HDMI dongle.” There are several variants, but they’re all just different housing over the same MacroSilicon MS2109 chip.

Screenshot of HDMI for sale on eBay for $11.20

HDMI to USB dongle available on eBay for $11.20 with free shipping

When the device arrived a few days later, it blew me away. Without any tinkering, it showed up as a UVC video capture device as soon as I plugged it in to the Raspberry Pi.

$ sudo v4l2-ctl --list-devices
bcm2835-codec-decode (platform:bcm2835-codec):
        /dev/video10
        /dev/video11
        /dev/video12

UVC Camera (534d:2109): USB Vid (usb-0000:01:00.0-1.4): <<< HDMI capture dongle
        /dev/video0
        /dev/video1

Within minutes, I was able to capture and restream HDMI video:

# On the Pi
ffmpeg \
  -re \
  -f v4l2 \
  -i /dev/video0 \
  -vcodec libx264 \
  -f mpegts udp://10.0.0.100:1234/stream

# On my Windows desktop
ffplay.exe -i udp://@10.0.0.100:1234/stream

It was so darn convenient, too. The LKV373A was nearly brick-sized and required its own power source and Ethernet cable. The HDMI dongle was as small as a thumb drive and required nothing more than a USB port.

Comparison of Lenkeng LKV373A with HDMI dongle

The Lenkeng LKV373A HDMI extender (left) was larger and required more connections than the HDMI dongle (right).

The only problem was, again, latency. The Pi’s rebroadcast of the video stream lagged the source computer by 7-10 seconds.

Comparison of Lenkeng LKV373A with HDMI dongle

Using ffmpeg to stream video from my Pi, there was a delay in the video of up to 10 seconds.

I wasn’t sure if this delay came from dongle itself, ffmpeg on the Pi, or ffplay on my desktop. Arsenio Dev reported latency of 20 ms, so it seemed like faster performance was possible if I delved into ffmpeg’s arcane and mysterious command-line flags.

Another stroke of luck spared me from that miserable task.

Borrowing from a similar project 🔗︎

When I published my previous blog post about Key Mime Pi, I received a comment from Max Devaev, who encouraged me to check out his project, Pi-KVM.

Max's comment: Hi:) Take a look at this project: https://github.com/pikvm/pikvm We have already done and debugged many things

Max Devaev pointed me to his existing Pi-KVM project.

GPIO pins

My previous experience with breadboards involved accidentally melting them.

I had looked at Pi-KVM briefly, but its requirements of breadboards and soldering scared me off.

At Max’s suggestion, I gave Pi-KVM a second look, particularly interested in how he solved the video latency issue. I noticed that he captured video through a tool called uStreamer.

Note: From further discussions with Max, I’ve learned that Pi-KVM does support builds without soldering or breadboards.

uStreamer: a super-fast video streamer 🔗︎

Have you ever found a tool that’s so good, it solves problems you hadn’t even anticipated?

Right out of the box, uStreamer reduced my latency from 8 seconds to 500-600 milliseconds. But it also eliminated a whole chain of extra work.

500 ms latency with uStreamer and the HDMI dongle

uStreamer reduced my latency by a factor of 15.

Prior to uStreamer, I wasn’t sure how to get video from ffmpeg into the user’s browser, but I knew it was possible somehow. I tested this mostly-accurate tutorial for piping video from ffmpeg to nginx using HLS, but it added even more latency. And it still left open problems like how to start and stop streaming on HDMI cable connects and disconnects and how to translate the video to a browser-friendly format.

uStreamer solved all of this. It ran its own minimal HTTP server that served video in Motion JPEG, a format browsers play natively. I didn’t have to bother with HLS streams or getting ffmpeg and nginx to talk to each other.

The tool was so fully-featured that I assumed Max simply forked it from a more mature project, but I was mistaken. This maniac wrote his own video encoder in C just to squeeze the maximum performance out of Pi hardware. I quickly donated to Max and invite anyone who uses his software to do the same.

Improving video latency 🔗︎

uStreamer reduced my latency from 10 seconds down to ~600 milliseconds. That was a huge leap forward but still a noticeable delay. I told Max I was interested in funding uStreamer further if he could find ways to improve performance, so we got to chatting.

Max was interested in the HDMI dongle I was using since he’d never seen that particular device. He invited me to create a shared shell session using tmate so that he could access my Pi remotely.

Screenshot of conversation where Max ofers to help me via tmate

Max offered to either help improve latency or frame me for a federal crime. Fortunately, he ended up doing the former.

After a few minutes of testing how uStreamer ran on my hardware, Max ran the v4l2-ctl utility and saw a line that fascinated him but totally went over my head:

$ sudo v4l2-ctl --all
Driver Info:
        Driver name      : uvcvideo
        Card type        : UVC Camera (534d:2109): USB Vid
...
Format Video Capture:
        Width/Height      : 1280/720
        Pixel Format      : 'MJPG' (Motion-JPEG)
...
Streaming Parameters Video Capture:
        Capabilities     : timeperframe
        Frames per second: 30.000 (30/1)

The HDMI dongle was delivering the video stream in Motion JPEG format! uStreamer’s hardware-assisted encoding was fast, but it was totally unnecessary, as modern browsers play Motion JPEG natively.

We configured uStreamer to skip re-encoding and just pass through the video stream as-is.

Photo showing 200ms of latency after eliminating re-encode step

Skipping the extra re-encode step on the Pi reduced latency from 600 ms down to 200 ms.

Latency went from 600 milliseconds all the way down to 200 ms. It’s not instantaneous, but it’s low enough to forget the delay after using it for a few minutes.

TinyPilot in action 🔗︎

Remember way back at the beginning of this post when I said I wanted TinyPilot so that I could access my headless VM server before it boots? Well, it works and I do!

I iterated on Key Mime Pi to make a new web interface that integrates the video capture functionality:

I built a new headless VM server this year and used TinyPilot to install Proxmox, an open-source hypervisor and web interface for managing VMs.

TinyPilot allowed me to manage the entire install from my browser. It was definitely more pleasant than my old process of dragging computers around and swapping cables back and forth.

How to build your own TinyPilot 🔗︎

Parts list 🔗︎

Want an all-in-one TinyPilot kit?

Support TinyPilot’s development by purchasing an official TinyPilot kit. It includes all the parts you need to build TinyPilot and guarantees access to premium versions of TinyPilot software I may release in the future.

Install Raspberry Pi OS Lite 🔗︎

To begin, install Raspberry Pi OS lite (formerly known as Raspbian) on a microSD card.

Screenshot of Rufus

I use Rufus to write my Pi micro SD cards, but any whole disk imaging tool will work.

Enable SSH access by placing a file called ssh on the microSD’s boot partition. If you’re connecting over wireless, you also need a wpa_supplicant.conf file.

When you finish preparing the microSD card, insert it into your Pi device.

Install a case (optional) 🔗︎

The Raspberry Pi 4 famously generates a lot of heat. You can run it fine without cooling, but you’ll likely hit stability issues over time.

I like this minimalist case because it’s inexpensive and passively cools the Pi without the complexity of a powered fan:

Minimal aluminum case for Raspberry Pi

This minimalist aluminum case cools your Pi well without the complexity of a fan.

Connect to the machine via USB 🔗︎

To enable TinyPilot to function as a virtual keyboard, connect your Pi’s USB-C port to a USB-A port on the target machine:

Note: Prefer USB 3.0 ports, as they provide more power to the Pi.

Attach the HDMI capture dongle 🔗︎

To complete the physical assembly, insert the HDMI dongle into one of the Pi’s USB ports. Then, connect an HDMI cable to the dongle, and plug the other end into the display output of your target computer.

Note: If the computer you’re connecting to has no HDMI output, you should be able to use a DisplayPort to HDMI cable or a DVI to HDMI cable, though I haven’t tested these personally.

Connect an Ethernet cable 🔗︎

If you’re connecting to your Pi over wired LAN, attach a network cable to your Pi’s Ethernet port:

Photo of Ethernet cable connected to Pi device

Connect an Ethernet cable to your Pi.

Note: You can skip this step if you configured wireless access by adding a wpa_supplicant.conf file above.

Install the TinyPilot software 🔗︎

SSH into your Pi device (default credentials for Raspberry Pi OS are pi / raspberry), and run the following commands:

curl -sS https://raw.githubusercontent.com/mtlynch/tinypilot/master/quick-install \
  | bash -
sudo reboot

If you’re appropriately suspicious of piping a random web script into your shell, I encourage you to inspect the source.

The script bootstraps a self-contained Ansible environment with my TinyPilot Ansible role. It installs four services that run on every boot:

  • nginx: a popular open-source web server
  • ustreamer: a lightweight HTTP video streaming server
  • usb-gadget: a script enabling Pi’s “USB gadget mode,” which allows the Pi to impersonate USB devices
  • tinypilot: the web interface I created for TinyPilot

Using TinyPilot 🔗︎

After you run the install script, TinyPilot will be available at:

Screenshot of TinyPilot web interface

When setup is complete, you can access TinyPilot’s web interface at http://raspberrypi/ on your local network.

The power problem 🔗︎

The biggest limitation of this setup is power. Relying on the target computer for power means that when the target shuts down, the Pi suffers an unexpected power cut.

Further, The Pi 4 needs 3 Amps for stable operation, though it can run at lower power. A computer’s USB 3.0 port provides only 0.9 Amps and USB 2.0 provides only 0.5 Amps, which is why you may see these warnings in the Pi’s system logs:

 $ sudo journalctl -xe | grep "Under-voltage"
Jun 28 06:23:15 pikvm kernel: Under-voltage detected! (0x00050005)

To solve this problem, I worked with an engineering firm to create a custom circuit board that splits the Pi’s USB-C port into two. The first port accepts USB power, so you can still deliver a full 3 Amps to the Pi. The second accepts USB data out, so the Pi can still impersonate a USB keyboard.

Importantly, the power connector’s data port excludes a USB power line. This ensures that voltage differences between the computer’s power source and the Pi’s power source won’t cause undesirable power backflows.

Note: Without a proper connector, there’s a risk of hardware damage if you power the Pi from an external power source while it’s connected to a computer. See the TinyPilot wiki for additional details.

TinyPilot kits 🔗︎

If you’d like to support further development of this software, consider donating or purchasing a TinyPilot kit. Kits include all the equipment you need to build your own TinyPilot. It comes with a preformatted microSD card, so you don’t need to configure any software.

Screenshot of TinyPilot order page

Purchasing a TinyPilot kit supports future development of TinyPilot and guarantees you access to premium features I may add in the future.

Source code 🔗︎

All TinyPilot software is open-source under the permissive MIT license:

  • tinypilot: The TinyPilot web interface and backend.
  • ansible-role-tinypilot: The Ansible role for installing TinyPilot and its dependencies as systemd services.

Special thanks to Max Devaev for his incredible work on uStreamer and his contributions to TinyPilot.

comments powered by Disqus