TinyPilot is my inexpensive, open-source device for controlling computers remotely. It works even before the operating system boots, so I use TinyPilot to install new OSes and debug boot failures on my bare metal homelab servers.
TinyPilot is now available for saleThe kit includes all the parts you need to build your own TinyPilot.
This post details my experience creating TinyPilot and shows how you can build your own for under $100 using a Raspberry Pi.
I don’t want your life story; just tell me how to build it 🔗︎
If you’re a grinch who wants to skip my fascinating tale of triumph and despair in developing TinyPilot, jump directly to the section, “How to Build Your Own TinyPilot."
Why TinyPilot? 🔗︎
A few years ago, I built my own home server for testing software. It’s been a valuable investment, and I use it every day.
The server has no keyboard or monitor attached because I access it over ssh or a web interface. This is a convenient setup, but it also turns small issues into a colossal pain.
Every few months, I’ll screw something up and prevent the server from booting or joining the network, effectively locking me out of the machine. To get things running again, I have to disconnect everything, drag the server over to my desk, and juggle cables around to connect the server to the keyboard and monitor at my desktop.
Commercial solutions 🔗︎
Friends have raved to me about their experience with iDRAC. It’s a chip in Dell servers that provides a virtual console from the moment the system powers on. I briefly considered an iDRAC for my next home server, but its hefty price tag quickly put an end to that. The license alone costs $300, and it requires expensive custom hardware.
Next, I looked at commercial KVM over IP solutions. They provide similar functionality to Dell’s iDRAC, but they’re external devices that connect to a computer’s keyboard, video, and mouse ports (hence the name KVM). Sadly, they’re even more expensive, ranging in price from $500 to $1000 per unit.
As lazy as I am about dragging servers around, I couldn’t justify spending $500 to save myself the trouble of swapping cables around a few times per year.
So, I did what any appropriately irrational programmer would do: spend several hundred hours building my own KVM over IP.
Building a KVM over IP with Raspberry Pi 🔗︎
The Raspberry Pi is a small, inexpensive single-board computer. The devices are powerful enough to run a full desktop operating system, so their $30-60 price point makes them a popular tool among hobbyists and programmers.
Recent versions of the Pi support USB on-the-go (USB OTG), which allows the Pi to impersonate USB devices such as keyboards, thumb drives, and microphones.
As a proof of concept of my Pi-as-KVM idea, I created a simple web app called Key Mime Pi.
The challenge of capturing video 🔗︎
Keyboard forwarding isn’t so useful if you can’t see what’s happening on the screen. My obvious next step was to find a way to capture my server’s display output in the Pi and render it in the browser.
My first attempt at video capture was to use the Lenkeng LKV373A HDMI extender. Daniel Kučera (aka danman) did an excellent job reverse engineering this device. It was available from Chinese merchants on eBay for around $40, so it seemed like my best option.
Capturing video was tricky because the LKV373A transmitter isn’t a video capture device. Its intended purpose is to pair with an LKV373A receiver that converts the network stream back to HDMI output. In danman’s investigation, he discovered a way to intercept and capture the video stream, but the LKV373A speaks a non-standard variant of the RTP protocol that few video tools understand.
Fortunately, danman contributed a patch to ffmpeg that handles the LKV377A’s goofy behavior, so I was able to render the stream using ffmpeg’s video player:
ffplay -i udp://220.127.116.11:5004
It was here that I received my first taste of a problem that dogged me throughout the project: latency. There was almost a one-second delay between the target computer and the video playback on my desktop.
I tried playing around with ffplay’s many command-line flags to speed up the stream, but I never pushed past 800 milliseconds. And that was on my desktop with its high-end GPU and CPU. It didn’t bode well performance-wise for my scrappy little Raspberry Pi.
Fortunately, I found a better solution by complete coincidence.
HDMI to USB dongle 🔗︎
Capturing video at 1080p resolution and 30 frames per second seemed a little too good to be true, so I ordered one from eBay. It was only $11, including shipping. I don’t even know what you call it — it has no brand name, so I’ll just call it “the HDMI dongle.” There are several variants, but they’re all just different housing over the same MacroSilicon MS2109 chip.
When the device arrived a few days later, it blew me away. Without any tinkering, it showed up as a UVC video capture device as soon as I plugged it in to the Raspberry Pi.
$ sudo v4l2-ctl --list-devices bcm2835-codec-decode (platform:bcm2835-codec): /dev/video10 /dev/video11 /dev/video12 UVC Camera (534d:2109): USB Vid (usb-0000:01:00.0-1.4): <<< HDMI capture dongle /dev/video0 /dev/video1
Within minutes, I was able to capture and restream HDMI video:
# On the Pi ffmpeg \ -re \ -f v4l2 \ -i /dev/video0 \ -vcodec libx264 \ -f mpegts udp://10.0.0.100:1234/stream # On my Windows desktop ffplay.exe -i udp://@10.0.0.100:1234/stream
It was so darn convenient, too. The LKV373A was nearly brick-sized and required its own power source and Ethernet cable. The HDMI dongle was as small as a thumb drive and required nothing more than a USB port.
The only problem was, again, latency. The Pi’s rebroadcast of the video stream lagged the source computer by 7-10 seconds.
I wasn’t sure if this delay came from dongle itself, ffmpeg on the Pi, or ffplay on my desktop. Arsenio Dev reported latency of 20 ms, so it seemed like faster performance was possible if I delved into ffmpeg’s arcane and mysterious command-line flags.
Another stroke of luck spared me from that miserable task.
Borrowing from a similar project 🔗︎
I had looked at Pi-KVM briefly, but its requirements of breadboards and soldering scared me off.
At Max’s suggestion, I gave Pi-KVM a second look, particularly interested in how he solved the video latency issue. I noticed that he captured video through a tool called uStreamer.
uStreamer: a super-fast video streamer 🔗︎
Have you ever found a tool that’s so good, it solves problems you hadn’t even anticipated?
Right out of the box, uStreamer reduced my latency from 8 seconds to 500-600 milliseconds. But it also eliminated a whole chain of extra work.
Prior to uStreamer, I wasn’t sure how to get video from ffmpeg into the user’s browser, but I knew it was possible somehow. I tested this mostly-accurate tutorial for piping video from ffmpeg to nginx using HLS, but it added even more latency. And it still left open problems like how to start and stop streaming on HDMI cable connects and disconnects and how to translate the video to a browser-friendly format.
uStreamer solved all of this. It ran its own minimal HTTP server that served video in Motion JPEG, a format browsers play natively. I didn’t have to bother with HLS streams or getting ffmpeg and nginx to talk to each other.
The tool was so fully-featured that I assumed Max simply forked it from a more mature project, but I was mistaken. This maniac wrote his own video encoder in C just to squeeze the maximum performance out of Pi hardware. I quickly donated to Max and invite anyone who uses his software to do the same.
Improving video latency 🔗︎
uStreamer reduced my latency from 10 seconds down to ~600 milliseconds. That was a huge leap forward but still a noticeable delay. I told Max I was interested in funding uStreamer further if he could find ways to improve performance, so we got to chatting.
Max was interested in the HDMI dongle I was using since he’d never seen that particular device. He invited me to create a shared shell session using tmate so that he could access my Pi remotely.
After a few minutes of testing how uStreamer ran on my hardware, Max ran the
v4l2-ctl utility and saw a line that fascinated him but totally went over my head:
$ sudo v4l2-ctl --all Driver Info: Driver name : uvcvideo Card type : UVC Camera (534d:2109): USB Vid ... Format Video Capture: Width/Height : 1280/720 Pixel Format : 'MJPG' (Motion-JPEG) ... Streaming Parameters Video Capture: Capabilities : timeperframe Frames per second: 30.000 (30/1)
The HDMI dongle was delivering the video stream in Motion JPEG format! uStreamer’s hardware-assisted encoding was fast, but it was totally unnecessary, as modern browsers play Motion JPEG natively.
We configured uStreamer to skip re-encoding and just pass through the video stream as-is.
Latency went from 600 milliseconds all the way down to 200 ms. It’s not instantaneous, but it’s low enough to forget the delay after using it for a few minutes.
TinyPilot in action 🔗︎
Remember way back at the beginning of this post when I said I wanted TinyPilot so that I could access my headless VM server before it boots? Well, it works and I do!
I iterated on Key Mime Pi to make a new web interface that integrates the video capture functionality:
I built a new headless VM server this year and used TinyPilot to install Proxmox, an open-source hypervisor and web interface for managing VMs.
TinyPilot allowed me to manage the entire install from my browser. It was definitely more pleasant than my old process of dragging computers around and swapping cables back and forth.
How to build your own TinyPilot 🔗︎
Parts list 🔗︎
Want an all-in-one TinyPilot kit?
Support TinyPilot’s development by purchasing an official TinyPilot kit. It includes all the parts you need to build TinyPilot and guarantees access to premium versions of TinyPilot software I may release in the future.
- Raspberry Pi 4 (all variants work)
- USB-C to USB-A cable (Male/Male)
- HDMI to USB capture dongle
- Strangely, these don’t have a brand name, but you can recognize them by their appearance.
- They’re generally available on eBay for $11-15.
- microSD card (Class 10, 8 GB or larger)
- HDMI to HDMI cable
- Or [other] to HDMI, depending on how your target machine displays output.
- (Optional) A USB-C OTG split connector
- Requires two additional USB-A to microUSB cables.
- (Optional) A cooling case, heat sink, or fan
- Choose a case that provides access to the Pi’s GPIO pins.
- I use this minimalist, passive cooling case.
Install Raspberry Pi OS Lite 🔗︎
To begin, install Raspberry Pi OS lite (formerly known as Raspbian) on a microSD card.
Enable SSH access by placing a file called
ssh on the microSD’s boot partition. If you’re connecting over wireless, you also need a
When you finish preparing the microSD card, insert it into your Pi device.
Install a case (optional) 🔗︎
The Raspberry Pi 4 famously generates a lot of heat. You can run it fine without cooling, but you’ll likely hit stability issues over time.
I like this minimalist case because it’s inexpensive and passively cools the Pi without the complexity of a powered fan:
Connect to the machine via USB 🔗︎
To enable TinyPilot to function as a virtual keyboard, connect your Pi’s USB-C port to a USB-A port on the target machine:
Attach the HDMI capture dongle 🔗︎
To complete the physical assembly, insert the HDMI dongle into one of the Pi’s USB ports. Then, connect an HDMI cable to the dongle, and plug the other end into the display output of your target computer.
Connect an Ethernet cable 🔗︎
If you’re connecting to your Pi over wired LAN, attach a network cable to your Pi’s Ethernet port:
Install the TinyPilot software 🔗︎
SSH into your Pi device (default credentials for Raspberry Pi OS are
raspberry), and run the following commands:
curl -sS https://raw.githubusercontent.com/mtlynch/tinypilot/master/quick-install \ | bash - sudo reboot
If you’re appropriately suspicious of piping a random web script into your shell, I encourage you to inspect the source.
- nginx: a popular open-source web server
- ustreamer: a lightweight HTTP video streaming server
- usb-gadget: a script enabling Pi’s “USB gadget mode,” which allows the Pi to impersonate USB devices
- tinypilot: the web interface I created for TinyPilot
Using TinyPilot 🔗︎
After you run the install script, TinyPilot will be available at:
The power problem 🔗︎
The biggest limitation of this setup is power. Relying on the target computer for power means that when the target shuts down, the Pi suffers an unexpected power cut.
Further, The Pi 4 needs 3 Amps for stable operation, though it can run at lower power. A computer’s USB 3.0 port provides only 0.9 Amps and USB 2.0 provides only 0.5 Amps, which is why you may see these warnings in the Pi’s system logs:
$ sudo journalctl -xe | grep "Under-voltage" Jun 28 06:23:15 pikvm kernel: Under-voltage detected! (0x00050005)
To solve this problem, I worked with an engineering firm to create a custom circuit board that splits the Pi’s USB-C port into two. The first port accepts USB power, so you can still deliver a full 3 Amps to the Pi. The second accepts USB data out, so the Pi can still impersonate a USB keyboard.
Importantly, the power connector’s data port excludes a USB power line. This ensures that voltage differences between the computer’s power source and the Pi’s power source won’t cause undesirable power backflows.
TinyPilot kits 🔗︎
If you’d like to support further development of this software, consider donating or purchasing a TinyPilot kit. Kits include all the equipment you need to build your own TinyPilot. It comes with a preformatted microSD card, so you don’t need to configure any software.
Source code 🔗︎
All TinyPilot software is open-source under the permissive MIT license:
- tinypilot: The TinyPilot web interface and backend.
- ansible-role-tinypilot: The Ansible role for installing TinyPilot and its dependencies as systemd services.
Special thanks to Max Devaev for his incredible work on uStreamer and his contributions to TinyPilot.
Be the first to know when I post cool stuff
Subscribe to get my latest posts by email.
Thanks for signing up! Check your email to confirm your subscription.
Whoops, we weren't able to process your signup.