
TL;DR: Use streamlink to detect when a Twitch channel goes live, pipe the HLS feed to ffmpeg for a clean MP4, and run the whole thing under systemd so it restarts itself. Below is the exact script we run on customer VPS boxes. Total setup time: ~10 minutes.
Why a VPS, not your home PC
Recording Twitch from a home PC means: PC must stay on 24/7, your upload chokes when streams overlap, and a power blip kills the recording. A €5/mo VPS with 50 GB SSD records every stream to disk reliably, then you rsync the finished MP4s home overnight.
Minimum VPS spec for 1 channel at 1080p60:
- 1 vCPU
- 1 GB RAM
- 50 GB SSD (≈10 hours of 1080p60 per 50 GB at default Twitch bitrate)
- 25 Mbps download
Multiple channels in parallel: add 1 vCPU per stream.
Step 1: Install dependencies
On Ubuntu 22.04 / Debian 12:
sudo apt update
sudo apt install -y python3-pip ffmpeg curl
sudo pip3 install --upgrade streamlink
streamlink --version
ffmpeg -version | head -1
Confirm both print versions. streamlink 6.x or newer is required for Twitch's current HLS endpoints.
Step 2: Get a Twitch OAuth token (optional but recommended)
Without auth, Twitch injects mid-roll ads into the recording every ~10 minutes. With a Twitch Turbo or subscriber token, recordings are clean.
streamlink --twitch-disable-ads twitch.tv/CHANNEL best
To pass an auth token (find via browser dev tools → Application → Cookies → auth-token):
streamlink \
--twitch-api-header "Authorization=OAuth YOUR_OAUTH_TOKEN" \
--twitch-disable-ads \
twitch.tv/CHANNEL best
Store the token in /root/.config/streamlink/config (chmod 600):
twitch-api-header=Authorization=OAuth YOUR_OAUTH_TOKEN
twitch-disable-ads
Step 3: The recording script
Save as /usr/local/bin/twitch-record.sh and chmod +x:
#!/usr/bin/env bash
set -euo pipefail
CHANNEL="${1:-}"
OUT_DIR="${OUT_DIR:-/var/recordings}"
QUALITY="${QUALITY:-best}"
RETRY_DELAY="${RETRY_DELAY:-60}"
if [[ -z "$CHANNEL" ]]; then
echo "Usage: $0 <channel-name>" >&2
exit 1
fi
mkdir -p "$OUT_DIR/$CHANNEL"
while true; do
if streamlink --json "twitch.tv/$CHANNEL" "$QUALITY" >/dev/null 2>&1; then
TS="$(date -u +%Y%m%d-%H%M%S)"
OUT="$OUT_DIR/$CHANNEL/${CHANNEL}_${TS}.mp4"
echo "[$(date -Is)] $CHANNEL is LIVE, recording to $OUT"
streamlink \
--twitch-disable-ads \
--hls-live-restart \
--retry-streams 5 \
--retry-max 3 \
-O "twitch.tv/$CHANNEL" "$QUALITY" \
| ffmpeg -hide_banner -loglevel warning \
-i pipe:0 \
-c copy \
-movflags +faststart \
"$OUT" || true
echo "[$(date -Is)] $CHANNEL stream ended"
fi
sleep "$RETRY_DELAY"
done
Test it manually:
sudo OUT_DIR=/var/recordings /usr/local/bin/twitch-record.sh shroud
When the channel goes live, you'll see a new .mp4 appear and grow in real time.
Step 4: Run it as a systemd service
/etc/systemd/system/twitch-record@.service:
[Unit]
Description=Auto-record Twitch channel %i
After=network-online.target
Wants=network-online.target
[Service]
Type=simple
User=root
Environment=OUT_DIR=/var/recordings
Environment=QUALITY=best
ExecStart=/usr/local/bin/twitch-record.sh %i
Restart=always
RestartSec=30
StandardOutput=journal
StandardError=journal
[Install]
WantedBy=multi-user.target
Enable per channel:
sudo systemctl daemon-reload
sudo systemctl enable --now twitch-record@shroud
sudo systemctl enable --now twitch-record@summit1g
sudo systemctl status twitch-record@shroud
journalctl -u twitch-record@shroud -f
Each channel runs in its own process. systemd restarts on crash, on reboot, on network drop.
Step 5: Storage rotation (delete old recordings)
Daily cron at /etc/cron.daily/twitch-rotate:
#!/bin/bash
# Delete recordings older than 14 days
find /var/recordings -type f -name "*.mp4" -mtime +14 -delete
# Delete empty subdirectories
find /var/recordings -mindepth 1 -type d -empty -delete
chmod +x /etc/cron.daily/twitch-rotate. Adjust +14 to your retention.
Optional: Auto-upload to S3 / Backblaze B2 / your home server
After ffmpeg exits, fire a hook. Replace the ffmpeg ... "$OUT" || true line with:
ffmpeg ... "$OUT" || true
if [[ -f "$OUT" && -s "$OUT" ]]; then
rclone copy "$OUT" b2:my-bucket/twitch/$CHANNEL/ --transfers 4 \
&& rm "$OUT"
fi
Set up rclone once with rclone config for any S3-compatible target. B2 is the cheapest at $0.005/GB/month for cold storage.
Common gotchas
| Symptom | Cause | Fix |
|---|---|---|
| Recordings full of ads | No auth token | Add Twitch OAuth token in streamlink config |
| File ends mid-stream | streamlink HLS hiccup | Already handled by --hls-live-restart --retry-streams 5 |
| 0-byte mp4 files | ffmpeg got no stdin (channel ended instantly) | Add [[ -s "$OUT" ]] check before keeping |
| High CPU on small VPS | Reencoding instead of copying | Confirm -c copy in ffmpeg command |
| systemd unit won't start | streamlink not in $PATH for non-login shell | Use absolute path /usr/local/bin/streamlink |
Why this beats every "Twitch recorder" SaaS
- No 7-day storage cap like most consumer recorders
- No 1080p downgrade to save the SaaS bandwidth
- No retention re-uploads to YouTube that get DMCA-flagged
- You own the file — record once, archive forever
A €5/mo Hetzner CX22 records ~6 channels in parallel comfortably. For higher concurrency or guaranteed dedicated CPU, see our streaming VPS plans — same bare-metal Ryzen hosts your OBS instances run on.
Want this fully managed?
If you'd rather skip the systemd setup, our 24/7 OBS streaming VPS plans include automated recording out of the box: drop in your channel list, recordings land in your panel, lifecycle policy auto-rotates them. Same script under the hood, zero terminal time.
