Watching livestreams with friends using the SyncPlay feature in Jellyfin

August 07, 2021 8 minutes

Ever since the great SyncPlay feature in Jellyfin came out, I've been wondering if it was possible to use it to watch livestreams. This lead me down the rabbit hole of trying to figure out how and if it was even possible and this is what I found.

Table of contents

The beginning

I attempted a lot of different things in the beginning trying to figure out what the actual issues were that prevented it from working. The first thing I found was that Jellyfin is really made to work with complete video files, which is kind of an issue when you are trying to watch a livestream. What I found was that Jellyfin didn't really have a problem with opening a video file that was currently being written to with new data. The problem was it only recognised whatever video data was there when opening the file and there didn't seem to be an obvious way to get Jellyfin to read the newly written data at the end of the first stint, without restarting video playback. Then I thought, what if we could trick Jellyfin into thinking the video is longer than it actually is? So I had a look at different media container formats.

Multimedia container formats

With the assumption that the video needed to be played in a browser I needed something widely supported. The first stop was MP4.

MP4

I tried a few different things with command-line flags in FFmpeg, but I didn't manage to create a video file that said it was longer than it actually was, so I quickly moved on.

MPEG-TS

MPEG-TS is the format Jellyfin uses when it transcodes video and is built around broadcasting, so on the surface it isn't a bad choice. I did also manage to create a video file that reported itself as being longer than it actually was, but I ran into the original issue once again. The video stopped playing when it had played whatever video was there when the file was initially loaded.

MKV

MKV is a great container format, but there is no support for playing in directly in browsers, so the video has to be at least remuxed first. Luckily remuxing is a lot lighter than transcoding, so if you make sure the browser supports playing the video and audio codec in the mkv file, it can work on low-end hardware. So with that in mind I thought, let's try.

It pretty much worked on the first try and the thing I thought of as a negative, the remuxing, was actually the thing that made the whole thing work.

How it works

Let's say we have a livestream that we are currently downloading using Streamlink, and putting that into a MKV container, that we pretend is 1 hour long, and storing it on disk. Then let's play the file using Jellyfin. Let's say there is 1 minute worth of video available when we press play. When we press play Jellyfin starts a FFmpeg process to remux the video as your browser can't handle the MKV container. FFmpeg quickly goes through the data and exits at the 1-minute mark as there is no more video data. Jellyfin just keeps playing until we hit the 1-minute mark and when it does, it restarts the FFmpeg process. Jellyfin thinks the video is 1 hour long and because the MKV file can't natively be played Jellyfin restarts the FFmpeg process in order to continue playing.

Now this setup isn't perfect. It works great for short watch times, but you start to run into problems the longer you watch. Jellyfin only starts the FFmpeg process when it needs the next video segment, so there is no buffering. It isn't really a problem when you initially start watching, but when the FFmpeg process starts it needs to seek through the file until it gets to where you are, before it can continue remuxing. The longer you watch, the longer it takes to seek and the bigger the pauses gets when it needs to restart FFmpeg.

Version 2.0

To get around the seeking issues and possibly having to babysit FFmpeg while watching I ended up choosing to have FFmpeg read the input file in real-time. That meant the stream download and remux could go along at a constant rate, great! Or at least somewhat, it caused a few problems with the way I initially did it. I had everybody go into a syncplay room, and for each person I manually started a FFmpeg process that mimicked the one Jellyfin started, but mine just ran in real-time. It worked, but I meant if somebody dropped out for some reason, we would either have to start from the beginning or wait for the real-time remux to catch up again. This is because when you press play an ID is generated based on a few parameters including time, which means it will always be different. So you won't be able to use the previous remux if for example your browser crashes.

To work around that I made some small modifications to Jellyfin, created a little helper script and made a docker container for it, so it would be easy to go back and forth between my custom version and the official one. First I modified the code to create a predictable name for the transcode cache files, this does a few things. I can start the transcode on the command-line before even pressing play in Jellyfin. Everybody now shares the same transcode cache files, so only one remux needs to run and if somebody crashes they can jump right back in. Lastly, to make it work properly, I disabled the code to delete the transcode cache if somebody leaves the playback session, since everyone is sharing the same files, they would be deleted for everyone. Instead, I made the little helper script, used to start the transcode, clean up the cache files when stopping the FFmpeg process.

How to get it setup

Container

If you would like to try it out, luckily it's pretty easy especially if you already have Jellyfin set up with the official docker container. The container is intended to be a drop-in replacement for the official one, with one required and one optional config option. If you're unsure how to get started with the official docker container you can have a look at the instruction from the Jellyfin wiki here.

The two options are configured through environment variables and are JELLYFIN_LIVESTREAM_TRANSCODE_DIR and JELLYFIN_LIVESTREAM_DEFAULT_MEDIA_PATH.

JELLYFIN_LIVESTREAM_TRANSCODE_DIR has to be set to the transcode directory used by Jellyfin, or it won't work properly. By default, it is set to /config/transcodes.

JELLYFIN_LIVESTREAM_DEFAULT_MEDIA_PATH is a quality of life option allowing you to set the path to the media library that will contain the livestreams. This allows you to run the helper script with just the name of the file instead of the absolute path. Remember it should be the path to the library inside the container.

When you have the container up and running and a livestream download going you can start the transcode using the below command:

docker exec -it CONTAINER_NAME start-transcode livestream.mkv

If you didn't specify a default media path, use the absolute path to livestream.mkv inside the container instead. To stop the transcode you can just do a Ctrl+C, that will stop the FFmpeg process and clean up the transcode cache.

A livestream and a long MKV file.

To get a livestream you want to watch with some friends, I would recommend Streamlink. It's a great program and very easy to use. Now most if not all livestreams you download will be stored in a .mp4 container where we want an MKV container. To get that you have two options.

First options is to either pipe the stream directly to FFmpeg and have FFmpeg remux the stream into a mkv container with an arbitrarily long length. In this example I use 6 hours.

streamlink -O "LIVESTREAM_LINK" best | ffmpeg -i pipe: -codec copy -t 06:00:00 livestream.mkv

Or you can save the .mp4 and then have FFmpeg read from that file in real-time.

streamlink -o temp.mp4 "LIVESTREAM_LINK" best

And then

ffmpeg -re -i temp.mp4 -codec copy -t 06:00:00 livestream.mkv

The first option is nice because you don't have to store the video twice, but I have experienced issue with the audio being delay using that method. If you run into that as well try out the second method.

To manage it all I can definitely recommend having a look at tmux or screen to make everything easier.

Links

You can find a prebuilt docker container of the custom Jellyfin version on Docker Hub.

You can find the source on my Gitea instance, and I also have a mirror on GitHub. I'm active both places, so you are welcome to open an issue on either platform, if you have any bugs, feature requests, questions or anything else that comes to mind.