… They typed “make all” by themselves? Nice. I usually use a script to do that.
It adds a certain handcrafted quality to the code that you just can’t find elsewhere, like hand grinding your coffee beans with an electric grinder or turning the lights on using a wall switch.
Newbs. “all” is the default target, so they did twice as much work as they needed to.
It’s always ffmpeg under the hood
So I decompiled the gnu c compiler once, and yes, it was just an elaborate use of ffmpeg
Does anyone not use ffmpeg at this point?
FFmpeg
For them it’s just “the code”
as if the technology is what makes livestream services difficult anyways lmao
it’s just expensive as shit because it involves a lot of dataA lot of data throughput and buffer just for ingesting and distributing the live streams themselves, technical and business administration to keep things running, moderation to ensure compliance with content laws and data protection regulation, and then there’s still all the other fancy features major platforms offer if you want to compete for users.
Multiple resolution options with server-side rescaling for users with slower connections? Graphics computing power.
Store past broadcasts? Massive amounts of data storage capacity.
Social features? Even more moderation.And we haven’t even touched on the monetary issue of “How do you pay for all that?” and all its attached complexity. You could be running the nicest platform in the world, but without any funding, it won’t run very long.
So it I cut the server-side rescaling, target just the developed countries, skip moderation and don’t store broadcasts, it should be easy? Noted.
I mean, the minimum you need is some authentication mechanism, a secure certificate, an authenticated endpoint to send a live data feed to, an endpoint to query a given live data feed from, maybe a website to serve the whole thing for people that don’t have their own tool for reading and playing back a live data feed…
…and the infrastructure to distribute that data feed from ingest to content delivery. Easy.
(Note: easy does not mean cheap. Even if a live data feed ingest and delivery was easy to implement (which I doubt it is), you’d skip buffering (to reduce memory demands) and only used a single server (to spare such stupid things as distributed networks, load balancing, redundancy or costs for scaling cloud solutions), you’d still have computational overhead of network operations and of course a massive data throughput.)
The FFmpeg team is pretty based: https://xcancel.com/FFmpeg/status/1775178803129602500
I appreciate they know the value of their work and criticize companies for their ridiculous exploitation and underpayment of open source devs, as well as claiming open source libraries as their own work.
The downside is some shitty far-right service is now getting free publicity.
my company in a nutshell.
we made up a shitty name and a shittier AI logo for whats essentially an off the shelf white label appliance we configure a little different.
Not to defend them, but he did follow up with this:
This is referring to the technology we just released into BETA for premium subscribers, which delivers one of the lowest latencies for livestreaming (significantly better than YouTube’s latency).
This does not refer to encoding
https://xcancel.com/chrispavlovski/status/1856090182275215803
Although quality != latency, so idk.
Deleted, apparently