The AI filter applied server-side to YouTube Shorts (and only shorts, not regular videos) is horrible, and it feels like it must be a case of deliberate boiling the frog. If everyone gets used to overly smooth skin, weirdly pronounced wrinkles, waxy hair, and strange ringing around moving objects, then AI-generated content will stand out less when they start injecting it into the feed. At first I thought this must be some client-side upscaling filter, but tragically it is not. There's no data savings at all, and there's no way for uploaders or viewers to turn it off. I guess I wasn't cynical enough.
I think they are doing it so they can mask the extreme compression they are doing to YouTube shorts.
Wow - I had not considered this as the intention.
I’ve been saying for a while that the end game for addictive short form chum feeds like TikTok and YouTube Shorts is to drop human creators entirely. They’ll be AI generated slop feeds that people will scroll, and scroll, and scroll. Basically just a never ending feed of brain rot and ads.
Meta already teased making this ("Vibes") in September. Also OpenAI's homepage for their Sora tool is a bunch of AI video shorts.
There's already a huge number of AI generated channels in youtube. The only difference is that they're uploaded by channel owners. What's is gonna happen very quickly (if not already) is that Youtube itself will start "testing" AI content that it creates on what will look like new channels. In a matter of a few years they'll promote this "content" to occupy most of the time and views in the platform.
I buy into this conspiracy theory, it's genius. It's literally a boiling the frog kind of strategy against users. Eventually, everyone will get too lazy to go through the mental reasoning of judging every increasingly piece of content as "is this AI" as you mentally spend energy trying to find clues.
And over time the AI content will improve enough where it becomes impossible and then the Great AI Swappening will occur.
Perhaps the shorter/dumber the medium and format, the less discerning an audience it attracts. We're seeing a split between people who reject the idea of content without the subtext of the human creation behind it, and people who just take content for what it is on the surface without knowing why it should matter how it was created.
Yes, but what happens when the AIs themselves begin to brainrot (as happens when they are not fed their usual sustenance of information from humans and the real world)?
Have you seen what people watch on these things? It won’t matter. In fact, the surreal incoherent schizo stuff can work well for engagement.
> the surreal incoherent schizo stuff can work well for engagement.
There’s already popular subreddits (something blursed ai I think) where people upload this type of content and it’s getting decent engagement it seems
[dead]