This take seems to make some huge assumptions: Google is going to take direct interest in content creation; that fake content untethered from any reality will interest viewers; that fake content can reliably approximate a very large world of distributed points of view; that training data has a long shelf life; that developing idea and writing content is not relevant to content creation; that audiences won't care if they never see those like themselves in content ever again; that past training data can account for news; and many more related contingencies.
Where AI is really valuable to content creation is reducing costs of production for a large diffuse network of creators, not eliminating the creators.
AI has no ideas, If what it's going to do is soak up existing work of actual people and regurgitate or recontextualize others work at drastically lower costs of production, eclipsing that real work, then it is going to be regulated out of the picture, either by law, or due to complete poisoning of its sources.
I suspect that Google will be driven to reduce its percentages or/and subsidize the tools that assist creators, and that ad revenue per view will deflate as content becomes more provisional and narrowly focused to viewers sharply decreasing attention spans. And as costs of production decrease, the share commanded by novel embedded sources of capture will rise, due to necessity of original real-world content for both training and for the substance of programming.
Parallel to these shifts, OG content capture is going to progress approaches and systems for watermarking that ensure the piper gets paid.
The net effect might be an overall shrinking space of truly original, real-world relevant content... To put this another way, the rise of AI will produce a paradoxical return to the times before youtube, and ultimately completely different paradigms for content capture and distribution. Google's first concern should be that its AI business plans will obviate YouTube— and in this light it seems obvious and expected that removing the "you" obviates the tube.
> I worked for a few years inside one of the large teams at Google that touched Youtube’s product monetization roadmap.
A team that works on product monetization roadmap will be focused on increasing monetization. And that's by design. Their roadmap wont (and shouldnt) have other things.
Similarly, a team that's focused on creator features will only care about new features. You cant draw the opposite conclusion from creator features' roadmap and say "omg, they dont even think about monetization". Yeah, duh - it is a different team that's figuring that out.
Note to author, don't get ahead of yourself...
This take seems to make some huge assumptions: Google is going to take direct interest in content creation; that fake content untethered from any reality will interest viewers; that fake content can reliably approximate a very large world of distributed points of view; that training data has a long shelf life; that developing idea and writing content is not relevant to content creation; that audiences won't care if they never see those like themselves in content ever again; that past training data can account for news; and many more related contingencies.
Where AI is really valuable to content creation is reducing costs of production for a large diffuse network of creators, not eliminating the creators.
AI has no ideas, If what it's going to do is soak up existing work of actual people and regurgitate or recontextualize others work at drastically lower costs of production, eclipsing that real work, then it is going to be regulated out of the picture, either by law, or due to complete poisoning of its sources.
I suspect that Google will be driven to reduce its percentages or/and subsidize the tools that assist creators, and that ad revenue per view will deflate as content becomes more provisional and narrowly focused to viewers sharply decreasing attention spans. And as costs of production decrease, the share commanded by novel embedded sources of capture will rise, due to necessity of original real-world content for both training and for the substance of programming.
Parallel to these shifts, OG content capture is going to progress approaches and systems for watermarking that ensure the piper gets paid.
The net effect might be an overall shrinking space of truly original, real-world relevant content... To put this another way, the rise of AI will produce a paradoxical return to the times before youtube, and ultimately completely different paradigms for content capture and distribution. Google's first concern should be that its AI business plans will obviate YouTube— and in this light it seems obvious and expected that removing the "you" obviates the tube.
> I worked for a few years inside one of the large teams at Google that touched Youtube’s product monetization roadmap.
A team that works on product monetization roadmap will be focused on increasing monetization. And that's by design. Their roadmap wont (and shouldnt) have other things.
Similarly, a team that's focused on creator features will only care about new features. You cant draw the opposite conclusion from creator features' roadmap and say "omg, they dont even think about monetization". Yeah, duh - it is a different team that's figuring that out.