“Dead Internet Theory” would turn into the “Law of Dead Internet” if that happens. It’s pretty close right now as it is. At that point either a new “Internet” is born from a technology renaissance, or humans continue to co-exist with AI in their new role as Internet Zombie Cash Cows.
I think tools for detecting and filtering out ai material from search results would go a long way to improve the current situation, and is a middle ground between an internet revolution and a technological dystopia. There is still an unfathomably large amount of good information on the internet, the issue is that there is 20x more trash. And the trash is scaling rapidly, humans are not.
If you haven’t already, give the Marginalia search engine a try. They’re doing something interesting in this space. You can filter out results with javascript, affiliate links, tracking, ads, and cookies. After filtering, the internet feels a lot more like it did 20 years ago, more sincere, more human.
If I recall correctly, Marginalia is made and maintained by one guy. As the trash to good content ratio worsens, I think more people will want to build on and use projects like Marginalia.
Ironically, those tools to filter out AI will also be AI. I do believe they’ll be necessary, but also what the fuck. It’s a bit like a bunch of people have decided to just piss all over the place, and rather than cleaning it up and putting an end to the rampant pissing, everybody’s just gonna end up putting on masks so they don’t have to smell it.
not necessarily, i once stumbled upon an enormous ublock “ai filter” that was just a list of css rules hiding search results referencing a predefined list of ai sites
Filtering doesn’t necessarily have to be driven by AI.
Take recipes for example. Recipes are now almost impossible to get non AI results for via search engines. But, simple hardcoded parameters that set a preference for older results, ones without affiliate links (Marginalia does this), ones with fewer than 5 domains executing javascript on the site, some analysis of the date of the domain registration and activity on the domain, some analysis of the top level domain to filter out blogspam, these would all make the search results more human.
My hope is that eventually, there will be a paradigm of search engine optimization, maybe even an open standard for the absence of excessive javascript, affiliate links, social media buttons, etc. Sites that lack those elements are way less likely to be junk.
“Dead Internet Theory” would turn into the “Law of Dead Internet” if that happens. It’s pretty close right now as it is. At that point either a new “Internet” is born from a technology renaissance, or humans continue to co-exist with AI in their new role as Internet Zombie Cash Cows.
I think tools for detecting and filtering out ai material from search results would go a long way to improve the current situation, and is a middle ground between an internet revolution and a technological dystopia. There is still an unfathomably large amount of good information on the internet, the issue is that there is 20x more trash. And the trash is scaling rapidly, humans are not.
If you haven’t already, give the Marginalia search engine a try. They’re doing something interesting in this space. You can filter out results with javascript, affiliate links, tracking, ads, and cookies. After filtering, the internet feels a lot more like it did 20 years ago, more sincere, more human.
If I recall correctly, Marginalia is made and maintained by one guy. As the trash to good content ratio worsens, I think more people will want to build on and use projects like Marginalia.
Ironically, those tools to filter out AI will also be AI. I do believe they’ll be necessary, but also what the fuck. It’s a bit like a bunch of people have decided to just piss all over the place, and rather than cleaning it up and putting an end to the rampant pissing, everybody’s just gonna end up putting on masks so they don’t have to smell it.
not necessarily, i once stumbled upon an enormous ublock “ai filter” that was just a list of css rules hiding search results referencing a predefined list of ai sites
Filtering doesn’t necessarily have to be driven by AI.
Take recipes for example. Recipes are now almost impossible to get non AI results for via search engines. But, simple hardcoded parameters that set a preference for older results, ones without affiliate links (Marginalia does this), ones with fewer than 5 domains executing javascript on the site, some analysis of the date of the domain registration and activity on the domain, some analysis of the top level domain to filter out blogspam, these would all make the search results more human.
My hope is that eventually, there will be a paradigm of search engine optimization, maybe even an open standard for the absence of excessive javascript, affiliate links, social media buttons, etc. Sites that lack those elements are way less likely to be junk.