The inundation of AI-generated slop on various media platforms increasingly risks making them unusable. It’s an issue effecting everything from Spotify to indie games platforms like Itch.io, making it harder for users to find quality content (see some tips on how to identify AI 3D models)
Itch.io is now trying to save things with a new AI filter. It will require those who submit assets to declare whether generative AI was used to make them. The move has generally been welcomed by indie developers. But there are still some questions.
We are now requiring asset page creators to tag their use of generative AI in their work. You can learn more here: https://t.co/RBwTllQR10On your dashboard you can find a bulk tagging dialog if you have pages that need tagging. pic.twitter.com/bVoDKV5OWzNovember 20, 2024
Itch.io says its AI Generation Disclosure requires asset page creators to tag their use of generative AI to make it easier to filter out from search results. A new field on project edit pages asks if the project contains the results of generative AI. If you select ‘yes’, you’ll be asked what kinds of generative AI were used – graphics, sound, text and dialogue or code. This will apply sub-tags based on the selections. There’s also a bulk tagging dialog for whole pages that need tagging.
Assets that were made using generative AI and aren’t tagged accordingly will no longer be eligible for indexing on browse pages. The site’s creator Leaf Corcoran said there would be a grace period for people to update their pages, after which the site plans to use “user reports to handle pages that have not been addressed.
“This field is now required for all asset creators on itch.io. If you have a public asset page on itch.io and you view your dashboard, you will now see a blocking dialog instructing you to classify your pages,” leafo wrote.
The statement only mentions assets rather than games, but the ‘No AI’ tag appears in itch.io’s games library, suggesting that the filter will apply there too.
Developers have welcomed the move. “I had voiced despair at how flooded with AI the assets section had become, and even stopped publishing on itch altogether. This is very good news, a good start, and I’ll definitely come back to itch if negative filtering does a good job of filtering out the slop,” one person wrote.
However, there are questions about how it will work in practice. “If I use ChatGPT to debug and help me with code, do I also need to include a tag indicating it was made with AI?” one person has asked. “If that’s the case, I believe there will be a lot of games with the AI tag if developers are honest about it. If the goal is to filter out bad assets, I don’t think this will help much.”
There are also questions about how liars will be identified. “I can definitely see hostile reporting of assets being an upcoming issue. What will be used to decide if a project does or does not contain AI-generated content?” one person has asked.
Some are pushing for a complete ban. “No one wants it, and it’s just a shitty grift that actively harms the indies that your platform specifically caters towards,” one person wrote about AI content.
However, Itch.io’s policy of enforced disclosure sounds like a most sustainable approach to dealing with AI content, if it’s able to enforce it. Users need to be able to make informed decisions about the content they use and consume, and being able to easily filter out the AI rubbish will save people time. I hope we see other platforms follow suit.
If there were any doubts about the risk of using AI assets, just look at the backlash against the Coca-Cola AI Christmas ad.