The best AI image generators have improved massively in quality, reaching the point that their output can be hard to distinguish from photography at times. But can they get any better? That might depend on whether they can avoid AI cannibalisation. 

Some artists have begun fighting back against AI by using Nightshade, a tool that’s designed to “poison” AI-image generators by providing corrupted training data. But it seems that the generators could end up poisoning themselves. AI images are now so widespread online (see our pick of the best and worst AI advertising) that they’re likely be hoovered up too if attempts are made to train future models with images scraped from the net. And that could through a spanner in the works.