Found it first here - https://mastodon.social/@BonehouseWasps/111692479718694120
Not sure if this is the right community to discuss here in Lemmy?
Have been for a while. Pretty annoying and I wish you could filter them out.
The Google AI that pre-loads the results query isn’t able to distinguish real photos from fake AI generated photos. So there’s no way to filter out all the trash, because we’ve made generative AI just good enough to snooker search AI.
A lot of them mention they’re using an AI art generator in the description. Even only filtering out self-reported ones would be useful.
That still requires a uniform method of tagging art as such. Which is absolutely a thing that could be done, but there’s no upside to the effort. If your images all get tagged “AI” and another generator’s doesn’t, what benefit is that to you? That’s before we even get into what digital standard gets used in the tagging. Do we assign this to the image itself (making it more reliable but also more difficult to implement)? As structured metadata (making it easier to apply, but also easier to spoof or scrape off)? Or is Google just expected to parse this information from a kaleidoscope of generating and hosting standards?
Times like this, it would be helpful for - say - the FCC or ICANN to get involved. But that would be Big Government Overreach, so it ain’t going to happen.