Information pollution.
The information available to us is polluted. It is hard to take in information without taking in pollution.
Information pollution is age-old, but our current situation is new. Generative AI is causing an explosion of information pollution, mechanizing and automating it, in a way analogous to how the industrial revolution caused an explosion of air and water pollution.
Re this from @deevybee:
https://mastodon.social/@deevybee/113095441167609490
I’m shopping around this term because I think it puts some things in useful perspective that we’re not doing a great job of getting our heads around.
Think about:
- incentives that drive pollution
- what strategies have and have not stopped pollution
- who benefits from pollution
- cost (or impossibility) of clean-up
- ongoing discovery of unrecognized impacts
- disproportionate impacts of pollution (wealth, race, location…)
- pollution as a de facto tool of oppression & supremacy (Flint)
- how pollution moves from individual impact (a poisoned well) to systemic failure (climate change)
Now: apply those thoughts to •information pollution•.
@inthehands with respect to scholarly works in particular, we also need to reform the peer review process. I don’t know what the right way to do peer review is, but the way we do it now is bad in several ways, including accepting too much pollution.
( https://mastodon.social/@ShadSterling/113097784879653538 )
@inthehands …. Oh no, I bet some journals are working on using LLMs to generate review comments, aren’t they
@ShadSterling
Yes. It was already a barely functioning, unsustainable model. LLMs are pushing it past the breaking point.