hachyderm.io is one of the many independent Mastodon servers you can use to participate in the fediverse.
Hachyderm is a safe space, LGBTQIA+ and BLM, primarily comprised of tech industry professionals world wide. Note that many non-user account types have restrictions - please see our About page.

Administered by:

Server stats:

9.3K
active users

Information pollution.

The information available to us is polluted. It is hard to take in information without taking in pollution.

Information pollution is age-old, but our current situation is new. Generative AI is causing an explosion of information pollution, mechanizing and automating it, in a way analogous to how the industrial revolution caused an explosion of air and water pollution.

Re this from @deevybee:
mastodon.social/@deevybee/1130

MastodonDorothy Bishop (@deevybee@mastodon.social)Rather chilling study showing high volume of #chatGPT generated papers on Google Scholar in subject areas with policy implications - and these are the ones that were picked up on a v simple phrase search. https://misinforeview.hks.harvard.edu/article/gpt-fabricated-scientific-papers-on-google-scholar-key-features-spread-and-implications-for-preempting-evidence-manipulation/ #disinformation #publication #policy

I’m shopping around this term because I think it puts some things in useful perspective that we’re not doing a great job of getting our heads around.

Paul Cantrell

Think about:

- incentives that drive pollution
- what strategies have and have not stopped pollution
- who benefits from pollution
- cost (or impossibility) of clean-up
- ongoing discovery of unrecognized impacts
- disproportionate impacts of pollution (wealth, race, location…)
- pollution as a de facto tool of oppression & supremacy (Flint)
- how pollution moves from individual impact (a poisoned well) to systemic failure (climate change)

Now: apply those thoughts to •information pollution•.

@inthehands …. Oh no, I bet some journals are working on using LLMs to generate review comments, aren’t they

@ShadSterling
Yes. It was already a barely functioning, unsustainable model. LLMs are pushing it past the breaking point.