hachyderm.io is one of the many independent Mastodon servers you can use to participate in the fediverse.
Hachyderm is a safe space, LGBTQIA+ and BLM, primarily comprised of tech industry professionals world wide. Note that many non-user account types have restrictions - please see our About page.

Administered by:

Server stats:

9.9K
active users

There's a host of legal risks AI companies and companies that use generative AI are putting themselves in the path of, that we don't talk about enough:

📜 It's pretty clear Section 230, the foundational law enabling today's internet, DOES NOT protect AI-generated content like that from ChatGPT, Claude or Google's generative search experience

🚗💥🚙 Generative AI could also put companies at risk of product liability claims

My deep dive:

1/🧵

(gift link)

wsj.com/tech/ai/the-ai-industr

“If in the coming years we wind up using AI the way most commentators expect, by leaning on it to outsource a lot of our content and judgment calls, I don’t think companies will be able to escape some form of liability.”

-- Jane Bambauer, professor of law at the University of Florida

She's written a whole paper on yet a *third* category of legal risk using generative AI could open companies up to, which I didn't even have space for:

papers.ssrn.com/sol3/papers.cf

2/🧵

Thomas Favre-Bulle 🏳‍🌈

@mimsical Counter: "using the way most commentators expect" is already far from the most common use case today, and will be less and less in importance.

Section 230 doesn't apply to e.g. automated pipelines of internal documents and using for it doesn't change that.

For all the media attention on content creation for public consumption, most use is very boring office work.

@erispoe @mimsical

This. Content generation is very visible and in the public mind at the moment. But a lot of the real utility is internal or in the back end.