A.I. experts downplay ‘nightmare scenario of evil robot overlords’. Over 1,300 sign letter claiming it’s a ‘force for good, not a threat to humanity’
It honestly makes my blood boil when I read “AI will destroy everything” or “kill everyone” or similar. No, its not going to happen. We are nowhere close to that, we never will unintentionally or intentionally create unkillable beings like Terminator just because there is actually no material to build it with to make it par.
Big tech wanting to freeze AI just want to control it themself. Full speed ahead!
@SSUPII @throws_lemy @technology concur.
Skynet is a red herring.
The real issue is that #AI is putting more stress on long-standing problems we haven’t solved well. Good opportunity to think carefully about how we want to distribute the costs and benefits of knowledge work in our society.
“AI stole my book/art” is not that different from “show me in the search results, but only enough that people click through to my page”
“AI is taking all the jobs” is not that different from “you outsourced all the jobs overseas”
“the AI lied to me!” is not that different from “that twitter handle lied about me!”
The main difference is scale, speed and cost. Things continue to speed up, social norms and regulations fall behind faster. #ai
AI scraping and stealing people's art is literally nothing like a search engine.
Maybe that would hold up if the original artist was paid and credited/linked to, but right now there is literally zero upside to having your artwork stolen by big tech.
@donuts would you please share your thinking?
I certainly agree that you can see the current wave of Generative AI development as “scraping and stealing people’s art.” But it’s not clear to me why crawling the web and publishing the work as a model is more problematic than publishing crawl results through a search engine.
For example, image search has been contentious for very similar reasons.
1. You post a picture online for people to see, and host some adds to make some money when people look at it.
2. Then Google starts showing the picture in image search results.
3. People view the image on Google and never visit your site or click on your ads. Worst case, google hot links it and you incur increased hosting costs with zero extra ad revenue
I certainly think that a Generative AI model is a more significant harm to the artist, because it impacts future, novel work in addition to already-published work.
However in both cases the key issue is a lack of clear & enforceable licensing on the published image. We retreat to asking “is this fair use?” and watching for new Library of Congress guidance. We should do better.
Search engines are rightly considered fair use because they provide a mutual benefit to both the people who are looking for "content" and the people who create that same "content". They help people find stuff, which basically is good for everyone.
On the other hand, artists derive zero benefit from having their art scraped by big tech companies. They aren't paid licensing fees (they should be), they aren't credited (they should be), and their original content is not visible or being advertised in any way. To me, it's simply exploitation right now, and I hope that things can change in the future so that it can benefit everyone, artists included.