• 0 Posts
  • 119 Comments
Joined 2 years ago
cake
Cake day: July 14th, 2023

help-circle








    1. Fuck AI
    2. This judge’s point is absolutely true:

    “You have companies using copyright-protected material to create a product that is capable of producing an infinite number of competing products,” Chhabria said. “You are dramatically changing, you might even say obliterating, the market for that person’s work, and you’re saying that you don’t even have to pay a license to that person.”

    1. AI apologists’ response to that will invariably be “but it’s sampling from millions of people at once, not just that one person”, which always sounds like the fractions-of-a-penny scene
    2. Fuck copyright
    3. A ruling against fair use for AI will almost certainly deal collateral damage to perfectly innocuous scraping projects like linguistic analysis. Even despite their acknowledgement of the issue:

    To prevent both harms, the Copyright Office expects that some AI training will be deemed fair use, such as training viewed as transformative, because resulting models don’t compete with creative works. Those uses threaten no market harm but rather solve a societal need, such as language models translating texts, moderating content, or correcting grammar. Or in the case of audio models, technology that helps producers clean up unwanted distortion might be fair use, where models that generate songs in the style of popular artists might not, the office opined.

    1. We really need to regulate against AI — right now — but doing it through copyright might be worse than not doing it at all



  • The author seems to have fallen for two tricks at once: The MPAA/RIAA playbook of seeing all engagement with content through the lens of licensing, and the AI hype machine telling everyone that someday they will love AI slop.

    He mentions people complaining that stock photo sites, book portals, and music streaming services are all degrading in quality because of AI slop, but his conclusion is that people will start seeking out AI content because it’s not copyrighted.

    Regardless… The position of those in power has not changed. They never believed in copyright as a guiding concept, only as a means to an end. That end being: We, the powerful, will control culture, and we will use it to benefit ourselves.

    Before generative AI, the approach was to keep the cultural landscape well-groomed – something you’d wanna pay to experience. Mindfully grown and pruned, with clear walking paths, toll booths at each entrance, and harsh penalties for littering or stepping on the grass. You were allowed to have your own toll-free parks outside of the secure perimeter, that continue the walking paths in ways that are mutually beneficial, as long as visitors don’t track mud in as a result.

    But now? The landscape is no longer about creating a well-manicured amusement park worth the price of admission. There’s oil under the surface. And it’s time to frack the hell out of it. It’s too bad about the toxic slurry that will accumulate up top, making the walled and unwalled parks alike into an intolerable biohazard. There are resources to extract. Externalities are an end-user problem.

    Yeah, turning culture into an expensive amusement park was a horrible mistake. But I wouldn’t get too eager to gloat over seeing the tide of sludge pour over their walls. We’ll still be on the outside, drowning in it.