• Nick Clegg, former Meta executive and UK Deputy Prime Minister, has reiterated a familiar line when it comes to AI and artist consent.
  • He said that any push for consent would “basically kill” the AI industry.
  • Clegg added that the sheer volume of data that AI is trained on makes it “implausible” to ask for consent.
  • Tattorack@lemmy.world
    link
    fedilink
    English
    arrow-up
    9
    arrow-down
    1
    ·
    2 days ago

    If asking for permission is going to kill an industry, then that industry should be killed.

    • Bravo@eviltoast.org
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      5
      ·
      2 days ago

      In principle I agree. The problem is that there are countries which don’t care about respecting law and if you kill AI in the West, all that will happen is the West will get left behind.

    • dumbpotato@lemmy.cafe
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      2
      ·
      2 days ago

      No, it’s not like saying that.

      Please stop trying to use rape as a way to get an emotional response for something unrelated.

  • phlegmy@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    27
    ·
    3 days ago

    Cool, so I’ll get started on building an automated business that sells cheap access to all the music, movies and shows on the streaming services.

    Getting consent for each title would basically kill my business and would be implausible, so I’ll just assume it’s ok.

  • Treczoks@lemmy.world
    link
    fedilink
    English
    arrow-up
    28
    ·
    3 days ago

    If a business cannot survive without breaking the law, then it is not a business but a criminal organisation.

    • dumbpotato@lemmy.cafe
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      1
      ·
      2 days ago

      Contrary to popular belief among useful idiots, copyright and patent laws are not there to protect the working class.

      If copyright and patent laws actually protected workers, why have we not seen rulers fight back against them until now?

      This should be eye-opening to most of you, but that would involve admitting you were wrong.

      Most people can’t do that.

  • Benchamoneh@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    2
    ·
    edit-2
    2 days ago

    If an industry can’t survive without resorting to copyright theft then maybe it’s not a viable business.

    Imagine the business that could exist if only they didn’t have to pay copyright holders. What makes the AI industry any different or more special?

  • vane@lemmy.world
    link
    fedilink
    English
    arrow-up
    33
    ·
    3 days ago

    I have a proposition. Raid them with police and search their computers for stolen data like you would do with your citizens.

  • FreakinSteve@lemmy.world
    link
    fedilink
    English
    arrow-up
    30
    ·
    edit-2
    3 days ago

    oh noes

    Look, these goddamn assholes have got in their head that they have a right to profit.

    NOBODY HAS A RIGHT TO PROFIT.

    You have a right to try to create a profit and there are rules to that. You’re gonna lose your billions in investment if you can’t plaigerize content?..fuck you, your loss, and you shoulda fucking known better when the idea was presented to you.

    Assholes

  • Flickerby@lemm.ee
    link
    fedilink
    English
    arrow-up
    46
    ·
    3 days ago

    If your industry can’t exist without theft then your industry doesn’t deserve to exist, pretty simple.

    • toastmeister@lemmy.ca
      link
      fedilink
      English
      arrow-up
      9
      arrow-down
      24
      ·
      3 days ago

      Copying isn’t theft, the original still exists. Just like watching pirated movies.

      • Rose@slrpnk.net
        link
        fedilink
        English
        arrow-up
        29
        ·
        3 days ago

        The AI industry doesn’t want to abolish or reform copyright law, they just want an exception so that they can keep appropriating shit. On the contrary, they’re pretty mad that AI stuff isn’t covered by more copyright.

        AI bros are not on the side of open culture.

      • Flickerby@lemm.ee
        link
        fedilink
        English
        arrow-up
        33
        ·
        3 days ago

        If someone pirates a movie for home use its no big deal because yes. If someone pirates a movie and then opens a movie theatre and starts charging people to watch the movie that’s an entirely different matter. AI is a business generating income, not a person skipping out on a $4 rental fee.

      • Vanilla_PuddinFudge@infosec.pub
        link
        fedilink
        English
        arrow-up
        14
        ·
        3 days ago

        Copying isn’t theft, the original still exists. Just like watching pirated movies.

        Shit take when the results are used for profit. Most of us that pirate aren’t legally monetizing our stash.

      • Blackmist@feddit.uk
        link
        fedilink
        English
        arrow-up
        9
        ·
        3 days ago

        Cool, so I can torrent without a VPN now?

        Oh, only the super rich can benefit. How convenient.

      • Don_alForno@feddit.org
        link
        fedilink
        English
        arrow-up
        12
        ·
        3 days ago

        As long as people get punished for pirating media, corporations need to license their shit just as well.

  • dumbpotato@lemmy.cafe
    link
    fedilink
    English
    arrow-up
    3
    ·
    edit-2
    2 days ago

    Rules for thee, not for me.

    I thought copyright and patent laws were supposed to protect the little guy? Looks like as soon as they protect the little guy from big business, they stop mattering.

    It’s almost like, they weren’t there to protect the little guy which is why big businesses never fought back against them.

    I guess the useful idiots were wrong, again. Color me not-surprised.

  • CosmoNova@lemmy.world
    link
    fedilink
    English
    arrow-up
    187
    arrow-down
    2
    ·
    4 days ago

    If abiding to the law destroys your business then you are a criminal. Simple as.

    • ☂️-@lemmy.ml
      link
      fedilink
      English
      arrow-up
      11
      ·
      4 days ago

      yes. but honestly, we should use this opportunity to push for better copyright law.

      • 6nk06@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        6
        arrow-down
        4
        ·
        3 days ago

        You can’t have a better law. Copyright laws are one-sided towards $billion companies. They would never agree to give more power to small creators or (worse) open-source projects who rely on such laws without making money.

        • ☂️-@lemmy.ml
          link
          fedilink
          English
          arrow-up
          3
          ·
          edit-2
          3 days ago

          yeah i mean thats why we should push it, and not wait for lying politicians to do it for us.

        • Oniononon@sopuli.xyz
          link
          fedilink
          English
          arrow-up
          2
          ·
          3 days ago

          Yes you can. Raise awareness, vote, contact representatives, organise and sign a large petition. This is eu only, if youre in us use 2nd amendment as intended in order to get your democracy back.

    • Even_Adder@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      24
      arrow-down
      17
      ·
      edit-2
      4 days ago

      But the law is largely the reverse. It only denies use of copyright works in certain ways. Using things “without permission” forms the bedrock on which artistic expression and free speech are built upon.

      AI training isn’t only for mega-corporations. Setting up barriers like these only benefit the ultra-wealthy and will end with corporations gaining a monopoly of a public technology by making it prohibitively expensive and cumbersome for regular folks. What the people writing this article want would mean the end of open access to competitive, corporate-independent tools and would jeopardize research, reviews, reverse engineering, and even indexing information. They want you to believe that analyzing things without permission somehow goes against copyright, when in reality, fair use is a part of copyright law, and the reason our discourse isn’t wholly controlled by mega-corporations and the rich.

      I recommend reading this article by Kit Walsh, and this one by Tory Noble staff attorneys at the EFF, this one by Katherine Klosek, the director of information policy and federal relations at the Association of Research Libraries, and these two by Cory Doctorow.

      • I Cast Fist@programming.dev
        link
        fedilink
        English
        arrow-up
        17
        arrow-down
        2
        ·
        4 days ago

        They want you to believe that analyzing things without permission somehow goes against copyright, when in reality, fair use is a part of copyright law, and the reason our discourse isn’t wholly controlled by mega-corporations and the rich.

        Ok, but is training an AI so it can plagiarize, often verbatim or with extreme visual accuracy, fair use? I see the 2 first articles argue that it is, but they don’t mention the many cases where the crawlers and scrappers ignored rules set up to tell them to piss off. That would certainly invalidate several cases of fair use

        Instead of charging for everything they scrap, law should force them to release all their data and training sets for free. “But they spent money and time and resources!” So did everyone who created the stuff they’re using for their training, so they can fuck off.

        The article by Tory also says these things:

        This facilitates the creation of art that simply would not have existed and allows people to express themselves in ways they couldn’t without AI. (…) Generative AI has the power to democratize speech and content creation, much like the internet has.

        I’d wager 99.9% of the art and content created by AI could go straight to the trashcan and nobody would miss it. Comparing AI to the internet is like comparing writing to doing drugs.

        • Even_Adder@lemmy.dbzer0.com
          link
          fedilink
          English
          arrow-up
          4
          arrow-down
          6
          ·
          4 days ago

          Ok, but is training an AI so it can plagiarize, often verbatim or with extreme visual accuracy, fair use? I see the 2 first articles argue that it is, but they don’t mention the many cases where the crawlers and scrappers ignored rules set up to tell them to piss off. That would certainly invalidate several cases of fair use

          You can plagiarize with a computer with copy & paste too. That doesn’t change the fact that computers have legitimate non-infringing use cases.

          Instead of charging for everything they scrap, law should force them to release all their data and training sets for free.

          I agree

          I’d wager 99.9% of the art and content created by AI could go straight to the trashcan and nobody would miss it. Comparing AI to the internet is like comparing writing to doing drugs.

          But 99.9% of the internet is stuff that no one would miss. Things don’t have to have value to you to be worth having around. That trash could serve as inspiration for your 0.1% of people or garner feedback for people to improve.

          • I Cast Fist@programming.dev
            link
            fedilink
            English
            arrow-up
            5
            ·
            4 days ago

            The apparent main use for AI thus far is spam and scam, which is what I was thinking about when dismissing most content made with that. While the internet was already chock full of that before AI, its availability is increasing those problems tenfold

            Yes, people use it for other things, like “art”, but most people using it for “art” are trying to get a quick buck ASAP before customers get too smart to fall for it. Writers already had a hard time getting around, now they have to deal with a never ending deluge of AI books, plus the risk of a legally distinct enough copy of their work showing up the next day.

            Put it another way, the major use of AI thus far is “i want to make money without effort”

            • Even_Adder@lemmy.dbzer0.com
              link
              fedilink
              English
              arrow-up
              2
              arrow-down
              5
              ·
              4 days ago

              It definitely seems that way depending on what media you choose to consume. You should try to balance the doomer scroll with actual research and open source news.

              • I Cast Fist@programming.dev
                link
                fedilink
                English
                arrow-up
                5
                ·
                4 days ago

                I’m basing it mostly from personal and family experience. My mom often ends up watching AI made videos (stuff that’s just an AI narrator and AI images slideshow), my RPG group has poked fun at the amount of AI books that Amazon keeps suggesting them, anyone using instagram will, sooner or later, see adverts of famous people endorsing bogus products or sites via the magic of AI

                • Even_Adder@lemmy.dbzer0.com
                  link
                  fedilink
                  English
                  arrow-up
                  2
                  arrow-down
                  5
                  ·
                  4 days ago

                  So you don’t interact with AI stuff outside of that? Have you seen any cool research papers or messed with any local models recently? Getting a bit of experience with the stuff can help you better inform people and see through the more bogus headlines.

          • skulblaka@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            4
            arrow-down
            1
            ·
            4 days ago

            I don’t really disagree with your other two points, but

            You can plagiarize with a computer with copy & paste too. That doesn’t change the fact that computers have legitimate non-infringing use cases.

            They sure do, of which that is not one. That’s de facto copyright infringement or plagiarism. Especially if you then turn around and sell that product.

            • 8uurg@lemmy.world
              link
              fedilink
              English
              arrow-up
              1
              arrow-down
              2
              ·
              3 days ago

              The key point that is being made is that it you are doing de facto copyright infringement of plagiarism by creating a copy, it shouldn’t matter whether that copy was made though copy paste, re-compressing the same image, or by using AI model. The product being the copy paste operation, the image editor or the AI model here, not the (copyrighted) image itself. You can still sell computers with copy paste (despite some attempts from large copyright holders with DRM), and you can still sell image editors.

              However, unlike copy paste and the image editor, the AI model could memorize and emit training data, without the input data implying the copyrighted work. (exclude the case where the image was provided itself, or a highly detailed description describing the work was provided, as in this case it would clearly be the user that is at fault, and intending for this to happen)

              At the same time, it should be noted that exact replication of training data isn’t exactly desirable in any case, and online services for image generation could include a image similarity check against training data, and many probably do this already.

      • tane6@lemm.ee
        link
        fedilink
        English
        arrow-up
        5
        arrow-down
        4
        ·
        4 days ago

        The fact that this is upvoted is so funny but unsurprising given the types who visit this site

        • despoticruin@lemm.ee
          link
          fedilink
          English
          arrow-up
          11
          arrow-down
          2
          ·
          4 days ago

          Yeah, anyone who thinks stealing content explicitly for financial gain is fair use needs their head checked.

  • DrownedRats@lemmy.world
    link
    fedilink
    English
    arrow-up
    32
    arrow-down
    1
    ·
    edit-2
    3 days ago

    If being declined concent is going to kill your industry then maybe your industry deserved to die.

    Fucking rapist mentaility right there.

    • Tobberone@lemm.ee
      link
      fedilink
      English
      arrow-up
      3
      ·
      2 days ago

      My thought exactly. If consent isn’t needed, what other actions do they deem justified without consent?

      This is not a IP-issue, this is about human rights.