We Asked A.I. to Create the Joker. It Generated a Copyrighted Image.::Artists and researchers are exposing copyrighted material hidden within A.I. tools, raising fresh legal questions.

  • Random_Character_A@lemmy.world
    link
    fedilink
    English
    arrow-up
    2
    arrow-down
    8
    ·
    edit-2
    10 months ago

    Can a tool create? It generated.

    Anyway, in case like this, is creation even a factor in liability?

    In my opinion one who gets monetary value first from the piece should be liable.

    NYTimes?

    • wildginger@lemmy.myserv.one
      link
      fedilink
      English
      arrow-up
      9
      arrow-down
      1
      ·
      10 months ago

      “I didnt kill him, officer, my murder robot did. Oh, sure, I built it and programmed it to stab jenkins to death for an hour. Oh, yes, I charged it, set it up in his house, and made sure all the programming was set. Ah, but your honor, I didnt press the on switch! Jenkins did, after I put a note on it that said ‘not an illegal murderbot’ next to the power button. So really, the murderbot killed him, and if you like maybe even jenkins did it! But me? No, sir, Im innocent!”

        • Ross_audio@lemmy.world
          link
          fedilink
          English
          arrow-up
          3
          arrow-down
          1
          ·
          10 months ago

          And someone created the AI programming too.

          Then someone trained that AI.

          It didn’t just come out of the aether, there’s a manual on how to do it.

          • Random_Character_A@lemmy.world
            link
            fedilink
            English
            arrow-up
            2
            ·
            edit-2
            10 months ago

            Yes, but in your previous example you person specifically created a machine to stab a specific person.

            Example would be apt, if you created a program that generates programming for industrial machines to insert things in to stuff and then you uploaded a generated program without checking the code and it stabbed some random guy.

                • Ross_audio@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  edit-2
                  10 months ago

                  The liability of industrial machines is actually quite apt.

                  If you design a machine that kills someone during reasonable use. You are liable.

                  Aircraft engineers have a 25 year liability on their work. A mistake they might make could kill hundreds.

                  There is always a human responsible for the actions of a machine. Even unintended results have liability.

                  If you upload a program to a machine and someone dies as a result you’re in hot water.

                  Moving away from life and death, unintended copyright infringement by a machine hasn’t been tested. But it’s likely it will be ruled that at least some of the builders of that machine are responsible.

                  AI “self-driving” cars are getting away with it by only offering an assist to driving. Keeping the driver responsible. But that’s possible because you need a license to drive a car in the first place.

                  AI images like this are the equivalent of a fully self driving car. You set the destination, it drives you there. The liability falls on the process of driving, or the process of creating. The machine doing that means designers are then liable.

    • Ross_audio@lemmy.world
      link
      fedilink
      English
      arrow-up
      8
      arrow-down
      1
      ·
      10 months ago

      So by that logic. I prompted you with a question. Did I create your comment?

      I used you as a tool to generate language. If it was a Pulitzer winning response could I gain the plaudits and profit, or should you?

      If it then turned out it was plagiarism by yourself, should I get the credit for that?

      Am I liable for what you say when I have had no input into the generation of your personality and thoughts?

      The creation of that image required building a machine learning model.

      It required training a machine learning model.

      It required prompting that machine learning model.

      All 3 are required steps to produce that image and all part of its creation.

      The part copyright holders will focus on is the training.

      Human beings are held liable if they see and then copy an image for monetary gain.

      An AI has done exactly this.

      It could be argued that the most responsible and controlled element of the process. The most liable. Is the input of training data.

      Either the AI model is allowed to absorb the world and create work and be held liable under the same rules as a human artist. The AI is liable.

      Or the AI model is assigned no responsibility itself but should never have been given copyrighted work without a license to reproduce it.

      Either way the owners have a large chunk of liability.

      If I ask a human artist to produce a picture of Donald Duck, they legally can’t, even though they might just break the law Disney could take them to court and win.

      The same would be true of any business.

      The same is true of an AI as either its own entity, or the property of a business.

      • Random_Character_A@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        10 months ago

        I’m not non-sentient construct that creates stuff.

        …and when the copyright law was written there was no non-sentient things gererating stuff.

        • Ross_audio@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          1
          ·
          10 months ago

          There is literally no way to prove whether you’re sentient.

          Decart found that limitation.

          The only definition in law is whether you have competency to be responsible. The law assumes you do as an adult unless it’s proven you don’t.

          Given the limits of AI the court is going to assume it to be a machine. And a machine has operators, designers, and owners. Those are humans responsible for that machine.

          It’s perfectly legitimate to sue a company for using a copyright breaking machine.

          • Random_Character_A@lemmy.world
            link
            fedilink
            English
            arrow-up
            2
            ·
            10 months ago

            You almost seem like you get the problem, but then you flounder away.

            Law hasn’t caught up with the world with generative programs. A.I will not be considered sentient and they will have this same discussion in court.

            • Ross_audio@lemmy.world
              link
              fedilink
              English
              arrow-up
              1
              ·
              10 months ago

              It doesn’t matter whether AI is sentient or not. It has a designer, trainer, and owner.

              Once you prove the actions taken by the AI, even as just a machine, breach copyright liability is easily assigned.

              • Random_Character_A@lemmy.world
                link
                fedilink
                English
                arrow-up
                2
                ·
                10 months ago

                Argee to disagree and time will tell, but you must see there are factors that haven’t existed before in the history of humanity.

                • Ross_audio@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  10 months ago

                  Who knows how the laws will change because of AI. But as the law currently stands it’s just a matter of proving it to a court. That’s the main barrier.

                  This is strong evidence an AI is breaking the law.

                  • Random_Character_A@lemmy.world
                    link
                    fedilink
                    English
                    arrow-up
                    1
                    ·
                    10 months ago

                    That joker could have been somebodys avatar picture with matching username.

                    A.I. can’t understand copyright and useful A.I can’t be build by protecting it from every material somebody thinks is their IP. It needs to learn to understand humans and needs human material to do so. Shitload of it. Who’s up for some manual filtering?

                    If we go by NYTimes standards we better mothball the entire AI endeavor.