I just listened to this AI generated audiobook and if it didn’t say it was AI, I’d have thought it was human-made. It has different voices, dramatization, sound effects… The last I’d heard about this tech was a post saying Stephen Fry’s voice was stolen and replicated by AI. But since then, nothing, even though it’s clearly advanced incredibly fast. You’d expect more buzz for something that went from detectable as AI to indistinguishable from humans so quickly. How is it that no one is talking about AI generated audiobooks and their rapid improvement? This seems like a huge deal to me.

  • LadyLikesSpiders@lemmy.ml
    link
    fedilink
    arrow-up
    105
    arrow-down
    8
    ·
    1 year ago

    Ah yes, Audio AI. I can’t wait for this rapidly-approaching future where you literally won’t be able to trust the validity of anything your senses tell you anymore

      • AVincentInSpace@pawb.social
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        But up until this point, you see, there has always been one medium that is difficult/expensive enough to convincingly fake that it can reasonably be used as proof that something actually happened. If technology advances to the point where a video of something happening is no more convincing than a text description that it happened, and no other more sophisticated, harder-to-fake medium steps in to replace it…

        I don’t want to live in a world where the truth is anything you can convince your friends of, you feel me?

        • mindbleach@sh.itjust.works
          link
          fedilink
          arrow-up
          1
          ·
          1 year ago

          “Up until this point” meaning maybe eighty years where unexpected events had any chance of being on film or televised, and several decades where amateur video was even theoretically possible.

          And solid corroborating evidence still barely moved the needle whenever it was footage of cops trying to kill someone.

          And what’s going to make bodycams necessary regardless is chain-of-custody demonstrating (a) the footage matching what the victims said absolutely came from the camera strapped to the chest of the accused, or (b) some motherfucker orchestrated a cover-up that demonstrates consciousness of guilt.

    • AdmiralShat@programming.dev
      link
      fedilink
      English
      arrow-up
      36
      arrow-down
      1
      ·
      1 year ago

      Imagine the day when people post videos of the president saying literally anything with pitch perfect audio voice synth

      Imagine going to prison for a generated clip of you confessing to a crime.

      • FaceDeer@kbin.social
        link
        fedilink
        arrow-up
        27
        arrow-down
        2
        ·
        1 year ago

        Once the tech is that good, a recording of your confession will be useless as evidence in court.

        • AdmiralShat@programming.dev
          link
          fedilink
          English
          arrow-up
          13
          ·
          edit-2
          1 year ago

          …but it is already that good? The fact that celebrities are having to come out and say it wasn’t them in an ad is proof enough that it can fool people

          You only need to fool a jury

          • FaceDeer@kbin.social
            link
            fedilink
            arrow-up
            9
            ·
            1 year ago

            Then we’ll have to take more care with how jury trials are conducted. It’s always been possible to fool juries, that’s often a lawyer’s entire strategy.

        • xkforce@lemmy.world
          link
          fedilink
          arrow-up
          8
          arrow-down
          3
          ·
          1 year ago

          Everything will be useless in court. Audio evidence? Worthless. Video evidence? Worthless. Physical evidence? Prove that it wasnt planted. That kind of AI is a fucking nightmare and no one really understands the danger that kind of AI poses.

          • FaceDeer@kbin.social
            link
            fedilink
            arrow-up
            9
            arrow-down
            1
            ·
            1 year ago

            AI can’t tamper with physical evidence. It can’t fake financial records or witness testimony. Many kinds of audio and visual recordings will still have sufficient authentication and chain of custody to be worthwhile.

            The main kind of evidence that these AI generators makes untenable are the ones where someone just shows up and says “look at this video of X confessing to Y that I happen to have,” which was never a particularly good sort of evidence to base a court case on to begin with.

            • xkforce@lemmy.world
              link
              fedilink
              arrow-up
              7
              arrow-down
              1
              ·
              edit-2
              1 year ago

              Witness testimony is already a very unreliable source of evidence. And again, evidence can be planted. Hell there was doubt about the chain of custody before AI could just make up audio and video. The validity of the chain of custody boils down to the cops and government in general being trusted enough to not falsify it when it suits them.

              Sufficiently advanced AI can, and eventually will, be capable of creating deepfakes that cant reliably be proven to be false. Every test that can be done to authenticate that media can be used by the AI to select generated media that would pass scrutiny in principle.

              I love the optimism and I hope you’re right but I don’t think you are. I think that deepfake AI should scare people a whole lot more than it does.

              • FaceDeer@kbin.social
                link
                fedilink
                arrow-up
                2
                ·
                1 year ago

                The validity of the chain of custody boils down to the cops and government in general being trusted enough to not falsify it when it suits them.

                There are ways to cryptographically validate chain of custody. If we’re in a world where only video with valid chain of custody can be used in court then those methods will see widespread adoption. You also didn’t address any of the other kinds of evidence that I mentioned AI being unable to tamper with. Sure, you can generate a video of someone doing something horrible. But in a world where it is known that you can generate such videos, what jury would ever convict someone based solely on a video like that? It’s frankly ridiculous.

                This is very much the typical fictional dystopia scenario where one assumes all the possible negative uses of the technology will work fine but ignore all the ways of being able to counter those negative uses. You can spin a scary sci-fi tale from such speculation but it’s not really a useful way of predicting how the actual future is likely to go.

        • Moneo@lemmy.world
          link
          fedilink
          arrow-up
          5
          ·
          1 year ago

          That got me thinking about when we’ll hear the first case of AI generated security camera footage used to frame someone. Which leads me to wonder when it will be standard procedure for cameras to digitally sign their footage.

      • Shyfer@ttrpg.network
        link
        fedilink
        arrow-up
        19
        ·
        1 year ago

        Or imagine politicians like Trump saying the most heinous stuff and then denying it saying it’s fake or AI. How will people know? You won’t even be able to trust your eyes or ears anymore.

      • Helix 🧬@feddit.de
        link
        fedilink
        arrow-up
        4
        arrow-down
        1
        ·
        1 year ago

        Guss we’ll have to resort to digital watermarking with personal certificates then.

      • LadyLikesSpiders@lemmy.ml
        link
        fedilink
        arrow-up
        4
        ·
        1 year ago

        You know some people are just gonna generate that fucking locker room smell, the reek of hormones and axe body spray, to terrorize people

    • FooBarrington@lemmy.world
      link
      fedilink
      arrow-up
      4
      ·
      1 year ago

      Tech like this has been available for a number of years, and has most likely already been used against you. It’s now getting available for the broader masses, but that might just be a blessing in disguise, since increased awareness will hopefully also make you suspicious of those cases that are already happening.

      • LadyLikesSpiders@lemmy.ml
        link
        fedilink
        arrow-up
        9
        arrow-down
        3
        ·
        1 year ago

        Yes, but you could tell they weren’t real. They still needed real voice actors, real sound design, studios and stages and resources. Anyone with a halfway decent rig can fake shit to a very believable degree. Even with CGI you swear is fantastic, you see its fakeness once the novelty wears off