• sunzu2@thebrainbin.org
    link
    fedilink
    arrow-up
    19
    arrow-down
    2
    ·
    3 months ago

    Too late to the grift…

    Why even collab with that clown, when apply has money to hire talent and buy NVIDIA cards to do the job…

      • sunzu2@thebrainbin.org
        link
        fedilink
        arrow-up
        3
        arrow-down
        9
        ·
        3 months ago

        learn something new everyday… well apple better tuck that dick and buy these GPUs then it seems, i doubt their own design can compete tbh… .if it did, nvidia would not be running a train on the global GPU market.

        • thejml@lemm.ee
          link
          fedilink
          English
          arrow-up
          17
          arrow-down
          2
          ·
          3 months ago

          Considering how long Apple has been putting neural cores in all their chips, and the speed at which their in house chips have outpaced competitors (like the M series for example), I feel like not only will Apple beat nvidia at this, Apple will do so by a decent amount.

          That said, nvidia will continue to sell world wide in this market as Apple will keep their chips in only their own hardware, so if you’re not running a Mac/IOS device, you’ll be using nvidia chips.

          Either way, even if Apple just keeps up, competition is still best for everyone, so I welcome this development.

          • Akrenion@slrpnk.net
            link
            fedilink
            arrow-up
            4
            ·
            3 months ago

            The m chips are only good for inference and not for training. That is still unparalleled with CUDA. Pun intended.

            • nilloc@discuss.tchncs.de
              link
              fedilink
              English
              arrow-up
              3
              arrow-down
              1
              ·
              3 months ago

              That is still unparalleled with CUDA.

              I still don’t understand how an open source alternative with better hardware support hasn’t happened yet.

              • herrvogel@lemmy.world
                link
                fedilink
                arrow-up
                2
                ·
                3 months ago

                The problem has two sides: software and hardware. You can open source the software side all you want, it’s not gonna go very far when it has to fight against the hardware instead of working with it.

                ROCm is open source, but it’s AMD. Their hardware has historically not been as powerful and therefore attractive to the target audience, so it’s been going slow.

    • narc0tic_bird@lemm.ee
      link
      fedilink
      arrow-up
      10
      ·
      3 months ago

      They can and they are making their own chip designs to do the job.

      The cloud part of Apple Intelligence runs on their own designed hardware.