• jsomae@lemmy.ml
    link
    fedilink
    arrow-up
    1
    ·
    edit-2
    1 hour ago

    That’s good. They shouldn’t care. I’ll keep saying this until the cows come home: AI is not something that can be used responsibly by most people. As the technology currently exists, it has rare and specific use cases – anything where you can accept a high failure rate, or can verify an answer more easily than you can posit one. This is completely against user expectation, and when the market realizes this, the bubble could pop.

  • RememberTheApollo_@lemmy.world
    link
    fedilink
    arrow-up
    11
    ·
    6 hours ago

    What benefit is there?

    AI basically takes what I already see in search results and tries to make it a conversational summary. I don’t want a conversation or to read a made-up wiki summary, just give me the correct and pertinent result. Problem is that they put AI first and search result quality has been deteriorating for years, so two wrongs don’t mean forcing it on users is right.

    • thermal_shock@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      2 hours ago

      I’m in the same boat. Besides some image generators my kid and I used to create some avatars, I don’t get it. Don’t need a conversation, just give me search results. Don’t waste my time.

  • lacaio da inquisição@lemmy.eco.br
    link
    fedilink
    arrow-up
    5
    arrow-down
    1
    ·
    5 hours ago

    From what I understand, this is the trend because Apple Silicon works. It has well integrated GPU with CPU with great memory for AI tasks on a minimal case. You can run DeepSeek (the 671B one) on it. Who wouldn’t want that? The problem is that those companies hardware, specifically the firmware, is not to be trusted.

    Imagine a world where you would have to jailbreak everything on your PC for it to work. I think that’s what they’re going for. AI is really useful, and if they can make something like Mac Studio cheaper, it has obvious value.

  • some_guy@lemmy.sdf.org
    link
    fedilink
    arrow-up
    3
    ·
    6 hours ago

    The hype around this shit is astounding. That people who make decisions about products from huge brands keep buying in is shocking to me. How can something so useless (to most people) capture the imagination of educated and intelligent people? It’s a sign of how broken capitalism is. Rational thought is replaced by fear of missing out.

    • aesthelete@lemmy.world
      link
      fedilink
      arrow-up
      2
      ·
      3 hours ago

      Most of the concrete value that can be delivered by connecting things to the Internet and simple algorithms has been extracted by silicon valley. The only capital extraction mechanisms left are difficult things that only a government has the capital access to make real progress on (e.g., AGI, advanced robotics, self-driving vehicles, space exploration, etc.) and hyped up garbage that big investment firms think they can extract value out of either the public (through scams like cryptocurrency) or other investors (through LLMs and AI hype) and sell before people figure out that it’s smoke and mirrors.

      We made real progress on the backs of mostly government-funded research projects like DARPA and GPS. The industry was able to optimize and innovate the shit out of the earliest computing breakthroughs where now you have a device in your pocket that can hold several libraries of Congress and beats anything put out in desktop form 20 years ago. But since the tech companies that matter are all giant, there just aren’t ways for them to grow market share (everyone’s their customer) or get many more dollars out of their existing customers. All that is left are scams and bad business practices. That’s why we’re in the golden age of enshittification.

    • Spider@lemmy.blahaj.zone
      link
      fedilink
      English
      arrow-up
      2
      ·
      4 hours ago

      Not educated and intelligent people, wealthy investors.

      We’re sighing at having to build all these features the boss wants and we know they are stupid and we see the lost oppertunity cost that could have been used to improve other things instead. I’m tired and the job market is so ass right now.

  • mriswith@lemmy.world
    link
    fedilink
    English
    arrow-up
    28
    ·
    16 hours ago

    I thought that was pretty obvious by now? Based on how much the companies are trying force feed people their latest version through constant notifications about assistants, assisted search, etc.

    It’s one of the greatest flaws of relying on social media for market research: Tech-bros being overly loud about things like AI, NFTs, etc. trick companies into thinking more people are interested.

    Now they’ve invested tons of money and people aren’t biting, so they’re constantly nagging people to engage so they can justify their expenditure.

      • kibiz0r@midwest.social
        link
        fedilink
        English
        arrow-up
        20
        ·
        edit-2
        16 hours ago

        Basically:

        Intel, AMD, and Microsoft are all going down a dead-end road called x86_64, especially on portable devices.

        Apple and Google took a turn ages ago, towards an alternative called aarch64. Originally just for phones, but now for everything.

        VR headsets, Raspberry Pis, IoT devices, etc. also tend to run aarch or aarch64.

        Microsoft has been trying to follow suit, but it hasn’t gone well so far. Windows for ARM (the aarch64 version of Windows) is supremely unpopular, for a lot of (mostly good) reasons.

        So people avoid the devices or ditch them because none of their apps run natively. But Microsoft basically has no choice but to keep pushing.

        So the end result is, Microsoft is subsidizing tons of excellent hardware that will never be used for Windows cuz it’s just not ready yet.

        But Linux is!

        Edit:

        Funny thing is, ARM (company behind aarch64) keeps shooting themselves in the foot, to the point where lots of companies are hedging their bets with a dark horse called RISC-V that never had a snowball’s chance in Hell before, but now could possibly win.

        And if Microsoft still hasn’t built a new home on aarch64 by the time that happens, they may accidentally be in the best position to capitalize on it.

        • sunzu2@thebrainbin.org
          link
          fedilink
          arrow-up
          6
          ·
          7 hours ago

          RiscV to CPU is what Linux is to OS

          It will win in the end because closed source corpo trash will always enshitify and erode its market position. Just like micro-shit is the best marketer for Linux

          • aesthelete@lemmy.world
            link
            fedilink
            arrow-up
            2
            ·
            3 hours ago

            It’s ultimately a war of attrition which is something the quarterly crowd can never reliably win. They’re ziptied to the market while the open alternative can continue to do the hard but necessary work of getting better over time and can stand to be ignored for decades because it’s mostly hobbyists.

            • sunzu2@thebrainbin.org
              link
              fedilink
              arrow-up
              1
              ·
              3 hours ago

              it’s mostly hobbyists.

              I can stay retarded longer than they can stay solvent. There are millions and more people are joining every day!

      • Grappling7155@lemmy.ca
        link
        fedilink
        arrow-up
        12
        ·
        16 hours ago

        ARM architecture 64 bit. It’s the style of CPU in your phone and MacBooks, known for being energy efficient and it’s performance is getting better too.

        The big downside though is that loads of old Windows apps aren’t going to run on these as effortlessly as they would on conventional x86-64 CPUs from Intel and AMD.

          • Squizzy@lemmy.world
            link
            fedilink
            arrow-up
            1
            ·
            5 hours ago

            The phones is the hardest one, there just is no practical alternatives to the main two. Even degoogling is centered around pixels and other mainstream brands.

            I am looking at Nothing at the moment, I want something green and private.

      • OK, so I ran this past a techie colleague. Here’s how he summarized this for me.

        • @jagged_circle@feddit.nl is drawing a superficial parrallel between CPU speculation and LLM/AI unpredictability without acknowledging the crucial differences in determinism, transparency, and user experience.
        • He’s relying on the likelihood that others in the conversation may not know the technical details of “CPU speculation”, allowing him to sound authoritative and dismissive (“this is old news, you just don’t get it”).
        • By invoking an obscure technical concept and presenting it as a “gotcha,” he positions himself as the more knowledgeable, sophisticated participant, implicitly belittling others’ concerns as naïve or uninformed.

        He is in short using bad faith argumentation. He’s not engaging with the actual objection (AI unpredictability and user control), but instead is derailing the conversation with a misleading-to-flatly-invalid analogy that serves more to showcase his own purported expertise than to clarify or resolve the issue.

        The techniques he’s using are:

        • Jargon as Gatekeeping:
          Using technical jargon or niche knowledge to shut down criticism or skepticism, rather than to inform or educate.

        • False Equivalence:
          Pretending two things are the same because they share a superficial trait, when their real-world implications and mechanics are fundamentally different.

        • Intellectual One-upmanship:
          The goal isn’t to foster understanding, but to “win” the exchange and reinforce a sense of superiority.

        Explaining his bad objection in plain English, he’s basically saying “You’re complaining about computers guessing? Ha! They’ve always done that, you just don’t know enough to appreciate it.” But in reality, he’s glossing over the fact that:

        • CPU speculation is deterministic, traceable, and (usually) invisible to the user.

        • LLM/AI “guessing” is probabilistic, opaque, and often the source of user frustration.

        • The analogy is invalid, and the rhetorical move is more about ego than substance.

        TL;DR: @jagged_circle@feddit.nl is using his technical knowledge not to clarify, but to obfuscate and assert dominance in the conversation without regard to truth, a pretty much straightforward techbrodude move.

        • jagged_circle@feddit.nl
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          6
          ·
          6 hours ago

          How do you think I’m grifting?

          Speculation caused huge security issues. Both of these technologies cause enormous harm.

          • Dude, in case my breakdown of your argumentation style didn’t make it clear: piss off. You’re a dishonest grifter with no opinion anybody should be paying attention to. Your parents should be ashamed of their accidental conception of you. AND you’re stupid enough to push AI bullshit in a group literally called “Fuck AI”.

            Go away. Your mother is calling you.

            Oops. She just called you something else.

            • jagged_circle@feddit.nl
              link
              fedilink
              English
              arrow-up
              1
              arrow-down
              4
              ·
              5 hours ago

              Be nice and learn to read. I said speculation is bad. I didn’t grift for AI. Its also bad.

  • SplashJackson@lemmy.ca
    link
    fedilink
    arrow-up
    21
    arrow-down
    1
    ·
    1 day ago

    I’m waiting for all AI “features” to be isolated onto a single chip which I can just reach into the case with a pair of plyers and crush into dust

    • techt@lemmy.world
      link
      fedilink
      arrow-up
      6
      arrow-down
      2
      ·
      21 hours ago

      Then you’d juat have to deal with errors from processes expecting it to be there. Probably better to instead not use software that implements it, which I assume you’re doing anyway.

  • whodatdair@lemmy.blahaj.zone
    link
    fedilink
    arrow-up
    10
    arrow-down
    4
    ·
    edit-2
    15 hours ago

    Can’t wait to buy a used laptop with an NPU and a bunch of ram for my home lab to run private LLMs on. Just gotta be patient. 🙃