I cannot reproduce this on Google.
Maybe they fixed it, I was able to reproduce it on Mixtral
Totally reproducible, just with slightly different prompts.
There’s going to be an entire generation of people growing up with this and “learning” this way. It’s like every tech company got together and agreed to kill any chance of smart kids.
This makes me happy since I won’t lose my job to newbies.
Isn’t it the opposite? Kids see so many examples of obviously wrong answers they learn to check everything
How do they know something is obviously wrong when they try to learn it? For “bananum” sure, for anything at school, college though?
One can hope…
And yet it doesn’t even list ‘Plum’, or did it just think ‘Applum’ was just a variation of a plum?
Well, plum originally comes from applum which morphed into a plum so yeah.
And that’s absolutely not true.
I will start using this as a factoid
Did you know that “factoid” means small piece of trivia? /S
So will a bunch of LLMs
Tomatum… that’s the one
Looks like someone set Google to “Herakles Mode”.
This is the tech you’re all afraid of?
Evaporated 20 gallons of water in Nevada for this
Reminds me of how the “1800 gallons for one burger” statistic uses annual rainfall to calculate that as if it was captured, stored, and used from our kitchen sinks.
What’s that got to do with datacenters using evaporative cooling?
Also if you’re curious cows drink water, 9-20 gallons a day, and in the typical 1-2 year lifespan, that amounts to 3,285 on the conservative side, or up to 14,000 gallons in hot climates, per cow. And depending on the cut, some 800 quarter pound patties, and using that conservative 1 year 9 gallons a day, that is about…
4 gallons per burgers worth of meat. That total 3,000-14,000 gal/cow water usage is certainly an issue, especially in hot climates, but why make up bullshit?
Because statistics like those often ignore the fact the water they’re calculating is inaccessible for other uses. They calculate the rainwater used to make the grass grow, water we don’t collect nor have available for other uses, but it makes the number higher and shocking. If you see “14,000 gallons of water per cow” you think that’s how much water we’ve “lost” when in reality, it’s a massive bucket of rainwater they’re drinking out of, not a hit to our irrigation or water treatment facilities.
It’s a misleading statistic meant to shock and manipulate you into a specific way of thinking, a lot like your original comment. I don’t give a shit how much rainwater a cow drinks, I care about how much is being pulled from local irrigation. Rainwater is going to lay in the dirt and evaporate anyway so why is that being calculated? If the answer to how much water is being pulled from our infrastructure is nearly zero, that’s how many fucks I dedicate to it.
Should datacenters be operating in silicon valley where water is already scarce? No. But people shouldn’t also be living in a fucking desert, overdrawing from the river that lets anyone live there, so maybe they should move. Not like they can’t afford to.
You should learn about thermodynamics
You should learn how to structure a retort.
I’m just matching effort 🤷♂️
Gemini thought we name food like we name a periodic table
plutonium is food once
A gram of plutonium has enough calories to last you the rest of your life.
Coconutum
You did WHAT to em?
I barely know 'em!
Same vibe: HHGregg is Havaving a Summummer Sale
Some “AI” LLMs resort to light hallucinations. And then ones like this straight-up gaslight you!
MFer accidentally got “plum” right and didn’t even know it…
Factual accuracy in LLMs is “an area of active research”, i.e. they haven’t the foggiest how to make them stop spouting nonsense.
duckduckgo figured this out quite a while ago: just fucking summarize wikipedia articles and link to the precise section it lifted text from
We can’t fleece investors with that though, needs more “AI”.
Because accuracy requires that you make a reasonable distinction between truth and fiction, and that requires context, meaning, understanding. Hell, full humans aren’t that great at this task. This isn’t a small problem, I don’t think you solve it without creating AGI.
It’s crazy how bad d AI gets of you make it list names ending with a certain pattern. I wonder why that is.
It can’t see what tokens it puts out, you would need additional passes on the output for it to get it right. It’s computationally expensive, so I’m pretty sure that didn’t happen here.
doesn’t it work literally by passing in everything it said to determine what the next word is?
I’m not an expert, but it has something to do with full words vs partial words. It also can’t play wordle because it doesn’t have a proper concept of individual letters in that way, its trained to only handle full words
That’s interesting, didn’t know
they don’t even handle full words, it’s just arbitrary groups of characters (including space and other stuff like apostrophe afaik) that is represented to the software as indexes on a list, it literally has no clue what language even is, it’s a glorified calculator that happens to work on words.
I mean, isn’t any program essentially a glorified calculator?
LLMs aren’t really capable of understanding spelling. They’re token prediction machines.
LLMs have three major components: a massive database of “relatedness” (how closely related the meaning of turns are), a transformer (figuring out which of the previous words have the most contextual meaning), and statistical modeling (the likelihood of the next word, like what your cell phone does.)
LLMs don’t have any capability to understand spelling, unless it’s something it’s been specifically trained on, like “color” vs “colour” which is discussed in many training texts.
"Fruits ending in ‘um’ " or "Australian towns beginning with ‘T’ " aren’t talked about in the training data enough to build a strong enough relatedness database for, so it’s incapable of answering those sorts of questions.
Make em say ummmmmmm na na na naaaaa
Strawberrum sounds like it’ll be at least 20% abv. I’d like a nice cold glass of that.
Strawberrum? Barely knew 'em!
Googlum is brokum
Despite that, it delivers its results with much applum!
Quality pum
I just tried to have Gemini navigate to the nearest Starbucks and the POS found one 8hrs and 38mins away.
Absolute trash.
Just tried it with Target and again, it’s sending me to Raleigh, North Carolina.
that leads me to believe it thinks you are in North Carolina. have you allowed location to Gemini? Are you on a VPN?
No VPN, it all has proper location access. I even tried it with a local restaurant that I didn’t think was a chain, and it found one in Tennessee. I’m like 10 minutes away from where I told it to go.
It seems to think you need to leave Alabama but aren’t ready for a state as tolerable as Georgia
I would totally leave if the “salary to cost of living” ratio wasn’t so damn good.
I’d move to Germany or the Netherlands or Sweden or Norway so fast if I could afford it.