If you want to visit a Doctor for every minor thing feel welcome. So far LLM’s have correctly predicted every health issue I have had and provided better and more accurate information than the doctor visit afterwards.
This does not mean they are infaillible. But you can easily check what they suggest and see whether the symptoms match other websites and the doctors description.
If you want to visit a Doctor for every minor thing feel welcome. So far LLM’s have correctly predicted every health issue I have had and provided better and more accurate information than the doctor visit afterwards.
lol no
“easily just do the same search you would have after” truly the llm is very helpful and not just an uncertainty adding middle step that through your own admission you rely on over medical professionals.
Doctors have liability and the ability to self regulate their confidence and understand certainty. Also lol “similarly prone to error” no, human cognition is not a transformer.
Do not give medical advice. Neither you nor any llm are licensed and capable of doing so. Yes, that means you should be held legally liable if that advice ever leads to harm, as should ai companies for convincing you and others of their grift.
If you want to visit a Doctor for every minor thing feel welcome. So far LLM’s have correctly predicted every health issue I have had and provided better and more accurate information than the doctor visit afterwards.
This does not mean they are infaillible. But you can easily check what they suggest and see whether the symptoms match other websites and the doctors description.
lol no
“easily just do the same search you would have after” truly the llm is very helpful and not just an uncertainty adding middle step that through your own admission you rely on over medical professionals.
Do you believe doctors to be all wise? They are similarly prone to error. Their head is not Wikipedia.
Doctors have liability and the ability to self regulate their confidence and understand certainty. Also lol “similarly prone to error” no, human cognition is not a transformer.
Do not give medical advice. Neither you nor any llm are licensed and capable of doing so. Yes, that means you should be held legally liable if that advice ever leads to harm, as should ai companies for convincing you and others of their grift.