fossilesque@mander.xyzM to Science Memes@mander.xyzEnglish · 3 days agoBeing Difficultmander.xyzimagemessage-square92linkfedilinkarrow-up11arrow-down10
arrow-up11arrow-down1imageBeing Difficultmander.xyzfossilesque@mander.xyzM to Science Memes@mander.xyzEnglish · 3 days agomessage-square92linkfedilink
minus-squareptu@sopuli.xyzlinkfedilinkEnglisharrow-up0·2 days agoIt’s called the heuristic method and those doing it know the limitations. Whereas LLMs will just confidently put out garbage claiming it true.
minus-squareranzispa@mander.xyzlinkfedilinkEnglisharrow-up0·2 days agoScientific calculations - and other approaches as well - put out garbage all the time, that is the main point of what I said above. Some limitations are known, just like it is known that LLMs have the limitation of hallucinating.
minus-squareptu@sopuli.xyzlinkfedilinkEnglisharrow-up0·1 day agoI didn’t notice your critique on the outcome of results, but how they were achieved. LLM’s hallucinating are making computers make ”human errors”, which makes them less deterministic, the key reason I prefer doing some things on a computer.
It’s called the heuristic method and those doing it know the limitations. Whereas LLMs will just confidently put out garbage claiming it true.
Scientific calculations - and other approaches as well - put out garbage all the time, that is the main point of what I said above.
Some limitations are known, just like it is known that LLMs have the limitation of hallucinating.
I didn’t notice your critique on the outcome of results, but how they were achieved. LLM’s hallucinating are making computers make ”human errors”, which makes them less deterministic, the key reason I prefer doing some things on a computer.