@Em0nM4stodonEm :official_verified: It is.

The one thing LLMs are good at is convincing people that they are useful. That gets people to spread them while providing absolutely no benefit.

They are trained to hold a convincing conversation. Because talking is how we interact with humans, when something is good at talking we think it must be good at all the other things... even if it was specifically designed to game that metric.

0
0
0

If you have a fediverse account, you can quote this note from your own instance. Search https://infosec.exchange/users/nothacking/statuses/115772931490162216 on your instance and quote it. (Note that quoting is not supported in Mastodon.)