Not trying to drag on @codinghorrorJeff Atwood here - but I’ve seen this repeated a few places and he just ends up being the convenient toot (he retooted it into his own timeline so I’m not picking out of a thread here.)

This is correct. LLMs are a tool. They can be dangerous when used wrong but helpful when used correctly. Academically correct.

The problem is that the way they’re being sold and the way they’re being implemented is incorrect. And that’s human nature.
infosec.exchange/@codinghorror

0

If you have a fediverse account, you can quote this note from your own instance. Search https://mastodon.social/users/colincornaby/statuses/115810044540893729 on your instance and quote it. (Note that quoting is not supported in Mastodon.)