The thing is, animal cognition is inextricably an embodied process. Affect is not a side-effect of cognition but its root.

The fact that we have computerised the production of plausibly similar outputs as those from animal cognition only means that we anthropomorphise the process that produces those plausible outputs.

We wrongly assign intention and goals to AI models like LLMs because we incorrectly assume the nature of their insides based on their outsides.

It is meaningless to talk of AI goals or intent, or at least meaningless to think of them as in any way isomorphic to animal goals or intent, as the mechanism for the production of goals and intent fundamentally does not exist in AI models.

0

If you have a fediverse account, you can quote this note from your own instance. Search https://mastodon.exitmusic.world/users/james/statuses/115668566030583231 on your instance and quote it. (Note that quoting is not supported in Mastodon.)