An AI Called Winter: Neurosymbolic Computation or Illusion? dustycloud.org/blog/an-ai-call

In which I try to piece apart whether or not a *particular* AI agent is doing something novel: running Datalog as a constraint against its own behavior and as a database to accumulate and query facts. Is something interesting happening or am I deluding myself? Follow along!

@cwebberChristine Lemmer-Webber a brave post

A question I was left with is, if you swapped out the LLM but kept the same datalog, would it behave close enough to the same to be considered the same entity?

Also: The LLM is doing 2 jobs, one is the usual plausible sentence generation, and the other is encoding rules and facts into the context window for the next iteration. Since we know other people can easily be fooled by an LLM doing the former, would a system with the same architecture, but that did not expose us to the generated material, but used it in some other way, still be useful/valuable/interesting?

0

If you have a fediverse account, you can quote this note from your own instance. Search https://sunbeam.city/users/joeyh/statuses/116083061695272841 on your instance and quote it. (Note that quoting is not supported in Mastodon.)