If I'm in a video game, I'd like to request a better player. The problem I see with AGI is that these machines with their large language models may be able to use their predictive capabilities to simulate emotional responses, they don't have actual emotional experiences. Per Robert Sapolski, they don't experience the hormone brain bath and genetics of "one minute before, 10 minutes before, ten days before" etc. LLM's may scrape all of the written knowledge of humankind, but they will never experience it. I think the fear that we are creating computerized sociopaths is not unreasonable (the films Ex Machina and Colossus: The Forbin Project come to mind). The issue of being in a simulation makes reason stare: some being with the intelligence to create this might have the intelligence to do something better with their time. The nature of the quantum universe seems to be the better explanation, but thinking about it at 6:45 AM makes going back to bed an enticing proposition. When we look at Sapolski's proposition, one might feel "I am what I am" is the nihilistic conclusion. However, a change in environment (such as involvement in a recovery community) actually can bring about change in the person and I _think_ Sapolski would agree, in spite of there being no "prime mover neuron." But I digress...
How creepy to have Steve Buscemi as your addiction counselor. But I digress: it is a hard lesson that will power doesn't work in recovery, as the drive to use happens outside of the brain's executive function. Yet much of society sees addiction as a simple lack of will power, not what it is. Nice clip: thank you.