Feynman said if you can't explain it simply, you don't understand it. i took that literally and started explaining concepts to claude like it's never heard of them. no jargon, no shortcuts. plain language only.
what i didn't expect.. claude pushes back. and it pushes back exactly where the explanation gets fuzzy. that's not a coincidence. that's the gap.
Been running this on jake's material. lesson lands, i explain it back, claude presses on the parts that don't hold. what i thought i understood and what i can actually defend out loud are two very different things.
The AI isn't teaching you. it's exposing what you never actually learned.
anyone else using AI as a sparring partner?