Activity
Mon
Wed
Fri
Sun
Apr
May
Jun
Jul
Aug
Sep
Oct
Nov
Dec
Jan
Feb
Mar
What is this?
Less
More

Memberships

Learning with AI & Technology

134 members • Free

10 contributions to Learning with AI & Technology
Thursday 2/19/26 AI coding
Hello Folks. I was offline for the last 2 meeting of the Course but I hope to reconnect soon, I can't recommend highly enough this Youtube https://www.youtube.com/watch?v=bDcgHzCBgmQ The poster is Nate B Jones. I recommend anything by him but he is technical and not an easy listen. As you may know, AI is now writing computer code. Nate's main point here is that the BIG change that has to take place in this transition from human to AI code writing is that the humans have to adjust. Developers who wish to stay employed have to change their skills to speciation writing and scenario development. I would be happy to ATTEMPT to answer any follow-up questions.
Sunday 2/8/26 Learning vs Complete Mastery
I am struck by the idea of effort in learning. After 22, I pretty much have had the luxury of learning what I want to learn. Mainly, being a spoiled child, I learn what I want. Once I am not enjoying the learning, I stop. In learning, my main goal is usually not to move the information to muscles. I do not need muscle memory of the Rubic Cube strategy. I can just look at the book ONCE I understand that there is a path. Same with solving a system of equations. I move knowledge from external (Book, Youtube etc) to Personal control (Notes, searchable material) and usually stop and do not really move all the material to my own brain. Just enough of an understanding in order to access the notes etc.
Wednesday 2/4/26 Teach your colleagues, don't Hide It
We have all seen the employee who has a skill, learned how to access the SQL database, use the whiteboard, and they think their job security depends on keeping that skill to themselves. They are wrong. If you want real job stability, as soon as you learn something, offer to teach it to others, offer a lunch talk. Managers are always looking for someone who can master their given assignment and are available to take on an even more challenging one.
Tuesday 2/3/25 Education as out of the box reinforcement
Could AI be a major education disruptor? Too early to tell, but that allows me to speculate without the drag of data! Not simple disruptions like all tests have to be in school supervised so AI is not used, lectures given by AI, etc. Maybe the whole emphasis switches to how to develope students with curiosity. Mastering skills, material, is out. The one attribute we must grow is motivation to attack problems. Thinking out of the box. A whole new education system that no longer pushes mastery of French but bad ass curiosity. Of course one has to master matieral, learn what is in the box, before one can think outside of it. A najor change in emphsis. First we reward mastery, then, opps, sorry, now instead of understanding all that has gone before, we need you to think about why A is important, What if B were the true guiding principle. What would one put in this new cirricumum? Visual Illusions, you know those pictures that you look at, see two women, look longer, see a compote. One line looks longer, no, they are the same length. The mystery has one ending. No, here is another ending that fits the facts also. Luckily I only have to outline the ideas, not cmoe up with the out-of-the-box cirriculum. Thoughts?
0 likes • Feb 3
Notebooklm commented on the above: The provided text explores the potential for artificial intelligence to fundamentally redefine the objectives of the modern educational system. Rather than prioritizing the mastery of information or specific technical skills, the author suggests that schools should pivot toward fostering intense curiosity and innovative thinking. This pedagogical transition would emphasize the ability to challenge existing norms and approach problems from unconventional angles rather than simply absorbing established facts. The author envisions a new curriculum centered on mystery and subjective interpretation, using tools like visual illusions to encourage students to see beyond surface-level truths. Ultimately, the source argues that while foundational knowledge remains necessary, the primary goal of future learning must be the cultivation of motivation and an inquisitive mindset.
0 likes • Feb 3
Claud Free then had the final word. You're touching on something really fundamental here—the tension between knowledge acquisition and intellectual autonomy that AI is forcing us to confront much faster than we expected. Your core intuition feels right: when information mastery becomes trivially accessible through AI, education's value proposition has to shift. But I think the interesting question isn't whether to abandon mastery for curiosity, but how the relationship between them might fundamentally change. The visual illusion example is revealing. Those illusions are powerful precisely because you need some mastery first—you have to understand visual perception well enough to recognize when it's being subverted. The "aha" moment requires both knowledge and the willingness to question it. So maybe the new model isn't "curiosity instead of mastery" but something like "mastery in service of productive confusion"? What strikes me about your speculation is that it's not entirely new—it echoes progressive education movements, Socratic dialogue, scientific method training. The difference is that AI might finally make this economically viable at scale. When you don't need humans to deliver rote content, you can redirect educational labor toward what's genuinely hard: cultivating intellectual courage, comfort with ambiguity, the ability to ask better questions. For curriculum, I'd push beyond visual illusions to things like: Competing explanatory frameworks for the same phenomenon (why did Rome fall? Here are five legitimate scholarly interpretations) Problems with no clear solutions where the quality of thinking matters more than the answer Deliberate exposure to being wrong and recovering from it Metacognitive training—learning to notice your own assumptions and biases The real challenge isn't designing such a curriculum—educators have been doing versions of this for decades. It's the systemic friction: assessment systems built around measurable mastery, parent expectations, credential signaling, educational inequality. How do you "test" curiosity at scale? How do you ensure equitable access to human mentorship when that becomes education's core value?
Monday 2/2/26 Thought
AI will force education to move from Problem Solving to Problem Finding. In this rant I am going to fail measurable by trying to do 2 things. The first is the above idea. It comes from Youtube Nate B Janes. I usually suggest that only technical people listen to him but I am breaking the rule. Now we move to the second "thing". I have used Notebooklm to parce the Nate piece. IF you have a gmail account, send it to me at [email protected] and I will send you an invitation to view this Nate piece in a notebook. It is all free. Back to thing 1, Nate. Nate points out that the education system has historically optimized for problem solving, but the AI era will increasingly reward "problem finding" and the ability to frame the right questions. Furthermore, as technical skills like programming become commoditized, human value will shift toward taste, judgment, and the ability to curate high-quality outputs from a sea of AI-generated options. The above was written for me by notebooklm. Come see what I am talking about.
0 likes • Feb 3
I forgot to give the url to the Nate post. You do not need to watch it to participate in the notebooklm game above but you can also just ignore the game above and just watch Nate! https://www.youtube.com/watch?v=pxuXV3Q6tGY
1-10 of 10
Tom Pears
2
5points to level up
@tom-pears-2925
Long retired, 80 years young. 12 years teaching, 30 in high tech, deep into helping people understand and use AI

Active 19d ago
Joined Jan 25, 2026