<?xml version="1.0" encoding="utf-8" standalone="yes"?><rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:content="http://purl.org/rss/1.0/modules/content/"><channel><title>Education on John Schultz</title><link>https://johndschultz.com/tags/education/</link><description>Recent content in Education on John Schultz</description><generator>Hugo -- 0.155.3</generator><language>en-us</language><lastBuildDate>Mon, 30 Mar 2026 00:00:00 +0000</lastBuildDate><atom:link href="https://johndschultz.com/tags/education/index.xml" rel="self" type="application/rss+xml"/><item><title>Teaching What You're Still Learning</title><link>https://johndschultz.com/thoughts/teaching-what-youre-still-learning/</link><pubDate>Mon, 30 Mar 2026 00:00:00 +0000</pubDate><guid>https://johndschultz.com/thoughts/teaching-what-youre-still-learning/</guid><description>The most honest form of teaching happens when you&amp;#39;re still fixing the thing you&amp;#39;re teaching about. The gap between what you teach and what you&amp;#39;ve mastered isn&amp;#39;t hypocrisy. It&amp;#39;s the curriculum.</description><content:encoded><![CDATA[<h2 id="the-observation">The Observation</h2>
<p>In January I stood in front of 100 auctioneers at a state convention and told them &ldquo;AI assists. You decide.&rdquo; Three hours of live demos, chaining tools together, a 7-day challenge. I showed them 200 hours of reclaimed marketing time. The room bought in. The closing line about the train leaving the station landed.</p>
<p>In February I started one-on-one coaching with a colleague, walking her through the same tools at a slower pace. In March I did it again for 20 community leaders on a Zoom call. And now I&rsquo;m building a 3.5-hour continuing education course that&rsquo;s basically the same material refined a fourth time.</p>
<p>Two days before that course research started, I found a bug where stale cached data was silently injecting garbage into every outbound email for a full day. My own AI memory system, the one I&rsquo;d built to make my tools smarter, was serving bad data because nobody built in decay.</p>
<p>And two weeks before that, I realized I&rsquo;d given a consulting client the exact triage framework I haven&rsquo;t applied to my own company. Document processes before hiring. Sort decisions into only-you, delegatable, and automatic. I diagnosed his problem in ten minutes and haven&rsquo;t solved the same one at my company in fourteen years.</p>
<h2 id="what-i-think-is-actually-happening">What I Think Is Actually Happening</h2>
<p>There&rsquo;s a version of expertise where you only teach what you&rsquo;ve fully mastered. That version is clean and safe and mostly useless, because by the time you&rsquo;ve mastered something in technology, the thing you mastered has already moved.</p>
<p>The alternative is teaching from the middle of it. I&rsquo;m teaching auctioneers to use AI tools while simultaneously discovering that the AI systems I&rsquo;ve built have fundamental design flaws. I&rsquo;m telling rooms full of people to trust but verify while my own cache was serving unverified garbage for a full day without anyone catching it.</p>
<p>That&rsquo;s not disqualifying. It might be the whole point.</p>
<p>The convention seminar worked because I could show real numbers from real campaigns. The leadership session worked because I told two stories where I learned something by getting it wrong first. In both cases the AI framing created tunnel vision, and I only caught it by flipping the prompt. The one-on-one coaching works because my colleague watches me problem-solve in real time, not deliver a finished product.</p>
<p>Every good teaching moment I can point to came from something I was still figuring out. None of them came from polished expertise.</p>
<h2 id="the-principle">The Principle</h2>
<p>Mastery isn&rsquo;t a prerequisite for teaching. Proximity is. The person one step ahead of you in the fog is more useful than the person who mapped the whole terrain from a helicopter. They know where the ground is soft because they just stepped in it.</p>
<p>The memory bug made me a better teacher of AI skepticism than any slide about hallucinations could. Realizing I hadn&rsquo;t taken my own advice makes the triage framework more honest, not less. I can say &ldquo;I haven&rsquo;t done this either, and here&rsquo;s what it&rsquo;s costing me.&rdquo;</p>
<p>The gap between what you teach and what you&rsquo;ve mastered isn&rsquo;t something to hide. It&rsquo;s the most credible thing you bring to the room.</p>
<h2 id="whats-open">What&rsquo;s Open</h2>
<p>The CE course research covers the tools and trends well enough. But does it include enough of the &ldquo;here&rsquo;s where I got it wrong&rdquo; stories? The convention and leadership sessions landed hardest on the failures. If the course leans too much on the polished version, it&rsquo;ll lose the thing that actually teaches.</p>
<p>There&rsquo;s also a tension between the &ldquo;AI assists, you decide&rdquo; framing (which puts the human in control) and the memory bug insight (which shows the AI quietly making decisions without anyone noticing). The first is aspirational. The second is operational. The course probably needs both.</p>
<p>And the consulting card asks whether I&rsquo;ve documented the &ldquo;why&rdquo; behind my systems. The teaching version of that question: am I teaching people to use the tools, or am I teaching them how to think about when the tools fail? Those are different courses.</p>
]]></content:encoded></item></channel></rss>