AI Beyond the Chat
July 5, 2025

Why conversational design needs more pedagogy and less personality
We've trained AI to talk like us. But we forgot to teach it how to teach.
Every new chatbot or assistant is pitched as more "natural," more "human," more "fluent." But fluency without function is performative intelligence. What if, instead of optimizing for tone and charm, we designed for clarity, confidence, and learning?
When I think about AI product design, I'm not interested in whether it sounds like a person. I care about whether it helps people become more capable thinkers. Not just faster answer-seekers, but more reflective decision-makers.
Too often, conversational UX is treated like a style guide. Choose the right voice. Add a few emojis. Tweak your fallback phrases. But that isn't conversation. That's branding. If we believe in human-centered design, then we have to ask a better question: What makes a conversation meaningful?
The answer isn't personality. It's pedagogy.
Teaching is the Interface
Teaching, in any medium, means revealing your thought process. It means supporting reflection, not just output. It means knowing when to step back and when to challenge. It means asking follow-up questions that expand the user's awareness of what's possible.
These aren't common features in today's AI tools.
An AI that teaches doesn't just say, "Here's what you should do." It says, "Here's how I arrived at that." It invites critique. It makes its reasoning visible. It supports confidence, not compliance. And it leaves room for the user to challenge, reflect, and explore.
If AI is going to act as a co-pilot, it needs to stop navigating for users and start navigating with them.
Natural Isn't the Same as Nurturing
Being "natural" doesn't make a conversation helpful. An assistant that sounds like your coworker might be charming, but charm doesn't scaffold understanding. Sometimes, "natural" conversation can feel too fast, too smooth, too agreeable.
Good teaching is not always smooth. It pauses. It adapts. It redirects. It makes you think.
Helpful AI isn't about being clever or casual. It's about being constructively curious. It asks you why you made a certain choice. It encourages you to try another perspective. It slows you down in the moments that matter.
What This Could Look Like
Imagine a writing assistant that asks what you're trying to say emotionally before suggesting edits. It doesn't just fix your grammar — it shows the pattern behind your argument and asks whether it reflects your intent. It highlights what's missing based on the purpose of your writing, not just sentence structure.
Or imagine a research assistant that shows you not just sources, but perspectives. One that flags when your citations come from a single worldview. One that suggests opposing views and invites deeper reflection instead of simplifying the issue into one answer.
This kind of AI doesn't just respond. It reveals. It doesn't just serve. It scaffolds.
What We Need Now
We don't need AI that sounds louder, faster, or more confident. We need AI that helps us pause, reflect, and learn. We need wiser tools, not smoother ones.
Most designers are still asking how AI should sound. But we should be asking: How should it help us think?
That's the future worth building.