The Tuning Fork
What happens when your own perceptual instrument is calibrated differently, and why that turned out to be the whole point.
My partner said something to me a few years ago that I still think about.
We were in the middle of a conversation. I was explaining something I was excited about, going deep on some technical detail, building the argument layer by layer the way I do. He stopped me and said: "Do you realize you're not picking up on me not being interested at all in what you're talking about?"
He wasn't being cruel. He was being accurate. I had missed every signal. The shortened responses. The body language pulling away. The polite nodding that meant "I'm done" and not "keep going." All of it had sailed past me while I was inside the structure of my own thought.
That moment crystallized something I'd been circling for years. I don't read rooms the way most people read rooms. I never have.
I'm autistic. Formally diagnosed, ASD. And I have severe combined-type ADHD.
For most of my life I didn't have the vocabulary for either of those things. I just knew that certain parts of being human felt like they were running on a different operating system than what everyone else seemed to have. Eye contact felt like staring into the sun. Conversations had rules I could observe but not feel. Small talk was a protocol I memorized rather than a thing I enjoyed. And my attention was either a laser or a scattered mess, nothing in between.
I want to be clear: this is not a disability narrative. I'm not telling you this because it's inspirational. I'm telling you because these two things, the autism and the ADHD, turned out to be the reason I can do what I do. They're not obstacles I designed around. They're the instruments I design with.
The autism gave me something specific. Because I don't process social cues automatically, I had to learn them the way you learn an instrument. Theory, practice, application, more practice, repeating that loop until the thing that started as conscious effort became something closer to fluency.
I didn't learn what a smile means by feeling it. I learned it by observing patterns. Cataloging responses. Testing hypotheses in real interactions and noting what worked and what didn't. Someone leans forward, usually engagement. Feet pointing toward the door means they're ready to leave. Eyes breaking down and to the left, that's internal processing. To the right, they're constructing something to say.
Most people absorb this stuff unconsciously as children. Their social cognition is intuitive, built from millions of micro-interactions processed automatically by neural architecture that handles it below the level of awareness. They can read a room without knowing how they read a room.
I can't do that. What I can do is tell you exactly what I'm reading and why. I can articulate the patterns because I had to learn them explicitly. The mechanics under the intuition, the part that's invisible to everyone else, that's the only version I have.
Working the door at the nightclub was where this became an advantage instead of a limitation.
A thousand face-to-face interactions a night. A 300-capacity venue, people cycling through every couple of hours. I had to get fast at reading signals I couldn't process naturally. So I built systems. Specific cues mapped to specific responses. Posture combinations that predicted behavior. Vocal patterns that told me whether someone was escalating or winding down.
Analytical social cognition. The deliberate kind, where every read is a conscious act of pattern recognition, nothing like the warm, effortless version most people run on. Slower at first. But with thousands of repetitions per week, it got fast. And because it was explicit rather than intuitive, it was transferable. I could apply it to any social system, not just the one I grew up in.
Most people miss this about neurodivergent perception: when you learn a skill analytically that others learn intuitively, you end up understanding the mechanics at a level they never need to reach. A native speaker knows their language fluently. A non-native speaker who achieved fluency knows the grammar.
I know the grammar of social perception. And people interact with websites the same way they interact with other humans.
The ADHD is a different instrument entirely.
Some experiences are literally painful for me. Not metaphorically. Cluttered interfaces, confusing navigation, forms that ask for the same information twice, pages that make me hunt for the thing I came to find. These don't just annoy me. They hurt. The cognitive load registers in my body as a physical sensation, a tightness, an agitation, a need to escape.
What most designers theorize about, I feel.
That sensitivity is extreme. Most users experience friction as mild annoyance or unconscious abandonment. They leave a confusing site without knowing why they left. For me, the "why" is screaming. I can point at the exact element that broke the experience, name the cognitive load it imposed, and explain why it triggered the exit response. Because I live on the sharp end of that response every day.
The ADHD also gave me something I didn't expect: years of building mental prostheses.
If something is out of sight, it's out of mind. I mean that literally: if I can't see it, it doesn't exist in my working memory. So I keep post-it notes on my desk for today's tasks. Not because I'm organized. Because without them, I forget what I'm doing mid-sentence.
I have to actively "turn on" when listening to people. Processing the non-verbal cues, the implications, the context, the said-vs-unsaid. Neurotypical people handle this automatically. For me it's a learned skill, like a dancer controlling every muscle on stage. It looks natural from the audience. Backstage, it's deliberate effort.
Those prostheses, the external memory aids, the active listening protocols, the friction-reduction strategies, they map directly to design.
"Out of sight, out of mind" becomes a design question: what must we show versus what we want to show versus what we could show? Then cut everything except what we must show. Active listening becomes user empathy: who is this for, what's at stake, what's the desired outcome, what's being said and what's not being said?
The scaffolding I build for my own brain is the scaffolding I build for users.
There's a concept in accessibility called the curb cut effect. It comes from a literal observation: when cities started cutting curbs at crosswalks for wheelchair users, everyone benefited. Parents with strollers. Delivery workers with hand trucks. Travelers with rolling luggage. Runners. Skateboarders. A modification designed for a specific disability turned out to be better design for everyone.
My entire career in one concept.
Design is prosthesis for human cognitive functions and limitations. I know that because I've been building prostheses for myself my whole life. The strategies I developed to manage my own attention, reduce my own cognitive load, and compensate for my own processing gaps, those turned out to be exactly what users need, not because they share my diagnosis, but because the human brain has universal constraints that good design accommodates and bad design ignores.
The post-it note principle (out of sight, out of mind) applies to every user. If your call-to-action scrolls off screen, it stops existing. The active listening principle applies to every interface. If your design doesn't signal that it understands the user's goal, the user feels unheard.
What I built for my own cognitive accessibility helps everyone. The curb cut effect, in pixels instead of concrete. It keeps showing up in design because human cognition has limits, and designing for those limits is what good design is.
Around 2012, a friend I met at a Chicago meetup about emotions in video games suggested I try improv.
I went in already analytical. The only way I know how to go into anything. And improv, it turned out, was the bridge between the two halves of my perceptual system: the analytical social cognition from the autism and the raw sensitivity from the ADHD.
My improv teacher gave me a formula: (Listen) x (Act + React). The formula for being in the moment. If Listen equals zero, if you're not attending, then everything else multiplies by zero. Nothing you do matters if you haven't heard what's actually happening.
That formula hit me like a brick. It was the nightclub door distilled into mathematics. Every successful interaction I'd ever had, every table visit at 1:10, every greeting that activated attention instead of triggering a ritual, they all started with listening. With receiving the room before trying to change it.
Improv taught me other principles that turned out to be design tools, not metaphors.
"Yes, and." Accept what exists and build on it. Don't throw everything out when something feels wrong. Brick by brick, not castle at a time. In design: inherit the existing brand, acknowledge the current state, then layer improvements. Stakeholders who feel heard will let you push further than stakeholders who feel overridden.
"If this is true, what else is true?" One insight cascades. If this demographic behaves this way on mobile, what else follows? If this is the brand voice on the homepage, it should be the brand voice everywhere. Follow the thread.
"It's not for you, it's for them." Protect the design from stakeholder preferences. Design for the target's autopilot, not your own comfort. Not the CEO's taste. Not the designer's portfolio. The user's actual perceptual experience.
These aren't improv rules borrowed as clever analogies. They're operational tools I use every week. When a client says "can we add a banner here?" my response is yes-and: "Yes, and if we add it, here's what it does to the user's attention at this point in the scroll. Is that the trade-off we want?" When I audit a site, I'm asking "if this is true, what else is true?" about every signal the design sends.
All of this connects back to the diagnostic process I described at the end of the last chapter.
Step 1 of the PFD diagnostic is Feel. Arrive at the page and let the emotional response fire before your conscious brain translates it into words. "This makes me feel X." The feeling comes before the language. System 1 before System 2.
That step, the pre-verbal emotional read, is where my ADHD sensitivity becomes the diagnostic instrument. Most designers can learn to do this. They can practice pausing before analyzing, noticing their gut reaction, giving the emotional response room to register. But for me, it's not optional. The feeling arrives whether I want it to or not, louder than it arrives for most people, impossible to ignore.
The emotional response IS the diagnostic instrument. Like a tuning fork that resonates at the slightest vibration while everyone else needs to put their ear to the surface.
This is the part of PFD that may not be fully teachable. I've thought about this honestly. The lens, the empathy-first approach, the "design for how they perceive, not how you wish they'd perceive" mindset, that's completely teachable. Give anyone that lens and they get better immediately. The pattern library, the hundreds of projects of instant recognition, that takes years. No shortcut. But the sensitivity, the neurodivergent tuning fork, the thing that lets me feel cognitive friction before I can name it? That might be native hardware.
The good news: the diagnostic process, Feel, Unpack, Diagnose, Prescribe, can be taught as a substitute. You don't need my nervous system to run the protocol. You need to practice the pause. You need to learn to notice your own pre-verbal reactions instead of skipping past them. You need to treat the emotional read as data instead of noise.
The tuning fork is calibrated differently in my case. But everyone has one. Most people just haven't learned to listen to it.
I want to be direct about what this chapter is really saying, because it's easy to read it as "neurodivergence is a superpower" and leave it there. That framing is too simple.
Autism and ADHD cost me things. Relationships where I missed signals for months. Jobs where I couldn't mask well enough. Days where the ADHD makes it impossible to start a task I genuinely want to do. The sensitivity that lets me feel friction in a design also means I feel friction in a grocery store, in a crowded room, in a conversation that goes sideways. It's not selective. It's all the time.
What I'm saying is narrower and more specific. The same cognitive architecture that creates difficulty in some contexts creates professional advantage in one specific context: understanding how humans perceive and process designed experiences. The analytical social cognition lets me see patterns that intuitive readers can't articulate. The friction sensitivity lets me feel problems that typical users walk past. And the years of building prostheses for my own brain taught me what good scaffolding looks like, because I can't function without it.
Not a superpower. A trade-off that happened to land in a useful place.
The improv filled the last gap. Before improv, I had the analytical read and the emotional sensitivity, but they operated in parallel. The analysis was one track, the feeling was another, and bridging them in real time was clumsy. Improv taught me to integrate them. To feel the room and act on the analysis simultaneously. To let the emotional read inform the analytical response without either one drowning out the other.
The diagnostic protocol came directly from that integration. Feel first: the ADHD instrument. Then Unpack what specifically triggered that feeling. Then Diagnose which of the five layers is the source. Then Prescribe what would resolve the violation without creating new ones. The analytical and the intuitive, working together, sequenced deliberately because that's how I learned to do it.
The tuning fork doesn't work without the analysis. The analysis doesn't work without the tuning fork. The methodology is the integration of both.
If you remember one thing from this chapter, make it this.
Your own perceptual system is an instrument. It has a calibration. It has biases and sensitivities and blind spots. The first step in designing for how other people perceive is understanding how you perceive. Not to center yourself in the process, but to know your instrument well enough to compensate for its tuning.
If you're neurotypical, your social cognition is probably intuitive. That means you'll read rooms quickly but struggle to articulate what you read. Practice the pause. Practice naming what you feel before you analyze it. Practice the Feel step until it becomes habitual.
If you're neurodivergent, your instrument is calibrated differently, not worse, not better, just tuned to different frequencies. Figure out what your calibration catches that others miss, and what it misses that others catch. Then build the protocol that accounts for both.
The diagnostic starts with Feel because perception starts before language. Before analysis. Before any conscious processing at all. If you skip that step, you're designing from your assumptions instead of from the user's experience. And your assumptions, no matter how educated, are not the same as the pre-verbal read.
The tuning fork doesn't lie. But you have to learn to hear it.
Next: The Foundation. On why working memory holds 3–5 things (not 7), and what happens when your site steals all of them before the user starts.
Key Terms
| Analytical social cognition | Learning social cues through deliberate observation and pattern recognition rather than intuitive absorption. Slower to develop, but produces explicit, transferable understanding of the mechanics. |
| Friction sensitivity | The heightened awareness of cognitive load that comes with ADHD. Where most users experience friction as mild annoyance, friction sensitivity registers it as a physical, unavoidable signal. |
| Mental prostheses | External scaffolding built to compensate for cognitive gaps: post-it notes for working memory, active listening protocols for attention, progressive disclosure for information overload. The same scaffolding good design provides to all users. |
| Curb cut effect | A modification designed for a specific disability that turns out to be better design for everyone. Cognitive accessibility features work the same way. |
| (Listen) x (Act + React) | The improv formula for being in the moment. If Listen equals zero, everything multiplies by zero. The foundation of both good improv and good design. |
| The tuning fork | The pre-verbal emotional response that serves as the diagnostic instrument in Perception-First Design. Everyone has one. Most people haven't learned to listen to it. |
Want Perception-First Design applied to your business?
Work with me