The 4-Lens Scan: 90 Seconds to See What Algorithms Hide

Sara was a city planner in Monterrey when her safety app flagged a “suspicious” figure near a bus stop. She almost acted on it. Then she paused — and recognized her neighbor’s cousin, waiting in the rain.

That pause is what the 4-Lens Scan is about. Four questions, ninety seconds: Who becomes invisible when we optimize? What assumptions are hiding in the code? What are the long-term ripples? And what’s actually driving me right now — reality, or the story the algorithm is telling?

Sara’s experience is a reminder that AI doesn’t just process data. It shapes perception. The 4-Lens Scan is designed to interrupt that shaping long enough for you to see what’s actually there. Sara went on to implement what she called “Algorithmic Pause Points” across the city’s operations — institutionalized moments requiring human review before accepting AI recommendations. Permit approvals for underserved neighborhoods went up 34% when operators realized that “missing documentation” often just meant forms in unrecognized languages.

The scan is quick enough to become habit. That’s the point. It’s not meditation; it’s something more like checking your mirrors before changing lanes.

The 4-Lens Scan is one of 21 practical tools from AI and the Art of Being Human by Jeffrey Abbott and Andrew Maynard. The characters and narratives in the book are fictional — designed to reveal truths about AI and being human that only stories can capture.