DEF CON 27 Social Engineering Village Notes

Micah Zenko - Red Teaming Insights and Examples from Beyond

Notes By Aryan Giri

Speaker: Micah Zenko, Consultant & National Security Expert

Background: PhD in Political Science, former positions at Harvard and Council on Foreign Relations, columnist at Foreign Policy

Key Focus: Red teaming as a mindset and methodology beyond cybersecurity

Red Teaming: Beyond Cybersecurity

Micah Zenko presents a unique perspective on red teaming that extends far beyond traditional cybersecurity applications, framing it as a fundamental mindset and approach to critical thinking.

Core Definition: "Red teaming to me is really a mindset. It's an approach and a series of techniques. It's not a cyber thing."

Speaker Background:

The Problem: Why We Can't Think Critically

"Think about what you do each day. Nobody decides then and there what to do at 9:00 a.m. each day. You have a series of expected dress, expected rituals, norms and cultures. Over time, all of these behaviors shape and limit the way you think."

The De-skilling of Workers:

Zenko references Thorstein Veblen's observation about the transition from agricultural/craft work to manufacturing:

CIA Example: A deputy director of intelligence described how she "became a CIA analyst" when turning into the wooded lot at Langley, stopping her free thinking. This demonstrates how organizational environments shape and limit thinking.

Why Red Teaming is Necessary

Fundamental Principle: "You can't grade your own homework. Organizations are systematically incapable of identifying their blind spots and challenging their assumptions."

Two Core Reasons for Red Teaming:

What's Between Our Ears

Cognitive biases and mental shortcuts that prevent objective self-evaluation

What's Between All of Us

Organizational pathologies that inhibit critical thinking and dissent

Cognitive Biases: The Individual Barriers

Status Quo Bias - The QWERTY Example:

The QWERTY Keyboard Legacy:

  • Invented by Christopher Latham Sholes in Milwaukee as a typesetting solution
  • Sold to Remington rifle company in 1870
  • No technical reason for QWERTY dominance - takes 3-4 days to train out of it
  • People could type 30-40% faster with alternative layouts
  • Demonstrates how we accept inefficient systems as "morally correct"

Blind Spot Bias - The Meta Bias:

Research Finding: In a study of 7,700 Americans taking cognitive bias tests, only 1 in 7,700 said they were more biased than average. 85% thought they were less biased than others.

Doctor Example: 20% of doctors admit being influenced by pharmaceutical gifts, but estimate 80% of their peers are influenced.

Organizational Pathologies

Groupthink

Illusion of unanimity where people don't express what they actually believe to maintain unit cohesion and morale

Hierarchy Problems

Leaders believe their doors are always open, but people don't voice dissent because they think it's pointless

Voicing Sideways

When people stop giving feedback upward and instead gossip with coworkers

Maverick Hunting

Organizations systematically "hunt down and kill" mavericks who challenge conventional wisdom

Cultural Impact: "You can have the best strategies, plans, processes, and people - if you have the wrong culture, it destroys them. Culture is harder to measure because it's in the walls of where we work."

Historical Examples of Red Teaming

The Devil's Advocate (1230-1982)

  • First institutionalized red team in the Catholic Church
  • Position created to find disconfirming information about saint nominations
  • Eliminated by Pope John Paul II in 1982
  • Result: More people canonized in next 25 years than previous 1,900 years
  • Pope John Paul II himself canonized just 12 years after death

Modern Red Teaming Origins (1950s)

  • Started at RAND Corporation in Santa Monica
  • "Red" referred to Soviet Red Army
  • Modeled NATO vs. Soviet strategic interactions
  • Foundations in brainstorming, behavioral sciences, game theory, and organizational theory

Types of Red Teaming

Simulations

Testing plans and strategies by envisioning failure scenarios. Used with large banks for data breach response planning involving entire C-suite and board.

Alternative Analysis

Series of techniques to challenge assumptions. Reference: US Army Red Team Handbook Volume 9.

Red Team Quadrant: Easiest = adversarial operational (how bad guys break in). Hardest = your own ideas and thinking (pentesting your thinking).

Real-World Case Study: Healthcare Rounding Strategy

Situation:

  • 100 chief nursing officers at major healthcare company
  • Rounding strategy (team-based patient care) had failed
  • Doctors systematically didn't show up for rounds
  • Nurses primed to blame doctors for all problems

Red Team Intervention:

  • Pre-mortem analysis: Imagine strategy failed completely in one year
  • Constraint: Couldn't blame the doctors
  • Result: Identified real underlying issues

Discoveries & Solutions:

  • No common training standard → Implemented standardized training
  • High nurse turnover → Raised pay to reduce attrition
  • No family engagement → Set up 24-hour text notification system

Outcome:

Within 3-4 months, significant improvements in patient satisfaction metrics.

Facilitator Role: "I'm a disinterested outside facilitator with no vested interest in outcomes. I'm just good at facilitation and changed how they thought about the problem."

Red Teaming Best Practices

Senior Leadership Buy-in

If the boss doesn't care, resource, authorize access, and act on findings, nothing else matters

Act on Findings

Doing red teaming and ignoring results is worse than not doing it at all - signals dissent isn't welcome

Pre-engagement Preparation

Spend weeks interviewing participants, sending preparatory emails, addressing concerns in advance

Maverick Approach

Individual mavericks are systematically hunted down in organizations - exhausting and ineffective

Leadership Therapy: "The first act of red teaming is really therapy with senior leaders - what do you think you care about most? They don't actually know what they care about most."

Becoming a Personal Red Teamer

Three Essential Practices:

1. Recognize and Mitigate Biases

Bookmark Wikipedia's list of 100+ cognitive biases. Before consequential decisions, review and identify which biases might be affecting your judgment.

2. Read Outside Your Interests

Average person visits 10-12 websites daily with almost no monthly variation. Reading within your field is maintenance, not critical thinking.

Zenko's Rotation: Rotates 8 outside interest areas yearly (history, sociology, behavioral sciences, scientific interests)

3. Effective Voicing and Receiving

Learn to voice concerns effectively and as leaders, learn to receive feedback properly.

Power Cues That Inhibit Voice:
  • Being overly well-dressed
  • Cold office settings
  • Intimidating social media profiles (e.g., free climbing El Capitan)

Creativity and Innovation Through Red Teaming

Creating Adjacencies for Better Thinking:

Physical Removal

You can't think outside the box while sitting at your desk. Physically remove yourself from your work environment daily.

Moments of Transition

Highest creativity occurs during transitions (walking, jogging, incidental conversations) not during focused work

Incubation Effects

Immerse in problem details, then step away. The deeper the immersion, the more anchored you become - requires forgetting to gain novel perspectives.

Research Basis: University of Chicago's Mihaly Csikszentmihalyi documented that peak creative performance happens during transitional moments, not focused work sessions.

Q&A Insights

Giving Criticism

Research shows people receive criticism better from peers than senior leaders. Criticism should be direct, actionable, and delivered by someone at their level.

Building Buy-in

Spend 6+ weeks pre-gaming with participants, addressing concerns, ensuring everyone understands the purpose and value.

Multidisciplinary Teams

"Tyranny of expertise" means technical experts become too immersed in their field to see discontinuities. Need diverse perspectives including non-experts.

Key Takeaways

Essential Red Teaming Principles:

  1. Red teaming is fundamentally a mindset, not just a cybersecurity technique
  2. Organizations and individuals are systematically incapable of objective self-evaluation
  3. Cognitive biases and organizational pathologies prevent critical thinking
  4. Successful red teaming requires senior leadership commitment and action on findings
  5. Personal red teaming involves recognizing biases, seeking diverse perspectives, and creating thinking adjacencies
  6. Creativity flourishes during transitions and incubation periods, not focused work sessions
Final Thought: "The surest way to make nobody ask a question is to say 'if there's no questions we can all go home' - but red teaming is about always asking the hard questions, even when it's uncomfortable."