When I work with executive committees on their AI strategy, one question keeps coming up: "What do we do with our managers?" If AI automates coordination, analysis, and reporting -- 40 to 60% of a manager's time -- what is their new role?
It is to answer this question that I structured the M3K framework: Mindset, Methods, Metrics, Knowledge.
The 4 dimensions of M3K
Mindset -- From "controller" to "orchestrator." The AI-Native manager orchestrates collaboration between humans and AI. They no longer check whether the report is properly formatted -- AI does that. They ensure that the question asked of the AI is the right one.
Methods -- New rituals. The standup becomes an "AI Review." Reporting is generated by AI; the manager interprets rather than produces. One-on-ones focus on human-AI collaboration skills.
Metrics -- Measure what matters. M3K metrics: quality of decisions made with AI assistance, speed of adaptation to new tools, measurable business impact.
Knowledge -- The manager as curator. Which models to use for which tasks? Which prompts work? The manager becomes the guardian of their team's "knowledge graph."
90-day plan
Days 1-30 -- Mindset diagnostic. Assess where your managers stand by observing their actual practices.
Days 31-60 -- Methods pilot. 2 to 3 volunteer managers. New rituals. First results.
Days 61-90 -- Metrics + Knowledge deployment. New AI-Native metrics. AI knowledge base. Preparation for scaling.
Organizations that invest in managerial transformation before deploying AI at scale achieve adoption rates 3 times higher. M3K is a pragmatic tool, tested in real-world contexts.