Tech Leads, AI, and the Context Problem
Generative AI has moved from experimentation to everyday reality in engineering teams. Code assistants are now everywhere, and code production has accelerated dramatically. But behind the speed gains, a deeper shift is emerging: AI doesn’t just accelerate delivery, it reshapes where the real constraints live.
This is a revolution, but not a symmetric one. AI removes friction in writing code, while amplifying pressure everywhere else — review, integration, quality, and alignment. As a result, responsibility moves up the stack. The center of gravity shifts toward Tech Leads, Developer Experience, and the systems that hold teams together.
If AI changes how developers write code, it also destabilizes the entire software delivery pipeline unless context, standards, and constraints are made explicit and shared. This is the rise of the emerging discipline of context engineering.
Geoffrey Graveaud, DevEx, Accelerate & Craft coach at Inside, and Arthur Magne, co-founder of Packmind, see this transformation daily in the field. They observe changing skill sets, rising expectations around Developer Experience, a sharp increase in review and cognitive load, and a growing realization: AI only scales when teams regain control of context.
In this conversation, they share a grounded perspective on what is really changing in engineering practices, how the Tech Lead role is evolving, and which levers help teams turn AI from a source of chaos into a reliable force multiplier.
What Has Really Changed in Developers’ Daily Work Over the Last Two Years With AI Assistants and Code Generation?
Arthur Magne:
From conversations with Tech Leads and teams, the most visible change concerns the developer’s posture. Writing code has never been the ultimate goal, but the profession was often perceived that way. With AI, that mindset is becoming obsolete: machines can generate code for us — and fast. Developers now have to direct intention clearly, steer the AI, and maintain perspective over the full delivery chain. Writing code without understanding context is no longer sufficient.
This shift also reshapes roles. Some intermediate steps between design, engineering management, and development are shrinking. Developers are closer to the business because value is more about understanding the requirement than purely typing code.
What we also see in the field is a gap between profile types. Developers with “medium” experience who didn’t catch the AI wave early are sometimes struggling, while “AI-native” juniors, capable of using tools without cultural friction, are pulling ahead. They may not always master theoretical foundations like compiler internals or language semantics, but they know how to use AI effectively. Their ability to market themselves hinges on that fluency: they come in with a new way of working.
Geoffrey Graveaud:
My perspective complements Arthur’s. Seniors who adopt AI also become extremely sought after. Many companies prefer to invest in experienced profiles that can deliver in days what a junior couldn’t produce under the same conditions. Their productivity explodes once they understand how to work with AI.
From my point of view, it’s not about youth or experience alone — it’s about training. Training still prepares engineers to write code, but too often doesn’t give them perspective on quality or testing. Ironically, AI reproduces the same mistakes humans make, but at much higher volume and speed. Classic problems don’t disappear — they arrive ten times faster and ten times more often. That’s why craftsmanship practices, attention to context, and design reflexes become essential.
“AI imitates what it sees: if the project is clean, structured, and tested, it amplifies the positive. If the project is unstable or poorly designed, it amplifies the flaws as well.”
Why Can AI-Driven Acceleration Paradoxically Increase the Mental and Operational Load on Tech Leads?
Arthur Magne:
During a webinar with the CTO of DX, one phrase struck me: “Code production has never been the real problem in the enterprise.” What’s challenging is the entire pipeline up to production, not just writing lines of code. If we amplify production with AI without addressing the rest of the chain, it creates imbalance.
Code arrives faster and in larger quantities, but the capacity for review, integration, testing, and observability doesn’t keep pace. The concrete result is a Tech Lead overload, constantly reviewing, arbitrating, and securing output. Some studies already show measurable effects: Tech Leads spend significantly more time on code review — sometimes dramatically more. Without investing in Developer Experience and good practices, Tech Leads end up crushed by workload — without a clear positive impact on delivery.
Geoffrey Graveaud:
Exactly. On the mental load side, there’s a simple reality: difficult-to-read code consumes working memory. Code smells make code harder to understand and maintain. Our working memory has limited capacity — after a few dozen seconds, if elements aren’t anchored, we have to reread them. With AI, if you multiply the volume of poor-quality code, review becomes nightmarish.
Context hasn’t changed: deadlines remain the same, environments are sometimes unstable, and quality pressure increases. You get a larger volume of code, more potential instability, and the same or higher expectations. Without improvements in pairing or review processes, burnout among Tech Leads and Lead Developers clearly rises.
The Tech Lead role must evolve. It can no longer be just the expert who codes the hardest parts. It becomes a mentor, a fast and effective reader of large codebases, and a facilitator of good practices. Tech Leads must also help developers appropriate AI output: “You can use AI, but you must be able to explain why you made these choices.” The future Tech Lead must remain in constant learning mode, experimenting and spreading new practices.
Does Generative AI Force a Return to Craftsmanship and Platform Thinking?
Arthur Magne:
This transformation is exciting. We’ve been working on software quality for more than a decade — long before AI. Initially, the focus was mentoring and training developers. What really makes the difference is work on context: capturing decisions, rules, and conventions, then transmitting them to developers. Today, we must do the same — but also transmit these practices directly to AI agents.
The core subject hasn’t changed: structure context so it’s readable, up to date, and understandable by humans — and now by AI as well. Precision in wording, intent, and structure matters more than ever. Practices that secure code remain indispensable. In reality, the software factory is no longer just tools or IDEs — it’s the entire context chain the team builds. If that chain is solid, AI accelerates. If it’s fragile, AI amplifies risk.
How Is the Tech Lead Role Evolving?
Geoffrey Graveaud:
What we observe begins with behavior changes before formal role changes. Tech Leads and Lead Developers must rebalance their posture: slow down on certain commitments, reduce deadline pressure, and put quality and stability back at the center. Over long-term projects, this shift is essential.
The role moves toward coaching and cultural adoption. Tech Leads must go from theory to practice to convince teams. They lead by example, embodying craftsmanship practices and helping teams adopt new reflexes. New skills around technical vigilance are becoming essential — understanding AI evolution, vendor changes, and separating solid advances from hype.
Arthur Magne:
Across many organizations, one thing is clear: teams that invested in Developer Experience before AI arrived gained a visible advantage. Internal platforms, guides, and rules — often undervalued work — become critical today. They improve onboarding, productivity, and AI effectiveness.
Not every company has a DevEx team, and even when they do, it’s often under-resourced. Tech Leads must step into that gap: contribute to DevEx, transmit practices, and integrate AI into team workflows.
What Helps Relieve Tech Leads and Improve DevEx?
Geoffrey Graveaud:
The starting point is working seriously on AI adoption. We can’t just stack tools — we need to understand what works and why. Large, evidence-based studies show clear structural capabilities that make AI adoption effective. Having a dedicated group to test, synthesize, and guide these practices can dramatically reduce the burden on Tech Leads.
Arthur Magne:
The renewed focus on AI has highlighted subjects some organizations previously undervalued — documentation is a good example. Today, its value is obvious. Context matters, and it’s rarely as unique as teams believe.
In practice, the first step is building a real DevEx approach tied to AI. Adoption doesn’t happen organically at scale. You need concrete, high-value use cases to build confidence. That’s where context engineering becomes critical: structuring documentation, sharing it, maintaining a common base of prompts, instructions, and best practices.
Many teams start with a shared repository of prompts, which is a fine beginning — but insufficient once you aim for real efficiency. What’s missing is consolidation, governance, maintenance, and distribution.
This is exactly the need Packmind addresses. The goal is to capture, share, and evolve standards and operating modes — how to write code, how to test, how to reason — across humans and AI agents, while remaining tool-agnostic.
Final Message to Teams Navigating AI Adoption
Arthur Magne:
The key idea remains the same: context is king. Without clear technical and business context, AI cannot deliver its full potential. Concepts like specification-driven development only work if the foundations are solid. Context engineering is what turns AI into an amplifier rather than a debt multiplier.
Craftsmanship practices and lean principles haven’t disappeared — they’ve become mandatory. Convincing organizations of this was hard before AI. Today, it’s obvious. We need more engineering, more experience, and more rigor.
AI adoption should be incremental. Add context gradually. Capture it better. Share it. Industrialize step by step. Only then can more advanced AI workflows emerge.
Geoffrey Graveaud:
Tech Leads cannot absorb all of this alone. Culture must evolve. Coaching, mentoring, and knowledge sharing become central. At the same time, every dimension of quality — tests, readability, conventions, reviews — has never been more important. AI amplifies everything, good and bad. Teams that invest in these fundamentals will gain a lasting advantage.