LOBUS.WORKS

Every engineering leader has an AI strategy.
Most of them are incoherent.

The gap between your CEO’s AI ambitions and your engineering org’s actual readiness is where value gets destroyed. I help close it.

Patterns I see repeatedly

01

The delegation trap

The CEO declared an AI-first strategy. A committee was formed. Nobody owns outcomes. Six months later, 20 engineers have Copilot licences, no experiments have shipped, and your best people are interviewing at companies that let them actually use the tools.

02

The amplifier blind spot

AI amplifies what you already have. If you have strong engineering practices, clear ownership, and small autonomous teams, AI will accelerate you. If you have manual QA, political capital allocation, and 12-person code reviews, AI will amplify the dysfunction. Most organisations are investing in AI tools before fixing the foundations.

03

The lane confusion

Your CEO tells the board you’re an AI innovator. Your CISO blocks every tool request. Your legal team takes 90 days to review a $500/month SaaS licence. Your hiring process bans AI use in interviews. You haven’t picked a lane — you’re in all of them at once, and the inconsistency is costing you credibility, talent, and time.

AI adoption is a change management problem

AI is not like adopting a new CI/CD tool or migrating to the cloud. It’s closer to managing through an acquisition. The tech stack gets audited. Roles get redefined. Structures that were designed for a different era need to be redesigned for the one you’re entering.

The organisations getting outsized returns from GenAI didn’t start by selecting tools. They started by investing in culture, engineering excellence, and developer experience. They created safe experimentation environments. They assigned dedicated leadership. They accepted a short-term productivity dip to build the capabilities that would pay off at scale.

The ones that failed delegated it to a committee, required proven ROI before any experiment, blocked tool access behind 90-day legal reviews, and then wondered why their best engineers left for companies that took AI seriously.

How I work with engineering leaders on GenAI

01

GenAI Readiness Diagnostic

2–4 weeks

I assess your engineering maturity, team culture, developer experience, and governance readiness for AI adoption. You get a clear picture of where you are, where the bottlenecks will appear, and what to fix before accelerating. Not a survey — a diagnostic based on interviews, artefact review, and structural analysis.

02

AI Operating Model Design

3–6 months

I work with your leadership team to redesign team structures, delivery governance, and accountability chains for AI-augmented engineering. This includes change management strategy, experimentation frameworks, skills development, and the organisational design changes required to capture real value from GenAI.

03

Executive AI Strategy Workshop

1–2 days

I facilitate your leadership team through the strategic decisions most organisations skip: Where do you sit on the AI adoption curve? What does that actually mean for policies, hiring, investment, and team structure? What’s your exposure if you move too slowly — or too fast? You leave with alignment, not a slide deck.

Why me

I’m not a consultant who studies AI adoption. I’m a VP who led a GenAI research team that delivered an internal product improving developer efficiency by 60%. I navigated the politics, the CISO negotiations, the board expectations, and the talent retention challenges — at enterprise scale, in a regulated industry.

I combine organisational systems coaching, engineering leadership, and executive coaching — the three disciplines required to make AI adoption actually work. Because the problem is never the model. It’s always the organisation.

I’m writing a book on governing AI-assisted software delivery at enterprise scale — grounded in systems safety engineering, aviation human factors, and financial controls. The thinking is deeper than a slide deck.

Let’s talk about your AI strategy

Get in touch