GPS arrived, and at some point we stopped reading maps. There was no decision made — it just happened, almost imperceptibly, over a few months of simply not needing to. A similar shift is now underway in finance: AI is taking over tasks that until recently counted as skilled, hands-on work. According to Deloitte, 87% of CFOs say AI will be very or extremely important to the finance function in 2026. The age of paper maps, it seems, has passed here too.
McKinsey reports that finance teams are already using AI to close reporting faster, build more accurate forecasts, monitor working capital in real time, and surface new savings opportunities. Which means the CFO role is starting to look different. Where it once meant someone who pulls the numbers together, it now means deciding which tasks can be handed to AI and which cannot and where human judgment needs to stay in the loop before automation starts producing fast, confident, and wrong conclusions.
We spoke with Stefan Boehmer — CFO and AI transformation specialist in finance. His background is unusual for this space: he worked in various industries, basic research, medical devices, mobility, material handling/logistics before spending years working at the intersection of finance, operational efficiency, and AI implementation. In this interview, he reflects on why companies risk losing the ability to think for themselves if they move too fast, how the CFO role is changing under automation, and why the defining scarcity of the coming years may have nothing to do with money — and everything to do with AI literacy.

2Digital: in your view — what is a CFO actually for in the age of AI?
Stefan: AI will replace 70% of those finance tasks, because let’s be honest, most of what a finance person does is repetitive. Closing the books happens every month. Paying invoices — same thing.
We now need to elevate the CFO position. We have to stay valid.
To do that, we need to look at the bigger picture — and first, we have to tell a story. It’s no longer enough to just report a number. If a company sees a revenue drop, the story looks completely different depending on whether a customer delayed an order by a month, or the company actually lost significant order intake.
I also believe the CFO role will shift more towards the technical side. We have to understand technology, apply it, orchestrate it. We used to have one ERP system. Now there are 50,000 finance apps to choose from. We need to know which ones to pick, how they interact, and what breaks if one goes wrong. When I say apps — I mean agents. That’s the world of AI.
And the last piece: people. People will be our differentiator versus AI. Relationships become far more important — empathy, motivating the team, engaging the team. In a nutshell: IQ — intelligence; TQ — technical understanding; EQ — emotional empathy. CFOs who develop all three will stay relevant.
2Digital: AI makes it easy to build persuasive narratives around anything, and the data might be weak. How not to disconnect from facts when you can not hand check everything?
Stefan: It’s a fair point. Before GPS, I drove with maps and common sense. Sometimes easy, sometimes not, but I knew how to find my way. With GPS, most people don’t know how to navigate without it anymore. They’ve de-learned it, or never learned it at all.
The same risk exists with AI. In my opinion, people cannot outsource creativity and curiosity. We also have to keep in mind that AI doesn’t develop new ideas. It gives answers based on existing data. So if you want a genuine breakthrough, something truly creative — that has to happen outside of AI.

AI can help you find out what other developers are working on, what other companies are doing. But that’s copying. If you want to disrupt a market, if you want to build something new — you have to put your hands on it yourself.
2Digital: How to understand that the team is ready for AI adoption?
Stefan: I don’t think there’s a benchmark for that.
I do believe that you always have to bring the people along first. Start with an AI literacy journey — explain what AI is, show the benefits, and then develop the use cases together with the team.
In my last role, we automated customer invoicing. We had 600 to 1,000 invoices a month — boring, repetitive, and people were making mistakes. People were leaving the company because they didn’t want to do that month in, month out. Once we automated it and raised the quality, the work got easier for everyone around it. The team could see and feel the difference.
As for choosing what to automate — I always go by use cases. AI is a tool, not a solution. You start with a process that isn’t delivering the outcome you want, you define a use case with clear KPIs, and then you prioritize the ones with the highest return. That could be a quick win with low investment, or it could be something more complex that’s a genuine bottleneck for the business.
And I’d always look at RPA and traditional process improvement first, alongside AI. AI is not the right answer for everything.
2Digital: Can some decisions already be fully outsourced to AI?
Stefan: Absolutely, but it depends on which decisions, and what oversight you put in place.
In finance, we have to explain everything to auditors. So we need to understand how AI makes decisions and what it decides. Take incoming supplier invoices that aren’t PO-based — someone on the team has to route them for approval. And they don’t always remember who they sent a similar invoice to last month, or two months ago. On top of that, repetitive invoices end up coded to different accounts, because manual coding is inherently inconsistent.
AI can handle those routing and coding decisions. But in the end, someone still needs to look at it and confirm it’s correct. You always need that human checkpoint, otherwise you can’t rely on the output.
It’s the same with surgery. There are operations where AI robots perform the procedure. But the surgeon is still there, and the surgeon is still responsible. Work can be outsourced. Responsibility cannot.
2Digital: You ran an AI literacy program before deploying anything. But most CFOs don’t have six months for internal education
Stefan: Go slow, take off fast.
Establish the foundations, do the training properly — then you scale, then you go fast. A lot of companies skip that entirely and jump straight into execution. That’s why 95% of AI use cases fail. They fail because of people. They fail because the use case was never clearly defined in the first place.
2Digital: Should employees find and pay for their own AI tools, or does the company need to provide them centrally?
Stefan: The company needs to provide. And as a CFO, I have clear reasons for that.
First, most personal licenses aren’t valid for business use — different license types, and you’d be violating agreements. Second, if everyone picks their own tools, you lose synergies and you lose consistency. We saw exactly that in engineering teams where people used different design tools — never in the company’s best interest.
It goes beyond just picking the right tool, though. You need to understand the privacy implications: where is the data stored, what happens to it, what are the associated risks? That’s why I believe you need an AI committee — a group that reviews tools, understands what’s actually under the hood, and flags the risks before anything gets deployed.
2Digital: Traditional institutions are losing credibility and can’t keep pace with new technology. Some professors admit their students know the tools better than they do. So who is globally responsible for knowledge transfer now? Maybe Google should become the new university?
Stefan: The problem with that, of course, is that you need education independent of any single company. Google would only teach Google — not Salesforce, not anything else.
For me, this is fundamentally a local and regional challenge. Corporations have to work closely with education institutions to define what they’ll actually need in employees five to ten years from now. A good example was COVID — companies hired IT people en masse because everyone went remote. Students saw the demand and flooded into IT degrees. Three or four years later, when they graduated, nobody was hiring — because the market was already saturated from the COVID wave.
We need to look further ahead. And the work has to start earlier than university. Universities expect students to arrive ready to learn — they shouldn’t be teaching the basics. That means starting in high school. There’s actually a lot happening on this front in Texas right now. The University of Texas at Dallas just received a $4 million fund Including local Highschools like Uplift Education.

I’m mentoring several students at the moment, and I’m genuinely impressed by the tools they bring. But I also see a sharp divide. Students have all these powerful, sophisticated tools — and no deep expertise. I go into companies, and they have deep expertise — but they don’t understand the tools. We have to bridge that gap. Bring both sides together, leverage each side.
2Digital: You frame AI as augmenting human potential — but we’re seeing massive layoffs and people struggling to find work. How should a CFO draw the line between augmentation, role redesign, and headcount reduction?
Stefan: I think what we see in the press is misleading.
A lot of companies that are cutting headcount simply hired too many people during COVID and are now correcting that. They call it AI-related — but it isn’t. Then you have companies like Microsoft investing so heavily in data centers that they don’t have enough bottom line left, so they make a trade-off. They’re not laying people off because AI replaced their work. They’re laying people off to fund infrastructure.
Look at who’s actually doing the cutting — Salesforce, Google, Meta. These are the same companies driving the AI boom. If you go into the real economy, I don’t see widespread AI-driven layoffs, because the adoption rate simply isn’t there yet. We’re still figuring out what to do with it.
That said, 2026 is the year of execution. We’ve had two years of experimentation, we know what works, and now we implement. And if you implement well, AI is like any other automation tool. No different, fundamentally. The tools themselves will become a commodity — so many agents, so many AI companies competing, prices will drop. The real differentiator will be execution. How well do you deploy it? How well do you connect it? How well do you train your people?
2Digital: You wrote that the new poor won’t be defined by money, but by AI knowledge. What did you mean by that?
Stefan: I’m not concerned that AI will take jobs. What I am concerned about is that AI will create a deep divide — between people who understand it and people who don’t. And it won’t be AI taking jobs from humans. It’ll be people with AI knowledge taking jobs from people without it.
The important thing is — AI can be learned. Even people in poor countries, with limited resources, can move up through AI literacy. And we’re already seeing that. More and more AI experts are emerging from outside the US, from all over the world. They’re building solutions, developing agents, and coming with fresh ideas.
But the people who don’t learn it — or can’t learn it because they lack access — will be left behind. The access gap is real in parts of the world where the basic infrastructure isn’t there yet. And then there’s a different group: people who simply don’t want to learn, often the older generation, who don’t see the benefit or resist the change.
Either way, the outcome is the same. You get left behind.
2Digital: If there’s something important that rarely gets asked, something you think people consistently miss about AI transformation — what is it?
Stefan: People, platform, performance. In that order.
You start with the people, you select the right tool, and then you execute. Simple framework — but most companies get it backwards.
And here’s the thing about AI: it makes everything faster. But if bad data goes in, you get bad results faster. So you have to deeply understand what you’re doing and where you’re implementing it. That’s why I see more and more forward-deployed engineers coming into companies to optimize processes from the inside — because transformation is company-specific. Where do you stand? What are your bottlenecks? Then you improve.
I’m a trained Six Sigma Black Belt. For me, whether it’s AI or not, the process is always the same — you start with the status quo, define the KPIs you want to hit, and then implement the changes. AI doesn’t change that logic. It just raises the stakes if you skip it.

