
Written by
Lukas Ebner
•
•
Projects

Friday, 4:20 PM. The project manager opens the time tracking dashboard and does the math: the team has burned through 78 of 100 budgeted hours. Progress according to last week's status call: roughly 60 percent. Roughly. He closes his laptop, drives home, and hopes the numbers will somehow work out. On Monday, they don't. Turns out it was closer to 45 percent.
Anyone who has worked at a digital agency or IT services company knows this moment. The only question is whether you catch it early enough — or at invoicing.
The Gap Between Gut Feeling and Reality
McKinsey and the University of Oxford studied large-scale IT projects with budgets exceeding $15 million. The findings are bleak: costs ran 45 percent over budget on average, timelines slipped by 7 percent, and the delivered value fell 56 percent short of expectations. One in six projects turned into a financial black swan — over 200 percent cost overrun.
You might argue that $15 million projects have little in common with a 20-person agency. Fair point. But the pattern scales down perfectly. A team that plans 100 hours for a web project and realizes at hour 80 that they're only halfway done faces the same structural problem as a Fortune 500 company. Just with less cushion.
Project controlling is the attempt to close this gap between gut feeling and reality in a systematic way. No magic. No enterprise-grade ERP rollout. Just: look at the numbers regularly, compare, adjust.
Why Textbook Methods Don't Work for Service Companies
The classic definition of project controlling — planned vs. actual comparison, Earned Value Analysis, milestone trend analysis — comes from a world of construction projects and heavy industry. The German standard DIN 69901 defines it as securing project goals through continuous comparison, deviation analysis, and corrective measures.
Sounds reasonable. And it is — in principle.
The problem: agencies and IT service companies operate under conditions these methods weren't designed for. No fixed scope, because the client has new ideas after sprint two. No clear percentage of completion, because software doesn't grow from the bottom up like a building. And not one large project, but twelve running in parallel, competing for the same people.
If you've ever tried applying a textbook Earned Value Analysis to an agile web development project, you know what I mean. The method assumes you know the total scope and can measure progress as a percentage. On a fixed-price relaunch with shifting requirements, both assumptions are fiction.
What Actually Matters
For service companies, project controlling is less a methodology and more an attitude: How much of my budget is spent? How much work is actually done? Does the ratio make sense?
It sounds trivial. But in practice, most teams don't fail because they lack methods — they fail because nobody asks these two questions on a regular basis. Time tracking is running, but no one looks at the data. The budget exists in the proposal, but no one checks it against reality.
The Metrics That Actually Matter for Service Businesses
Forget the iron triangle from the textbooks for a moment. Not because it's wrong, but because it's too abstract. Here are the numbers that make a real difference in a 10- to 50-person agency:
Budget Burn Rate
How fast am I burning through the budget relative to progress? Example: a project has a 100-hour budget. After two weeks, 40 hours are logged and the project is 35 percent done. The burn rate is slightly above plan — not yet alarming. If the project were only 20 percent done, that would be a different story entirely.
This single number — hours consumed divided by completion percentage — tells most teams more than any Earned Value formula ever will.
Contribution Margin per Project
What's left of the project revenue after deducting actual labor costs? Many agencies only find out months after delivery. Or never. Those who connect their time tracking to financial controlling see this number in real time. And sometimes discover that the flagship project everyone is proud of has been quietly eating the quarter's margin.
Forecast Accuracy
How close were your estimates to reality — not on a single project, but averaged across all projects? This meta-metric reveals whether the estimation problem is systematic. If the average overrun sits at 130 percent, the team isn't bad at estimating — they're consistently estimating too low. That's a different problem than random outliers.
According to PMI research, projects with structured controlling experience significantly less scope creep. Without systematic monitoring, scope almost always drifts — the only question is when you notice.
Five Patterns That Quietly Kill Projects
From what I can tell after fifteen years of running agency and software businesses, it's rarely the big disasters that make projects unprofitable. It's patterns you only see when you look regularly.
The first is the Friday problem. Hours get logged on Friday evening, from memory. What took three hours on Wednesday becomes "roughly two" — because nobody remembers exactly. Over a quarter, these small gaps add up to surprising numbers.
Then there's the optimism bias in status meetings. "We're almost done" is the most dangerous sentence in project work. It's usually wrong. Not because anyone is lying, but because humans chronically underestimate remaining effort. The 90-percent trap: the last ten percent takes as long as the first ninety.
Scope creep without a price tag is another classic. The client asks if you could "just quickly" adjust the contact form. Sure, no problem. Two hours. Then four, because the CMS doesn't cooperate. These micro-changes add up — we've analyzed why IT projects really fail, and this pattern shows up every time.
One that almost never makes it onto anyone's radar: internal projects without a budget. The website redesign, the CRM migration — nobody tracks them because there's no invoice attached. They just run. Sometimes for months. Tying up capacity that's missing from client work.
And finally, the spreadsheet trap. Controlling happens in a file that the project manager updates when they have time. Which means: never. At the end of the quarter, someone tries to piece together a summary from time entries, proposals, and invoices. That's not controlling. That's archaeology.
A Pragmatic Roadmap: Implement Project Controlling in Four Weeks
I'm skeptical of big rollouts. At eins+null, my previous company, we tried implementing controlling "properly" twice — with templates, training sessions, the whole program. Both times it fizzled out after six weeks.
What worked: starting small and making the value visible immediately.
In week one, one number per project is enough. Every project manager checks booked hours against budget on Friday. No analysis, no reports — just the number. If you're juggling multiple projects, do it for your three to five biggest ones. Takes ten minutes.
In week two, add a traffic light. Green means hours and progress are aligned. Yellow: there's a deviation, but it's still manageable. Red: without intervention, you'll blow the budget. Three colors, no philosophy.
Week three brings the 15-minute review — the team sits down once a week to go through the red and yellow projects. What happened? What do we do? Who talks to the client? No status meeting, no slides. Fifteen minutes, standing if necessary.
After four weeks, a retrospective: What changed since we started paying attention? Usually the answer is: we caught problems two weeks earlier than before. That's enough motivation to keep going.
The rest — dashboard automation, margin calculations, forecast tracking — comes once the basics are in place. Not before.
The Uncomfortable Truth About Controlling
Project controlling has a downside: it shows you things you might not want to see. That your favorite project is unprofitable. That the best developer on the team — the one everyone relies on — is logging the most hours on projects that are already over budget, because they're the only one who can fix things. Or that the standard 80-hour estimate for web projects hasn't been accurate for three years, but nobody updated it.
That's probably why so few service companies do it consistently. Not because the tools are missing or the methods are too complicated. But because not knowing is sometimes more comfortable than knowing.
The numbers don't disappear just because you're not looking, though. They show up at the latest in the annual review — and by then it's too late to course-correct.
We built Leadtime partly because at eins+null, we too often found out what went wrong only after the fact. Not because we're smarter — but because we got tired of always learning the truth at the end.


