← Back to The Debrief

When Annual Performance Reviews Fail

Filip Ivanković··8 min read·Teams & Culture
Default reading size

I once sat through a performance review where the marketing manager was rated “exceeds expectations” across the board. Three months later the business replaced the entire team. The review measured activity. The business needed outcomes.

Annual performance reviews are one of the most universally disliked rituals in corporate life. Managers dread writing them. Employees dread receiving them. HR chases both sides for weeks to get them completed on time. And when they are finally done, the vast majority change nothing about how anyone actually works.

For marketing teams specifically, the annual review model is particularly broken. Marketing operates on campaign cycles, algorithm changes, competitive shifts and seasonal patterns that bear no resemblance to a twelve-month calendar. Evaluating a marketer once a year on goals set twelve months earlier is like grading a pilot on a flight plan they filed before knowing the weather.

95%

of managers say their company’s performance review process is ineffective, according to Deloitte research across Australian and global workforces

The problem is not that reviews exist. Feedback matters. Accountability matters. The problem is that the standard review model fails on five specific dimensions, and most organisations have never stopped to question any of them.

1. Measuring activity instead of impact

The most common failure in marketing performance reviews is measuring what people did rather than what it produced. Campaign launches. Content pieces published. Ads created. Social posts scheduled. These are inputs. They tell you someone was busy. They tell you nothing about whether the business moved forward.

This happens because activity is easy to observe and easy to count. Impact requires a measurement framework that connects marketing work to commercial outcomes, and most businesses have not built one. So the review defaults to what is visible rather than what matters.

The result is predictable. Teams learn to optimise for visible busyness. The person who launches twelve campaigns gets rated higher than the person who killed eight underperformers and doubled down on four winners. The system rewards volume over judgement, and over time the culture follows.

67%

of Australian marketing teams report being measured primarily on output volume rather than commercial contribution

The fix is straightforward but requires discipline. Every marketing role should have two to three outcome metrics that connect to business results. Revenue influenced. Pipeline generated. Customer acquisition cost. Conversion rate improvement. Not tasks completed. Outcomes delivered.

2. Wrong metrics at the wrong altitude

Even when organisations try to move beyond activity measurement, they frequently choose metrics at the wrong altitude. Channel metrics like impressions, CTR, keyword rankings and social engagement feel more sophisticated than counting tasks. But they are still proxies. They measure marketing’s internal performance rather than its contribution to the business.

A paid media specialist should not be reviewed solely on ROAS if the business goal is market share growth. An SEO lead should not be reviewed on ranking positions if the pages ranking are not the pages that convert. A content marketer should not be reviewed on traffic if the traffic is not the right traffic.

The altitude problem is especially acute in Australian mid-market businesses where marketing teams are small and individuals wear multiple hats. When one person manages paid media, organic social and email, which channel metric do you use for their review? The answer is none of them individually. You use the commercial outcome their combined work produced.

When the metrics in the review don’t match the metrics the CEO cares about, the review is theatre. Everyone goes through the motions. Nothing changes.

3. Goals set in January, irrelevant by April

Annual goal-setting assumes a degree of predictability that marketing has not had for at least a decade. A set of goals written in January can be completely obsolete by Q2 due to algorithm changes, competitive moves, budget cuts, product pivots or market shifts that nobody could have predicted.

Google made over 4,500 changes to its search algorithm in 2024 alone. Meta overhauled its ad auction system twice. Privacy regulations continued to erode tracking accuracy across every channel. A marketer whose annual goals include specific ranking targets or attribution model outputs is being evaluated against a landscape that no longer exists.

The better approach is rolling quarterly objectives tied to a stable annual direction. The annual direction stays constant: grow pipeline by 30%, reduce CAC by 15%, increase conversion efficiency by 20%. The quarterly objectives flex based on what the data is showing and what the market is doing. Reviews happen against the quarterly objectives, with the annual direction as the compass.

4,500+

algorithm changes Google made in a single year. Annual goals cannot account for this pace of change

4. Feedback delayed is feedback wasted

The annual review cycle means that by the time feedback is delivered, the context has changed so completely that the feedback is nearly useless. Telling someone in December that a campaign they ran in March could have been better does not help them improve. The channels have changed. The audience has shifted. The learnings are stale.

This is not a new observation. Every management consultant and HR thought leader has been saying it for twenty years. And yet the annual review persists because organisations have not replaced it with anything better. They know the cadence is wrong but the alternative feels like more work.

It does not have to be. The most effective marketing teams I have worked with in Australia run 30-minute monthly performance conversations. Not formal reviews. Conversations. What worked this month. What did not. What are we changing next month. No forms. No ratings. No HR system. Just a manager and a team member looking at the same data and making decisions together.

The formal review still happens, but it is a summary of twelve monthly conversations rather than a once-a-year surprise. Nobody learns anything new in the annual review because the important conversations already happened in real time.

5. No connection to career growth

The final failure is the most damaging for retention. Most annual reviews evaluate past performance without connecting it to future growth. They are backward-looking assessments dressed up as development conversations. The employee hears what they did well and what they need to improve. They rarely hear how this connects to where they want to go.

Marketing talent in Australia is genuinely scarce. The good people have options. When a high performer sits through a review that says nothing about their trajectory, their development or their path to the next role, they start looking. Not because the review was negative. Because it was irrelevant to what they actually care about.

People don’t leave jobs because their review was bad. They leave because their review told them the organisation has no plan for their future. That’s a fundamentally different problem.

The fix requires managers to know what their team members want. That sounds obvious. In practice, most managers have never asked. They assume the person wants a promotion, a pay rise or both. Sometimes the person wants to specialise. Sometimes they want to move sideways into a different discipline. Sometimes they want to lead. The review should be built around those aspirations, not around a standardised form that treats every role the same way.

What a better model looks like

The organisations getting this right share a few common practices. They have replaced or supplemented the annual review with monthly or quarterly performance conversations. They measure outcomes rather than activity. They set goals at a cadence that matches the pace of the market. They connect individual performance to individual career aspirations, not just to organisational KPIs.

None of this requires expensive software or radical restructuring. It requires a decision that performance management is too important to do once a year, and a commitment to making it a regular part of how the team operates.

For marketing teams specifically, the shift from annual reviews to continuous performance management is not optional. The pace of change in digital marketing is too fast, the measurement landscape is too complex and the talent market is too competitive to rely on a model designed for a different era.

2.4x

Companies with continuous feedback models retain marketing talent 2.4x longer than those using annual-only reviews

The businesses that figure this out will keep their best people, improve faster and compound their advantages over competitors who are still filing annual review forms three months late. The ones that do not will keep wondering why their best marketers keep leaving for organisations that invest in their growth.

Filip Ivanković
Filip Ivanković

Founder of New Rebellion. 10+ years in performance marketing across agencies and in-house teams. Writes about the gap between marketing activity and commercial outcomes.