The roots of Expected Goals (xG) and its journey from “nerd nonsense” to the mainstream 

The roots of Expected Goals (xG) and its journey from “nerd nonsense” to the mainstream 

Data: it’s not usually the most emotionally charged subject. But in the aftermath of Bayern Munich’s 2016 European exit at the hands of Atlético Madrid, the blustering of a wild-eyed Craig Burley would have the uninitiated believing the polar opposite. 

“Are you talking to me about expected goals in a Champions League semi-final that they’ve just lost?! What an absolute load of nonsense that is. I expect things at Christmas from Santa Claus but they don’t come, right? What I deal in is facts. That’s what the game is about. Whether you like it, I like it or anybody likes it, the game is about results. That’s why managers change jobs and get the sack, right? None of this nerd nonsense about expected goals.”

His fury was directed at fellow ESPN pundit Gabriele Marcotti. The unfortunate Marcotti had the temerity to highlight the fact that, although Bayern had failed to win the match by the two-goal margin which would have seen them reach the final, Pep Guardiola’s side had outperformed Diego Simeone’s in terms of xG – and by a thoroughly convincing margin, too.

Now, Burley’s belligerent indignation was not exactly out of character (there is an entire ‘Craig Burley goes off on…’ YouTube subculture if you dare venture down that particular internet rabbit hole), but it is typical of a certain demographic. 

Members of this agitated strand view the trend in football as out with the old, in with the VAR review. If they see Gary Neville grappling with a labyrinthine video replay interface on Monday Night Football, they wearily shake their heads before switching to Only Fools and Horses on UK Gold. In short, they believe that modernity is slowly disenchanting football. 

It’s an understandable perspective, perhaps even a noble one. To dismiss the anti-tech brigade as Luddites when many of their grievances are well-meaning and rooted in sound logic is a cop-out. In an epoch which has come to be defined by an inescapable symbiosis with technology and a relentless obsession with the quantifiable, football arguably should sidestep the sci-fi as much as it can.

But, as with offside nowadays, it’s a question of where you draw the line. Everyone’s happy with referees’ headsets. Most have no issue with goalline technology. But the swarm begins to buzz as soon as those two little letters are uttered: xG. What is it about this metric that so boils the blood? 

Read  |  Herbert Chapman: the mastermind who reimagined football in the early 20th century

For those who don’t already know, expected goals is the measurement of how likely it is that a particular goalscoring chance will be taken. It takes into account endless variables: the distance from goal of the striker, positioning of the goalkeeper, body part used to take the shot, angle from goal – these only begin scratch the surface, there are infinitely more.

Using data from thousands of hours of video, a number is calculated that reflects the likelihood of an opportunity resulting in a goal. In the case of an open goal from a yard out, this number might be 0.99, while a piledriver from 30 yards on a striker’s weaker foot might be 0.05. The fact that xG models ostensibly attempt to strip the game of the intangible by interpreting its every minutia as a factor contributing to a statistic is perhaps one of the reasons it is so tribalizing.

Traditionalists don’t want academics encroaching into sport, and expected goals is very much a product of academia. You don’t have to be a statistician to record the numbers traditionally displayed in a post-match graphic – possession, shots, shots on target, corners, fouls – but to figure out each team’s expect goals, you manifestly do. 

The exact origins of xG are disputed. The term itself was coined in 1993 by Vic Barnett and Sarah Hilditch in their investigation into the impact of artificial pitches on home team performance, but the first recorded use of a metric which can be seen as a precursor to the one we use today was the brainchild of Charles Reep and Bernard Benjamin in the 1960s. Reep and Benjamin were both statisticians but in terms of their involvement with football, the two men could hardly have been further removed. 

Almost the point of neurosis, Charles Reep was a football obsessive. In his formative years, he hungrily absorbed Plymouth matches from the terraces of Home Park before moving to London after joining the RAF in 1928. In the capital, Reep’s obsession grew. After attending a series of lectures given by Arsenal skipper Charles Jones, he became enchanted by the attacking wing play instilled at the north London club by Herbert Chapman.

From then on, Reep began to watch football with a different, more scientific gaze. He drew up play-by-play charts, logged the amount of attacking moves in a match and observed correlations between various passing combinations and goals. His findings turned him into a champion of the long-ball and the first true performance analyst in football.

Bernard Benjamin, on the other hand, was a journeyed academic, health scientist and statistician. While Reep was often spotted on the terraces of Home Park wearing a miner’s helmet to illuminate his notepad, Benjamin – a keen pianist and watercolour painter – was more comfortable pouring over data by the glow of an office lamp than at the coalface.

Order  |  Philosophies

Had they been active at the time, the Twitterati would most likely have dismissed him as an armchair fan. Together, Reep and Benjamin undertook a huge mining of data. From 1952 to 1967 they observed 667 matches, travelling to four World Cups in the process. They found that, with startling uniformity, it took roughly ten shots to score a goal. This analysis, coupled with Reep’s almost militant advocacy of the long-ball, meant a very rudimentary form of the expected goals metric was born.

While analytics in today’s game have all but killed off the hoof upfield at the elite level, in the late 60s, 70s and early 80s, the study was one of its driving forces. Reep’s namesake, Charles Hughes, elaborated on his ideas to create a slightly more nuanced doctrine: “POMO” or Position of Maximum Opportunity.

Hughes began his career in football coaching the England Amateur and British Olympic Teams. He held the position until 1974 when amateur status was abolished, before moving into administration with FA. While many regard Hughes – who espoused long-ball tactics as vociferously as Reep – as a pernicious influence on the English game, his POMO theory, which concerned the areas on the pitch and method of assist from which a goal is most likely to result, are resonant in xG algorithms today, albeit in a more sophisticated manner. 

Coming from the mid-20th century, Reep and Hughes’ ideas disprove the perception that the use of analytics in is an entirely modern phenomenon. But it is the use of the altogether more la-di-da sounding family of “advanced metrics” to which many fans object and of which xG is a member. 

For them, we have the American sports analytics to thank – or vilify, depending on your point of view. Perhaps the insularity of football, and of the English game in particular, is the reason this system is rejected with such venom. As well as the fact that it sounds like it belongs alongside R2-D2 and C-3PO in a galaxy far, far away, xG is – horror of horrors – foreign to football. 

The use of advanced metrics has been par for the course in American sports for decades. Sabermetrics in baseball – made famous to a non-American audience through the movie Moneyball – is probably the most high-profile example of establishing objective facts about in-game performance through the use of data there is. Similar ideas were introduced into basketball, gridiron and ice hockey before statisticians began to wonder whether they could be applied to football too. 

Long before Opta had brought xG to prominence, various researchers attempted to create a model which outlined the factors which might affect the success of an attempt on goal. The aim was to look beyond goals in football in the same way that sabermetrics had looked beyond batting average in baseball in order to create a more complete statistical picture.

Read  |  How science and innovation made Valeriy Lobanovskyi Eastern Europe’s greatest manager

In 2004, Richard Pollard – a great friend of Charles Reep’s, coincidentally – with the help of Samuel Taylor and Jake Ensum – who would go on to become Head of Analysis at Tottenham – published a paper which outlined a number of these factors. The distance from goal, angle, space enjoyed by the striker, number of touches before taking the shot, type of build-up play, location of the assist: all were used in their model to evaluate the quality of a goalscoring chance. 

From then on, new developments in expected goals models flowed free as water. The flooding of academic research into advanced metrics coincided with the explosive growth of the statistics industry in football. Suddenly, data collection in sport became a multi-million-pound business. The catalyst was the new technology which became available to data collection companies as machine learning and optical spatial data tracking boomed. Rather than basing their xG models on essentially anecdotal evidence, they now had cameras capable of collecting millions of pieces of data per game. As such, xG became and continues to become exponentially more credible. 

Though behind the scenes xG was the talk of the proverbial town, it had barely landed in the public consciousness by 12 August 2017, the day it first featured in Match of the Day’s post-match statistical round-up. It was then that the fury mill really began to turn. How could this fancy-pants piece of data with American roots have found its way into something as traditional and quintessentially British as Match of the Day? It perhaps wasn’t helpful that the BBC presented the statistic without comment – many a fan felt this soulless little number was sneering at them. 

This shouldn’t have been the case. Goals are the currency of football, but we make judgements about a team or player’s quality using other standards too. No-one who watched Celtic’s 2-1 victory over Barcelona in 2013 despite having just 11 percent possession and recording 166 passes in comparison 955 could claim that they were the better side on the night. Whenever we watch a game in which it’s obvious that the better side lost, it’s almost certain that xG will look more favourably on them. 

That doesn’t mean it wasn’t beautiful when an 18-year-old Tony Watt’s strike ten minutes from time filled the chilly Glasgow air with scorching, inexhaustible joy. The thing that makes football great is its fundamentally chaotic nature and the use of statistics to better understand the game can never rob it of that. To claim that it could is to do it a staggering disservice. 

“We were robbed”, “a real smash and grab”, “the scoreline didn’t show the complete picture,” all are maxims which serve the same purpose as xG. The metric might seem snooty at first, but football fans have been inadvertently collecting their own data sets since they first saw a ball kicked.

By Adam Williams

No Comments Yet

Comments are closed