Certified Legendary Thread The Squiggle 2018, 2019 and 2020 appreciation thread (and other analytics)

Remove this Banner Ad

jarrod_island_doggies

Club Legend
Oct 3, 2006
1,240
2,561
Gold Coast/Melbourne
AFL Club
Richmond
Other Teams
Phillip Island Bulldogs/Lions AFLW
Hey Final Siren you've mentioned how this week's been a trainwreck for models over tipsters - one that I think had a lot of people scratching their heads was the full suite of models that went all in on the Swans. What do you reckon caused that significant skew between skynet and Joe/Jane Public?
 

GROTTO

Hall of Famer
Jul 5, 2013
45,665
52,462
AFL Club
Adelaide
Other Teams
¯\_(ツ)_/¯
Hey Final Siren you've mentioned how this week's been a trainwreck for models over tipsters - one that I think had a lot of people scratching their heads was the full suite of models that went all in on the Swans. What do you reckon caused that significant skew between skynet and Joe/Jane Public?
 

Final Siren

Mr Squiggle
Aug 18, 2009
3,996
16,657
AFL Club
Richmond
Hey Final Siren you've mentioned how this week's been a trainwreck for models over tipsters - one that I think had a lot of people scratching their heads was the full suite of models that went all in on the Swans. What do you reckon caused that significant skew between skynet and Joe/Jane Public?
Good question!

So the game in question looks like this:

Screenshot from 2020-07-20 10-01-51.png


The market, which is a good estimate of public opinion, pegged the Suns as a 52% win chance, but all the models tipped Sydney, and many were very confident about it.

You really have to ask the individual model authors, but here are my own wild guesses:
  • FMI and The Flag are overconfident models, imo. That is, they lean too heavily in favour of the team they think will win. This shows up in very low Bit scores in unpredictable/even seasons.
  • Swinburne places a lot of store in venue-specific performance, so possibly had a unique idea about how well the Suns play the SCG, where I believe they'd never won before. [edit: They were 1-4 there.]
  • Live Ladders, which is one of if not the best model of the bunch, had this to say:
  • And this sentiment was echoed by Matter of Stats:
  • Some models don't consider Ins/Outs at all, and those that do, including mine, don't place that much emphasis on it. I thought I'd end up tipping Suns, once the Kennedy and Heeney outs became official, but they were mostly balanced out by the Ins - whereas I don't think most people saw it that way.
  • Models tend to be less reactive than humans - which is usually a good thing, because humans overreact. Gold Coast in 2020 have improved a lot but models are taking a while to be convinced.
 
Last edited:

(Log in to remove this ad.)

jarrod_island_doggies

Club Legend
Oct 3, 2006
1,240
2,561
Gold Coast/Melbourne
AFL Club
Richmond
Other Teams
Phillip Island Bulldogs/Lions AFLW
Good question!

So the game in question looks like this:

View attachment 915525

The market, which is a good estimate of public opinion, pegged the Suns as a 52% win chance, but all the models tipped Sydney, and many were very confident about it.

You really have to ask the individual model authors, but here are my own wild guesses:
  • FMI and The Flag are overconfident models, imo. That is, they lean too heavily in favour of the team they think will win. This shows up in very low Bit scores in unpredictable/even seasons.
  • Swinburne places a lot of store in venue-specific performance, so possibly had a unique idea about how well the Suns play the SCG, where I believe they'd never won before.
  • Live Ladders, which is one of if not the best model of the bunch, had this to say:
  • And this sentiment was echoed by Matter of Stats:
  • Some models don't consider Ins/Outs at all, and those that do, including mine, don't place that much emphasis on it. I thought I'd end up tipping Suns, once the Kennedy and Heeney outs became official, but they were mostly balanced out by the Ins - whereas I don't think most people saw it that way.
  • Models tend to be less reactive than humans - which is usually a good thing, because humans overreact. Gold Coast in 2020 have improved a lot but models are taking a while to be convinced.
Thanks for a really in-depth response.

One thing I suppose many may have "felt" in the lead up was not so much that Gold Coast had improved out of sight (even though signs show they have certainly improved from a small sample size) but that their improvement (astronomical or not) then couched against an undersized and undermanned Swans - who have been trending down on a bunch of models for over a year now - placed a lot of chips in GC's favour. The interesting part is we can't really go back and play the game again under the same conditions to see if the models had reason behind their apparent madness - it's a mug's game for you lot sometimes!

The other funny thing is that the Suns have now won two of their past three Swans games at the SCG. Trend? Time will tell.
 

threenewpadlocks

Brownlow Medallist
Sep 10, 2012
10,857
13,770
Melbourne
AFL Club
Western Bulldogs
Seems like a human could have done a better job assessing player ins/outs (even though in the vast majority of cases, we don't).

The big differential between Fiorini and Powell, for example, makes sense in their differences at AFL level in statistical production. But every human could figure out that it's probably a change that doesn't push the needle in either direction - both are currently fringe players for the Suns, and whilst Fiorini has had good seasons in the past, has only been selected for 3 of the first 6 games and was dropped after a 9 touch game.

That could have very well been the difference between the Squiggle and the punters itself.
 

twarby

Norm Smith Medallist
Sep 24, 2006
5,213
1,214
An Alley
AFL Club
Essendon
Other Teams
Henley Sharks
Good question!

So the game in question looks like this:

View attachment 915525

The market, which is a good estimate of public opinion, pegged the Suns as a 52% win chance, but all the models tipped Sydney, and many were very confident about it.
mostly true, market is still shaped by a handful of bettors though who have models like the above and was small move towards Swans way game day, most models are still very low on Gold Coast due to reasons touched on by yourself, cluster injuries for Swans are also a bit harder to get right for player based models.
eyeballing the models above the ones using some sort of player input were less keen for Sydney which makes sense, but home field adv was enough to keep them tipping them.

its easy to forget the week before though when the public was all over the Suns vs Melb and all models and pro bettors backed Melbourne from evens to -10.5
1595229884813.png


Seems like a human could have done a better job assessing player ins/outs (even though in the vast majority of cases, we don't).

The big differential between Fiorini and Powell, for example, makes sense in their differences at AFL level in statistical production. But every human could figure out that it's probably a change that doesn't push the needle in either direction - both are currently fringe players for the Suns, and whilst Fiorini has had good seasons in the past, has only been selected for 3 of the first 6 games and was dropped after a 9 touch game.
a human has no accurate way of knowing what a player is worth, good player models know this, 80% of the models above dont know who is playing the game.
 

Final Siren

Mr Squiggle
Aug 18, 2009
3,996
16,657
AFL Club
Richmond
Seems like a human could have done a better job assessing player ins/outs (even though in the vast majority of cases, we don't).
Yes, I feel like that's true in general. Models aren't magic; the only reason they tend to do well is that they're free from a collection of human biases. In particular:
  • Models pay equal attention to every game.
  • Models have no problem remembering how well Team X performed against Team Y in Round Z.
  • Models don't leap to conclusions from tiny data sets.
  • Models don't ignore the difference between a 6-goal margin and a 9-goal margin.
On the flip side, models have very limited information compared to humans, since they can't, you know, watch the freaking game.

For the above reasons, models are relatively strong during home & away, when there are too many matches for us to easily keep track of, and relatively weak in finals (and especially the Grand Final).

There's absolutely no reason why a person who watches a lot of football and who is aware of and can temper their own biases shouldn't reliably outperform a model. But this is a lot easier said than done, because we are all so beholden to our biases, we don't even realize we have them.
 

Simon_Nesbit

Brownlow Medallist
Sep 26, 2001
10,902
6,156
Tasmania
AFL Club
Hawthorn
Question without notice:

Have any of the stat/model boffins come up with a player rating akin to the NBA's win shares?
I remember reading years ago about Net Player Value (IIRC) - was relatively simple - took note of scores made versus players on ground.

Ie every player had "X" score for/against when they were on the ground, and their team had "Y" score for/against when they were off (bench/not selected). X>Y gave a value bigger than 1.
 

Dr Tigris

Premium Platinum
Aug 19, 2009
7,218
17,415
Canberra
AFL Club
Richmond
FS, it'll be interesting to see how Squiggle, and the other models, accuracy goes by season end this year. If they hold up you've gone a long way to proving how robust the models are. But with all of the disruptions I wouldn't be surprised if things are a bit more random this year.
 

(Log in to remove this ad.)

mouncey2franklin

Premiership Player
Jun 16, 2018
4,143
5,796
AFL Club
North Melbourne
The models are hopeless and this will be proven once again when the Suns run rings around the hapless Dogs at Skase Stadium in a few hours time.

(Seriously though this is a cool thread, thanks to the statisticians for the insights, fascinating to say the least).
 
Last edited:

Final Siren

Mr Squiggle
Aug 18, 2009
3,996
16,657
AFL Club
Richmond
A new look at crowds & home advantage, based on 1,534 soccer matches without fans in Europe.

https://www.economist.com/graphic-d...ums-have-shrunk-football-teams-home-advantage

Interesting points are:
  • Umpire bias has disappeared: home teams have received 50% of red cards with no crowds vs 46% with crowds.
  • Home teams have had fewer shots at goal: 53% with no crowds vs 55% with crowds.
  • Home teams have won a little bit less: 56% with no crowds vs 58% with crowds.
 

iluvparis

Hall of Famer
Apr 1, 2005
35,043
27,658
AFL Club
Carlton
Other Teams
Calgary Flames, Man Utd
A new look at crowds & home advantage, based on 1,534 soccer matches without fans in Europe.

https://www.economist.com/graphic-d...ums-have-shrunk-football-teams-home-advantage

Interesting points are:
  • Umpire bias has disappeared: home teams have received 50% of red cards with no crowds vs 46% with crowds.
  • Home teams have had fewer shots at goal: 53% with no crowds vs 55% with crowds.
  • Home teams have won a little bit less: 56% with no crowds vs 58% with crowds.
Oh hai there LicoriceAllsorts
 

Final Siren

Mr Squiggle
Aug 18, 2009
3,996
16,657
AFL Club
Richmond
Actually this is probably what I should have posted in the first place. From best fixture to worst in Rounds 1-5, the net benefit of Home Ground Advantage + strength of opposition:

HGA
Opposition
Net Benefit
Port Adelaide​
-10.3​
60.2​
49.9​
Sydney​
0.4​
44.8​
45.2​
Geelong​
6.0​
20.1​
26.1​
Brisbane Lions​
16.1​
6.2​
22.3​
Fremantle​
-14.3​
34.4​
20.1​
Gold Coast​
14.9​
-1.9​
13.0​
West Coast​
-4.3​
15.8​
11.5​
Adelaide​
-4.3​
12.9​
8.6​
North Melbourne​
-2.8​
-6.4​
-9.2​
Essendon​
-1.4​
-11.4​
-12.8​
Carlton​
-3.7​
-18.3​
-22.0​
Collingwood​
-3.0​
-20.1​
-23.1​
Richmond​
2.0​
-29.0​
-27.0​
Western Bulldogs​
-1.0​
-27.6​
-28.6​
Melbourne​
-4.7​
-28.3​
-33.0​
Greater Western Sydney​
16.7​
-54.6​
-37.9​
St Kilda​
-1.1​
-47.9​
-48.9​
Hawthorn​
-5.3​
-48.6​
-53.9​
I keep meaning to update this table at the end of a round, but WHEN IS THAT??

Anyway at the end of R11, from easiest to hardest fixture:

HGA
Opposition
Net Benefit
Gold Coast​
29​
21​
50​
Port Adelaide​
12​
30​
41​
Adelaide​
25​
2​
27​
Sydney​
11​
15​
26​
West Coast​
21​
2​
23​
Essendon​
-16​
39​
23​
Brisbane Lions​
29​
-15​
13​
North Melbourne​
-2​
12​
10​
Collingwood​
-27​
18​
-8​
Fremantle​
11​
-27​
-16​
Geelong​
-11​
-9​
-21​
St Kilda​
-21​
-8​
-29​
Melbourne​
-19​
-14​
-32​
GWS​
21​
-75​
-54​
Carlton​
-12​
-49​
-61​
Western Bulldogs​
-18​
-43​
-61​
Hawthorn​
-18​
-46​
-64​
Richmond​
-14​
-51​
-65​

  • The Giants have had a ridiculously difficult draw so far, which means they have many softer games to come: post-R11 they have Adelaide, Sydney, Carlton, Fremantle, St Kilda, and West Coast.
  • Also with a relatively tough run so far: Richmond, Carlton, Hawthorn, and the Bulldogs.
  • At the other end of the scale, the Bombers are about to hit the uphill part of their fixture, with games remaining against St Kilda, Richmond, Hawthorn, Port Adelaide, Geelong, Melbourne, and West Coast.
  • As you'd expect, Brisbane and Gold Coast have done best out of HGA. Collingwood have done the worst.
By the end of R13, which is as far as the AFL has fixtured:
  • The Hawks will reach at GWS levels of fixture difficulty, then ski downhill toward the end of their season by playing Adelaide, Essendon, Gold Coast, St Kilda, and the Bulldogs.
  • The Hawks will also become the team with the worst net HGA.
  • West Coast will surpass both QLD teams for best net HGA (mainly because of crowd numbers).
 

Dr Tigris

Premium Platinum
Aug 19, 2009
7,218
17,415
Canberra
AFL Club
Richmond
I keep meaning to update this table at the end of a round, but WHEN IS THAT??

Anyway at the end of R11, from easiest to hardest fixture:

HGA
Opposition
Net Benefit
Gold Coast​
29​
21​
50​
Port Adelaide​
12​
30​
41​
Adelaide​
25​
2​
27​
Sydney​
11​
15​
26​
West Coast​
21​
2​
23​
Essendon​
-16​
39​
23​
Brisbane Lions​
29​
-15​
13​
North Melbourne​
-2​
12​
10​
Collingwood​
-27​
18​
-8​
Fremantle​
11​
-27​
-16​
Geelong​
-11​
-9​
-21​
St Kilda​
-21​
-8​
-29​
Melbourne​
-19​
-14​
-32​
GWS​
21​
-75​
-54​
Carlton​
-12​
-49​
-61​
Western Bulldogs​
-18​
-43​
-61​
Hawthorn​
-18​
-46​
-64​
Richmond​
-14​
-51​
-65​

  • The Giants have had a ridiculously difficult draw so far, which means they have many softer games to come: post-R11 they have Adelaide, Sydney, Carlton, Fremantle, St Kilda, and West Coast.
  • Also with a relatively tough run so far: Richmond, Carlton, Hawthorn, and the Bulldogs.
  • At the other end of the scale, the Bombers are about to hit the uphill part of their fixture, with games remaining against St Kilda, Richmond, Hawthorn, Port Adelaide, Geelong, Melbourne, and West Coast.
  • As you'd expect, Brisbane and Gold Coast have done best out of HGA. Collingwood have done the worst.
By the end of R13, which is as far as the AFL has fixtured:
  • The Hawks will reach at GWS levels of fixture difficulty, then ski downhill toward the end of their season by playing Adelaide, Essendon, Gold Coast, St Kilda, and the Bulldogs.
  • The Hawks will also become the team with the worst net HGA.
  • West Coast will surpass both QLD teams for best net HGA (mainly because of crowd numbers).
And the Tigers ... it just stays hard (?)
 

Top Bottom