Certified Legendary Thread The Squiggle is back in 2023 (and other analytics)

Remove this Banner Ad

Similar question. But given the premiers all won at least 3 games against good opposition to win the flag and the location shows the final spot, ther is an inbuilt increase in position for the premier. Would it be more reasonable to show the premiers position at round 22 that would show where you need to be prior to finals? As typical all premiership teams shoot into the zone after their 3 wins.

Great comment. Fascinated to see the answer to this.

There is no pre-finals premiership zone. Sometimes team win the flag by being the best and merely maintaining that form throughout finals (like Hawthorn 2014 / 2015), and other times teams ramp up and start performing a lot better in finals (like Hawthorn 2008, or any of the last three premiers).

So we can't really say, "This is where you need to be pre-finals." It's more like, "If you're HERE, you can probably win the flag by maintaining current form, but if you're HERE, you need some significant but plausible improvement during the finals, while if you're HERE, you need to be the 2016 Bulldogs."

Also, of course, it matters a lot where your competition are. Collingwood 2011 were an all-time great team, but had to compete against an ever better all-time great team.
 
So we can't really say, "This is where you need to be pre-finals." It's more like, "If you're HERE, you can probably win the flag by maintaining current form, but if you're HERE, you need some significant but plausible improvement during the finals, while if you're HERE, you need to be the 2016 Bulldogs."

Yeah. I get that. It's clear how the system works. I do however agree that it would be interesting essentially to see how close these teams were to their final Premiership position on the squiggle come end of 22 H&A games vs end of year. How much movement and in what directions... although it could easily tell as much about how finals are played as it does about the team's performance.
 
Yeah. I get that. It's clear how the system works. I do however agree that it would be interesting essentially to see how close these teams were to their final Premiership position on the squiggle come end of 22 H&A games vs end of year. How much movement and in what directions... although it could easily tell as much about how finals are played as it does about the team's performance.
I have this old graphic from the last time the question came up in 2016! It's not particularly revealing, though.

end-of-home-and-away-premiers.jpg
 

Log in to remove this ad.

Fixed! Thanks for letting me know.

It's working weirdly, though. I tried a bunch of simulations with 5% luck and no rewriting history, and the minor round results were the same every time (Geelong 18, GWS 15, Collingwood 15, Adelaide 13, WCE 13, Adelaide make the four on percentage). The major round results were random, though.
 
It's working weirdly, though. I tried a bunch of simulations with 5% luck and no rewriting history, and the minor round results were the same every time (Geelong 18, GWS 15, Collingwood 15, Adelaide 13, WCE 13, Adelaide make the four on percentage). The major round results were random, though.
Ah, right you are! Thanks, I'll look at it tomorrow.

Must have broken when I rejiggered my ladder predictor.
 
Squiggle Doors should be actually fixed now.

While testing, I found a version where Geelong (1st) get upset by Adelaide (4th) in the first week of finals, smash Hawthorn in the semi, fly to Perth and tie 77-77 with the Eagles, then win by 3 points in extra time, and finally destroy Adelaide in the GF. That would be exciting.
 

(Log in to remove this ad.)

Am I the only one who does not understand how the squiggle works?
It’s all a bit confusing at first but it can be boiled down pretty simply in the end
It kind of works a bit like anyone trying to predict games but it can store more information and has less inaccurate biases.
Basically it uses ratings for each team to predict scores for games, then compares the final scores to the predicted scores.
If a team scores more than predicted, their attack rating increases so they move up. If they hold a team to less than predicted, their defense rating increases so they move right.
The new ratings are used to predict scores for future games.

There’s complicating factors but that mostly explains the behaviour of the squiggle.
 
Squiggle Doors should be actually fixed now.

While testing, I found a version where Geelong (1st) get upset by Adelaide (4th) in the first week of finals, smash Hawthorn in the semi, fly to Perth and tie 77-77 with the Eagles, then win by 3 points in extra time, and finally destroy Adelaide in the GF. That would be exciting.
 
Adelaide +14 v Richmond
Essendon +5 v Hawthorn
Gold Coast v St Kilda +1 (Virtually tipping a tie)
Fremantle +9 v Port Adelaide
Carlton v Western Bulldogs +11
North Melbourne v GWS +3

A proud week for the algorithm, falling over the line on 6/6. 72/111

1. Geelong 25.9
2. GWS 18.7
3. Collingwood 13.4
4. Adelaide 11.3
5. Essendon 5.9 (+4)
6. West Coast 5.8 (+1)
7. Port Adelaide 5.0 (-1)
8. Fremantle 4.6 (+2)
9. Sydney 4.5 (-1)
10. North Melbourne 4.0 (-5)
11. Brisbane -2.5 (+2)
12. Hawthorn -3.3
13. Richmond -4.0 (-2)
14. Western Bulldogs -11.6
15. Melbourne -11.7
16. St Kilda -16.4
17. Carlton -20.2
18. Gold Coast -25.4

Tons of movement around 5-10, but with the ratings so close together that's mostly cosmetic. The algorithm inserts a null round where teams have a bye, which can cause ratings to settle, often in the direction of zero.

West Coast Eagles +9 v Essendon
Sydney +13 v Hawthorn
Melbourne v Fremantle +7
St Kilda v Brisbane +8
Port Adelaide v Geelong +14
Western Bulldogs v Collingwood +25
 
Am I the only one who does not understand how the squiggle works?
In general, any model takes the results of games and tries to remove the biases from them to figure out how strong the teams really are. The major biases that Squiggle tries to remove are:
  • Strength of opposition team - e.g. beating the Suns by 5 goals is less impressive than beating Geelong by 5 goals
  • Home ground advantage - e.g. beating West Coast in Melbourne is less impressive than beating them in Perth
  • Ins/Outs - e.g. beating Richmond with major Outs is less impressive than beating them at full strength
  • Goalkicking accuracy - e.g. winning 10.1 to 6.20 probably means you just got lucky
There's a bit more to it - e.g. Squiggle rates teams separately on attack and defense, and is fond of strong defensive performances - but that's the most important part.

To predict a game, it takes the unbiased rating of the two teams and adds back in biases it expects to exist based on the venue and Ins/Outs, etc.
 
In general, any model takes the results of games and tries to remove the biases from them to figure out how strong the teams really are. The major biases that Squiggle tries to remove are:
  • Strength of opposition team - e.g. beating the Suns by 5 goals is less impressive than beating Geelong by 5 goals
  • Home ground advantage - e.g. beating West Coast in Melbourne is less impressive than beating them in Perth
  • Ins/Outs - e.g. beating Richmond with major Outs is less impressive than beating them at full strength
  • Goalkicking accuracy - e.g. winning 10.1 to 6.20 probably means you just got lucky
There's a bit more to it - e.g. Squiggle rates teams separately on attack and defense, and is fond of strong defensive performances - but that's the most important part.

To predict a game, it takes the unbiased rating of the two teams and adds back in biases it expects to exist based on the venue and Ins/Outs, etc.

Thanks for that, it was interesting to read.

Do you rate the Crows? I personally dont, I think our wins this season can be strongly linked to the points you raised in your post, furthermore I rate at most one win that I was impressed with and that was the one against the Giants at home where they were missing a couple of key players.
 
Thanks for that, it was interesting to read.

Do you rate the Crows? I personally dont, I think our wins this season can be strongly linked to the points you raised in your post, furthermore I rate at most one win that I was impressed with and that was the one against the Giants at home where they were missing a couple of key players.
Yeah, I do - it's an even year, and while they're clearly below Geelong, GWS & Collingwood, they deserve to be among - and possibly even at the front of - that next pack of teams.

It's true that the Crows were aided by home advantage in beating GWS, and also Richmond's injuries in that win, but even after accounting for those factors, they performed well. In fact, on those performances, Squiggle would have Adelaide still winning both games at a neutral venue, and all other things being equal.

The win over Melbourne was very shaky - Adelaide should have lost that - but the month before was solid, with a narrow loss to West Coast despite more scoring shots, a 1-pt loss away to Brisbane, and wins over Port, Freo and St Kilda (away).
 
Mmm, it's an interesting one!

Both teams were mildly upgraded by Squiggle after the game, because it likes extremely strong defensive performances more than it likes extremely strong attacking performances. This is because it's more common for great defensive games to signal strong future performances than high-scoring games.

Personally I suspect this is because it's a little easier for a team to get off the chain and score very heavily in two or three quarters than it is to lock an opposition down for the entire game, as is required to hold them to a low total score. (Squiggle considers behinds to be worth more than half a goal, so a "strong defensive performance" can't just be due to the oppo missing shots.) So when a team tightly restricts scoring shots for a whole match, Squiggle likes that a whole lot.

Usually a scoreline like 7.9 to 5.4 means terrible weather, but this time it really just was two very defensive performances. So it will be interesting to see how these teams go.

One note: If you're using Live Squiggle, just be careful with your screen resolution, because if you have the chart stretched out very wide, it will exaggerate defensive (horizontal) movement. You probably want to click the "1:1" button first.
Interesting to come back to this 6 weeks later... nothing conclusive, but both Adelaide and Fremantle have indeed gone pretty well since this infamous game in Round 7.
 
Interesting to come back to this 6 weeks later... nothing conclusive, but both Adelaide and Fremantle have indeed gone pretty well since this infamous game in Round 7.

Interestingly the same kind of game followed up just two weeks later when the Dockers played the Dons.
 
Interestingly the same kind of game followed up just two weeks later when the Dockers played the Dons.
That was low-scoring - 8.12 (60) to 7.11 (53) - but much more like a normal game where both teams were just a bit inaccurate. It featured 38 scoring shots, which is 52% more than when Adelaide beat Fremantle 7.9 (51) to 5.4 (34).

Or to put it another way, scoring shots in R9 ESS v FRE were 15% down on normal, but R7 ADE v FRE was 44% down.

Actually even that doesn't adequately convey the freakish nature of that game. Here is the season so far, one dot per game. I don't even need to label it.

Screenshot from 2019-06-18 16-47-17.png
 

Remove this Banner Ad

Back
Top