Certified Legendary Thread The Squiggle is back in 2023 (and other analytics)

Remove this Banner Ad

Sydney v Essendon +23
Western Bulldogs +3 v Brisbane
Carlton v Collingwood +40
Gold Coast +6 v Melbourne
St Kilda +2 v West Coast
Port Adelaide v Adelaide +6
North Melbourne v Geelong +34
Hawthorn v GWS +18
Fremantle +24 v Richmond

4/9. Running total 42/72

1. Geelong 25.7
2. Collingwood 19.7
3. Adelaide 11.0 (+1)
4. GWS 8.8 (-1)
5. Essendon 4.3 (+1)
6. West Coast 3.4 (+1)
7. Port Adelaide 2.3 (+1)
8. Fremantle 2.2 (-3)
9. Richmond 1.4 (+3)
10. Hawthorn 1.3 (+4)
11. Western Bulldogs -0.1 (-1)
12. Brisbane -2.6 (-3)
13. North Melbourne -4.5
14. St Kilda -7.6 (-3)
15. Carlton -12.4 (+2)
16. Sydney -13.5 (+2)
17. Melbourne -14.0 (-1)
18. Gold Coast -18.2 (-3)

West Coast +26 v Melbourne
Collingwood +27 v St Kilda
Brisbane v Adelaide +4
Geelong +26 v Western Bulldogs
Essendon +11 v Fremantle
North Melbourne +14 v Sydney
Port Adelaide +31 v Gold Coast
Richmond +1 v Hawthorn (almost tips a tie)
GWS +26 v Carlton
I find it quite intriguing how your algorithm has a completely different perspective on the tigers in comparison to the squiggle. The squiggle says we are 4th on aggregate and has pretty much held us in the upper group all season long, bar a few brief mid table moments. Your algorithm had us wayy down there last week, and still has us mid pack, a long way off the pace.
I'm not saying one's clearly more accurate - 42 tips clearly shows the power of the algorithm and is well above the average tipster this year (well, according to footytips, that is). Just interesting the different rankings that two tools can have - and both still have similar performances!
 
I find it quite intriguing how your algorithm has a completely different perspective on the tigers in comparison to the squiggle. The squiggle says we are 4th on aggregate and has pretty much held us in the upper group all season long, bar a few brief mid table moments. Your algorithm had us wayy down there last week, and still has us mid pack, a long way off the pace.
I'm not saying one's clearly more accurate - 42 tips clearly shows the power of the algorithm and is well above the average tipster this year (well, according to footytips, that is). Just interesting the different rankings that two tools can have - and both still have similar performances!

My algorithm is more sensitive to current form, and pays no account whatsoever to anything more than 13 rounds ago. (So, at the moment, the oldest round it considers is the last week before last year's finals). So, if the wind changes, I can pick it up more quickly. As against that, it will overreact to a short-term blip from a strong team, or a brief surge in form from a weak team.
 
Just as an exercise, applying my algorithm but paying no heed to last year's results at all produces this table:

1. Geelong 29.7
2. Collingwood 20.1
3. Adelaide 11.7
4. GWS 9.7
5. Essendon 4.8
6. Hawthorn 3.3
7. Port Adelaide 2.3
8. Fremantle 2.0
9. Richmond 1.1
10. West Coast 0.4
11. Western Bulldogs -0.6
12. Brisbane -3.5
13. North Melbourne -4.4
14. St kilda -9.1
15. Carlton -13.6
16. Sydney -14.9
17. Melbourne -17.0
18. Gold Coast -21.8

And, same again, but weighting each match this year equally

1. Geelong 29.7
2. Collingwood 19.6
3. GWS 14.1
4. Adelaide 7.7
5. Fremantle 5.9
6. West Coast 4.5
7. Hawthorn 3.7
8. Port Adelaide 2.0
9. Essendon 1.5
10. Richmond -0.6
11. Brisbane -0.7
12. Western Bulldogs -2.1
13. St Kilda -4.9
14. North Melbourne -11.8
15. Carlton -16.0
16. Sydney -16.5
17. Gold Coast -17.3
18. Melbourne -19.1
 

Log in to remove this ad.

How much attention you should pay to old form is a tough question, and different models have different opinions. Squiggle is more responsive this year to early-season form, but it's still quite attentive to how good a team was 12 months ago. Other models are even more heavily weighted toward older form.

There are cases where it's clearly advantageous to respond quickly, and cases where it's not. St Kilda this year, for example -- Squiggle believed in their early form, and was able to use that to correctly tip an upset over Melbourne, but since then it's cost it twice (v Adelaide and v West Coast).

There's probably no one-size-fits-all answer... teams rise or fall for different reasons. Really a model probably needs to be able to pick up on other signals (like personnel changes, or how differently the team is playing according to in-game stats) to figure out how sustainable a form change is likely to prove.
 
West Coast +26 v Melbourne
Collingwood +27 v St Kilda
Brisbane v Adelaide +4
Geelong +26 v Western Bulldogs
Essendon +11 v Fremantle
North Melbourne +14 v Sydney
Port Adelaide +31 v Gold Coast
Richmond +1 v Hawthorn (almost tips a tie)
GWS +26 v Carlton

7/9. Running total 49/81

1. Geelong 28.2
2. Collingwood 21.4
3. GWS 15.8 (+1)
4. Adelaide 9.1 (-1)
5. Richmond 7.7 (+4)
6. Essendon 5.2 (-1)
7. Fremantle 3.1 (+1)
8. Port Adelaide 2.9 (-1)
9. West Coast 1.0 (-3)
10. Brisbane -1.3 (+2)
11. Western Bulldogs -2.0
12. Hawthorn -3.0 (-2)
13. North Melbourne -7.9
14. St Kilda -8.9
15. Sydney -10.2 (+1)
16. Melbourne -13.4 (+1)
17. Carlton -19.2 (-2)
18. Gold Coast -20.8

Sydney v Collingwood +25
Hawthorn +1 v Port Adelaide
Western Bulldogs +6 v North Melbourne
Adelaide +21 v West Coast
Gold Coast v Geelong +43
Richmond +2 v Essendon
Melbourne v GWS +24
St Kilda +9 v Carlton
Fremantle +17 v Brisbane
 
Squiggle is a mess, no team stands out.

I know it likes the better defence, and Geelong went backwards despite their win against the Dogs, and i guess a similar sort of win against North. Possibly the higher conceded scores as well (80 and 89 pts) has made Squiggle think Geelong's defence is a bit porous. And maybe Geelong's surprising and probably unsustainable accuracy of the last few games could also be in there too. The Giants though made a big move of course with their big win over the BLues, and the low score they kept them to.
 
Sydney v Collingwood +25
Hawthorn +1 v Port Adelaide
Western Bulldogs +6 v North Melbourne
Adelaide +21 v West Coast
Gold Coast v Geelong +43
Richmond +2 v Essendon
Melbourne v GWS +24
St Kilda +9 v Carlton
Fremantle +17 v Brisbane

7/9. Running total 56/90


1. Geelong 26.8
2. Collingwood 18.8
3. GWS 16.6
4. Richmond 11.0 (+1)
5. West Coast 4.3 (+4)
6. Adelaide 4.2 (-2)
7. Essendon 2.2 (-1)
8. Hawthorn 2.1 (+4)
9. Fremantle 0.7 (-2)
10. Brisbane -1.0
11. Port Adelaide -1.4 (-3)
12. North Melbourne -4.0 (+1)
13. Sydney -6.17 (+2)
14. Western Bulldogs -6.21 (-3)
15. St Kilda -10.4 (-1)
16. Melbourne -12.7
17. Gold Coast -20.01 (+1)
18. Carlton -20.02 (-1)

North Melbourne v Richmond +15
Collingwood +27 v Fremantle
GWS +46 v Gold Coast
Geelong +38 v Sydney
Brisbane +3 v Hawthorn
Melbourne v Adelaide +11
St Kilda v Port Adelaide +9
Essendon +22 v Carlton
West Coast +20 v Western Bulldogs


Note: I’m redoing a tip from this round - it’s clearly an error for two reasons - I don’t count home groumd advantage for matches in China, and the tip is inconsistent with the ratings anyway. So, Port must clearly be tipped over St Kilda, and the margin without any home ground advatage applied must be +9, not St Kilda by 3.
 
Last edited:
North Melbourne v Richmond +15
Collingwood +27 v Fremantle
GWS +46 v Gold Coast
Geelong +38 v Sydney
Brisbane +3 v Hawthorn
Melbourne v Adelaide +11
St Kilda v Port Adelaide +9.
Essendon +22 v Carlton
West Coast +20 v Western Bulldogs

(Fixed the St Kilda/Port tip)

7/9. 63/99.

1. Geelong 26.1
2. GWS 19.2 (+1)
3. Collingwood 13.5 (-1)
4. West Coast 11.8 (+1)
5. Port Adelaide 9.6 (+6)
6. Adelaide 6.6
7. Fremantle 5.3 (+2)
8. Richmond 5.0 (-4)
9. North Melbourne 4.4 (+3)
10. Essendon 4.1 (-3)
11. Brisbane 3.0 (-1)
12. Hawthorn 0.9 (-4)
13. Sydney -3.0
14. Western Bulldogs -11.0
15. Melbourne -11.3 (+1)
16. St Kilda -18.6 (-1)
17. Gold Coast -21.5
18. Carlton -24.6

This round was characterised by the rise of the middle. Richmond and Collingwood fell, while substantial wins for West Coast, Port, North, Essendon and Brisbane all caused their ratings (if not their placings) to rise. In contrast, the bottom - Bulldogs, St Kilda, Gold Coast and Carlton - all fell, leaving a substantial gap between the contenders and the rest.

Richmond v Geelong +21
Carlton v Brisbane +22
Gold Coast v North Melbourne +18
Adelaide v GWS +4
Sydney v West Coast +3
Collingwood +25 v Melbourne
 
Last edited:
Ladder based on scoring shots for match outcomes

GWS 9-2
Collingwood 8-3
Geelong 8-3
Brisbane 7-4
Port Adelaide 6-4-1
Richmond 6-4-1
Western Bulldogs 5-4-2
Adelaide 5-5-1
Essendon 5-5-1
Fremantle 5-5-1
Melbourne 5-5-1
St Kilda 5-5-1
Hawthorn 5-6
West Coast 5-6
North Melbourne 4-5-2
Gold Coast 2-8-1
Carlton 2-9
Sydney 1-8-2
 

(Log in to remove this ad.)

I reckon it's showing the Giants moving right towards the sweet spot of premiership cups. Geelong will move closer once they start beating some good teams again. They've had a month of playing sides outside of the eight.
If we beat Richmond this week (who squiggle still rates surprisingly), and by at least 4 goals, you'll see Geelong move diagonally to the right.
 
Ladder based on scoring shots for match outcomes

GWS 9-2
Collingwood 8-3
Geelong 8-3
Brisbane 7-4
Port Adelaide 6-4-1
Richmond 6-4-1
Western Bulldogs 5-4-2
Adelaide 5-5-1
Essendon 5-5-1
Fremantle 5-5-1
Melbourne 5-5-1
St Kilda 5-5-1
Hawthorn 5-6
West Coast 5-6
North Melbourne 4-5-2
Gold Coast 2-8-1
Carlton 2-9
Sydney 1-8-2


Thank you!
 
Is there a way to find the week by week Season predictor? Would love to see how several team have fallen/risen etc

View attachment 687102
Not for the ladder predictor specifically. You can see how finishing position likelihoods (the colorful bars on the right) change over time in the Tower, and of course how team ratings change in the main chart.

I will add a way to see ladder predictions change as well, sooner or later, although probably first for the Aggregate Ladder, which draws from Squiggle + other models.
 
If we beat Richmond this week (who squiggle still rates surprisingly), and by at least 4 goals, you'll see Geelong move diagonally to the right.
I do wonder if Squiggle overrates Richmond because it overvalues their scoring shots.

The Tigers seem to have a game style that involves hacking the ball forward and trying to invent a goal once it gets there, which probably results in fewer clean shots at goal than most teams. Squiggle doesn't know this, and assumes that when a team is unusually accurate (or inaccurate), that's just random variation on the day, which shouldn't be expected to continue from week to week.

So when there's a game like last Friday's, where to my eye North Melbourne (15.9 99) completely outclassed Richmond (9.8 62), Squiggle gives the Tigers some credit for not being too far away on scoring shots.

Then again, supposedly Champion Data's Expected Scores take all this stuff into account (pressure of the shot, where it was taken from), and they also say the match was closer than the scoreboard suggested: a 20 pt victory to the North rather than the 37 pts it was in reality.
 
I do wonder if Squiggle overrates Richmond because it overvalues their scoring shots.

The Tigers seem to have a game style that involves hacking the ball forward and trying to invent a goal once it gets there, which probably results in fewer clean shots at goal than most teams. Squiggle doesn't know this, and assumes that when a team is unusually accurate (or inaccurate), that's just random variation on the day, which shouldn't be expected to continue from week to week.

So when there's a game like last Friday's, where to my eye North Melbourne (15.9 99) completely outclassed Richmond (9.8 62), Squiggle gives the Tigers some credit for not being too far away on scoring shots.

Then again, supposedly Champion Data's Expected Scores take all this stuff into account (pressure of the shot, where it was taken from), and they also say the match was closer than the scoreboard suggested: a 20 pt victory to the North rather than the 37 pts it was in reality.

Do you think Geelong's chart location is being negatively impacted by their accuracy? They seem to have an opposite style, with slower, high percentage entries inside 50 and a high conversion rate on the I50's and shots.

Our accuracy has seemed abnormally high in a lot of games this year but now that we're half way through the season, perhaps it's not abnormal at all and we're just accurate.
 
I do wonder if Squiggle overrates Richmond because it overvalues their scoring shots.

The Tigers seem to have a game style that involves hacking the ball forward and trying to invent a goal once it gets there, which probably results in fewer clean shots at goal than most teams. Squiggle doesn't know this, and assumes that when a team is unusually accurate (or inaccurate), that's just random variation on the day, which shouldn't be expected to continue from week to week.

So when there's a game like last Friday's, where to my eye North Melbourne (15.9 99) completely outclassed Richmond (9.8 62), Squiggle gives the Tigers some credit for not being too far away on scoring shots.

Then again, supposedly Champion Data's Expected Scores take all this stuff into account (pressure of the shot, where it was taken from), and they also say the match was closer than the scoreboard suggested: a 20 pt victory to the North rather than the 37 pts it was in reality.
Then Champion Data are full of crapola. If anything the scoreline flattered Richmond. I know I am one-eyed (and blind in that one too), but that was an absolute flogging all over the ground.
 
Then Champion Data are full of crapola. If anything the scoreline flattered Richmond. I know I am one-eyed (and blind in that one too), but that was an absolute flogging all over the ground.

Classic response - two evidenced based data conclusions wiped away by someone who 'watched' his team and decided they were completely dominant.
 
I do wonder if Squiggle overrates Richmond because it overvalues their scoring shots.

The Tigers seem to have a game style that involves hacking the ball forward and trying to invent a goal once it gets there, which probably results in fewer clean shots at goal than most teams. Squiggle doesn't know this, and assumes that when a team is unusually accurate (or inaccurate), that's just random variation on the day, which shouldn't be expected to continue from week to week.

So when there's a game like last Friday's, where to my eye North Melbourne (15.9 99) completely outclassed Richmond (9.8 62), Squiggle gives the Tigers some credit for not being too far away on scoring shots.

Then again, supposedly Champion Data's Expected Scores take all this stuff into account (pressure of the shot, where it was taken from), and they also say the match was closer than the scoreboard suggested: a 20 pt victory to the North rather than the 37 pts it was in reality.
I'm certainly not going to drag out the replay to have a look for myself, but did the game have an abnormally high proportion of shots in general play? Expected score is pretty handy with set shots because not a lot changes aside from the measured angle and distance, but general play stuff has a lot of other variables that it can't account for. Only way I can see such a massive difference from reality.
 

Remove this Banner Ad

Back
Top