2019 team rankings according to Champion Data

Remove this Banner Ad

Someone posited earlier that a tackle and free kick in the forward fifty is worth more than one in the midfield. CD take this into account.

Basically what happens is they use machine learning to figure out the difference in expectations for the next score before and after an event.

What do the outliers (Brisbane, WCE for example) tell you about where deficiencies may exist currently in how the AI is evaluating the on field play?

The positive expectation the AI thinks Brisbane players are accruing, isn’t translating to the bottom line (wins/losses), with the reverse being true for WCE.

Do small sample sizes and “luck” (bad bounces, bad officiating, etc) account for these kinds of outliers, or is it more of a case of the AI is still grappling with an extremely complex game and something about how these teams’ players contribute to winning/losing games isn’t being captured by the system at this point in time?
 
What do the outliers (Brisbane, WCE for example) tell you about where deficiencies may exist currently in how the AI is evaluating the on field play?

The positive expectation the AI thinks Brisbane players are accruing, isn’t translating to the bottom line (wins/losses), with the reverse being true for WCE.

Do small sample sizes and “luck” (bad bounces, bad officiating, etc) account for these kinds of outliers, or is it more of a case of the AI is still grappling with an extremely complex game and something about how these teams’ players contribute to winning/losing games isn’t being captured by the system at this point in time?
When I saw the Lions play early this year, they were actually doing a lot right. This would be reflected in their stats. They just played like a young team that hadn't learnt to win yet.

WC played a different gamestyle than most clubs this year. A bit like the Hawks in their dominant period. It's very focussed on maintaining posession and their stats will show a lot of uncontested marks and possessions.

Most clubs are more territory focussed and will have better contested stats, training and seeking contests more.

Without breaking down the algorithm I expect how it rates uncontested possessions might be a strong clue.

I dont think it's silly to predict a big ruse for the Lions BTW, they did start winning late last year. Nothings certain from year to year though.
 

Log in to remove this ad.

What do the outliers (Brisbane, WCE for example) tell you about where deficiencies may exist currently in how the AI is evaluating the on field play?

The positive expectation the AI thinks Brisbane players are accruing, isn’t translating to the bottom line (wins/losses), with the reverse being true for WCE.

Do small sample sizes and “luck” (bad bounces, bad officiating, etc) account for these kinds of outliers, or is it more of a case of the AI is still grappling with an extremely complex game and something about how these teams’ players contribute to winning/losing games isn’t being captured by the system at this point in time?
The sample size is an interesting one because you would naturally think the bigger the better but the bigger it is the longer you go back in history. The game is evolving quickly which may make machine learned data from past games inaccurate for the todays game. There are also things that are common but hard to measure. For example kicking to a marking contest where the kicker has deliberately put it to his teamates advantage. I believe this one is considered to subjective for CD to count.

You would also have to look at the players coming in and out post 2018. Recency bias also plays a part. Dom Sheed was amazing in the grand final but his worth is calculated off a weighted 40 odd game average.

People look at who won the grand final and then say it is the best list but outsiders win more frequently than the public imagines. West coast were paying $50 before the season started. Noone thought they had a steller list then.

Halfway through the season the AFL stopped having WA umpires for west coast home games. The AFL implied they were getting a helping hand from the umpires. If this is the case then it might make a team better than they actually are. (For all you West Coast fans, yes the Eagles did in fact win in the only game that counts which I do know is played at the MCG).

I think for the machine learning to have a big enough sample size they use what would the average team expect to score from a given position rather than what this particular team would expect to score. Different teams strategise to make best use of their particular resources which makes them different to the average team.

I don't know the answer to the big question though and for peace of mind as a freo fan i like to think of 2018 as an unusual monte carlo event.
 
Last edited:
2018

1. Sydney Swans – Five elite players (Lance Franklin, Dan Hannebery, Tom Papley, Josh Kennedy and Dane Rampe).
Note: The Swans also have a stunning 12 above average players.

2. Port Adelaide – Six elite players – (Robbie Gray, Paddy Ryder, Justin Westhoff, Charlie Dixon, Chad Wingard and Travis Boak).
Note: The Power has 10 above average players, with Jack Watts and Steven Motlop in that category. New recruit Tom Rockliff was rated average last year, largely because of injury.

3. Adelaide – Six elite players (Brodie Smith, Rory Sloane, Rory Laird, Eddie Betts, Tom Lynch and Taylor Walker).
Note: The Crows have five above average players.

4. GWS – Four elite players (Zac Williams, Toby Greene, Jeremy Cameron and Lachie Whitfield).
Note: The Giants have 10 above average players.

5. Melbourne – Four elite players (Jake Lever, Christian Petracca, Jayden Hunt and Tom McDonald).
Note: The Demons have eight above average players including Clayton Oliver, Max Gawn, Jack Viney and Nathan Jones.

6. Geelong – Four elite players (Patrick Dangerfield, Gary Ablett, Sam Menegola and Daniel Menzel).
Note: The Cats have seven above average players.

7. Western Bulldogs – One elite player (Jason Johannisen).
Note: The Dogs have nine above average players and 12 average players.

8. Richmond – Three elite players (Shane Edwards, Dustin Martin and Alex Rance).
Note: The Tigers have nine above average players.

9. Hawthorn – Three elite players (Ben McEvoy, Luke Bruest and Cyril Rioli).
Note: The Hawks have seven above average players.

10. Collingwood – Three elite players (Jeremy Howe, Scott Pendlebury and Jack Crisp).

11. Essendon – Two elite players (Anthony McDonald-Tipungwuti and Michael Hurley).

Note: The Bombers have nine above average players.

12. West Coast – Six elite players (Jeremy McGovern, Elliot Yeo, Shannon Hurn, Josh Kennedy, Luke Shuey and Nic Naitanui).
Note: The Eagles have two above average players. Andrew Gaff is listed as an average player.

13. North Melbourne – One elite player (Todd Goldstein).
Note: The Kangaroos have six above average players.

14. St Kilda – One elite player (Jack Sinclair).
Note: The Saints have seven above average players.

15. Gold Coast – Two elite players (Aaron Hall and Tom Lynch).
Note: The Suns have four above average players.

16. Brisbane Lions – Two elite players (Daniel Rich and Dayne Zorko).
Note: The Lions have four above average players, including Luke Hodge.

17. Fremantle – One elite player (Nat Fyfe).
Note: The Dockers have five above average players.

18. Carlton – One elite player (Sam Docherty).

https://www.sen.com.au/news/2018/01/31/champion-data-ranks-your-club's-list-for-2018/
Shows you how much of a joke Champion data is. Freo apparently had the worst best 22 in the comp last year and finished 14th.
 
Shows the flaws in a stats based system. As of last year I don’t think it’s unfair to rank WCE def top 2 and Forward top 2-3 and midfield 6th at worst being extremely pessimistic that’s a bottom top 8 side st worst.
 
Essendon only have 2 elite players? what about Merret,Smith,Shiel,Heppel,Fantasia etc

Get stuffed Champion data lols

I think they have underrated the amount of elites... but you sir, have vastly overrated. You think Essendon have 7 elite players?
 
Just shows you can win a premiership by getting lucky and doing half the work of other teams
 

(Log in to remove this ad.)

Shows the flaws in a stats based system. As of last year I don’t think it’s unfair to rank WCE def top 2 and Forward top 2-3 and midfield 6th at worst being extremely pessimistic that’s a bottom top 8 side st worst.

You won the flag, why do you care?
 
For those that did not see

Top 5 Key Forwards
1. Lance Franklin
2. Jack Riewoldt
3. Tom Hawkins
4. Jack Gunston
5. Tom McDonald

Top 5 Key Defenders
1. Jeremy McGovern
2. Harris Andrews
3. Alex Rance
4. Majak Daw
5. Jake Lever

Top 5 Rucks
1. Brodie Grundy
2. Max Gawn
3. Nic Naitanui
4. Paddy Ryder
5. Ben McEvoy

https://www.sen.com.au/news/2018/11/29/champion-datas-top-five-key-forwards-defenders-and-rucks/
Gawn AA starting ruck yet second best in comp; good logic there

Lever misses most the year also Named in top 5 defenders, seems legit.. . I would of replaced him with May.
 
It isn't an exact stat, though I think it should be, but I find that metres gained divided by disposals is a good way to judge players that get cheap stats.

I did this a few years ago but

Heeney - 328m (20.6 disposals)
Parker - 307m (25.3 disposals)
Kennedy - 305m (28.8 disposals)
Mills - 295m (17.3 disposals)
Rampe - 285m (16.6 disposals)
Jack - 281m (17.7 disposals)
Hannebery - 280m (24.7 dispoals)
McVeigh - 266m (18.5 dispoals)
Rohan - 255m (9.6 disposals)
Grundy - 253m (16.8 disposals)
Towers - 224m (14.7 disposals)
Papley - 233m (15.6 disposals)
Cunningham - 207m (14.2 disposals)
Melican - 197m (11.9 disposals)
Hewett - 162m (18.7 disposals)

So in other words per disposal, these are were the Swans most damaging players

Metres for every disposals

Rohan - 25.8m
Franklin - 25.7m
Newman - 19.2m
Jones - 17.9m
Rampe - 17.2m
Mills - 17.1m
Lloyd - 16.7m
Melican - 16.6m
Heeney - 15.9m
Jack - 15.9m
Towers - 15.2m
Grundy - 15.1m
Papley - 14.9m
Cunningham - 14.6m
McVeigh - 14.4m
Parker - 12.1m
Hannebery - 11.3m
Kennedy - 10.6m
Hewett - 8.7m

That's not effective at all.

A bloke who torps it 70m to a 50/50 gets 70m gained even if the other side kills the contest and rebounds for a goal where as someone like Josh Kennedy might win the ball in dispute in a contest and handball 5m back wards to the sweeper who sets up a scoring play.

Kennedy is -5m on that metric.

Bloke who torps it 70m won't get the footy without Kennedy winning it to him in the first place.

It's like saying Gary Rohan is a more effective player than Patrick Cripps which is just plain wrong.
 
Gawn AA starting ruck yet second best in comp; good logic there

Lever misses most the year also Named in top 5 defenders, seems legit.. . I would of replaced him with May.

That doesn't makes sense.
Champion data is a statistical analysis.
AA is just 7 old blokes opinion and there probably hasn't been a closer run contest than the Gawn Grundy debate this year.

I don't mind people thinking Grundy was better, they were both ******* sensational and it's a shitty pissing contest to name a best.

Gawns a better Tap ruck and Marking player but Grundy gets more disposals and plays as an extra midfielder with much better disposal by both hand and foot.

Both had unreal seasons though.
 
Shows you how much of a joke Champion data is. Freo apparently had the worst best 22 in the comp last year and finished 14th.
It’s analysing lists going into 2019, Neale is rated extremely highly in their rankings. They have a big problem though, WC haven’t lost anyone and are ranked 11. They have a flaw in their system.
 
It’s analysing lists going into 2019, Neale is rated extremely highly in their rankings. They have a big problem though, WC haven’t lost anyone and are ranked 11. They have a flaw in their system.

You need only look at their 2018 predictions to know its garbage stats.

All they are proving is either:
You cant quantify the unquantifiable
Or
Their markers for performance are completely wrong

Either of the above is true. Or both. But its at least one.
 
There's no "opinion" in these stats. And they don't suggest taking them as any guide to future performance.
I think the question is whether they mean anything. It all comes down to what stats they choose to prioritise. And from people's comments, the choices don't seem to align that well with reality.

Also, what's important changes all the time and different style lists work well for different game plans. For example, Richmond and Dogs relied on speed and hardness, while West Coast were more traditional and relied on talls marking in defense and forward, and quick clean ball movement.
 
I assume the majority of posters believe that the ladder will remain the same in 2019 as it was in 2018?

I'm not saying the list looks spot on, but just because x team won the flag or x team hasnt finished bottom 6 in decades doesnt mean their list is best/poor.

Teams overachieve, underachieve.

Anyone who suggests that a team who finishes 5th has the 5th best list is a moron. They might have the 3rd best or the 10th best.
 
Shows you how much of a joke Champion data is. Freo apparently had the worst best 22 in the comp last year and finished 14th.

14th v 18th, you are splitting hairs there buddy.

Plus I'm sure you would consider Lyon a better coach than Bolton, Richardson and Dew.

Plus Freo have a much bigger home ground advantage than Carlton, St Kilda and GC who spent the year playing away from home because of the Commonwealth Games.

Champion Data are not taking those last two points into account. Nor Carlton getting savaged by injuries for example.

Their system has thrown up some clearly wrong results but having the 2018 version of Freo as the worst best 22 is not one of the bad ones, they might not have been the absolute worst but they were pretty close.
 

Remove this Banner Ad

Back
Top