Champion Data’s List & Player Ratings - An Accurate Tool or Total Fabrication ??

They are perfectly fine if you understand the context and methodology in which the rating system was created. Unfortunately that goes beyond most people, so therefore it's routinely trashed as useless.

The most common misunderstanding is that these types of ranking systems are inferring something. They are not inferring anything, all it is is a ranking system based on a set of measurements and assumptions. Once you accept that they are simply a way to present what is happening on the field along with some assumptions - which can be agreed with or not, then it's a lot more digestible.

No statistical rating system is perfect, you have to look into each one to see the methodology behind it, which is why there will always be cases of players or teams that don't fit with the eye test. Use them in the context in which they were created, in addition to your own eye test and they can be useful tools.
 
Feb 23, 2009
32,139
45,738
Melbourne
AFL Club
Richmond
Other Teams
New York Jets
They are perfectly fine if you understand the context and methodology in which the rating system was created. Unfortunately that goes beyond most people, so therefore it's routinely trashed as useless.

The most common misunderstanding is that these types of ranking systems are inferring something. They are not inferring anything, all it is is a ranking system based on a set of measurements and assumptions. Once you accept that they are simply a way to present what is happening on the field along with some assumptions - which can be agreed with or not, then it's a lot more digestible.

No statistical rating system is perfect, you have to look into each one to see the methodology behind it, which is why there will always be cases of players or teams that don't fit with the eye test. Use them in the context in which they were created, in addition to your own eye test and they can be useful tools.
Understand that perfectly, but curious as to what their measures are that have the GWS midfield so low for example.
 
Understand that perfectly, but curious as to what their measures are that have the GWS midfield so low for example.
they had an explanation of their player rating system on the afl site last year, but as the format of the site has turned to s**t I'm struggling to find it now. I'm sure it's still on there somewhere...

Is it possible that the name power or reputation of some teams' players exceeds their actual output on the field?
 

living_in_syd

Norm Smith Medallist
May 2, 2009
6,259
10,488
sydney
AFL Club
Richmond
they had an explanation of their player rating system on the afl site last year, but as the format of the site has turned to s**t I'm struggling to find it now. I'm sure it's still on there somewhere...

Is it possible that the name power or reputation of some teams' players exceeds their actual output on the field?

afl app is hopeless it keeps crashing. So bad
 

nineteen eighty

Norm Smith Medallist
Sep 18, 2003
7,084
7,396
Sydney
AFL Club
Richmond
Other Teams
TIGERS
If they just listed the teams in order of 2019 ladder position it would make more sense. Their data tells us less than simply looking at what happened last year.

i always find their data in isolation is fine. One can view the data and make up their own minds.

Their data when used by CD to interpret outcomes is shite at times.
 
Dec 20, 2014
26,333
21,512
Hong Kong
AFL Club
West Coast
i always find their data in isolation is fine. One can view the data and make up their own minds.

Their data when used by CD to interpret outcomes is shite at times.
If I just cut and pasted last year's ladder and said "this is my ranking of the lists", it would be no less sensible than CD's "data-driven" account. So what are they adding?

It's not clear that adding extra layers of data reveals anything useful, accurate or insightful.
 
Dec 20, 2014
26,333
21,512
Hong Kong
AFL Club
West Coast
They are perfectly fine if you understand the context and methodology in which the rating system was created. Unfortunately that goes beyond most people, so therefore it's routinely trashed as useless.

The most common misunderstanding is that these types of ranking systems are inferring something. They are not inferring anything, all it is is a ranking system based on a set of measurements and assumptions. Once you accept that they are simply a way to present what is happening on the field along with some assumptions - which can be agreed with or not, then it's a lot more digestible.

No statistical rating system is perfect, you have to look into each one to see the methodology behind it, which is why there will always be cases of players or teams that don't fit with the eye test. Use them in the context in which they were created, in addition to your own eye test and they can be useful tools.
That's a lot of fence-sitting waffle.

The statistics are used to express a hierarchy which doesn't pass the smell test.
 

Yojimbo

Cancelled
10k Posts
Nov 14, 2012
10,914
9,834
The "Elephant" in the room.
AFL Club
Western Bulldogs
Midfields (Starting Six) x 2019 Brownlow Votes: Expressed in ladder format.

West Coast: 69 Votes
Western Bulldogs: 66 Votes
Collingwood: 62 Votes
Brisbane: 57 Votes
GWS Giants: 50 Votes
Carlton: 47 Votes
Fremantle: 45 Votes
Geelong: 43 Votes

Richmond: 42 Votes
North Melbourne: 39 Votes
Essendon: 37 Votes
Melbourne: 34 Votes
Adelaide: 33 Votes
Port Adelaide: 33 Votes
Hawthorn: 31 Votes
St Kilda: 30 Votes
Sydney: 29 Votes
Gold Coast: 17 Votes

Now obviously Fremantle with (45) is skewed slightly with Fyfe (33) and Walters (11) making up almost all
of the registered votes. Plus it is up to me to determine your starting midfield six which leaves the door
ajar for some statistical manipulation too say the very least. My point is all about CONTEXT you switch
Tim Kelly's votes back to Geelong and boom up to the pointy end we go. Life is CONTEXT.
 

seanoff

Premiership Player
Mar 12, 2007
4,783
1,492
Darwin
AFL Club
GWS
Other Teams
St Mary's
They are perfectly fine if you understand the context and methodology in which the rating system was created. Unfortunately that goes beyond most people, so therefore it's routinely trashed as useless.

The most common misunderstanding is that these types of ranking systems are inferring something. They are not inferring anything, all it is is a ranking system based on a set of measurements and assumptions. Once you accept that they are simply a way to present what is happening on the field along with some assumptions - which can be agreed with or not, then it's a lot more digestible.

No statistical rating system is perfect, you have to look into each one to see the methodology behind it, which is why there will always be cases of players or teams that don't fit with the eye test. Use them in the context in which they were created, in addition to your own eye test and they can be useful tools.

but their model doesn’t present what is happening on the field. Geelong had the best defence last year, the numbers clearly back it up, everybody struggled to score against them. When essentially the same defence rates as 16th, you can defend your model all you want, but the model is at best poor and i am leaning towards completely useless as it’s misleading And demonstrably wrong.

they need to tweak the context and methodology otherwise putting out this information just reflects poorly on them. It seems to me they are just aggregating player ratings whereas they probably need to add more data to the equation so they don’t look like idiots.

predicting the future is a difficult beast, but at least if you are using stats as the basis for your predictions, use a methodology that is at least arguable.
 
but their model doesn’t present what is happening on the field. Geelong had the best defence last year, the numbers clearly back it up, everybody struggled to score against them. When essentially the same defence rates as 16th, you can defend your model all you want, but the model is at best poor and i am leaning towards completely useless as it’s misleading And demonstrably wrong.

they need to tweak the context and methodology otherwise putting out this information just reflects poorly on them. It seems to me they are just aggregating player ratings whereas they probably need to add more data to the equation so they don’t look like idiots.

predicting the future is a difficult beast, but at least if you are using stats as the basis for your predictions, use a methodology that is at least arguable.

More often than not the articles you see like this are from some journo with no background in data analysis, and they strip all context from it. CD are better than people give them credit for.
 

Final Siren

Mr Squiggle
Aug 18, 2009
4,229
17,495
AFL Club
Richmond
They are perfectly fine if you understand the context and methodology in which the rating system was created. Unfortunately that goes beyond most people, so therefore it's routinely trashed as useless.

The most common misunderstanding is that these types of ranking systems are inferring something. They are not inferring anything, all it is is a ranking system based on a set of measurements and assumptions. Once you accept that they are simply a way to present what is happening on the field along with some assumptions - which can be agreed with or not, then it's a lot more digestible.

No statistical rating system is perfect, you have to look into each one to see the methodology behind it, which is why there will always be cases of players or teams that don't fit with the eye test. Use them in the context in which they were created, in addition to your own eye test and they can be useful tools.
Mmm I kind of agree, but I think you're being a bit generous to CD.

For one thing, a lot of their numbers are definitely more than simple tallies of stats. For example, "Pressure Rating" is based on a model of what CD think pressure is. "Expected Score" is from a CD model of what they think a typical team would have scored from the same opportunities. Except for very basic stats, CD numbers carry with them a subjective opinion about what they think good football is.

Another thing is that they resist publishing clear predictions, but must know that each year they produce a list like this, a bunch of articles will be written by the same journalists they work closely with all year, like "CD predict a rise for the Bulldogs." And the general public naturally has an appetite for predictions made by the AFL's official stats company. So I do feel like they have their cake and eat it too, to a degree, by facilitating a bunch of predictions, but not being held to them.
 

Final Siren

Mr Squiggle
Aug 18, 2009
4,229
17,495
AFL Club
Richmond
Also just FYI "infer" is when you use what you know to figure out something else - like using handball and tackle numbers to calculate a pressure rating. Which is indeed what CD do. You mean "imply," which is when you suggest more information than is immediately apparent.
 

seanoff

Premiership Player
Mar 12, 2007
4,783
1,492
Darwin
AFL Club
GWS
Other Teams
St Mary's
More often than not the articles you see like this are from some journo with no background in data analysis, and they strip all context from it. CD are better than people give them credit for.

yeah but I do and I don’t need a journo to tell me that a basically unchanged defence that was the best in the AFL last year and the year before that is rated 16th now. That one rating is enough to call into question the methodology. 2 years is not a statistical anomaly, or even a trend. Its the truth. And they were better in 2019 than 2018.

ranking Geelong 16th is a gross error that is easily found by a grade 4 kid. Someone should be looking at that going. Yeah. Nah. Try again, that’s obviously incorrect.

their stats are good, but conclusions that are discordant with the actual results must lead to questions about what conclusions they are drawing And why they are drawing from all the raw data they have.
 

Final Siren

Mr Squiggle
Aug 18, 2009
4,229
17,495
AFL Club
Richmond
Feb 23, 2009
32,139
45,738
Melbourne
AFL Club
Richmond
Other Teams
New York Jets
they had an explanation of their player rating system on the afl site last year, but as the format of the site has turned to s**t I'm struggling to find it now. I'm sure it's still on there somewhere...

Is it possible that the name power or reputation of some teams' players exceeds their actual output on the field?
Statistical output vs actual performance are two different things though. It's the stats vs watching football test.
There are plenty of statistically better players relative to others, it's often not the best measure.

The below is Dusty 2017 vs Mitchell's 2017 season. Statistically you could make a case that Mitchell's look better. More disposals, better efficiency, less turnovers, more tackles, that all looks pretty good. Watching football tells you Dusty was the better player on the field in 2017.



19.2
Kicks​
13.9​
10.6​
Handballs​
21.8
29.8​
Disposals​
35.8
4.1​
Marks​
5.3
1.5
Goals​
0.5​
1.2
Behinds​
0.5​
3.5​
Tackles​
6.5
0​
Hitouts​
0​
6.0
Inside 50s​
3.5​
1.2
Goal Assists​
0.4​
1.5
Frees For​
1.4​
1.8
Frees Against​
1.6​
14.5​
Contested Possessions​
14.8
14.4​
Uncontested Possessions​
21.5
19.3​
Effective Disposals​
26.1
64.8%​
Disposal Efficiency %​
72.9%
5.0
Clangers​
4.5​
1.0
Contested Marks​
0.1​
1.0
Marks Inside 50​
0.3​
6.4
Clearances​
6.3​
0.9​
Rebound 50s​
2.1
1.0
One Percenters​
0.9​
1.0
Bounces​
0.3​
85.2​
Time On Ground %​
87.5
3.4
Centre Clearances​
2.3​
3.0​
Stoppage Clearances​
4.0
9.0
Score Involvements​
6.9​
482.9
Metres Gained​
308.4​
6.2
Turnovers​
4.8​
 
Nov 30, 2018
6,929
17,170
AFL Club
Fremantle
The West Australian ran an article yesterday claiming Champion Data ranked their pin up boy Nic Natanui as the Best Player in the AFL.

I wouldn’t buy that toe rag if you paid me all the oil in Saudi Arabia so I’m a bit sketchy on the details but has anyone seen the data that The West is referencing? To be honest I wouldn’t put it past CD to come to the conclusion that an injury prone ruckman who hasn’t made an AA since 2012 is the league’s best player but then again I definitely wouldn’t put it past The West to ‘interpret’ the data that way so they could manufacture yet another Nic Nat headline.

Either way it seems a bit baffling to me.
 
I don’t even know how this whole champion data ranking system works and what criteria they use, but in 2019 Hawthorn ranked 1st in contested marks, 1st in intercept marks and 3rd in points conceded. Yet the Hawthorn back six is ranked 11th in the competition? That makes a lot of sense. :$:thumbsu:
 

Ishmael_

Premiership Player
Apr 30, 2013
4,017
11,326
Six Thousand
AFL Club
West Coast
Other Teams
South Fremantle
The West Australian ran an article yesterday claiming Champion Data ranked their pin up boy Nic Natanui as the Best Player in the AFL.

I wouldn’t buy that toe rag if you paid me all the oil in Saudi Arabia so I’m a bit sketchy on the details but has anyone seen the data that The West is referencing? To be honest I wouldn’t put it past CD to come to the conclusion that an injury prone ruckman who hasn’t made an AA since 2012 is the league’s best player but then again I definitely wouldn’t put it past The West to ‘interpret’ the data that way so they could manufacture yet another Nic Nat headline.

Either way it seems a bit baffling to me.

It's probably based on time on ground --> Impact. Naitanui's immense in the centre square, wins the majority of his taps to advantage and follows up his work with ground-ball gets - both which generally lead to deep forwards 50 entries/scores. And he generally starts in the centre where he does his best work and is then rotated off for around the ground stuff, which would skew his score.

They did this last year. If you extrapolate every player's data to 100% game time, Nic Nat comes out on top.

Obviously this is purely a thought experiment because Naitanui's never going to play a full game and his impact would very likely go down if he did, but I imagine that's the rationale behind it.
 
Back