- Jul 10, 2012
- 7,499
- 9,218
- AFL Club
- West Coast
- Other Teams
- Manchester City, Lakers
That's a joke. How do they publish this shite when West Coast's midfield are clearly the 14th best
The new rankings have them 4th, about right.
Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
LIVE: Richmond v Melbourne - 7:25PM Wed
Squiggle tips Demons at 77% chance -- What's your tip? -- Team line-ups »
That's a joke. How do they publish this shite when West Coast's midfield are clearly the 14th best
Or just indicates a large difference in statistical output for Melbourne players year to year.Last year they ranked Melbourne's list as number 1.
Twelve months later pretty much the same list is ranked 13th.
That basically sums up the accuracy of CD rankings IMO.
Understand that perfectly, but curious as to what their measures are that have the GWS midfield so low for example.They are perfectly fine if you understand the context and methodology in which the rating system was created. Unfortunately that goes beyond most people, so therefore it's routinely trashed as useless.
The most common misunderstanding is that these types of ranking systems are inferring something. They are not inferring anything, all it is is a ranking system based on a set of measurements and assumptions. Once you accept that they are simply a way to present what is happening on the field along with some assumptions - which can be agreed with or not, then it's a lot more digestible.
No statistical rating system is perfect, you have to look into each one to see the methodology behind it, which is why there will always be cases of players or teams that don't fit with the eye test. Use them in the context in which they were created, in addition to your own eye test and they can be useful tools.
they had an explanation of their player rating system on the afl site last year, but as the format of the site has turned to s**t I'm struggling to find it now. I'm sure it's still on there somewhere...Understand that perfectly, but curious as to what their measures are that have the GWS midfield so low for example.
they had an explanation of their player rating system on the afl site last year, but as the format of the site has turned to s**t I'm struggling to find it now. I'm sure it's still on there somewhere...
Is it possible that the name power or reputation of some teams' players exceeds their actual output on the field?
If they just listed the teams in order of 2019 ladder position it would make more sense. Their data tells us less than simply looking at what happened last year.
If I just cut and pasted last year's ladder and said "this is my ranking of the lists", it would be no less sensible than CD's "data-driven" account. So what are they adding?i always find their data in isolation is fine. One can view the data and make up their own minds.
Their data when used by CD to interpret outcomes is shite at times.
That's a lot of fence-sitting waffle.They are perfectly fine if you understand the context and methodology in which the rating system was created. Unfortunately that goes beyond most people, so therefore it's routinely trashed as useless.
The most common misunderstanding is that these types of ranking systems are inferring something. They are not inferring anything, all it is is a ranking system based on a set of measurements and assumptions. Once you accept that they are simply a way to present what is happening on the field along with some assumptions - which can be agreed with or not, then it's a lot more digestible.
No statistical rating system is perfect, you have to look into each one to see the methodology behind it, which is why there will always be cases of players or teams that don't fit with the eye test. Use them in the context in which they were created, in addition to your own eye test and they can be useful tools.
They are perfectly fine if you understand the context and methodology in which the rating system was created. Unfortunately that goes beyond most people, so therefore it's routinely trashed as useless.
The most common misunderstanding is that these types of ranking systems are inferring something. They are not inferring anything, all it is is a ranking system based on a set of measurements and assumptions. Once you accept that they are simply a way to present what is happening on the field along with some assumptions - which can be agreed with or not, then it's a lot more digestible.
No statistical rating system is perfect, you have to look into each one to see the methodology behind it, which is why there will always be cases of players or teams that don't fit with the eye test. Use them in the context in which they were created, in addition to your own eye test and they can be useful tools.
but their model doesn’t present what is happening on the field. Geelong had the best defence last year, the numbers clearly back it up, everybody struggled to score against them. When essentially the same defence rates as 16th, you can defend your model all you want, but the model is at best poor and i am leaning towards completely useless as it’s misleading And demonstrably wrong.
they need to tweak the context and methodology otherwise putting out this information just reflects poorly on them. It seems to me they are just aggregating player ratings whereas they probably need to add more data to the equation so they don’t look like idiots.
predicting the future is a difficult beast, but at least if you are using stats as the basis for your predictions, use a methodology that is at least arguable.
Mmm I kind of agree, but I think you're being a bit generous to CD.They are perfectly fine if you understand the context and methodology in which the rating system was created. Unfortunately that goes beyond most people, so therefore it's routinely trashed as useless.
The most common misunderstanding is that these types of ranking systems are inferring something. They are not inferring anything, all it is is a ranking system based on a set of measurements and assumptions. Once you accept that they are simply a way to present what is happening on the field along with some assumptions - which can be agreed with or not, then it's a lot more digestible.
No statistical rating system is perfect, you have to look into each one to see the methodology behind it, which is why there will always be cases of players or teams that don't fit with the eye test. Use them in the context in which they were created, in addition to your own eye test and they can be useful tools.
More often than not the articles you see like this are from some journo with no background in data analysis, and they strip all context from it. CD are better than people give them credit for.
https://www.zerohanger.com/champion-data-ranks-every-clubs-list-ahead-of-2019-season-26529/It's somewhere in the middle. Didn't Champion Data have Melbourne rated the second best list last year and West Coast like ninth?
Or was it Melbourne with the second best midfield? Can't remember, either way, I know they finished second last.
Or just indicates a large difference in statistical output for Melbourne players year to year.
Statistical output vs actual performance are two different things though. It's the stats vs watching football test.they had an explanation of their player rating system on the afl site last year, but as the format of the site has turned to s**t I'm struggling to find it now. I'm sure it's still on there somewhere...
Is it possible that the name power or reputation of some teams' players exceeds their actual output on the field?
19.2 | Kicks | 13.9 |
10.6 | Handballs | 21.8 |
29.8 | Disposals | 35.8 |
4.1 | Marks | 5.3 |
1.5 | Goals | 0.5 |
1.2 | Behinds | 0.5 |
3.5 | Tackles | 6.5 |
0 | Hitouts | 0 |
6.0 | Inside 50s | 3.5 |
1.2 | Goal Assists | 0.4 |
1.5 | Frees For | 1.4 |
1.8 | Frees Against | 1.6 |
14.5 | Contested Possessions | 14.8 |
14.4 | Uncontested Possessions | 21.5 |
19.3 | Effective Disposals | 26.1 |
64.8% | Disposal Efficiency % | 72.9% |
5.0 | Clangers | 4.5 |
1.0 | Contested Marks | 0.1 |
1.0 | Marks Inside 50 | 0.3 |
6.4 | Clearances | 6.3 |
0.9 | Rebound 50s | 2.1 |
1.0 | One Percenters | 0.9 |
1.0 | Bounces | 0.3 |
85.2 | Time On Ground % | 87.5 |
3.4 | Centre Clearances | 2.3 |
3.0 | Stoppage Clearances | 4.0 |
9.0 | Score Involvements | 6.9 |
482.9 | Metres Gained | 308.4 |
6.2 | Turnovers | 4.8 |
Your humble opinion?A few weird ones. North and Hawthorn rated a bit too highly for mine, but that's my opinion.
Top 5 for goals over the last decade and top 25 for goal assists, ok. There should be plenty more players in there and plenty not in it, but trust BF to pick on Tex.The fact Taylor Walker is listed as an "elite" player just goes to show everything you need to know about the list.
The West Australian ran an article yesterday claiming Champion Data ranked their pin up boy Nic Natanui as the Best Player in the AFL.
I wouldn’t buy that toe rag if you paid me all the oil in Saudi Arabia so I’m a bit sketchy on the details but has anyone seen the data that The West is referencing? To be honest I wouldn’t put it past CD to come to the conclusion that an injury prone ruckman who hasn’t made an AA since 2012 is the league’s best player but then again I definitely wouldn’t put it past The West to ‘interpret’ the data that way so they could manufacture yet another Nic Nat headline.
Either way it seems a bit baffling to me.