Review Autopsy vs Freo

Remove this Banner Ad

Look at Mcraes rating and then come chat to me mate

You should be asking why he got the rating he did and letting that help inform your opinion on his impact, instead of the other way around. He was ineffectual, and from what I've heard of Jack, he'd be the first to admit he was below his best.
 
You should be asking why he got the rating he did and letting that help inform your opinion on his impact, instead of the other way around. He was ineffectual, and from what I've heard of Jack, he'd be the first to admit he was below his best.


Oh please mate

It tells me Adams was ok and Suckling and JJ about par with Cordy?

It constantly rates Moz low

Go and speak to AFL coaches about champion data and ask what they think of its accuracy.

CD is as useful as SuperCoach points to rate players and their effect on the games

We all love Bont but best player on the ground?? Most influential to the outcome? NO

Wait a minute.... So you watch football and then look at CD To help you decide who played well and who didn't? Ummmm ok
 
Last edited:

Log in to remove this ad.

No disrespect to you MD but I find it's those that criticise the player rankings the most that don't know anything about them. It's incredibly in depth and certainly a valuable indicator of a players overall positive impact for their team.
In the main, I agree with you, but Macrae is way, way too underrated by CD's measurements.
 
Last edited:
Oh please mate

It tells me Adams was ok and Suckling and JJ about par with Cordy?

It constantly rates Moz low

Go and speak to AFL coaches about champion data and ask what they think of its accuracy.

CD is as useful as SuperCoach points to rate players and their effect on the games

We all love Bont but best player on the ground?? Most influential to the outcome? NO

Wait a minute.... So you watch football and then look at CD To help you decide who played well and who didn't? Ummmm ok

Too right MD - thought id fubar-ed majorly picking Bont as Capt in supercoach and just about fell off y chair when I saw he top scored for us
 
True, in fact the majority of those pure defenders are not respected nearly enough.

So then how can CD be relevant??

I had a chat to Monty and Hansen last year at a league coaching review and they both commented the CD is misleading and totally unusable to clubs as it misses and overweights many stats that club use as KP indicators.
 
Coaches Votes

10 S Hill
6 Fyfe
4 Hamling
3 B Hill
3 Mundy
3 J J
1 Hunter
My guess (there are quite a few other viable permutations):
LYON
5 - S Hill
4 - Fyfe
3 - B Hill or Mundy
2 - B Hill or Mundy
1 - B Hill or Mundy

BEVERIDGE
5 - S Hill
4 - Hamling (ex-WB)
3 - JJ (WB)
2 - Fyfe
1 - Hunter (WB)
 
So then how can CD be relevant??
As a way of rating players, they're not.

They're only really useful if you're interested in seeing how players fare when you rate them a certain way. They shouldn't be taken as the definitive player ratings but they don't pretend to be anyway.

Clubs don't use them and people shouldn't use them to guide their opinions. Some people just find it interesting to look at how an equity model rates players.
 
I'm not going to go through every post to quote them so I'll do a general reply here.

CD Ranking specific stats and producing specific rankings are two different things.

Champion Data ranking points also used for SuperCoach was created in the late 1990's and has had only minor tweaks since then. What it is is a "box score" based statistic - it measures things like contested possessions vs uncontested, disposal efficiency and intercept possessions and weighs them in such a way that you can compare and contrast across games and years.

Nobody's claiming it's more than that.

AFL Player Ratings Points are something that was developed over a significant period of time. 15 years ago the concept of "equity" was created to quantify the advantage of possession and location of certain locations on the ground, and to compare and contrast exactly how far up the ground it is worthwhile to concede possession. Once Champion Data started measuring pressure in 2010, they could then convert this to a individual-level, and after a couple of years of research, turned into AFL Player Ratings Points which measures an accumulation to how each player changes the equity-state of the game, every game.

The funny thing is, neither of those all-in-one, catch all statistics were ever developed to be used by clubs for ranking purposes or list management/coaching purposes.

Ranking Points were initially formed for the newspapers in the late 90's with Ted Hopkins writing an article on them.

Player Rating points were created in part because AFL Media for their own website wanted their own statistic, and in part because the creator of them at CD was doing research in new ways to measure player performance as part of a PhD.

Neither were created by and large to help teams win games of football through a tool of List Management or Coaching.

These stats miss a lot of nuance in differences in team styles or team list desires.

Clubs use stats heavily, but they use them in the context of their own video and tactical desires. Once you understand the number in the context of your own desires, they can be very useful because they ignore human biases and watch and record every game equally.

Claiming things like "it doesn't measure defence" is kind of ridiculous because no creator or user of these statistics claim otherwise. Defence is inherently about preventing things from ever happing in the first place like a goal being kicked. So how can you attribute the fact something didn't happen as an abstract concept to Morris?

Individual statistics are useful when in the right context and understanding of how it relates to what we are trying to achieve - nobody knows what our benchmarks are with certain stats, or for that matter whether we eve have benchmarks in some stats.

Future of stats and analytics is with GPS positional data but that's a post for another time.
 
I'm not going to go through every post to quote them so I'll do a general reply here.

CD Ranking specific stats and producing specific rankings are two different things.

Champion Data ranking points also used for SuperCoach was created in the late 1990's and has had only minor tweaks since then. What it is is a "box score" based statistic - it measures things like contested possessions vs uncontested, disposal efficiency and intercept possessions and weighs them in such a way that you can compare and contrast across games and years.

Nobody's claiming it's more than that.

AFL Player Ratings Points are something that was developed over a significant period of time. 15 years ago the concept of "equity" was created to quantify the advantage of possession and location of certain locations on the ground, and to compare and contrast exactly how far up the ground it is worthwhile to concede possession. Once Champion Data started measuring pressure in 2010, they could then convert this to a individual-level, and after a couple of years of research, turned into AFL Player Ratings Points which measures an accumulation to how each player changes the equity-state of the game, every game.

The funny thing is, neither of those all-in-one, catch all statistics were ever developed to be used by clubs for ranking purposes or list management/coaching purposes.

Ranking Points were initially formed for the newspapers in the late 90's with Ted Hopkins writing an article on them.

Player Rating points were created in part because AFL Media for their own website wanted their own statistic, and in part because the creator of them at CD was doing research in new ways to measure player performance as part of a PhD.

Neither were created by and large to help teams win games of football through a tool of List Management or Coaching.

These stats miss a lot of nuance in differences in team styles or team list desires.

Clubs use stats heavily, but they use them in the context of their own video and tactical desires. Once you understand the number in the context of your own desires, they can be very useful because they ignore human biases and watch and record every game equally.

Claiming things like "it doesn't measure defence" is kind of ridiculous because no creator or user of these statistics claim otherwise. Defence is inherently about preventing things from ever happing in the first place like a goal being kicked. So how can you attribute the fact something didn't happen as an abstract concept to Morris?

Individual statistics are useful when in the right context and understanding of how it relates to what we are trying to achieve - nobody knows what our benchmarks are with certain stats, or for that matter whether we eve have benchmarks in some stats.

Future of stats and analytics is with GPS positional data but that's a post for another time.

Exactly, they're a useful tool for comparing apples with apples. Anyone with half a brain could see that there is literally no statistical method for comparing a Dale Morris to a Bont. Criticising a metric because of that is so laughably moronic.
 
Exactly, they're a useful tool for comparing apples with apples. Anyone with half a brain could see that there is literally no statistical method for comparing a Dale Morris to a Bont. Criticising a metric because of that is so laughably moronic.


Hahahha add another lemming
 

(Log in to remove this ad.)

I'm not going to go through every post to quote them so I'll do a general reply here.

CD Ranking specific stats and producing specific rankings are two different things.

Champion Data ranking points also used for SuperCoach was created in the late 1990's and has had only minor tweaks since then. What it is is a "box score" based statistic - it measures things like contested possessions vs uncontested, disposal efficiency and intercept possessions and weighs them in such a way that you can compare and contrast across games and years.

Nobody's claiming it's more than that.

AFL Player Ratings Points are something that was developed over a significant period of time. 15 years ago the concept of "equity" was created to quantify the advantage of possession and location of certain locations on the ground, and to compare and contrast exactly how far up the ground it is worthwhile to concede possession. Once Champion Data started measuring pressure in 2010, they could then convert this to a individual-level, and after a couple of years of research, turned into AFL Player Ratings Points which measures an accumulation to how each player changes the equity-state of the game, every game.

The funny thing is, neither of those all-in-one, catch all statistics were ever developed to be used by clubs for ranking purposes or list management/coaching purposes.

Ranking Points were initially formed for the newspapers in the late 90's with Ted Hopkins writing an article on them.

Player Rating points were created in part because AFL Media for their own website wanted their own statistic, and in part because the creator of them at CD was doing research in new ways to measure player performance as part of a PhD.

Neither were created by and large to help teams win games of football through a tool of List Management or Coaching.

These stats miss a lot of nuance in differences in team styles or team list desires.

Clubs use stats heavily, but they use them in the context of their own video and tactical desires. Once you understand the number in the context of your own desires, they can be very useful because they ignore human biases and watch and record every game equally.

Claiming things like "it doesn't measure defence" is kind of ridiculous because no creator or user of these statistics claim otherwise. Defence is inherently about preventing things from ever happing in the first place like a goal being kicked. So how can you attribute the fact something didn't happen as an abstract concept to Morris?

Individual statistics are useful when in the right context and understanding of how it relates to what we are trying to achieve - nobody knows what our benchmarks are with certain stats, or for that matter whether we eve have benchmarks in some stats.

Future of stats and analytics is with GPS positional data but that's a post for another time.

So they are pretty much useless you agree

Cheers
 
As a way of rating players, they're not.

They're only really useful if you're interested in seeing how players fare when you rate them a certain way. They shouldn't be taken as the definitive player ratings but they don't pretend to be anyway.

Clubs don't use them and people shouldn't use them to guide their opinions. Some people just find it interesting to look at how an equity model rates players.

Funny because some people keep jamming them down your throat when you suggest a player isn't that good and when the same player has a bad day(often) they say CD points are relative to each game and he had a different role.

Looking at the cd points from our game on the weekend they show you in one sample how unreliable and basically useless they are. No one that way he. That game could agree that the players influence to the game is close to the cd rankings.

It's a stat used by those attempting to look intelligent in their own analysis and they throw that at you as "proof" of their opinions.

Any of us can sit down and place long winded analytical stats based theories down and then use CD to back up said theory in an attempt to look quasi intellectual and belittle others we disagree with but those that can watch footy with our own eyes and don't need a laptop to tell us who played well see through that utter twattle and can make our own assessment on who played poorly and badly.
 
So they are pretty much useless you agree

Cheers
No, because they're the best tool we have in the absence of watching every minute of every game, understanding the context of the stats, and without the wide range of stats avaliable to clubs and the media.

There's a difference between arguing dumb stuff like "was Bontempelli's first 50 games better than Pendlebury's first 50" (actual argument I was having on the main board), where in the absence of watching all 100 games and trying to quatify things like "which team had more stoppage heavy footy providing opportunity for. midfielders to generate points (which CD rankings points partially adjust for), where instead of doing that I can point to their Rankings Point averages and say Bontempelli being 5 points average ahead means he almost certainly had the best 50 games. That's a fundamentally difference usage of stats to wider list management and tactical/on field strategy reasons. We understand certain players produce points well because of the tactical scheme they're part of, and that certain elements of value to a team are less valuable to another. For example, we rotate midfielders between the midfield and the forward line more than other clubs, so it means (say) players like a Andrew Swallow are less valuable to us because they can't play forward effectively. Nobody's saying that ranking points can quantify that (though there are some other stats and data analysis research than can help a little bit).

I don't think you're making that distinction and you're painting everything with the same brush.
 
No, because they're the best tool we have in the absence of watching every minute of every game, understanding the context of the stats, and without the wide range of stats avaliable to clubs and the media.

There's a difference between arguing dumb stuff like "was Bontempelli's first 50 games better than Pendlebury's first 50" (actual argument I was having on the main board), where in the absence of watching all 100 games and trying to quatify things like "which team had more stoppage heavy footy providing opportunity for. midfielders to generate points (which CD rankings points partially adjust for), where instead of doing that I can point to their Rankings Point averages and say Bontempelli being 5 points average ahead means he almost certainly had the best 50 games. That's a fundamentally difference usage of stats to wider list management and tactical/on field strategy reasons. We understand certain players produce points well because of the tactical scheme they're part of, and that certain elements of value to a team are less valuable to another. For example, we rotate midfielders between the midfield and the forward line more than other clubs, so it means (say) players like a Andrew Swallow are less valuable to us because they can't play forward effectively. Nobody's saying that ranking points can quantify that (though there are some other stats and data analysis research than can help a little bit).

I don't think you're making that distinction and you're painting everything with the same brush.

I was actually referencing the initial poster who was attempting to tell me that McRae played poorly due to his CD score.

As in his score is an absolute indicator of how good or bad he played and his influence on the game must me minimal if he has a low score.

As we both know many acts that are very important to the outcome are unmeasurable(ie spoils, positioning, pressure) and that means defensive players are always lower on the "scale".

All statistically instruments have their uses but as some contend they are not the be all and end all. Anyone who looks at CD and takes that as a reliable rating for each and every player on the day has no idea.

I understand the way YOU use CD and although I disagree with it I can see the theory you use but some who actually have not a clue how it should be used correctly and then trot it out as "proof" someone played well only prove they are unable to watch football and make their own opinions.
 
I was actually referencing the initial poster who was attempting to tell me that McRae played poorly due to his CD score.

As in his score is an absolute indicator of how good or bad he played and his influence on the game must me minimal if he has a low score.

As we both know many acts that are very important to the outcome are unmeasurable(ie spoils, positioning, pressure) and that means defensive players are always lower on the "scale".

All statistically instruments have their uses but as some contend they are not the be all and end all. Anyone who looks at CD and takes that as a reliable rating for each and every player on the day has no idea.

I understand the way YOU use CD and although I disagree with it I can see the theory you use but some who actually have not a clue how it should be used correctly and then trot it out as "proof" someone played well only prove they are unable to watch football and make their own opinions.
But once you account for the advantages and disadvantages it's a tool among others. It's a strawman argument you're proposing here, claiming that stats are an answer to all when nobody's claiming it.

Likewise with Macrae, nobody's saying he played a poor game on the fact alone that he scored poorly, moreso that given he scored poorly it's worth investigation and a second d glance as to how and why he scored poorly once we understand the measurement system and logic behind the calculation of the number.

It's like Bontempelli scoring highly like being their Norm Smith because his kicking efficiency was 100%. Watching the game live, we might not notice the difference between him shaking 2 or 3 kicks instead of him not shaking one at all. It's only when you look at the game on review do you say "s**t yeah maybe he did play an excellent game on the fact all his kicks were effective, and it was a cognitive flaw to not notice the true value of 100% kicking effiency as the game was unfolding".

It's the same principle but in reverse for Macrae.
 
But once you account for the advantages and disadvantages it's a tool among others. It's a strawman argument you're proposing here, claiming that stats are an answer to all when nobody's claiming it.

Likewise with Macrae, nobody's saying he played a poor game on the fact alone that he scored poorly, moreso that given he scored poorly it's worth investigation and a second d glance as to how and why he scored poorly once we understand the measurement system and logic behind the calculation of the number.

It's like Bontempelli scoring highly like being their Norm Smith because his kicking efficiency was 100%. Watching the game live, we might not notice the difference between him shaking 2 or 3 kicks instead of him not shaking one at all. It's only when you look at the game on review do you say "s**t yeah maybe he did play an excellent game on the fact all his kicks were effective, and it was a cognitive flaw to not notice the true value of 100% kicking effiency as the game was unfolding".

It's the same principle but in reverse for Macrae.

Noooooo he intimated that McRae had a bad game and CD proved this

That was my initial point and still is.

I watched the replay McRae did not play badly and in the CD "scale" there is no way he was that low.

You can watch that game 100 times and Adams would still have a Sh!t day but by CD he didn't.

As I've stated I think the whole think it a skid mark and totally irrelevant but if some disagree that fine but please don't tell me it's a definite indicator of a good or bad game.

Mate AFL coaches thinks it's misleading and irrelevant I think id back them before the general public on statistical tools
 
Noooooo he intimated that McRae had a bad game and CD proved this

That was my initial point and still is.

I watched the replay McRae did not play badly and in the CD "scale" there is no way he was that low.

You can watch that game 100 times and Adams would still have a Sh!t day but by CD he didn't.

As I've stated I think the whole think it a skid mark and totally irrelevant but if some disagree that fine but please don't tell me it's a definite indicator of a good or bad game.

Mate AFL coaches thinks it's misleading and irrelevant I think id back them before the general public on statistical tools
Or it could be that CD enabled him to reflect more directly on Macrae's game which then upon consideration was poor, of which CD is one element of that consideration.

Coaches think they're crap because ultimately they're interested in structures and tactics and they don't answer how effectively players are adhering to tactics and structures.

That doesn't mean for our purposes they're crap.

Unless you personally know what Bevo's pre-game instructions were to every single player, they're a handy tool in avoiding very human elements of the game, like misapplication of events or even being distracted by watching the entire game.

Given that in the match-day thread, you said you only noticed 3 of Jong's 15 disposals to half time, you are going down this line of argument. Nobody's claiming that Jong played a better game the another because of the fact he had more touches, there's factors lien decision making that statistics can't measure. It's simply that in a fast paced game that humans find it inherently difficult to notice 22 players at once, it notices and collates every disposal from every player without human flaws like distractions or inability to closely monitor a fast-paced game.

It's simplifying it for the Macrae argument, but the principle remains the same.
 
Mate you seem a reasonable bloke but if you don't think his increased mid time is a factor in our stats falling in amazed. I've never said he is wholly to blame they are all down but when you put an inferior player in the midfield It effects everything.

Let's pretend it's not Jong and we put Jason Tutt in there. Stats would drop and so would effectiveness and that would trickle down to effect the other players as their work rate would then need to cover the lesser player.

BUT I'm Jongs defense the others have been down so he is now doing more than he is capable of and that is effecting the side of the game he is good at.

He has been far from our worst player and has played two exceedingly good qrts for a player of his level but to say him,and by extension any of the midfielders, are playing well is wrong.

They have been sh!te and a major factor in why we haven't been great and like it or not Jong is and has to be partly to blame for that

Great post MD!!!
 

Remove this Banner Ad

Back
Top