Is there anything more Port Adelaide than finishing top of the ladder with the highest clearance differential, and then spending the next pre-season coming up with a gameplan built around losing the clearances.
Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
Weekly Prize - Join Any Time - Tip Round 9
The Golden Ticket - MCG and Marvel Medallion Club tickets and Corporate Box tickets at the Gabba, MCG and Marvel.
What do you mean by "greatness"? I usually see that term used to describe a player who has been excellent over a long period of time (e.g. their whole career), but not necessarily at this instant in time.So you want Tex Walker elevated to greatness because he has had a good 10 games? Great players aren't great for a couple of weeks, they are great for a sustained period. That's what they are trying to show over 40 games.
You can evaluate players at a point in time and over a career or in this case, a chunk of a career. 40 games is a decent mix of both to rank a floating 800 players a season.What do you mean by "greatness"? I usually see that term used to describe a player who has been excellent over a long period of time (e.g. their whole career), but not necessarily at this instant in time.
For example, I would consider Buddy as a "great" player, but at this point in time I probably wouldn't rank him as one of the top footballers in the league. Conversely, no I would not say the Walker is a "great" player, but certainly at this point in time he is arguably the most dominant key forward.
When you are ranking players the intention, as I perceive it, is to assess the quality of one player compared to their peers at this point in time. Not last season, not five years ago, and not over the past ten years. And yes, to some extent you cannot make that evaluation on a small set of games, but equally you cannot make that assessment on games that were over a year ago. I don't think a game that Charlie Dixon played 2 years ago should be used to assess his quality now. It is irrelevant.
To illustrate the how ridiculous this approach is, a major driver of Dixon going up the ranking is poor games dropping out of his assessment period from 2019. So in effect the system is valuing him more highly because we are further away from 2019.
Yes he should be rated badly.You can evaluate players at a point in time and over a career or in this case, a chunk of a career. 40 games is a decent mix of both to rank a floating 800 players a season.
So DBJ is sh*t the moment and that's how we should assess him and completely forget he was an All Australian last years and near all oz player the year before. That's goldfish memory stuff.
Nicks' half time address must be pure gold.Adelaide is still yet to win a Q3
Drew's last month of footy ranks him 21st of 324 players who have played at least 3 games.
I have never paid attention to the AFL ratings before and don't claim to understand how they arrive at their ratings. This I don't understand at all:
View attachment 1169421
So -
1. top 5 Crows players all rank above every port player except Boak, our number one ranked player.
2. Amon, who has 21 AFLCA votes can't get a look in the top 5 above Drew who has 2 votes
3. Ben Seedsman is ranked 6 positions above Ollie Wines who must easily be in the top 5 most likely to win the Brownlow.
Hmmm...
Not that website.RussellEbertHandball, our slow starts and, more often, slow finishes can be seen in the bottom of the second image:
Q1 - 09th (W, 50%; Pts, 104%)
Q2 - 01st (W, 71%; Pts, 150%)
Q3 - 04th (W, 71%; Pts, 137%)
Q4 - 11th (W, 46%; Pts, 105%)
Oddly enough, if I understood the table correctly, our score distribution is somewhat balanced: 24-28-25-23. The difference for us, then, seems to be more in the defensive side of things.
Is it possible to see the distribution of Pts Against? Or do those numbers consider both for and against, instead?
Have I read the distribution correctly? Is it only on Points For?Not that website.
Does this site provide what you want? Click on the Quarter Statistics tab on this page and any team page
2021 Port Adelaide Team Page - FinalSiren.com
AFL football statistics on AFL teams, games and players.finalsiren.com
As I have written before the ratings punish you heavily if you fu** up and you get little points if you do the average thing as expected eg take an uncontested mark you don't get any points but do in Dream Team and Super Coach.I have never paid attention to the AFL ratings before and don't claim to understand how they arrive at their ratings. This I don't understand at all:
View attachment 1169421
So -
1. top 5 Crows players all rank above every port player except Boak, our number one ranked player.
2. Amon, who has 21 AFLCA votes can't get a look in the top 5 above Drew who has 2 votes
3. Ben Seedsman is ranked 6 positions above Ollie Wines who must easily be in the top 5 most likely to win the Brownlow.
Hmmm...
The one glaring omission from the system is the undervaluation of defensive play. It seems to allocate a set number of ranking points to all the members of an offensive chain based on their influence on that offensive chain.As I have written before the ratings punish you heavily if you fu** up and you get little points if you do the average thing as expected eg take an uncontested mark you don't get any points but do in Dream Team and Super Coach.
The spectacular things like kicking 50m or 60m goals as well as a lot of goals gets you more points than a goal from 15m. Spectacular marks get you the same pts as a standard contested mark.
Wines racks up a lot of clangers, kicks mainly, so he gets marked heavily with negative pts and his kicking has stopped him being graded as an elite player by CD despite having an elite rating for hardball gets.
Walker kicks a lot of long goals so he gets a lot of positive pts for that, he doesn't fu** up too many kicks to get negatives and he doesn't get negatives if he is a bit lazy and doesn't impact a contest.
CD data feeds this to the media and clubs and they use it as part of their analysis. This also goes to the All Oz selectors and they use it as part of their arguments for the players they are pushing.
Its also why CD does rankings by positions because they know those playing similar positions would get similar +ve and -ve pts for that position. It's also why every year CD come out with their elite or above average ratings for each position for each club, and most are reasonable ie elite is top 10% for that position ie key forward or mid fielder or general defender, but every year there are some players who get those high ratings and people say WTF, how is that guy considered elite.
It ain't perfect but its been a constant methodology with the odd tweak since 2012. I've said many times before that I'm sceptical of the algorithm but I found the document they published in 2013 earlier this year, and now understand better how it works. It has deficiencies but any system would as its done pretty close to real time.
The fact you don't hear the analysts at all the clubs either coming out publicly bagging the system or hear head or assistant coaches come out in public and say they are sh*t and we totally ignore them, tells you that they use it as part of their analysis tools. Tom Mitchell might regularly get 35 disposal but if 18-20 are meaningless one two type handballs that will come out in these ratings.
In the collingwood game review thread I had a discussion with Sleezy between post #346 and #351 and cut and pasted some stuff from that 2013 document. See
Review - Rd 10 Port vs Collingwood
There are currently 65 players with a higher goals per game average than Todd this season. And 54 have kicked more goals than he has. If goals are not your thing, and you want a broader measure of contribution to scoring he is ranked 155th for score involvements per game. And if it is...www.bigfooty.com
Have a read thru the 2013 document attached to see how defenders pick up pts.The one glaring omission from the system is the undervaluation of defensive play. It seems to allocate a set number of ranking points to all the members of an offensive chain based on their influence on that offensive chain.
IMO defenders should gain ranking points for interrupting an offensive chain, with the points scaling based on the likelihood of an opposition goal being scored through that chain prior to the defenders action.
The way it's set up, it appears that players can lose points for messing up an offensive chain, but defender's don't gain points for defensive play as the points seem to be based solely on contribution to offensive chains.
At the very least, I'd like to see a defensive player's impact on their opponent's ranking for the game added to their score. IE - McKenize kept De Goey (Av 9.5 ranking points) to only 2 ranking point this year, so should gain 7.5 points for his defensive actions, in addition to his contribution to offensive chains for the game.
Interesting. It seems I'd misunderstood the way points are allocated. Defenders will always be undervalued in that system because of where they play. Defensive play so far from the offensive goal has an inherent disadvantage because of the negative equity of being so close to defensive goal.Have a read thru the 2013 document attached to see how defenders pick up pts.
I can see 5 levels:Record against Top-4 sides (2016-2010) - including Finals [II]
Considering the period, we have the 16th overall record against Top-4 sides.
View attachment 1172825
Note: those 'W%' in yellow are all above average.
---
This is how our overall record from the period was distributed:
View attachment 1172826
I think this kind of distribution should be expected, but Port's look too damn perfect!
Note: For a 22-game season, the equivalent record would vary depending on the double-ups and Port’s ladder position. The ground would be a 10-8 record (which puts us roughly within a 14-8/10-12 range).
*Overall, 10-8 (18)
- v. Top-4, 0-4 (4)- v. 5th-8th, 2-2 (4)- v. Others, 8-2 (10)
Giants seem to have totally blown it. Just 2 Top-4 indicates they have lost plenty of points they shouldn’t…Gee and WCE have played finals all 5 years, Richmond the last 4, GWS the first 4. Sydney have made finals in 3 years like WB and Coll, but they have a good history of being giant killers.