|
Menu
Login: | Explaining level changesDivision 8: Redland Z v BAWA (Thu 14 Jul 2016)Match played between Robert Burr (home) and Andrew Hardie (away).Match won by Robert Burr. Result: 9-2,9-3,9-2:9-4,9-7,5-9,1-9,10-9:9-0,9-1,9-2:9-1,9-3,9-2. Starting level for Robert Burr: 1,097, level confidence: 72%. Set manually. Starting level for Andrew Hardie: 845, level confidence: 59%. Robert Burr to win as he is currently playing 30% better than Andrew Hardie. Robert Burr won all of the games and 79% of the points. This games result would be expected if he was better by around 55% or more. This points result would be expected if he was better by around 97% (english scoring). These are weighted and combined to calculate that Robert Burr played 97% better than Andrew Hardie in this match. Assuming that any level changes are shared between both players, for this result it looks like Robert Burr actually played at a level of 1,351 and Andrew Hardie at a level of 686. Without any damping, both players would need to be adjusted by 23% to match this result. Allowing for the difference in level between the players, the adjustments have been reduced to 20% and 20% respectively. Factoring in the relative levels of confidence which allows players with low confidence in their levels to change more quickly, the adjustment for Robert Burr changes to +17% and Andrew Hardie changes to -20%. After applying standard match damping, the adjustment for Robert Burr becomes +9.2% and for Andrew Hardie becomes -11%. Apply match/event weighting of 65% for 'Mixed Summer 2016' so the adjustment for Robert Burr is +6% and for Andrew Hardie is -7%. Apply limits to the amount of change for a single match which are based on player level, level confidence and time since last match so that Andrew Hardie is limited to -5% level change. In general a player's level won't go up by more than 10% or drop more than 5% if they've played in the last 7 days but those limits are relaxed if their previous match was further back. Increase level confidence due to one more match played. Robert Burr: 85%, Andrew Hardie: 77%. Reduce level confidence based on how unexpected the result is. Robert Burr: 69%, Andrew Hardie: 63%. A final adjustment of -1.1% has been made to both players as part of the automatic calibration that is performed after each match. All players in this pool will have been adjusted equally in order to remain equivalent to other player pools. Final level for Robert Burr: 1,167, level confidence: 69%. Final level for Andrew Hardie: 785, level confidence: 63%. Notes
|