|
Menu
Login: | Explaining level changesPremier B: Redland Z v University of Bath 1 (Sun 18 Mar 2018)Match played between Mike Martin (home) and Aidan O'Brien (away).Match won by Mike Martin. Result: 3-11,11-7,11-7,11-7. Starting level for Mike Martin: 3,609, level confidence: 76%. Set manually. Starting level for Aidan O'Brien: 6,303, level confidence: 45%. Aidan O'Brien to win as he is currently playing 75% better than Mike Martin. Mike Martin won 75% of the games and 53% of the points. This games result would be expected if he was better by around 25%. This points result would be expected if he was better by around 13% (PAR scoring). These are weighted and combined to calculate that Mike Martin played 21% better than Aidan O'Brien in this match. An upset! Assuming that any level changes are shared between both players, for this result it looks like Mike Martin actually played at a level of 5,244 and Aidan O'Brien at a level of 4,338. Without any damping, both players would need to be adjusted by 45% to match this result. Allowing for the difference in level between the players, the adjustments have been reduced to 33% and 33% respectively. Factoring in the relative levels of confidence which allows players with low confidence in their levels to change more quickly, the adjustment for Mike Martin changes to +19% and Aidan O'Brien changes to -33%. After applying standard match damping, the adjustment for Mike Martin becomes +6.8% and for Aidan O'Brien becomes -9.1%. Given Aidan O'Brien's level and the type of match played, an additional damping of 5.3% has been applied to his level change. Looks like he wasn't taking the match too seriously... Apply match/event weighting of 75% for 'Mixed Spring 2017/2018' so the adjustment for Mike Martin is +5.1% and for Aidan O'Brien is -6.3%. Apply limits to the amount of change for a single match which are based on player level, level confidence and time since last match so that Aidan O'Brien is limited to -5% level change. In general a player's level won't go up by more than 10% or drop more than 5% if they've played in the last 7 days but those limits are relaxed if their previous match was further back. Increase level confidence due to one more match played. Mike Martin: 87%, Aidan O'Brien: 67%. Reduce level confidence based on how unexpected the result is. Mike Martin: 60%, Aidan O'Brien: 46%. A final adjustment of -2.3% has been made to both players as part of the automatic calibration that is performed after each match. All players in this pool will have been adjusted equally in order to remain equivalent to other player pools. Final level for Mike Martin: 3,798, level confidence: 60%. Final level for Aidan O'Brien: 5,725, level confidence: 46%. Notes
|