|
Menu
Login: | Explaining level changesPremier A: University of Bath 4 v University of Bristol Chiefs (Wed 21 Jan 2009)Match played between Stefan Stanton (home) and Stephen Parsons (away).Match won by Stephen Parsons. Result: 1-9,7-9,6-9. Starting level for Stefan Stanton: 2,467, level confidence: 66%. Starting level for Stephen Parsons: 1,893, level confidence: 67%. Stefan Stanton to win as he is currently playing 30% better than Stephen Parsons. Stephen Parsons won all of the games and 66% of the points. This games result would be expected if he was better by around 55% or more. This points result would be expected if he was better by around 41% (english scoring). These are weighted and combined to calculate that Stephen Parsons played 46% better than Stefan Stanton in this match. An upset! Assuming that any level changes are shared between both players, for this result it looks like Stephen Parsons actually played at a level of 2,608 and Stefan Stanton at a level of 1,791. Without any damping, both players would need to be adjusted by 38% to match this result. Allowing for the difference in level between the players, the adjustments have been reduced to 32% and 32% respectively. Factoring in the relative levels of confidence which allows players with low confidence in their levels to change more quickly, the adjustment for Stephen Parsons changes to +32% and Stefan Stanton changes to -32%. After applying standard match damping, the adjustment for Stephen Parsons becomes +12.7% and for Stefan Stanton becomes -12%. Apply match/event weighting of 75% for 'Mixed Spring 2008/2009' so the adjustment for Stephen Parsons is +9.5% and for Stefan Stanton is -8.6%. Apply limits to the amount of change for a single match which are based on player level, level confidence and time since last match so that Stefan Stanton is limited to -5% level change. In general a player's level won't go up by more than 10% or drop more than 5% if they've played in the last 7 days but those limits are relaxed if their previous match was further back. Increase level confidence due to one more match played. Stephen Parsons: 82%, Stefan Stanton: 81%. Reduce level confidence based on how unexpected the result is. Stephen Parsons: 60%, Stefan Stanton: 59%. A final adjustment of +0.2% has been made to both players as part of the automatic calibration that is performed after each match. All players in this pool will have been adjusted equally in order to remain equivalent to other player pools. Final level for Stefan Stanton: 2,355, level confidence: 59%. Final level for Stephen Parsons: 2,078, level confidence: 60%. Notes
|