|
Menu
Login: | Explaining level changesDivision 5: Workout Harbourside v Weston Shankers (Wed 22 Mar 2017)Match played between Stephen Williams (home) and Harvey Lane (away).Match won by Harvey Lane. Result: 3-9,9-3,0-9,5-9. Starting level for Stephen Williams: 1,147, level confidence: 81%. Set manually. Starting level for Harvey Lane: 1,136, level confidence: 81%. Set manually. Stephen Williams to win as he is currently playing 1% better than Harvey Lane. Harvey Lane won 75% of the games and 64% of the points. This games result would be expected if he was better by around 25%. This points result would be expected if he was better by around 35% (english scoring). These are weighted and combined to calculate that Harvey Lane played 28% better than Stephen Williams in this match. Assuming that any level changes are shared between both players, for this result it looks like Harvey Lane actually played at a level of 1,293 and Stephen Williams at a level of 1,008. Without any damping, both players would need to be adjusted by 14% to match this result. Factoring in the relative levels of confidence which allows players with low confidence in their levels to change more quickly, the adjustment for Harvey Lane changes to +14% and Stephen Williams changes to -14%. After applying standard match damping, the adjustment for Harvey Lane becomes +7.8% and for Stephen Williams becomes -7.8%. Apply match/event weighting of 75% for 'Mixed Spring 2016/2017' so the adjustment for Harvey Lane is +5.8% and for Stephen Williams is -5.7%. Apply limits to the amount of change for a single match which are based on player level, level confidence and time since last match so that Stephen Williams is limited to -5% level change. In general a player's level won't go up by more than 10% or drop more than 5% if they've played in the last 7 days but those limits are relaxed if their previous match was further back. Increase level confidence due to one more match played. Harvey Lane: 90%, Stephen Williams: 90%. Reduce level confidence based on how unexpected the result is. Harvey Lane: 79%, Stephen Williams: 79%. A final adjustment of +0.9% has been made to both players as part of the automatic calibration that is performed after each match. All players in this pool will have been adjusted equally in order to remain equivalent to other player pools. Final level for Stephen Williams: 1,114, level confidence: 79%. Final level for Harvey Lane: 1,200, level confidence: 79%. Notes
|