|
Menu
Login: | Explaining level changesDivision 6: Chew Valley Winter B v BAWA (Wed 29 Nov 2017)Match played between Sam Hume (home) and Stephen Green (away).Match won by Sam Hume. Result: 9-6,9-3,9-7. Starting level for Sam Hume: 402, level confidence: 29%. Starting level for Stephen Green: 923, level confidence: 59%. Stephen Green to win as he is currently playing 130% better than Sam Hume. Sam Hume won all of the games and 63% of the points. This games result would be expected if he was better by around 55% or more. This points result would be expected if he was better by around 32% (english scoring). These are weighted and combined to calculate that Sam Hume played 40% better than Stephen Green in this match. An upset! Assuming that any level changes are shared between both players, for this result it looks like Sam Hume actually played at a level of 720 and Stephen Green at a level of 516. Without any damping, both players would need to be adjusted by 79% to match this result. Allowing for the difference in level between the players, the adjustments have been reduced to 47% and 47% respectively. Factoring in the relative levels of confidence which allows players with low confidence in their levels to change more quickly, the adjustment for Sam Hume changes to +47% and Stephen Green changes to -19%. After applying standard match damping, the adjustment for Sam Hume becomes +23% and for Stephen Green becomes -10.7%. Apply match/event weighting of 75% for 'Mixed Autumn 2017/2018' so the adjustment for Sam Hume is +17% and for Stephen Green is -7.8%. Apply limits to the amount of change for a single match which are based on player level, level confidence and time since last match so that Sam Hume is limited to +10% and Stephen Green is limited to -5% level change. In general a player's level won't go up by more than 10% or drop more than 5% if they've played in the last 7 days but those limits are relaxed if their previous match was further back. Increase level confidence due to one more match played. Sam Hume: 54%, Stephen Green: 77%. Reduce level confidence based on how unexpected the result is. Sam Hume: 30%, Stephen Green: 43%. A final adjustment of +0.6% has been made to both players as part of the automatic calibration that is performed after each match. All players in this pool will have been adjusted equally in order to remain equivalent to other player pools. Final level for Sam Hume: 451, level confidence: 30%. Final level for Stephen Green: 871, level confidence: 43%. Notes
|