|
Menu
Login: | Explaining level changesDivision 1: Gordano - Gordano v Thornbury A (Wed 21 Feb 2024)Match played between Steve Gale (home) and Rob Livingstone (away).Match won by Steve Gale. Result: 8-15,15-13,15-10,15-10. Starting level for Steve Gale: 1,605, level confidence: 89%. Set manually. Starting level for Rob Livingstone: 2,091, level confidence: 63%. Rob Livingstone to win as he is currently playing 30% better than Steve Gale. Steve Gale won 75% of the games and 52% of the points. This games result would be expected if he was better by around 25%. This points result would be expected if he was better by around 10% (PAR scoring). These are weighted and combined to calculate that Steve Gale played 20% better than Rob Livingstone in this match. An upset! Assuming that any level changes are shared between both players, for this result it looks like Steve Gale actually played at a level of 2,008 and Rob Livingstone at a level of 1,671. Without any damping, both players would need to be adjusted by 25% to match this result. Allowing for the difference in level between the players, the adjustments have been reduced to 22% and 22% respectively. Factoring in the relative levels of confidence which allows players with low confidence in their levels to change more quickly, the adjustment for Steve Gale changes to +15% and Rob Livingstone changes to -22%. After applying standard match damping, the adjustment for Steve Gale becomes +7.4% and for Rob Livingstone becomes -9%. Apply match/event weighting of 75% for 'Mixed Spring 2023/2024' so the adjustment for Steve Gale is +5.6% and for Rob Livingstone is -6.6%. Apply limits to the amount of change for a single match which are based on player level, level confidence and time since last match so that Rob Livingstone is limited to -5% level change. In general a player's level won't go up by more than 10% or drop more than 5% if they've played in the last 7 days but those limits are relaxed if their previous match was further back. Increase level confidence due to one more match played. Steve Gale: 94%, Rob Livingstone: 79%. Reduce level confidence based on how unexpected the result is. Steve Gale: 75%, Rob Livingstone: 63%. A final adjustment of -0.1% has been made to both players as part of the automatic calibration that is performed after each match. All players in this pool will have been adjusted equally in order to remain equivalent to other player pools. Final level for Steve Gale: 1,692, level confidence: 75%. Final level for Rob Livingstone: 1,988, level confidence: 63%. Notes
|