|
Menu
Login: | Explaining level changesSteve Smith v Rob Wilde (Mon 29 Jan 2024)Match won by Steve Smith. Result: 15-12,15-10,15-8.Starting level for Steve Smith: 1,936, level confidence: 73%. Set manually. Starting level for Rob Wilde: 852, level confidence: 53%. Steve Smith to win as he is currently playing 127% better than Rob Wilde. Steve Smith won all of the games and 60% of the points. This games result would be expected if he was better by around 55% or more. This points result would be expected if he was better by around 50% (PAR scoring). These are weighted and combined to calculate that Steve Smith played 52% better than Rob Wilde in this match. Due to the difference between the players' levels, allow for the likelihood that Steve Smith was taking it easy by anything up to 16%. This gives him an allowed level range for this match between 1,257 and 1,936 without affecting his level. In this case, Steve Smith played at level 1,451 and remained within his allowed range so his level will not be adjusted. On the assumption that Steve Smith would normally have been playing at level 1,629 (based on typical behaviour), Rob Wilde played better than expected and therefore gains a pre-damping level increase of 12%. Allowing for the difference in level between the players, including some additional protection for the better player, the adjustments have been reduced to 0% and 8% respectively. Factoring in the relative levels of confidence which allows players with low confidence in their levels to change more quickly, the adjustment for Steve Smith changes to 0% and Rob Wilde changes to +8%. After applying standard match damping, the adjustment for Steve Smith becomes 0% and for Rob Wilde becomes +5.5%. Apply match/event weighting of 50% for 'Westbury David Lloyd Boxes' so the adjustment for Steve Smith is 0% and for Rob Wilde is +2.7%. Increase level confidence due to one more match played. Steve Smith: 85%, Rob Wilde: 73%. Reduce level confidence based on how unexpected the result is. Steve Smith: 70%, Rob Wilde: 59%. A final adjustment of -0.1% has been made to both players as part of the automatic calibration that is performed after each match. All players in this pool will have been adjusted equally in order to remain equivalent to other player pools. Final level for Steve Smith: 1,933, level confidence: 70%. Final level for Rob Wilde: 874, level confidence: 59%. Notes
|