|
Menu
Login: | Explaining level changesBaD Premier League: Workout Harbourside 1 v Redland B (Tue 05 Dec 2023)Match played between Andy Lowe (home) and Cameron Haddow (away).Match won by Cameron Haddow. Result: 3-11,11-7,8-11,10-12. Starting level for Andy Lowe: 9,487, level confidence: 50%. Starting level for Cameron Haddow: 6,849, level confidence: 71%. Set manually. Andy Lowe to win as he is currently playing 39% better than Cameron Haddow. Cameron Haddow won 75% of the games and 56% of the points. This games result would be expected if he was better by around 25%. This points result would be expected if he was better by around 28% (PAR scoring). These are weighted and combined to calculate that Cameron Haddow played 26% better than Andy Lowe in this match. An upset! Assuming that any level changes are shared between both players, for this result it looks like Cameron Haddow actually played at a level of 9,049 and Andy Lowe at a level of 7,180. Without any damping, both players would need to be adjusted by 32% to match this result. Allowing for the difference in level between the players, the adjustments have been reduced to 27% and 27% respectively. Factoring in the relative levels of confidence which allows players with low confidence in their levels to change more quickly, the adjustment for Cameron Haddow changes to +19% and Andy Lowe changes to -27%. After applying standard match damping, the adjustment for Cameron Haddow becomes +5.6% and for Andy Lowe becomes -7%. Given Cameron Haddow's level and the type of match played, an additional damping of 7.3% has been applied to his level change. Given Andy Lowe's level and the type of match played, an additional damping of 16% has been applied to his level change. Looks like he wasn't taking the match too seriously... Apply match/event weighting of 75% for 'Mixed Spring 2023/2024' so the adjustment for Cameron Haddow is +3.9% and for Andy Lowe is -4.4%. Increase level confidence due to one more match played. Cameron Haddow: 84%, Andy Lowe: 70%. Reduce level confidence based on how unexpected the result is. Cameron Haddow: 64%, Andy Lowe: 53%. A final adjustment of -0.2% has been made to both players as part of the automatic calibration that is performed after each match. All players in this pool will have been adjusted equally in order to remain equivalent to other player pools. Final level for Andy Lowe: 9,071, level confidence: 53%. Final level for Cameron Haddow: 7,108, level confidence: 64%. Notes
|