|
Menu
Login: | Explaining level changesDivision 1: BAWA / Kingswood v Bradley Stoke Squash Club A (Wed 20 Oct 2021)Match played between Simon Strange (home) and Ian Doherty (away).Match won by Ian Doherty. Result: 10-15,4-15,10-15. Starting level for Simon Strange: 1,651, level confidence: 46%. Starting level for Ian Doherty: 2,140, level confidence: 63%. Ian Doherty to win as he is currently playing 30% better than Simon Strange. Ian Doherty won all of the games and 65% of the points. This games result would be expected if he was better by around 55% or more. This points result would be expected if he was better by around 88% (PAR scoring). These are weighted and combined to calculate that Ian Doherty played 88% better than Simon Strange in this match. Assuming that any level changes are shared between both players, for this result it looks like Ian Doherty actually played at a level of 2,574 and Simon Strange at a level of 1,373. Without any damping, both players would need to be adjusted by 20% to match this result. Allowing for the difference in level between the players, the adjustments have been reduced to 18% and 18% respectively. Factoring in the relative levels of confidence which allows players with low confidence in their levels to change more quickly, the adjustment for Ian Doherty changes to +13% and Simon Strange changes to -18%. After applying standard match damping, the adjustment for Ian Doherty becomes +5.9% and for Simon Strange becomes -8.2%. Apply match/event weighting of 75% for 'Mixed Autumn 2021/2022' so the adjustment for Ian Doherty is +4.4% and for Simon Strange is -6%. Apply limits to the amount of change for a single match which are based on player level, level confidence and time since last match so that Simon Strange is limited to -5% level change. In general a player's level won't go up by more than 10% or drop more than 5% if they've played in the last 7 days but those limits are relaxed if their previous match was further back. Increase level confidence due to one more match played. Ian Doherty: 79%, Simon Strange: 68%. Reduce level confidence based on how unexpected the result is. Ian Doherty: 66%, Simon Strange: 57%. A final adjustment of -0.1% has been made to both players as part of the automatic calibration that is performed after each match. All players in this pool will have been adjusted equally in order to remain equivalent to other player pools. Final level for Simon Strange: 1,570, level confidence: 57%. Final level for Ian Doherty: 2,231, level confidence: 66%. Notes
|