|
Menu
Login: | Explaining level changesDivision 4: Lansdown Summer C v Avon West / Centre For Sport - K1 (Wed 31 Jul 2019)Match played between Adam Conisbee (home) and Mike Secker (away).Match won by Adam Conisbee. Result: 15-8,15-7,15-4. Starting level for Adam Conisbee: 2,128, level confidence: 83%. Set manually. Starting level for Mike Secker: 1,691, level confidence: 88%. Set manually. Adam Conisbee to win as he is currently playing 26% better than Mike Secker. Adam Conisbee won all of the games and 70% of the points. This games result would be expected if he was better by around 55% or more. This points result would be expected if he was better by around 137% (PAR scoring). These are weighted and combined to calculate that Adam Conisbee played 137% better than Mike Secker in this match. Assuming that any level changes are shared between both players, for this result it looks like Adam Conisbee actually played at a level of 2,919 and Mike Secker at a level of 1,233. Without any damping, both players would need to be adjusted by 37% to match this result. Allowing for the difference in level between the players, the adjustments have been reduced to 33% and 33% respectively. Factoring in the relative levels of confidence which allows players with low confidence in their levels to change more quickly, the adjustment for Adam Conisbee changes to +33% and Mike Secker changes to -30%. After applying standard match damping, the adjustment for Adam Conisbee becomes +12.4% and for Mike Secker becomes -13%. Apply match/event weighting of 65% for 'Mixed Summer 2019' so the adjustment for Adam Conisbee is +8.1% and for Mike Secker is -7.9%. Apply limits to the amount of change for a single match which are based on player level, level confidence and time since last match so that Mike Secker is limited to -5% level change. In general a player's level won't go up by more than 10% or drop more than 5% if they've played in the last 7 days but those limits are relaxed if their previous match was further back. Increase level confidence due to one more match played. Adam Conisbee: 91%, Mike Secker: 94%. Reduce level confidence based on how unexpected the result is. Adam Conisbee: 66%, Mike Secker: 68%. A final adjustment of -0.1% has been made to both players as part of the automatic calibration that is performed after each match. All players in this pool will have been adjusted equally in order to remain equivalent to other player pools. Final level for Adam Conisbee: 2,297, level confidence: 66%. Final level for Mike Secker: 1,608, level confidence: 68%. Notes
|