|
Menu
Login: | Explaining level changesDivision 4: Redland Fireflies v Kingsdown Stallions (Thu 07 Nov 2019)Match played between David Langbridge (home) and George Minchin (away).Match won by George Minchin. Result: 8-15,15-17,13-15. Starting level for David Langbridge: 1,335, level confidence: 82%. Set manually. Starting level for George Minchin: 856, level confidence: 75%. Set manually. David Langbridge to win as he is currently playing 56% better than George Minchin. George Minchin won all of the games and 57% of the points. This games result would be expected if he was better by around 55% or more. This points result would be expected if he was better by around 31% (PAR scoring). These are weighted and combined to calculate that George Minchin played 39% better than David Langbridge in this match. An upset! Assuming that any level changes are shared between both players, for this result it looks like George Minchin actually played at a level of 1,259 and David Langbridge at a level of 908. Without any damping, both players would need to be adjusted by 47% to match this result. Allowing for the difference in level between the players, the adjustments have been reduced to 36% and 36% respectively. Factoring in the relative levels of confidence which allows players with low confidence in their levels to change more quickly, the adjustment for George Minchin changes to +36% and David Langbridge changes to -32%. After applying standard match damping, the adjustment for George Minchin becomes +18.4% and for David Langbridge becomes -15%. Apply match/event weighting of 75% for 'Mixed Autumn 2019/2020' so the adjustment for George Minchin is +14% and for David Langbridge is -10.5%. Apply limits to the amount of change for a single match which are based on player level, level confidence and time since last match so that George Minchin is limited to +10% and David Langbridge is limited to -5% level change. In general a player's level won't go up by more than 10% or drop more than 5% if they've played in the last 7 days but those limits are relaxed if their previous match was further back. Increase level confidence due to one more match played. George Minchin: 86%, David Langbridge: 91%. Reduce level confidence based on how unexpected the result is. George Minchin: 59%, David Langbridge: 62%. A final adjustment of -0.1% has been made to both players as part of the automatic calibration that is performed after each match. All players in this pool will have been adjusted equally in order to remain equivalent to other player pools. Final level for David Langbridge: 1,270, level confidence: 62%. Final level for George Minchin: 940, level confidence: 59%. Notes
|