|
Menu
Login: | Explaining level changesDavid Albrow v David Williams (Tue 09 Jan 2018)Match won by David Albrow. Result: 9-2,9-4,9-5.Starting level for David Albrow: 232, level confidence: 53%. Starting level for David Williams: 320, level confidence: 59%. David Williams to win as he is currently playing 38% better than David Albrow. David Albrow won all of the games and 71% of the points. This games result would be expected if he was better by around 55% or more. This points result would be expected if he was better by around 58% (english scoring). These are weighted and combined to calculate that David Albrow played 57% better than David Williams in this match. An upset! Assuming that any level changes are shared between both players, for this result it looks like David Albrow actually played at a level of 341 and David Williams at a level of 217. Without any damping, both players would need to be adjusted by 47% to match this result. Allowing for the difference in level between the players, the adjustments have been reduced to 39% and 39% respectively. Factoring in the relative levels of confidence which allows players with low confidence in their levels to change more quickly, the adjustment for David Albrow changes to +39% and David Williams changes to -33%. After applying standard match damping, the adjustment for David Albrow becomes +19.6% and for David Williams becomes -17%. Apply match/event weighting of 50% for 'Workout Harbourside Boxes' so the adjustment for David Albrow is +9.8% and for David Williams is -7.9%. Apply limits to the amount of change for a single match which are based on player level, level confidence and time since last match so that David Williams is limited to -5% level change. In general a player's level won't go up by more than 10% or drop more than 5% if they've played in the last 7 days but those limits are relaxed if their previous match was further back. Increase level confidence due to one more match played. David Albrow: 73%, David Williams: 77%. Reduce level confidence based on how unexpected the result is. David Albrow: 49%, David Williams: 52%. A final adjustment of -0.4% has been made to both players as part of the automatic calibration that is performed after each match. All players in this pool will have been adjusted equally in order to remain equivalent to other player pools. Final level for David Albrow: 254, level confidence: 49%. Final level for David Williams: 304, level confidence: 52%. Notes
|