|
Menu
Login: | Explaining level changesDivision 3: Redland Capybaras v Workout Harbourside (Wed 13 Jan 2016)Match played between Adam Chambers (home) and Clive Stiff (away).Match won by Adam Chambers. Result: 9-3,9-3,9-4. Starting level for Adam Chambers: 1,240, level confidence: 77%. Set manually. Starting level for Clive Stiff: 1,847, level confidence: 39%. Clive Stiff to win as he is currently playing 49% better than Adam Chambers. Adam Chambers won all of the games and 73% of the points. This games result would be expected if he was better by around 55% or more. This points result would be expected if he was better by around 67% (english scoring). These are weighted and combined to calculate that Adam Chambers played 67% better than Clive Stiff in this match. An upset! Assuming that any level changes are shared between both players, for this result it looks like Adam Chambers actually played at a level of 1,956 and Clive Stiff at a level of 1,171. Without any damping, both players would need to be adjusted by 58% to match this result. Allowing for the difference in level between the players, the adjustments have been reduced to 45% and 45% respectively. Factoring in the relative levels of confidence which allows players with low confidence in their levels to change more quickly, the adjustment for Adam Chambers changes to +23% and Clive Stiff changes to -45%. After applying standard match damping, the adjustment for Adam Chambers becomes +11.5% and for Clive Stiff becomes -17%. Apply match/event weighting of 75% for 'Mixed Spring 2015/2016' so the adjustment for Adam Chambers is +8.6% and for Clive Stiff is -12%. Apply limits to the amount of change for a single match which are based on player level, level confidence and time since last match so that Clive Stiff is limited to -5% level change. In general a player's level won't go up by more than 10% or drop more than 5% if they've played in the last 7 days but those limits are relaxed if their previous match was further back. Increase level confidence due to one more match played. Adam Chambers: 88%, Clive Stiff: 63%. Reduce level confidence based on how unexpected the result is. Adam Chambers: 56%, Clive Stiff: 40%. A final adjustment of -0.2% has been made to both players as part of the automatic calibration that is performed after each match. All players in this pool will have been adjusted equally in order to remain equivalent to other player pools. Final level for Adam Chambers: 1,345, level confidence: 56%. Final level for Clive Stiff: 1,756, level confidence: 40%. Notes
|