|
Menu
Login: | Explaining level changesDivision 8: Colston v University of Bristol Fresh (Wed 04 Oct 2017)Match played between Dave Adlam (home) and Thomas Ma (away).Match won by Dave Adlam. Result: 9-2,9-0,9-0:9-5,3-9,9-3,9-4:9-3,6-9,9-0,9-5:9-6,9-6,9-1:9-1,9-1,4-9,9-5. Starting level for Dave Adlam: 806, level confidence: 58%. Starting level for Thomas Ma: 419, level confidence: 16%. Dave Adlam to win as he is currently playing 92% better than Thomas Ma. Dave Adlam won all of the games and 93% of the points. This games result would be expected if he was better by around 55% or more. This points result would be expected if he was better by around 278% (english scoring). These are weighted and combined to calculate that Dave Adlam played 278% better than Thomas Ma in this match. Assuming that any level changes are shared between both players, for this result it looks like Dave Adlam actually played at a level of 1,130 and Thomas Ma at a level of 299. Without any damping, both players would need to be adjusted by 40% to match this result. Allowing for the difference in level between the players, the adjustments have been reduced to 28% and 28% respectively. Factoring in the relative levels of confidence which allows players with low confidence in their levels to change more quickly, the adjustment for Dave Adlam changes to +7.7% and Thomas Ma changes to -28%. After applying standard match damping, the adjustment for Dave Adlam becomes +5.3% and for Thomas Ma becomes -15%. Apply match/event weighting of 75% for 'Mixed Autumn 2017/2018' so the adjustment for Dave Adlam is +4% and for Thomas Ma is -10.6%. Apply limits to the amount of change for a single match which are based on player level, level confidence and time since last match so that Thomas Ma is limited to -10% level change. In general a player's level won't go up by more than 10% or drop more than 5% if they've played in the last 7 days but those limits are relaxed if their previous match was further back. Increase level confidence due to one more match played. Dave Adlam: 76%, Thomas Ma: 40%. Reduce level confidence based on how unexpected the result is. Dave Adlam: 54%, Thomas Ma: 29%. A final adjustment of -1.3% has been made to both players as part of the automatic calibration that is performed after each match. All players in this pool will have been adjusted equally in order to remain equivalent to other player pools. Final level for Dave Adlam: 839, level confidence: 54%. Final level for Thomas Ma: 371, level confidence: 29%. Notes
|