|
Menu
Login: | Explaining level changesTim Wollaston v Cameron Morrison (Wed 23 Nov 2022)Match won by Tim Wollaston. Result: 15-4,15-8,15-11.Starting level for Tim Wollaston: 1,049, level confidence: 80%. Set manually. Starting level for Cameron Morrison: 1,314, level confidence: 75%. Set manually. Cameron Morrison to win as he is currently playing 25% better than Tim Wollaston. Tim Wollaston won all of the games and 66% of the points. This games result would be expected if he was better by around 55% or more. This points result would be expected if he was better by around 96% (PAR scoring). These are weighted and combined to calculate that Tim Wollaston played 96% better than Cameron Morrison in this match. An upset! Assuming that any level changes are shared between both players, for this result it looks like Tim Wollaston actually played at a level of 1,642 and Cameron Morrison at a level of 839. Without any damping, both players would need to be adjusted by 57% to match this result. Allowing for the difference in level between the players, the adjustments have been reduced to 49% and 49% respectively. Factoring in the relative levels of confidence which allows players with low confidence in their levels to change more quickly, the adjustment for Tim Wollaston changes to +46% and Cameron Morrison changes to -49%. After applying standard match damping, the adjustment for Tim Wollaston becomes +22.1% and for Cameron Morrison becomes -21%. Apply match/event weighting of 50% for 'Redland Green Boxes' so the adjustment for Tim Wollaston is +11.1% and for Cameron Morrison is -9.6%. Apply limits to the amount of change for a single match which are based on player level, level confidence and time since last match so that Tim Wollaston is limited to +10% and Cameron Morrison is limited to -5% level change. In general a player's level won't go up by more than 10% or drop more than 5% if they've played in the last 7 days but those limits are relaxed if their previous match was further back. Increase level confidence due to one more match played. Tim Wollaston: 89%, Cameron Morrison: 86%. Reduce level confidence based on how unexpected the result is. Tim Wollaston: 57%, Cameron Morrison: 55%. A final adjustment of -0.1% has been made to both players as part of the automatic calibration that is performed after each match. All players in this pool will have been adjusted equally in order to remain equivalent to other player pools. Final level for Tim Wollaston: 1,153, level confidence: 57%. Final level for Cameron Morrison: 1,250, level confidence: 55%. Notes
|