|
Menu
Login: | Explaining level changesJames Murray v Henry Dunnill (Wed 24 May 2023)Match won by Henry Dunnill. Result: 8-15,10-15,5-15.Starting level for James Murray: 1,014, level confidence: 52%. Starting level for Henry Dunnill: 496, level confidence: 57%. James Murray to win as he is currently playing 104% better than Henry Dunnill. Henry Dunnill won all of the games and 66% of the points. This games result would be expected if he was better by around 55% or more. This points result would be expected if he was better by around 96% (PAR scoring). These are weighted and combined to calculate that Henry Dunnill played 96% better than James Murray in this match. An upset! Assuming that any level changes are shared between both players, for this result it looks like Henry Dunnill actually played at a level of 992 and James Murray at a level of 507. Without any damping, both players would need to be adjusted by 100% to match this result. Allowing for the difference in level between the players, including some additional protection for the better player, the adjustments have been reduced to 62% and 52% respectively. Factoring in the relative levels of confidence which allows players with low confidence in their levels to change more quickly, the adjustment for Henry Dunnill changes to +57% and James Murray changes to -52%. After applying standard match damping, the adjustment for Henry Dunnill becomes +27% and for James Murray becomes -25%. Apply match/event weighting of 50% for 'Redland Green Boxes' so the adjustment for Henry Dunnill is +13% and for James Murray is -11.1%. Apply limits to the amount of change for a single match which are based on player level, level confidence and time since last match so that Henry Dunnill is limited to +10% and James Murray is limited to -5% level change. In general a player's level won't go up by more than 10% or drop more than 5% if they've played in the last 7 days but those limits are relaxed if their previous match was further back. Increase level confidence due to one more match played. Henry Dunnill: 76%, James Murray: 72%. Reduce level confidence based on how unexpected the result is. Henry Dunnill: 38%, James Murray: 36%. A final adjustment of +9.5% has been made to both players as part of the automatic calibration that is performed after each match. All players in this pool will have been adjusted equally in order to remain equivalent to other player pools. Final level for James Murray: 964, level confidence: 36%. Final level for Henry Dunnill: 656, level confidence: 38%. Notes
|