|
Menu
Login: | Explaining level changesPeter Hadwin v Steve Owens (Thu 17 Aug 2023)Match won by Steve Owens. Result: 6-15,10-15,4-15.Starting level for Peter Hadwin: 369, level confidence: 71%. Set manually. Starting level for Steve Owens: 283, level confidence: 76%. Set manually. Peter Hadwin to win as he is currently playing 30% better than Steve Owens. Steve Owens won all of the games and 69% of the points. This games result would be expected if he was better by around 55% or more. This points result would be expected if he was better by around 125% (PAR scoring). These are weighted and combined to calculate that Steve Owens played 125% better than Peter Hadwin in this match. An upset! Assuming that any level changes are shared between both players, for this result it looks like Steve Owens actually played at a level of 485 and Peter Hadwin at a level of 215. Without any damping, both players would need to be adjusted by 71% to match this result. Allowing for the difference in level between the players, the adjustments have been reduced to 60% and 60% respectively. Factoring in the relative levels of confidence which allows players with low confidence in their levels to change more quickly, the adjustment for Steve Owens changes to +56% and Peter Hadwin changes to -60%. After applying standard match damping, the adjustment for Steve Owens becomes +26.9% and for Peter Hadwin becomes -28%. Apply match/event weighting of 50% for 'Workout Harbourside Boxes' so the adjustment for Steve Owens is +13% and for Peter Hadwin is -12%. Apply limits to the amount of change for a single match which are based on player level, level confidence and time since last match so that Steve Owens is limited to +10% and Peter Hadwin is limited to -5% level change. In general a player's level won't go up by more than 10% or drop more than 5% if they've played in the last 7 days but those limits are relaxed if their previous match was further back. Increase level confidence due to one more match played. Steve Owens: 87%, Peter Hadwin: 84%. Reduce level confidence based on how unexpected the result is. Steve Owens: 51%, Peter Hadwin: 49%. A final adjustment of +0.3% has been made to both players as part of the automatic calibration that is performed after each match. All players in this pool will have been adjusted equally in order to remain equivalent to other player pools. Final level for Peter Hadwin: 352, level confidence: 49%. Final level for Steve Owens: 312, level confidence: 51%. Notes
|