|
Menu
Login: | Explaining level changesPremier A: Redland Anacondas v University of Bath 1 (Wed 29 Mar 2017)Match played between Dan West (home) and Tim Arthur (away).Match won by Dan West. Result: 11-4,11-5,11-8:9-11,5-11,5-11:11-7,11-6,11-3:13-11,11-4,7-11,11-8:?-?. Starting level for Dan West: 13,600, level confidence: 52%. Starting level for Tim Arthur: 5,362, level confidence: 74%. Set manually. Dan West to win as he is currently playing 154% better than Tim Arthur. Dan West won all of the games and 66% of the points. This games result would be expected if he was better by around 55% or more. This points result would be expected if he was better by around 94% (PAR scoring). These are weighted and combined to calculate that Dan West played 94% better than Tim Arthur in this match. Due to the difference between the players' levels, allow for the likelihood that Dan West was taking it easy by anything up to 19%. This gives him an allowed level range for this match between 8,446 and 13,600 without affecting his level. In this case, Dan West played at level 10,703 and remained within his allowed range so his level will not be adjusted. On the assumption that Dan West would normally have been playing at level 11,005 (based on typical behaviour), Tim Arthur played better than expected and therefore gains a pre-damping level increase of 2.8%. Allowing for the difference in level between the players, the adjustments have been reduced to 0% and 1.8% respectively. Factoring in the relative levels of confidence which allows players with low confidence in their levels to change more quickly, the adjustment for Dan West changes to 0% and Tim Arthur changes to +1.2%. After applying standard match damping, the adjustment for Dan West becomes 0% and for Tim Arthur becomes +0.7%. Given Tim Arthur's level and the type of match played, an additional damping of 1.6% has been applied to his level change. Apply match/event weighting of 75% for 'Mixed Spring 2016/2017' so the adjustment for Dan West is 0% and for Tim Arthur is +0.5%. Increase level confidence due to one more match played. Dan West: 72%, Tim Arthur: 86%. Reduce level confidence based on how unexpected the result is. Dan West: 63%, Tim Arthur: 75%. A final adjustment of -2.1% has been made to both players as part of the automatic calibration that is performed after each match. All players in this pool will have been adjusted equally in order to remain equivalent to other player pools. Final level for Dan West: 13,083, level confidence: 63%. Final level for Tim Arthur: 5,371, level confidence: 75%. Notes
|