|
Menu
Login: | Explaining level changesPremier B: BAWA A v Lansdown 1 (Wed 02 Dec 2015)Match played between Stuart Dagnall (home) and Jeremy Goulding (away).Match won by Jeremy Goulding. Result: 5-9,1-9,5-9. Starting level for Stuart Dagnall: 2,274, level confidence: 59%. Starting level for Jeremy Goulding: 4,485, level confidence: 80%. Set manually. Jeremy Goulding to win as he is currently playing 97% better than Stuart Dagnall. Jeremy Goulding won all of the games and 71% of the points. This games result would be expected if he was better by around 55% or more. This points result would be expected if he was better by around 58% (english scoring). These are weighted and combined to calculate that Jeremy Goulding played 57% better than Stuart Dagnall in this match. Due to the difference between the players' levels, allow for the likelihood that Jeremy Goulding was taking it easy by anything up to 12%. This gives him an allowed level range for this match between 3,374 and 4,485 without affecting his level. In this case, Jeremy Goulding played at level 3,756 and remained within his allowed range so his level will not be adjusted. On the assumption that Jeremy Goulding would normally have been playing at level 3,952 (based on typical behaviour), Stuart Dagnall played better than expected and therefore gains a pre-damping level increase of 5.2%. Allowing for the difference in level between the players, the adjustments have been reduced to 0% and 3.7% respectively. Factoring in the relative levels of confidence which allows players with low confidence in their levels to change more quickly, the adjustment for Jeremy Goulding changes to 0% and Stuart Dagnall changes to +3.7%. After applying standard match damping, the adjustment for Jeremy Goulding becomes 0% and for Stuart Dagnall becomes +2.5%. Apply match/event weighting of 75% for 'Mixed Autumn 2015/2016' so the adjustment for Jeremy Goulding is 0% and for Stuart Dagnall is +1.8%. Increase level confidence due to one more match played. Jeremy Goulding: 89%, Stuart Dagnall: 77%. Reduce level confidence based on how unexpected the result is. Jeremy Goulding: 80%, Stuart Dagnall: 69%. A final adjustment of -0.2% has been made to both players as part of the automatic calibration that is performed after each match. All players in this pool will have been adjusted equally in order to remain equivalent to other player pools. Final level for Stuart Dagnall: 2,312, level confidence: 69%. Final level for Jeremy Goulding: 4,478, level confidence: 80%. Notes
|