|
Menu
Login: | Explaining level changesPremier C: Lansdown 2 v David Lloyd Lovecars (Wed 08 Feb 2017)Match played between Tim Brooksbank (home) and Ian Stuart (away).Match won by Ian Stuart. Result: 0-9,1-9,2-9. Starting level for Tim Brooksbank: 1,825, level confidence: 53%. Starting level for Ian Stuart: 3,591, level confidence: 71%. Set manually. Ian Stuart to win as he is currently playing 97% better than Tim Brooksbank. Ian Stuart won all of the games and 90% of the points. This games result would be expected if he was better by around 55% or more. This points result would be expected if he was better by around 209% (english scoring). These are weighted and combined to calculate that Ian Stuart played 209% better than Tim Brooksbank in this match. Assuming that any level changes are shared between both players, for this result it looks like Ian Stuart actually played at a level of 4,500 and Tim Brooksbank at a level of 1,456. Without any damping, both players would need to be adjusted by 25% to match this result. Allowing for the difference in level between the players, the adjustments have been reduced to 17% and 17% respectively. Factoring in the relative levels of confidence which allows players with low confidence in their levels to change more quickly, the adjustment for Ian Stuart changes to +13% and Tim Brooksbank changes to -17%. After applying standard match damping, the adjustment for Ian Stuart becomes +5% and for Tim Brooksbank becomes -7.9%. Apply match/event weighting of 75% for 'Mixed Spring 2016/2017' so the adjustment for Ian Stuart is +3.8% and for Tim Brooksbank is -5.8%. Apply limits to the amount of change for a single match which are based on player level, level confidence and time since last match so that Tim Brooksbank is limited to -5% level change. In general a player's level won't go up by more than 10% or drop more than 5% if they've played in the last 7 days but those limits are relaxed if their previous match was further back. Increase level confidence due to one more match played. Ian Stuart: 84%, Tim Brooksbank: 73%. Reduce level confidence based on how unexpected the result is. Ian Stuart: 67%, Tim Brooksbank: 58%. A final adjustment of +0.8% has been made to both players as part of the automatic calibration that is performed after each match. All players in this pool will have been adjusted equally in order to remain equivalent to other player pools. Final level for Tim Brooksbank: 1,774, level confidence: 58%. Final level for Ian Stuart: 3,710, level confidence: 67%. Notes
|