|
Menu
Login: | Explaining level changesDivision 8: Colston v BAWA (Wed 04 Nov 2015)Match played between John Press (home) and Andrew Hardie (away).Match won by Andrew Hardie. Result: 4-9,10-8,4-9,4-9:5-9,9-5,9-1,9-7:9-0,9-0,9-2:9-7,9-8,9-6:9-3,8-10,9-5,9-0. Starting level for John Press: 697, level confidence: 77%. Set manually. Starting level for Andrew Hardie: 605, level confidence: 89%. Set manually. John Press to win as he is currently playing 15% better than Andrew Hardie. Andrew Hardie won 75% of the games and 61% of the points. This games result would be expected if he was better by around 25%. This points result would be expected if he was better by around 26% (english scoring). These are weighted and combined to calculate that Andrew Hardie played 25% better than John Press in this match. Assuming that any level changes are shared between both players, for this result it looks like Andrew Hardie actually played at a level of 727 and John Press at a level of 580. Without any damping, both players would need to be adjusted by 20% to match this result. Factoring in the relative levels of confidence which allows players with low confidence in their levels to change more quickly, the adjustment for Andrew Hardie changes to +16% and John Press changes to -19%. After applying standard match damping, the adjustment for Andrew Hardie becomes +9.3% and for John Press becomes -10.6%. Apply match/event weighting of 75% for 'Mixed Autumn 2015/2016' so the adjustment for Andrew Hardie is +7% and for John Press is -7.7%. Apply limits to the amount of change for a single match which are based on player level, level confidence and time since last match so that John Press is limited to -5% level change. In general a player's level won't go up by more than 10% or drop more than 5% if they've played in the last 7 days but those limits are relaxed if their previous match was further back. Increase level confidence due to one more match played. Andrew Hardie: 95%, John Press: 88%. Reduce level confidence based on how unexpected the result is. Andrew Hardie: 79%, John Press: 73%. A final adjustment of +0.4% has been made to both players as part of the automatic calibration that is performed after each match. All players in this pool will have been adjusted equally in order to remain equivalent to other player pools. Final level for John Press: 671, level confidence: 73%. Final level for Andrew Hardie: 645, level confidence: 79%. Notes
|