|
Menu
Login: | Explaining level changesDivision 2B: BAWA v Redland Z (Wed 18 Jun 2008)Match played between Matt Waters (home) and Peter Joyce (away).Match won by Peter Joyce. Result: 1-9,8-10,0-9:3-9,0-9,9-6,10-8,9-4:3-9,4-9,9-7,2-9:0-9,7-9,7-9. Starting level for Matt Waters: 941, level confidence: 63%. Starting level for Peter Joyce: 1,181, level confidence: 71%. Set manually. Peter Joyce to win as he is currently playing 26% better than Matt Waters. Peter Joyce won all of the games and 76% of the points. This games result would be expected if he was better by around 55% or more. This points result would be expected if he was better by around 81% (english scoring). These are weighted and combined to calculate that Peter Joyce played 81% better than Matt Waters in this match. Assuming that any level changes are shared between both players, for this result it looks like Peter Joyce actually played at a level of 1,418 and Matt Waters at a level of 784. Without any damping, both players would need to be adjusted by 20% to match this result. Allowing for the difference in level between the players, the adjustments have been reduced to 18% and 18% respectively. Factoring in the relative levels of confidence which allows players with low confidence in their levels to change more quickly, the adjustment for Peter Joyce changes to +16% and Matt Waters changes to -18%. After applying standard match damping, the adjustment for Peter Joyce becomes +8.6% and for Matt Waters becomes -10.2%. Apply match/event weighting of 65% for 'Mixed Summer 2008' so the adjustment for Peter Joyce is +5.6% and for Matt Waters is -6.4%. Apply limits to the amount of change for a single match which are based on player level, level confidence and time since last match so that Matt Waters is limited to -5% level change. In general a player's level won't go up by more than 10% or drop more than 5% if they've played in the last 7 days but those limits are relaxed if their previous match was further back. Increase level confidence due to one more match played. Peter Joyce: 84%, Matt Waters: 80%. Reduce level confidence based on how unexpected the result is. Peter Joyce: 70%, Matt Waters: 66%. A final adjustment of -0.1% has been made to both players as part of the automatic calibration that is performed after each match. All players in this pool will have been adjusted equally in order to remain equivalent to other player pools. Final level for Matt Waters: 895, level confidence: 66%. Final level for Peter Joyce: 1,245, level confidence: 70%. Notes
|