|
Menu
Login: | Explaining level changesDivision 4B: Kingsdown v BAWA (Wed 11 Feb 2015)Match played between Andy Page (home) and Steve R Walker (away).Match won by Steve R Walker. Result: 3-9,2-9,2-9:9-1,7-9,9-6,9-0:3-9,9-5,9-6,9-0:9-2,2-9,9-1,9-5:9-2,9-1,9-2. Starting level for Andy Page: 503, level confidence: 70%. Starting level for Steve R Walker: 537, level confidence: 57%. Steve R Walker to win as he is currently playing 7% better than Andy Page. Steve R Walker won all of the games and 79% of the points. This games result would be expected if he was better by around 55% or more. This points result would be expected if he was better by around 97% (english scoring). These are weighted and combined to calculate that Steve R Walker played 97% better than Andy Page in this match. Assuming that any level changes are shared between both players, for this result it looks like Steve R Walker actually played at a level of 729 and Andy Page at a level of 370. Without any damping, both players would need to be adjusted by 36% to match this result. Factoring in the relative levels of confidence which allows players with low confidence in their levels to change more quickly, the adjustment for Steve R Walker changes to +35% and Andy Page changes to -27%. After applying standard match damping, the adjustment for Steve R Walker becomes +17.7% and for Andy Page becomes -14%. Apply match/event weighting of 75% for 'Mixed Spring 2014/2015' so the adjustment for Steve R Walker is +13% and for Andy Page is -10.4%. Apply limits to the amount of change for a single match which are based on player level, level confidence and time since last match so that Steve R Walker is limited to +10% and Andy Page is limited to -5% level change. In general a player's level won't go up by more than 10% or drop more than 5% if they've played in the last 7 days but those limits are relaxed if their previous match was further back. Increase level confidence due to one more match played. Steve R Walker: 76%, Andy Page: 84%. Reduce level confidence based on how unexpected the result is. Steve R Walker: 56%, Andy Page: 62%. A final adjustment of -0.2% has been made to both players as part of the automatic calibration that is performed after each match. All players in this pool will have been adjusted equally in order to remain equivalent to other player pools. Final level for Andy Page: 478, level confidence: 62%. Final level for Steve R Walker: 590, level confidence: 56%. Notes
|