|
Menu
Login: | Explaining level changesDivision 2: University of Bath 3 v BAWA / Kingswood (Wed 03 Feb 2016)Match played between Mike Gamblin (home) and Simon Strange (away).Match won by Simon Strange. Result: 1-9,9-10,9-3,2-9. Starting level for Mike Gamblin: 2,072, level confidence: 76%. Set manually. Starting level for Simon Strange: 1,307, level confidence: 86%. Set manually. Mike Gamblin to win as he is currently playing 59% better than Simon Strange. Simon Strange won 75% of the games and 60% of the points. This games result would be expected if he was better by around 25%. This points result would be expected if he was better by around 24% (english scoring). These are weighted and combined to calculate that Simon Strange played 25% better than Mike Gamblin in this match. An upset! Assuming that any level changes are shared between both players, for this result it looks like Simon Strange actually played at a level of 1,837 and Mike Gamblin at a level of 1,474. Without any damping, both players would need to be adjusted by 41% to match this result. Allowing for the difference in level between the players, the adjustments have been reduced to 31% and 31% respectively. Factoring in the relative levels of confidence which allows players with low confidence in their levels to change more quickly, the adjustment for Simon Strange changes to +28% and Mike Gamblin changes to -31%. After applying standard match damping, the adjustment for Simon Strange becomes +13% and for Mike Gamblin becomes -12%. Apply match/event weighting of 75% for 'Mixed Spring 2015/2016' so the adjustment for Simon Strange is +9.8% and for Mike Gamblin is -8.8%. Apply limits to the amount of change for a single match which are based on player level, level confidence and time since last match so that Mike Gamblin is limited to -5% level change. In general a player's level won't go up by more than 10% or drop more than 5% if they've played in the last 7 days but those limits are relaxed if their previous match was further back. Increase level confidence due to one more match played. Simon Strange: 92%, Mike Gamblin: 87%. Reduce level confidence based on how unexpected the result is. Simon Strange: 66%, Mike Gamblin: 62%. A final adjustment of -0.1% has been made to both players as part of the automatic calibration that is performed after each match. All players in this pool will have been adjusted equally in order to remain equivalent to other player pools. Final level for Mike Gamblin: 1,970, level confidence: 62%. Final level for Simon Strange: 1,433, level confidence: 66%. Notes
|