|
Menu
Login: | Explaining level changesDivision 3: Bradley Stoke Squash Club A v Shepton Mallet B (Wed 10 Feb 2016)Match played between George Morgan (home) and Simon Ashfield (away).Match won by George Morgan. Result: 9-2,9-2,9-1:10-9,6-9,9-0,9-3:9-0,9-3,9-4:9-5,9-6,6-9,4-9,3-9:9-1,10-8,9-2. Starting level for George Morgan: 1,752, level confidence: 85%. Set manually. Starting level for Simon Ashfield: 1,512, level confidence: 85%. Set manually. George Morgan to win as he is currently playing 16% better than Simon Ashfield. George Morgan won all of the games and 84% of the points. This games result would be expected if he was better by around 55% or more. This points result would be expected if he was better by around 134% (english scoring). These are weighted and combined to calculate that George Morgan played 134% better than Simon Ashfield in this match. Assuming that any level changes are shared between both players, for this result it looks like George Morgan actually played at a level of 2,490 and Simon Ashfield at a level of 1,064. Without any damping, both players would need to be adjusted by 42% to match this result. Factoring in the relative levels of confidence which allows players with low confidence in their levels to change more quickly, the adjustment for George Morgan changes to +39% and Simon Ashfield changes to -39%. After applying standard match damping, the adjustment for George Morgan becomes +15.4% and for Simon Ashfield becomes -16%. Apply match/event weighting of 75% for 'Mixed Spring 2015/2016' so the adjustment for George Morgan is +12% and for Simon Ashfield is -12%. Apply limits to the amount of change for a single match which are based on player level, level confidence and time since last match so that George Morgan is limited to +10% and Simon Ashfield is limited to -5% level change. In general a player's level won't go up by more than 10% or drop more than 5% if they've played in the last 7 days but those limits are relaxed if their previous match was further back. Increase level confidence due to one more match played. George Morgan: 92%, Simon Ashfield: 92%. Reduce level confidence based on how unexpected the result is. George Morgan: 65%, Simon Ashfield: 65%. A final adjustment of -0.1% has been made to both players as part of the automatic calibration that is performed after each match. All players in this pool will have been adjusted equally in order to remain equivalent to other player pools. Final level for George Morgan: 1,924, level confidence: 65%. Final level for Simon Ashfield: 1,440, level confidence: 65%. Notes
|