|
Menu
Login: | Explaining level changesPremier League: Rhiwbina v Devon & Exeter Premier (Tue 19 Nov 2024)Match played between Peter Berkley (home) and Tim Arthur (away).Match won by Tim Arthur. Result: 8-11,5-11,1-11. Starting level for Peter Berkley: 8,371, level confidence: 60%. Starting level for Tim Arthur: 10,573, level confidence: 74%. Set manually. Tim Arthur to win as he is currently playing 26% better than Peter Berkley. Tim Arthur won all of the games and 70% of the points. This games result would be expected if he was better by around 55% or more. This points result would be expected if he was better by around 136% (PAR scoring). These are weighted and combined to calculate that Tim Arthur played 136% better than Peter Berkley in this match. Assuming that any level changes are shared between both players, for this result it looks like Tim Arthur actually played at a level of 14,444 and Peter Berkley at a level of 6,128. Without any damping, both players would need to be adjusted by 37% to match this result. Allowing for the difference in level between the players, the adjustments have been reduced to 32% and 32% respectively. Factoring in the relative levels of confidence which allows players with low confidence in their levels to change more quickly, the adjustment for Tim Arthur changes to +26% and Peter Berkley changes to -32%. After applying standard match damping, the adjustment for Tim Arthur becomes +6.7% and for Peter Berkley becomes -8.3%. Apply match/event weighting of 85% for 'Mixed South West Premier League 2024/2025' so the adjustment for Tim Arthur is +5.7% and for Peter Berkley is -7%. Apply limits to the amount of change for a single match which are based on player level, level confidence and time since last match so that Peter Berkley is limited to -5% level change. In general a player's level won't go up by more than 10% or drop more than 5% if they've played in the last 7 days but those limits are relaxed if their previous match was further back. Increase level confidence due to one more match played. Tim Arthur: 86%, Peter Berkley: 77%. Reduce level confidence based on how unexpected the result is. Tim Arthur: 63%, Peter Berkley: 57%. A final adjustment of -0.3% has been made to both players as part of the automatic calibration that is performed after each match. All players in this pool will have been adjusted equally in order to remain equivalent to other player pools. Final level for Peter Berkley: 8,067, level confidence: 57%. Final level for Tim Arthur: 10,986, level confidence: 63%. Notes
|