|
Menu
Login: | Explaining level changesDivision 2: University of Bath 2 v Redland Eagles (Wed 23 Oct 2024)Match played between Justin Lilico (home) and Keir Williams (away).Match won by Justin Lilico. Result: 15-6,15-9,15-6:12-15,15-9,15-13,6-15,19-17:12-15,17-15,15-11,15-9:7-15,14-16,7-15:15-9,15-10,5-15,13-15,6-15. Starting level for Justin Lilico: 3,139, level confidence: 70%. Starting level for Keir Williams: 2,577, level confidence: 76%. Set manually. Justin Lilico to win as he is currently playing 22% better than Keir Williams. Justin Lilico won all of the games and 68% of the points. This games result would be expected if he was better by around 55% or more. This points result would be expected if he was better by around 114% (PAR scoring). These are weighted and combined to calculate that Justin Lilico played 114% better than Keir Williams in this match. Assuming that any level changes are shared between both players, for this result it looks like Justin Lilico actually played at a level of 4,163 and Keir Williams at a level of 1,943. Without any damping, both players would need to be adjusted by 33% to match this result. Factoring in the relative levels of confidence which allows players with low confidence in their levels to change more quickly, the adjustment for Justin Lilico changes to +29% and Keir Williams changes to -26%. After applying standard match damping, the adjustment for Justin Lilico becomes +10% and for Keir Williams becomes -9.8%. Apply match/event weighting of 75% for 'Mixed Autumn 2024/2025' so the adjustment for Justin Lilico is +7.5% and for Keir Williams is -7.2%. Apply limits to the amount of change for a single match which are based on player level, level confidence and time since last match so that Keir Williams is limited to -5% level change. In general a player's level won't go up by more than 10% or drop more than 5% if they've played in the last 7 days but those limits are relaxed if their previous match was further back. Increase level confidence due to one more match played. Justin Lilico: 84%, Keir Williams: 87%. Reduce level confidence based on how unexpected the result is. Justin Lilico: 63%, Keir Williams: 66%. A final adjustment of +0.6% has been made to both players as part of the automatic calibration that is performed after each match. All players in this pool will have been adjusted equally in order to remain equivalent to other player pools. Final level for Justin Lilico: 3,370, level confidence: 63%. Final level for Keir Williams: 2,485, level confidence: 66%. Notes
|