1 / 30

Computer analysis of World Chess Champions

CG 2006. Computer analysis of World Chess Champions. Matej Guid and Ivan Bratko. Introduction. Who was the best chess player of all time? Chess players of different eras never met across the chess board. No well founded, objective answer. Computers...

padma
Download Presentation

Computer analysis of World Chess Champions

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. CG 2006 Computer analysis of World Chess Champions Matej Guid and Ivan Bratko

  2. Introduction Who was the best chess player of all time? • Chess players of different eras never met across the chess board. • No well founded, objective answer. Computers... • Were so far mostly used as a tool for statistical analysis of players’ results. High quality chess programs... • Provide an opportunity of an objective comparisson. Statistical analysis of results do NOT reflect: • true strengths of the players, • quality of play. I Wilhelm Steinitz, 1886 - 1894

  3. Related work Jeff Sonas, 2005: • rating scheme, based on tournament results from 1840 to the present, • ratings are calculated for each month separately, player’s activity is taken into account. • Disadvantages • Playing level has risen dramatically in the recent decades. • The ratings in general reflect the players’ success in competition, but NOT directly their quality of play. II Emanuel Lasker, 1894 -1921

  4. Our approach • computer analysis of individual moves played • determine players’ quality of play regardless of the game score • the differences in players’ style were also taken into account • calm positional playersvs aggresive tactical players • a method to assess the difficulty of positions was designed Analysed games • 14 World Champions (classical version) from 1886 to 2004 • analyses of the matches for the title of “World Chess Champion” • slightly adapted chess program Crafty has been used III Jose Raul Capablanca, 1921 -1927

  5. The modified Crafty • Instead of time limit, we limited search to fixed search depth. • Backed-up evaluations from depth 2 to 12 were obtained for each move. • Quiescence search remained turned on to prevent horizont effects. Advantages • complex positions automatically get more computation time, • the program could be run on computers of different computational powers. Obtained data • best move and its evaluation, • second best move and its evaluation, • move played and its evaluation, • material state of each player. IV Alexander Alekhine, 1927 -1935 and 1937 - 1946

  6. Average error • average difference between moves played and best evaluated moves • basic criterion Formula ∑|Best move evaluation – Move played evaluation| • Number of moves • “Best move” = Crafty’s decision resulting from 12 ply search Constraints • Evaluations started on move 12. • Positions, where both the move suggested and the move played were outside the interval [-2, 2], were discarded. • Positional players are expected to commit less errors due to somewhat less complex positions, than tactical players. V Max Euwe, 1935 - 1937

  7. Average error V Max Euwe, 1935 - 1937

  8. Blunders • Big mistakes can be quite reliably detected with a computer. • We label a move as a blunder when the numerical error exceeds 1.00. VI Mikhail Botvinnik, 1948 - 1957, 1958 - 1960, and 1961 - 1963

  9. Complexity of a position Basic idea • A given position is difficult, when different “best moves”, which considerably alter the evaluation of the root position, are discovered at different search depths. Assumption • This definition of complexity also applies to humans. • This assumption is in agreement with experimental results. Formula ∑|Best move evaluation – 2nd best move evaluation| besti ≠ besti - 1 VII Vasily Smyslov, 1957 - 1958

  10. Complexity of a position Euwe-Alekhine, 16th World Championship 1935 complexity = 0.00 VII Vasily Smyslov, 1957 - 1958

  11. Complexity of a position Euwe-Alekhine, 16th World Championship 1935 complexity = 0.00 VII Vasily Smyslov, 1957 - 1958

  12. Complexity of a position Euwe-Alekhine, 16th World Championship 1935 complexity = 0.00 VII Vasily Smyslov, 1957 - 1958

  13. Complexity of a position Euwe-Alekhine, 16th World Championship 1935 complexity = 0.00 VII Vasily Smyslov, 1957 - 1958

  14. Complexity of a position Euwe-Alekhine, 16th World Championship 1935 complexity = 0.00 VII Vasily Smyslov, 1957 - 1958

  15. Complexity of a position Euwe-Alekhine, 16th World Championship 1935 complexity = 0.00 VII Vasily Smyslov, 1957 - 1958

  16. Complexity of a position Euwe-Alekhine, 16th World Championship 1935 complexity = 0.00 VII Vasily Smyslov, 1957 - 1958

  17. Complexity of a position Euwe-Alekhine, 16th World Championship 1935 complexity = 0.00 VII Vasily Smyslov, 1957 - 1958

  18. Complexity of a position Euwe-Alekhine, 16th World Championship 1935 complexity = 0.00 VII Vasily Smyslov, 1957 - 1958

  19. Complexity of a position Euwe-Alekhine, 16th World Championship 1935 complexity = 0.00 + (1.30 – 1.16) complexity = 0.14 VII Vasily Smyslov, 1957 - 1958

  20. Complexity of a position Euwe-Alekhine, 16th World Championship 1935 complexity = 0.14 VII Vasily Smyslov, 1957 - 1958

  21. Complexity of a position Euwe-Alekhine, 16th World Championship 1935 complexity = 0.14 + (4.46 – 1.60) complexity = 0.14 + 2.86 complexity = 3.00 VII Vasily Smyslov, 1957 - 1958

  22. Complexity of a position VII Vasily Smyslov, 1957 - 1958

  23. Average error in equally complex positions • How would players perform if they faced equally complex positions? • What would be their expected error if they were playing in another style? VIII Mikhail Tal, 1960 - 1961

  24. Percentage of best moves played • It alone does NOT reveal true strength of a player. IX Tigran Petrosian, 1963 - 1969

  25. The difference in best move evaluations X Boris Spassky, 1969 - 1972

  26. Percentage of best moves played...... and the difference in best move evaluations XI Robert James Fischer, 1972 - 1975

  27. Material XII Anatoly Karpov, 1975 - 1985

  28. Credibility of Crafty as an analysis tool • By limiting search depth we achieved automatic adaptation of time used to the complexity of a given position. • Occasional errors cancel out through statistical averaging (around 1.400 analyses were applied, altogether over 37.000 positions). Using another program instead of Crafty... • An open source program was required for the modification of the program. • Analyses of “Man against the machine” matches indicate that Crafty competently appreciates the strength of the strongest chess programs. XIII Garry Kasparov, 1985 - 2000

  29. Conclusion • Slightly modified chess program Crafty was applied as tool for computer analysis aiming at an objective comparison of chess players of different eras. • Several criteria for evaluation were designed: • average difference between moves played and best evaluated moves • rate of blunders (big errors) • expected error in equally complex positions • rate of best moves played & difference in best moves evaluations • A method to assess the difficulty of positions was designed, in order to bring all players to a “common denominator”. • The results might appear quite surprising. Overall, they can be nicely interpreted by a chess expert. XIV Vladimir Kramnik, 2000 -

  30. ? XIV Vladimir Kramnik, 2000 -

More Related