Member-only story

Chess Transformer — Neural Network That Learns To Play Chess

Mikhail Raevskiy
2 min readAug 31, 2020

--

Chess Transformer is a language model trained to play chess. The neural network predicts the next move based on the history of the moves in the game. The Transformer model was trained on 2.8 million chess games in Portable Game Notation. The developers have published a Colab laptop in which you can play a game of chess with a model.

Generated example games from GPT-2. Each transformer game generated is a video captured and split for analysis move-bymove using Portable Game Notation (PGN) as inputs with both moving pieces and automated strategy notations as part of the chess package. Source: Arxiv

Training details

The GPT-2 with 774 million parameters was used as the transformer architecture. GPT-2 is a state-of-the-art generative model for natural language processing tasks. The model was developed by researchers from OpenAI.

The model was retrained in 30 thousand steps. The trained model correctly filters out invalid moves and demonstrates defense strategies such as Slav Exchange.

Common Move Frequency Statistics of diverse example
 moves from generative model.
Common Move Frequency Statistics of diverse example moves from the generative model. Source: Arxiv
Strategic positions from large GPT-2 generation. Upper left shows a classic English opening, upper right, a Slav Exchange; lower left, a King’s Indian Defense (KID) and lower right, a Nimzowitsch Variation. Source: Arxiv

Chess Transformer is an example of how transformer models, initially adapted to work with NLP problems, can solve more general strategic modeling problems.

Interested in Deep…

--

--

Mikhail Raevskiy
Mikhail Raevskiy

Written by Mikhail Raevskiy

Bioinformatician at Oncobox Inc. (@oncobox). Research Associate

No responses yet