Commit Graph

5678 Commits

Author SHA1 Message Date
Tomasz Sobczyk
a9cfaa4d98 Add a tool for rescoring fens from an epd file with fixed depth search 2020-12-21 10:48:20 +09:00
Tomasz Sobczyk
f56613ebf6 Add 'validation_count' option for 'learn' that specifies how many positions to use for validation 2020-12-20 09:47:30 +09:00
Tomasz Sobczyk
a7378f3249 Make next_fen in opening_book a critical section 2020-12-14 09:03:04 +09:00
Joost VandeVondele
76fbc5e3d0 Make score sign flip optional
Bug fix: flipping score is not needed for fishtest, make this optional.
2020-12-13 09:32:46 +09:00
kennyfrc
f4b4430380 remove unnecessary makefile commands and fix blas on mac 2020-12-13 09:31:52 +09:00
Joost VandeVondele
9c65e868f9 Enhance pgn_to_plain.py
in case a score can be parsed from the comment field in the pgn, add it to the output.
This form works for the fishtest pgns, and is quite common (cutechess-cli among others).
2020-12-11 00:33:34 +09:00
Tomasz Sobczyk
d99ba07b81 Fix incorrect enpassant flag for moves read from uci format in the binpack lib 2020-12-11 00:31:32 +09:00
Joost VandeVondele
b49fd3ab30 Add -lstdc++fs to the link line of gcc
older versions of gcc (<8.1) need this, even if they accept -std=c++17

with this patch, the code can be run on fishtest again,
at least by the majority of workers (fishtest doesn't require c++17 to be available)

See e.g.
https://tests.stockfishchess.org/tests/view/5fcfbf801ac1691201888235

Bench: 3820648
2020-12-09 08:40:34 +09:00
nodchip
ae045e2cd8 Merge pull request #258 from kennyfrc/stockfish-nnue-2020-08-30-macos
mac-compatible makefile with instructions for stockfish-nnue-2020-08-30
2020-12-09 08:39:36 +09:00
Kenn Costales
055f907315 Merge branch 'master' into stockfish-nnue-2020-08-30-macos 2020-12-08 22:49:11 +08:00
kennyfrc
bb26ce5aa1 mac specific makefile with compilation instructions 2020-12-08 22:14:18 +08:00
Tomasz Sobczyk
3a1bd1185f Add binpack coarse shuffle tool. 2020-12-06 19:08:52 +09:00
Tomasz Sobczyk
28d6d7cb03 Avoid computing gradient for validation loss. 2020-12-02 08:56:20 +09:00
Tomasz Sobczyk
fafb9557a8 Get train loss from update_parameters. 2020-12-02 08:56:20 +09:00
Tomasz Sobczyk
4eb0e77a2a Store references instead of copying the results of intermediate autograd computations. 2020-12-02 08:56:20 +09:00
Tomasz Sobczyk
6cd0b03098 Add some comments regarding the current state of autograd loss computation. 2020-12-02 08:56:20 +09:00
Tomasz Sobczyk
99cb869db3 Reintroduce use_wdl. 2020-12-02 08:56:20 +09:00
Tomasz Sobczyk
cf6bc7ecaf Cleanup around get_loss 2020-12-02 08:56:20 +09:00
Tomasz Sobczyk
256c4b55ec Properly apply gradient norm clipping after it's scaled in the update_parameters. 2020-12-02 08:56:20 +09:00
Tomasz Sobczyk
de675e3503 Reintroduce optional scaling of the teacher signal. 2020-12-02 08:56:20 +09:00
Tomasz Sobczyk
01ae7b1e2c Simplify passing constants that may vary between calls. 2020-12-02 08:56:20 +09:00
Tomasz Sobczyk
cbd973fdaa Detect constant expressions in autograd and return 0 grad early. 2020-12-02 08:56:20 +09:00
Tomasz Sobczyk
e975889132 Move cross_entropy calculation to a separate function. 2020-12-02 08:56:20 +09:00
Tomasz Sobczyk
891abf5511 Make the autograd loss expression chain thread_local. 2020-12-02 08:56:20 +09:00
Tomasz Sobczyk
8adf00ae6e Identify a single evalation chain by ID in autograd to prevent cache reuse for subsequent evaluations of the same expression tree. 2020-12-02 08:56:20 +09:00
Tomasz Sobczyk
cb812c742c Add [[nodiscard]] attributes to autograd functions. 2020-12-02 08:56:20 +09:00
Tomasz Sobczyk
26f19e1429 Make automatic differentiation node types constexpr. 2020-12-02 08:56:20 +09:00
Tomasz Sobczyk
aec6017195 When forming an autograd expression only copy parts that are rvalue references, store references to lvalues. 2020-12-02 08:56:20 +09:00
Tomasz Sobczyk
a5c20bee5b Apply gradient clipping. 2020-12-02 08:56:20 +09:00
Tomasz Sobczyk
d103867558 Add memoization to the autograd expression evaluator. 2020-12-02 08:56:20 +09:00
Tomasz Sobczyk
aa55692b97 Cross entropy loss. 2020-12-02 08:56:20 +09:00
Tomasz Sobczyk
539bd2d1c8 Replace the old loss/grad calculation completely. 2020-12-02 08:56:20 +09:00
Tomasz Sobczyk
b71d1e8620 Pass the new loss function to update_parameters 2020-12-02 08:56:20 +09:00
Tomasz Sobczyk
5a58eb803a Loss func with autograd 2020-12-02 08:56:20 +09:00
Tomasz Sobczyk
541fb8177a More utility in autograd. 2020-12-02 08:56:20 +09:00
Tomasz Sobczyk
6ce0245787 Basic autograd 2020-12-02 08:56:20 +09:00
Tomasz Sobczyk
1322a9a5fd Prevent false sharing of num_calls counter in the shared input trainer. Fix current_operation not being local to the executing thread. 2020-11-30 08:54:53 +09:00
Tomasz Sobczyk
2aa7f5290e Fix comparison of integers with different signedness. 2020-11-30 08:54:53 +09:00
Tomasz Sobczyk
a97b65eaef Fix compilation error with USE_BLAS 2020-11-30 08:54:53 +09:00
Tomasz Sobczyk
622e0b14c2 Remove superfluous example shuffling. Shuffling now only happens on reading. 2020-11-30 08:54:53 +09:00
Tomasz Sobczyk
34510dd08a Remove used examples asyncronously. 2020-11-30 08:54:53 +09:00
Tomasz Sobczyk
0bee8fef64 Don't unnecessarily copy the batch part. 2020-11-30 08:54:53 +09:00
Tomasz Sobczyk
e954b14196 Prefetch weights for feature transformer backprop to shared cache. 2020-11-30 08:54:53 +09:00
Tomasz Sobczyk
8009973381 Special case for alpha=1 in saxpy, slight performance increase. 2020-11-30 08:54:53 +09:00
Tomasz Sobczyk
49b2dcb1f3 Preallocate memory for unique_features. Keep the training_features temporary buffer as a thread_local so we reuse the storage. 2020-11-30 08:54:53 +09:00
Tomasz Sobczyk
1c8495b54b Remove handwritten saxpy because compilers optimize the second look anyway. 2020-11-30 08:54:53 +09:00
Tomasz Sobczyk
15c528ca7b Prepare feature transformer learner. 2020-11-30 08:54:53 +09:00
Tomasz Sobczyk
a3c78691a2 Prepare input slice trainer. 2020-11-30 08:54:53 +09:00
Tomasz Sobczyk
401fc0fbab Prepare clipped relu trainer. 2020-11-30 08:54:53 +09:00
Tomasz Sobczyk
774b023641 Add chunked for each with workers. 2020-11-30 08:54:53 +09:00