Tomasz Sobczyk
3f73c40412
More deterministic move accuracy validation.
2020-12-24 10:16:59 +09:00
Tomasz Sobczyk
8ca82646a9
Use plain nnue eval for validation loss calculation instead of first performing qsearch
2020-12-22 10:35:19 +09:00
Tomasz Sobczyk
6853b4aac2
Simple filtering for validation data.
2020-12-22 09:40:25 +09:00
Tomasz Sobczyk
f56613ebf6
Add 'validation_count' option for 'learn' that specifies how many positions to use for validation
2020-12-20 09:47:30 +09:00
Joost VandeVondele
b49fd3ab30
Add -lstdc++fs to the link line of gcc
...
older versions of gcc (<8.1) need this, even if they accept -std=c++17
with this patch, the code can be run on fishtest again,
at least by the majority of workers (fishtest doesn't require c++17 to be available)
See e.g.
https://tests.stockfishchess.org/tests/view/5fcfbf801ac1691201888235
Bench: 3820648
2020-12-09 08:40:34 +09:00
Tomasz Sobczyk
28d6d7cb03
Avoid computing gradient for validation loss.
2020-12-02 08:56:20 +09:00
Tomasz Sobczyk
fafb9557a8
Get train loss from update_parameters.
2020-12-02 08:56:20 +09:00
Tomasz Sobczyk
4eb0e77a2a
Store references instead of copying the results of intermediate autograd computations.
2020-12-02 08:56:20 +09:00
Tomasz Sobczyk
6cd0b03098
Add some comments regarding the current state of autograd loss computation.
2020-12-02 08:56:20 +09:00
Tomasz Sobczyk
99cb869db3
Reintroduce use_wdl.
2020-12-02 08:56:20 +09:00
Tomasz Sobczyk
cf6bc7ecaf
Cleanup around get_loss
2020-12-02 08:56:20 +09:00
Tomasz Sobczyk
256c4b55ec
Properly apply gradient norm clipping after it's scaled in the update_parameters.
2020-12-02 08:56:20 +09:00
Tomasz Sobczyk
de675e3503
Reintroduce optional scaling of the teacher signal.
2020-12-02 08:56:20 +09:00
Tomasz Sobczyk
01ae7b1e2c
Simplify passing constants that may vary between calls.
2020-12-02 08:56:20 +09:00
Tomasz Sobczyk
e975889132
Move cross_entropy calculation to a separate function.
2020-12-02 08:56:20 +09:00
Tomasz Sobczyk
891abf5511
Make the autograd loss expression chain thread_local.
2020-12-02 08:56:20 +09:00
Tomasz Sobczyk
a5c20bee5b
Apply gradient clipping.
2020-12-02 08:56:20 +09:00
Tomasz Sobczyk
aa55692b97
Cross entropy loss.
2020-12-02 08:56:20 +09:00
Tomasz Sobczyk
539bd2d1c8
Replace the old loss/grad calculation completely.
2020-12-02 08:56:20 +09:00
Tomasz Sobczyk
5a58eb803a
Loss func with autograd
2020-12-02 08:56:20 +09:00
Tomasz Sobczyk
9030020a85
Add smart_fen_skipping option to learn.
2020-11-23 19:22:11 +09:00
Tomasz Sobczyk
3cee6881ee
Move the terminal position check to after qsearch, otherwise qsearch may end up in a terminal position.
2020-11-23 08:29:38 +09:00
Tomasz Sobczyk
3dbc45bdfc
Add gradient clipping.
2020-11-16 10:08:56 +09:00
Tomasz Sobczyk
00bc80c3c4
Add assume_quiet option to the learner.
2020-11-15 22:18:13 +09:00
Tomasz Sobczyk
69bc3ef9be
Output loss more often.
2020-11-14 12:33:25 +09:00
Tomasz Sobczyk
ee0917a345
Pass ThreadPool to update_parameters, propagate, and backpropagate.
2020-10-29 09:21:19 +09:00
Tomasz Sobczyk
317fda2516
Cleanup eval saving and lr scheduling.
2020-10-28 23:08:05 +09:00
Tomasz Sobczyk
f81fa3d712
Replace global_learning_rate with learning_rate local to the learner and passed to update_parameters as a parameter.
2020-10-28 09:36:07 +09:00
Tomasz Sobczyk
cde6ec2bf2
Make all grad related functions in learn static. Pass calc_grad as a parameter.
2020-10-27 14:47:50 +09:00
Tomasz Sobczyk
e4868cb59e
Move setting learn search limits to learner.
2020-10-27 14:47:07 +09:00
Tomasz Sobczyk
c229929d26
Remove the position parameter from learn.
2020-10-27 00:35:43 +09:00
Tomasz Sobczyk
a8066cd4a9
Rename elmo lambdas
2020-10-27 00:33:58 +09:00
Tomasz Sobczyk
f7de49eb66
Create a collective parameter struct for learner.
2020-10-27 00:33:58 +09:00
Tomasz Sobczyk
2c477d76ec
Cleaner and more outputs during training initialization.
2020-10-25 22:18:28 +09:00
Tomasz Sobczyk
4b72658409
Synchronize printed info regions in the learner and sfen reader.
2020-10-25 22:18:28 +09:00
Tomasz Sobczyk
cf3edfed82
Improve info messages.
2020-10-25 22:18:28 +09:00
Tomasz Sobczyk
c49ae541c4
Add layer info for check_health. Print subsequent infos from the same scope with "-->" instead of "INFO:" for clarity.
2020-10-25 22:18:28 +09:00
Tomasz Sobczyk
8ddef320e6
Print an additional new line before calc_loss progress instead of after check_health in the feature transformer layer.
2020-10-25 22:18:28 +09:00
Tomasz Sobczyk
a351c1d65e
Add verbose flag to learn. Only print update parameters info when vebose=true
2020-10-25 22:18:28 +09:00
Tomasz Sobczyk
ec436d3dfd
Print some weight update stats
2020-10-25 22:18:28 +09:00
Tomasz Sobczyk
371acaa0b5
Allow changing sfen reader buffer sizes for the learn command.
2020-10-25 19:22:56 +09:00
Tomasz Sobczyk
8fb208598b
pass shuffle flag in the constructor
2020-10-25 19:22:56 +09:00
Tomasz Sobczyk
31f94a18b3
Update readme and docs after change from loop to epochs.
2020-10-25 19:22:56 +09:00
Tomasz Sobczyk
fc3788f630
Use cyclic sfen reader for learning, change loop option to epochs.
2020-10-25 19:22:56 +09:00
Tomasz Sobczyk
ad3d1b42e4
Make sfen reader only stop when it's destroyed. Now it is fully RAII.
2020-10-25 19:22:56 +09:00
Tomasz Sobczyk
c58aa9696a
Start sfen reader worker thread in the constructor.
2020-10-25 19:22:56 +09:00
Tomasz Sobczyk
0636e1256d
Add cyclic mode to the sfen reader. Make sfen reader take all files at construction
2020-10-25 19:22:56 +09:00
Tomasz Sobczyk
c7ac3688a7
Move the old convert stuff from learn to their own commands.
2020-10-24 08:52:42 +09:00
Tomasz Sobczyk
9564a52523
Remove whole file shuffling as it does not change learning behaviour, only works for bin, and is considered harmful for binpack.
2020-10-23 09:33:20 +09:00
Tomasz Sobczyk
7b4a769cca
Fix base_dir not being applied to singular filenames.
2020-10-22 20:01:55 +09:00