Tomasz Sobczyk
b5714c4084
Parallelize input slice trainer backprop.
2020-10-31 11:52:26 +09:00
Tomasz Sobczyk
941897ff2c
Optimize trainer clipped relu backpropagate.
2020-10-31 11:50:12 +09:00
Tomasz Sobczyk
c96743c5bd
Optimize feature transformer backpropagation stats.
2020-10-31 11:49:29 +09:00
Tomasz Sobczyk
2c10b1babc
Optimize feature transformer clipped relu.
2020-10-31 11:48:02 +09:00
Tomasz Sobczyk
a56d8124d8
Replace non-blas parts of trainers with our own blas-like routines.
2020-10-31 08:36:58 +09:00
Tomasz Sobczyk
ee0917a345
Pass ThreadPool to update_parameters, propagate, and backpropagate.
2020-10-29 09:21:19 +09:00
Tomasz Sobczyk
f1e96cab55
Align trainer arrays to cache line.
2020-10-29 09:12:50 +09:00
Tomasz Sobczyk
ec9e49e875
Add a HalfKA architecture (a product of K - king, and A - any piece) along with all required infrastructure. HalfKA doesn't discriminate kings compared to HalfKP. Keep old architecture as the default one.
2020-10-29 09:10:01 +09:00
Tomasz Sobczyk
317fda2516
Cleanup eval saving and lr scheduling.
2020-10-28 23:08:05 +09:00
Tomasz Sobczyk
680654b254
Add dots to output every epoch for progress visualization.
2020-10-28 09:36:43 +09:00
Tomasz Sobczyk
f81fa3d712
Replace global_learning_rate with learning_rate local to the learner and passed to update_parameters as a parameter.
2020-10-28 09:36:07 +09:00
Tomasz Sobczyk
cde6ec2bf2
Make all grad related functions in learn static. Pass calc_grad as a parameter.
2020-10-27 14:47:50 +09:00
Tomasz Sobczyk
ba390a7f9a
Print the used factorizer when intializing training.
2020-10-27 00:32:39 +09:00
Tomasz Sobczyk
0e528995c2
Print avg bias/weight for affine trasform and feature transformer during training.
2020-10-25 22:18:28 +09:00
Tomasz Sobczyk
fe766f4f42
Additional output from layers during training.
2020-10-25 22:18:28 +09:00
Tomasz Sobczyk
2c477d76ec
Cleaner and more outputs during training initialization.
2020-10-25 22:18:28 +09:00
Tomasz Sobczyk
b882423005
Bring back info for finished evalsave. Update tests with the new message.
2020-10-25 22:18:28 +09:00
Tomasz Sobczyk
4b72658409
Synchronize printed info regions in the learner and sfen reader.
2020-10-25 22:18:28 +09:00
Tomasz Sobczyk
cf3edfed82
Improve info messages.
2020-10-25 22:18:28 +09:00
Tomasz Sobczyk
c49ae541c4
Add layer info for check_health. Print subsequent infos from the same scope with "-->" instead of "INFO:" for clarity.
2020-10-25 22:18:28 +09:00
Tomasz Sobczyk
8ddef320e6
Print an additional new line before calc_loss progress instead of after check_health in the feature transformer layer.
2020-10-25 22:18:28 +09:00
Tomasz Sobczyk
a351c1d65e
Add verbose flag to learn. Only print update parameters info when vebose=true
2020-10-25 22:18:28 +09:00
Tomasz Sobczyk
ec436d3dfd
Print some weight update stats
2020-10-25 22:18:28 +09:00
Tomasz Sobczyk
be3937c37b
Print layers and their indices during training initialization.
2020-10-25 22:18:28 +09:00
Tomasz Sobczyk
f7530de20d
Fix assertion in trainer
2020-10-23 09:35:41 +09:00
Tomasz Sobczyk
146a6b056e
PascalCase -> snake_case for consistency with the rest of the codebase.
2020-10-19 18:37:23 +09:00
Tomasz Sobczyk
69ea3d30b2
Move the extra new line to after check health.
2020-10-19 08:29:51 +09:00
Tomasz Sobczyk
9023edc3c8
Add missing includes.
2020-10-19 08:29:51 +09:00
Tomasz Sobczyk
77624addf2
Cleanup last ".." in include paths.
2020-10-19 08:29:51 +09:00
Tomasz Sobczyk
497f689aa3
Cleanup nnue
2020-10-19 08:29:51 +09:00
Tomasz Sobczyk
c286f9cd7d
Cleanup trainer.
2020-10-19 08:29:51 +09:00
Tomasz Sobczyk
ea8eb415de
Cleanup trainer features.
2020-10-18 22:24:24 +09:00
Tomasz Sobczyk
3041adb080
Cleanup layers.
2020-10-18 19:32:15 +09:00
Tomasz Sobczyk
0d4c3014ca
Cleanup features.
2020-10-17 23:19:16 +09:00
Tomasz Sobczyk
ca760c3a5b
Cleanup architecture files.
2020-10-17 20:01:09 +09:00
Tomasz Sobczyk
3cf193a90e
Properly handle cases in verify and init when SkipLoadingEval is set.
2020-10-17 08:44:38 +09:00
Tomasz Sobczyk
5db46d0c82
Verify whether there is a network being used during training.
2020-10-17 08:44:38 +09:00
Tomasz Sobczyk
0494adeb2c
Move nnue evaluation stuff from evaluate.h to nnue/evaluate_nnue.h
2020-10-15 20:37:03 +09:00
noobpwnftw
d865159bd6
Fix variable initialization in test commands
2020-09-29 17:30:08 +08:00
noobpwnftw
a8b502a975
Merge remote-tracking branch 'remotes/origin/master'
...
Bench: 3618595
2020-09-29 17:09:14 +08:00
noobpwnftw
c065abdcaf
Use incremental updates more often
...
Use incremental updates for accumulators for up to 2 plies.
Do not copy accumulator. About 2% speedup.
Passed STC:
LLR: 2.95 (-2.94,2.94) {-0.25,1.25}
Total: 21752 W: 2583 L: 2403 D: 16766
Ptnml(0-2): 128, 1761, 6923, 1931, 133
https://tests.stockfishchess.org/tests/view/5f7150cf3b22d6afa5069412
closes https://github.com/official-stockfish/Stockfish/pull/3157
No functional change
2020-09-28 16:54:35 +02:00
noobpwnftw
5e8a49f7f2
Restore lambda and gradient function post-merge and minor fixes.
...
bench: 3788313
2020-09-26 12:55:02 +09:00
noobpwnftw
9827411b7c
Merge remote-tracking branch 'remotes/nodchip/master' into trainer
2020-09-24 21:45:28 +08:00
noobpwnftw
5be8b573be
Merge remote-tracking branch 'remotes/origin/master' into trainer
2020-09-23 19:02:27 +08:00
noobpwnftw
411adab149
Merge remote-tracking branch 'remotes/nodchip/master' into trainer
2020-09-23 18:29:30 +08:00
Stéphane Nicolet
9a64e737cf
Small cleanups 12
...
- Clean signature of functions in namespace NNUE
- Add comment for countermove based pruning
- Remove bestMoveCount variable
- Add const qualifier to kpp_board_index array
- Fix spaces in get_best_thread()
- Fix indention in capture LMR code in search.cpp
- Rename TtmemDeleter to LargePageDeleter
Closes https://github.com/official-stockfish/Stockfish/pull/3063
No functional change
2020-09-21 10:41:10 +02:00
Sami Kiminki
485d517c68
Add large page support for NNUE weights and simplify TT mem management
...
Use TT memory functions to allocate memory for the NNUE weights. This
should provide a small speed-up on systems where large pages are not
automatically used, including Windows and some Linux distributions.
Further, since we now have a wrapper for std::aligned_alloc(), we can
simplify the TT memory management a bit:
- We no longer need to store separate pointers to the hash table and
its underlying memory allocation.
- We also get to merge the Linux-specific and default implementations
of aligned_ttmem_alloc().
Finally, we'll enable the VirtualAlloc code path with large page
support also for Win32.
STC: https://tests.stockfishchess.org/tests/view/5f66595823a84a47b9036fba
LLR: 2.94 (-2.94,2.94) {-0.25,1.25}
Total: 14896 W: 1854 L: 1686 D: 11356
Ptnml(0-2): 65, 1224, 4742, 1312, 105
closes https://github.com/official-stockfish/Stockfish/pull/3081
No functional change.
2020-09-21 08:43:48 +02:00
Tomasz Sobczyk
d4737819cd
Fix castling rights feature encoding.
2020-09-20 20:10:03 +09:00
noobpwnftw
26f63fe741
Merge remote-tracking branch 'remotes/origin/master' into trainer
2020-09-19 03:38:37 +08:00
noobpwnftw
a47a3bfc7c
Merge remote-tracking branch 'remotes/nodchip/master' into trainer
2020-09-19 02:14:17 +08:00