Sign in
|
English
|
Terms of Use
Dictionary
Forum
Contacts
G
o
o
g
l
e
|
Forvo
|
+
Adam optimizer
This HTML5 player is not supported by your browser
stresses
AI.
алгоритм оптимизации "Адам"
(an algorithm for first-order gradient-based optimization of stochastic objective functions, based on adaptive estimates of lower-order moments, intended to train artificial neural networks
arxiv.org
Alex_Odeychuk
)
Add
|
Report an error
|
Get short URL
|
Language Selection Tips