登录
|
Chinese
|
使用条款
词典
论坛
联络
英语
⇄
俄语
+
G
o
o
g
l
e
|
Forvo
|
+
Adam optimizer
This HTML5 player is not supported by your browser
强调
人工智能
алгоритм оптимизации "Адам"
(an algorithm for first-order gradient-based optimization of stochastic objective functions, based on adaptive estimates of lower-order moments, intended to train artificial neural networks
arxiv.org
Alex_Odeychuk
)
增加
|
报告错误
|
获取短网址