![](https://dblp.uni-trier.de./img/logo.320x120.png)
![search dblp search dblp](https://dblp.uni-trier.de./img/search.dark.16x16.png)
![search dblp](https://dblp.uni-trier.de./img/search.dark.16x16.png)
default search action
Atsushi Nitanda
Person information
Refine list
![note](https://dblp.uni-trier.de./img/note-mark.dark.12x12.png)
refinements active!
zoomed in on ?? of ?? records
view refined list in
export refined list as
2020 – today
- 2024
- [c28]Atsushi Nitanda, Ryuhei Kikuchi, Shugo Maeda, Denny Wu:
Why is parameter averaging beneficial in SGD? An objective smoothing perspective. AISTATS 2024: 3565-3573 - [c27]Yuka Hashimoto, Sho Sonoda, Isao Ishikawa, Atsushi Nitanda, Taiji Suzuki:
Koopman-based generalization bound: New aspect for full-rank weights. ICLR 2024 - [c26]Atsushi Nitanda, Kazusato Oko, Taiji Suzuki, Denny Wu:
Improved statistical and computational complexity of the mean-field Langevin dynamics under structured data. ICLR 2024 - [i24]Atsushi Nitanda:
Improved Particle Approximation Error for Mean Field Neural Networks. CoRR abs/2405.15767 (2024) - [i23]Dake Bu, Wei Huang, Andi Han, Atsushi Nitanda, Taiji Suzuki, Qingfu Zhang, Hau-San Wong:
Provably Transformers Harness Multi-Concept Word Semantics for Efficient In-Context Learning. CoRR abs/2411.02199 (2024) - 2023
- [c25]Taiji Suzuki, Atsushi Nitanda, Denny Wu:
Uniform-in-time propagation of chaos for the mean-field gradient Langevin dynamics. ICLR 2023 - [c24]Atsushi Nitanda, Kazusato Oko, Denny Wu, Nobuhito Takenouchi, Taiji Suzuki:
Primal and Dual Analysis of Entropic Fictitious Play for Finite-sum Problems. ICML 2023: 26266-26282 - [c23]Atsushi Suzuki, Atsushi Nitanda, Taiji Suzuki, Jing Wang, Feng Tian, Kenji Yamanishi:
Tight and fast generalization error bound of graph embedding in metric space. ICML 2023: 33268-33284 - [c22]Taiji Suzuki, Denny Wu, Atsushi Nitanda:
Mean-field Langevin dynamics: Time-space discretization, stochastic gradient, and variance reduction. NeurIPS 2023 - [c21]Taiji Suzuki, Denny Wu, Kazusato Oko, Atsushi Nitanda:
Feature learning via mean-field Langevin dynamics: classifying sparse parities and beyond. NeurIPS 2023 - [i22]Yuka Hashimoto, Sho Sonoda, Isao Ishikawa
, Atsushi Nitanda, Taiji Suzuki:
Koopman-Based Bound for Generalization: New Aspect of Neural Networks Regarding Nonlinear Noise Filtering. CoRR abs/2302.05825 (2023) - [i21]Atsushi Nitanda, Ryuhei Kikuchi, Shugo Maeda:
Parameter Averaging for SGD Stabilizes the Implicit Bias towards Flat Regions. CoRR abs/2302.09376 (2023) - [i20]Atsushi Nitanda, Kazusato Oko, Denny Wu, Nobuhito Takenouchi, Taiji Suzuki:
Primal and Dual Analysis of Entropic Fictitious Play for Finite-sum Problems. CoRR abs/2303.02957 (2023) - [i19]Atsushi Suzuki, Atsushi Nitanda, Taiji Suzuki, Jing Wang, Feng Tian, Kenji Yamanishi:
Tight and fast generalization error bound of graph embedding in metric space. CoRR abs/2305.07971 (2023) - [i18]Taiji Suzuki, Denny Wu, Atsushi Nitanda:
Convergence of mean-field Langevin dynamics: Time and space discretization, stochastic gradient, and variance reduction. CoRR abs/2306.07221 (2023) - 2022
- [c20]Atsushi Nitanda, Denny Wu, Taiji Suzuki:
Convex Analysis of the Mean Field Langevin Dynamics. AISTATS 2022: 9741-9757 - [c19]Kazusato Oko, Taiji Suzuki, Atsushi Nitanda, Denny Wu:
Particle Stochastic Dual Coordinate Ascent: Exponential convergent algorithm for mean field neural network optimization. ICLR 2022 - [c18]Naoki Nishikawa, Taiji Suzuki, Atsushi Nitanda, Denny Wu:
Two-layer neural network on infinite dimensional data: global optimization guarantee in the mean-field regime. NeurIPS 2022 - [i17]Atsushi Nitanda, Denny Wu, Taiji Suzuki:
Convex Analysis of the Mean Field Langevin Dynamics. CoRR abs/2201.10469 (2022) - 2021
- [j1]Atsushi Nitanda, Tomoya Murata, Taiji Suzuki:
Sharp characterization of optimal minibatch size for stochastic finite sum convex optimization. Knowl. Inf. Syst. 63(9): 2513-2539 (2021) - [c17]Shingo Yashima, Atsushi Nitanda, Taiji Suzuki:
Exponential Convergence Rates of Classification Errors on Learning with SGD and Random Features. AISTATS 2021: 1954-1962 - [c16]Shun-ichi Amari, Jimmy Ba, Roger Baker Grosse, Xuechen Li, Atsushi Nitanda, Taiji Suzuki, Denny Wu, Ji Xu:
When does preconditioning help or hurt generalization? ICLR 2021 - [c15]Atsushi Nitanda, Taiji Suzuki:
Optimal Rates for Averaged Stochastic Gradient Descent under Neural Tangent Kernel Regime. ICLR 2021 - [c14]Atsushi Suzuki, Atsushi Nitanda, Jing Wang, Linchuan Xu, Kenji Yamanishi, Marc Cavazza:
Generalization Error Bound for Hyperbolic Ordinal Embedding. ICML 2021: 10011-10021 - [c13]Atsushi Suzuki, Atsushi Nitanda, Jing Wang, Linchuan Xu, Kenji Yamanishi, Marc Cavazza:
Generalization Bounds for Graph Embedding Using Negative Sampling: Linear vs Hyperbolic. NeurIPS 2021: 1243-1255 - [c12]Taiji Suzuki, Atsushi Nitanda:
Deep learning is adaptive to intrinsic dimensionality of model smoothness in anisotropic Besov space. NeurIPS 2021: 3609-3621 - [c11]Atsushi Nitanda, Denny Wu, Taiji Suzuki:
Particle Dual Averaging: Optimization of Mean Field Neural Network with Global Convergence Rate Analysis. NeurIPS 2021: 19608-19621 - [i16]Yuto Mori, Atsushi Nitanda, Akiko Takeda:
BODAME: Bilevel Optimization for Defense Against Model Extraction. CoRR abs/2103.06797 (2021) - [i15]Atsushi Suzuki, Atsushi Nitanda, Jing Wang, Linchuan Xu, Marc Cavazza, Kenji Yamanishi:
Generalization Error Bound for Hyperbolic Ordinal Embedding. CoRR abs/2105.10475 (2021) - 2020
- [c10]Atsushi Nitanda, Taiji Suzuki:
Functional Gradient Boosting for Learning Residual-like Networks with Statistical Guarantees. AISTATS 2020: 2981-2991 - [i14]Shun-ichi Amari, Jimmy Ba, Roger B. Grosse, Xuechen Li, Atsushi Nitanda, Taiji Suzuki, Denny Wu, Ji Xu:
When Does Preconditioning Help or Hurt Generalization? CoRR abs/2006.10732 (2020) - [i13]Atsushi Nitanda, Taiji Suzuki:
Optimal Rates for Averaged Stochastic Gradient Descent under Neural Tangent Kernel Regime. CoRR abs/2006.12297 (2020) - [i12]Shintaro Fukushima, Atsushi Nitanda, Kenji Yamanishi:
Online Robust and Adaptive Learning from Data Streams. CoRR abs/2007.12160 (2020) - [i11]Linchuan Xu, Jun Huang, Atsushi Nitanda, Ryo Asaoka, Kenji Yamanishi:
A Novel Global Spatial Attention Mechanism in Convolutional Neural Network for Medical Image Classification. CoRR abs/2007.15897 (2020) - [i10]Atsushi Nitanda, Denny Wu, Taiji Suzuki:
Particle Dual Averaging: Optimization of Mean Field Neural Networks with Global Convergence Rate Analysis. CoRR abs/2012.15477 (2020)
2010 – 2019
- 2019
- [c9]Atsushi Suzuki, Jing Wang, Feng Tian, Atsushi Nitanda, Kenji Yamanishi:
Hyperbolic Ordinal Embedding. ACML 2019: 1065-1080 - [c8]Atsushi Nitanda, Taiji Suzuki:
Stochastic Gradient Descent with Exponential Convergence Rates of Expected Classification Errors. AISTATS 2019: 1417-1426 - [c7]Atsushi Nitanda, Tomoya Murata, Taiji Suzuki:
Sharp Characterization of Optimal Minibatch Size for Stochastic Finite Sum Convex Optimization. ICDM 2019: 488-497 - [c6]Satoshi Hara, Atsushi Nitanda, Takanori Maehara:
Data Cleansing for Models Trained with SGD. NeurIPS 2019: 4215-4224 - [i9]Atsushi Nitanda, Taiji Suzuki:
Refined Generalization Analysis of Gradient Descent for Over-parameterized Two-layer Neural Networks with Smooth Activations on Classification Problems. CoRR abs/1905.09870 (2019) - [i8]Satoshi Hara, Atsushi Nitanda, Takanori Maehara:
Data Cleansing for Models Trained with SGD. CoRR abs/1906.08473 (2019) - [i7]Taiji Suzuki, Atsushi Nitanda:
Deep learning is adaptive to intrinsic dimensionality of model smoothness in anisotropic Besov space. CoRR abs/1910.12799 (2019) - [i6]Shingo Yashima, Atsushi Nitanda, Taiji Suzuki:
Exponential Convergence Rates of Classification Errors on Learning with SGD and Random Features. CoRR abs/1911.05350 (2019) - 2018
- [c5]Atsushi Nitanda, Taiji Suzuki:
Gradient Layer: Enhancing the Convergence of Adversarial Training for Generative Models. AISTATS 2018: 1008-1016 - [c4]Atsushi Nitanda, Taiji Suzuki:
Functional Gradient Boosting based on Residual Network Perception. ICML 2018: 3816-3825 - [i5]Atsushi Nitanda, Taiji Suzuki:
Gradient Layer: Enhancing the Convergence of Adversarial Training for Generative Models. CoRR abs/1801.02227 (2018) - [i4]Atsushi Nitanda, Taiji Suzuki:
Functional Gradient Boosting based on Residual Network Perception. CoRR abs/1802.09031 (2018) - [i3]Atsushi Nitanda, Taiji Suzuki:
Stochastic Gradient Descent with Exponential Convergence Rates of Expected Classification Errors. CoRR abs/1806.05438 (2018) - 2017
- [c3]Atsushi Nitanda, Taiji Suzuki:
Stochastic Difference of Convex Algorithm and its Application to Training Deep Boltzmann Machines. AISTATS 2017: 470-478 - [i2]Atsushi Nitanda, Taiji Suzuki:
Stochastic Particle Gradient Descent for Infinite Ensembles. CoRR abs/1712.05438 (2017) - 2016
- [c2]Atsushi Nitanda:
Accelerated Stochastic Gradient Descent for Minimizing Finite Sums. AISTATS 2016: 195-203 - 2015
- [i1]Atsushi Nitanda:
Accelerated Stochastic Gradient Descent for Minimizing Finite Sums. CoRR abs/1506.03016 (2015) - 2014
- [c1]Atsushi Nitanda:
Stochastic Proximal Gradient Descent with Acceleration Techniques. NIPS 2014: 1574-1582
Coauthor Index
![](https://dblp.uni-trier.de./img/cog.dark.24x24.png)
manage site settings
To protect your privacy, all features that rely on external API calls from your browser are turned off by default. You need to opt-in for them to become active. All settings here will be stored as cookies with your web browser. For more information see our F.A.Q.
Unpaywalled article links
Add open access links from to the list of external document links (if available).
Privacy notice: By enabling the option above, your browser will contact the API of unpaywall.org to load hyperlinks to open access articles. Although we do not have any reason to believe that your call will be tracked, we do not have any control over how the remote server uses your data. So please proceed with care and consider checking the Unpaywall privacy policy.
Archived links via Wayback Machine
For web page which are no longer available, try to retrieve content from the of the Internet Archive (if available).
Privacy notice: By enabling the option above, your browser will contact the API of archive.org to check for archived content of web pages that are no longer available. Although we do not have any reason to believe that your call will be tracked, we do not have any control over how the remote server uses your data. So please proceed with care and consider checking the Internet Archive privacy policy.
Reference lists
Add a list of references from ,
, and
to record detail pages.
load references from crossref.org and opencitations.net
Privacy notice: By enabling the option above, your browser will contact the APIs of crossref.org, opencitations.net, and semanticscholar.org to load article reference information. Although we do not have any reason to believe that your call will be tracked, we do not have any control over how the remote server uses your data. So please proceed with care and consider checking the Crossref privacy policy and the OpenCitations privacy policy, as well as the AI2 Privacy Policy covering Semantic Scholar.
Citation data
Add a list of citing articles from and
to record detail pages.
load citations from opencitations.net
Privacy notice: By enabling the option above, your browser will contact the API of opencitations.net and semanticscholar.org to load citation information. Although we do not have any reason to believe that your call will be tracked, we do not have any control over how the remote server uses your data. So please proceed with care and consider checking the OpenCitations privacy policy as well as the AI2 Privacy Policy covering Semantic Scholar.
OpenAlex data
Load additional information about publications from .
Privacy notice: By enabling the option above, your browser will contact the API of openalex.org to load additional information. Although we do not have any reason to believe that your call will be tracked, we do not have any control over how the remote server uses your data. So please proceed with care and consider checking the information given by OpenAlex.
last updated on 2025-01-28 23:38 CET by the dblp team
all metadata released as open data under CC0 1.0 license
see also: Terms of Use | Privacy Policy | Imprint