• Search Research Projects
  • Search Researchers
  • How to Use
  1. Back to previous page

Nitanda Atsushi  二反田 篤史

ORCIDConnect your ORCID iD *help
Researcher Number 60838811
Other IDs
Affiliation (based on the past Project Information) *help 2021 – 2024: 九州工業大学, 大学院情報工学研究院, 准教授
2019 – 2020: 東京大学, 大学院情報理工学系研究科, 助教
Review Section/Research Field
Principal Investigator
Basic Section 61030:Intelligent informatics-related
Keywords
Principal Investigator
確率的最適化 / 非凸最適化 / 平均場理論 / ニューラルネットワーク / 深層学習 / 機械学習 / 確率測度最適化 / 平均場最適化 / 平均場ニューラルネットワーク / 超高次元ニューラルネット … More / 加速分散縮小法 / カーネル法 / 確率的最適化法 / ランジュバンダイナミクス / 確率的勾配降下法 Less
  • Research Projects

    (2 results)
  • Research Products

    (57 results)
  •  Development of adaptive leanring method based on optimization of probability measuresPrincipal Investigator

    • Principal Investigator
      二反田 篤史
    • Project Period (FY)
      2022 – 2023
    • Research Category
      Grant-in-Aid for Scientific Research (B)
    • Review Section
      Basic Section 61030:Intelligent informatics-related
    • Research Institution
      Kyushu Institute of Technology
  •  Study on learning dynamics of high-dimensional machine learning models and development of efficient learning methodsPrincipal Investigator

    • Principal Investigator
      Nitanda Atsushi
    • Project Period (FY)
      2019 – 2021
    • Research Category
      Grant-in-Aid for Early-Career Scientists
    • Review Section
      Basic Section 61030:Intelligent informatics-related
    • Research Institution
      Kyushu Institute of Technology
      The University of Tokyo

All 2023 2022 2021 2020 2019

All Journal Article Presentation Book

  • [Book] 深層学習からマルチモーダル情報処理へ2022

    • Author(s)
      中山 英樹、二反田 篤史、田村 晃裕、井上 中順、牛久 祥孝
    • Total Pages
      248
    • Publisher
      サイエンス社
    • ISBN
      9784781915548
    • Data Source
      KAKENHI-PROJECT-23K24906
  • [Journal Article] Uniform-in-time Propagation of Chaos for the Mean Field Gradient Langevin Dynamics2023

    • Author(s)
      Taiji Suzuki, Atsushi Nitanda, Denny Wu
    • Journal Title

      The 11th International Conference on Learning Representations (ICLR2023)

      Volume: 11

    • Peer Reviewed / Open Access / Int'l Joint Research
    • Data Source
      KAKENHI-PROJECT-23K24906
  • [Journal Article] Convergence of Mean-field Langevin Dynamics: Time and Space Discretization, Stochastic Gradient, and Variance Reduction2023

    • Author(s)
      Taiji Suzuki, Denny Wu, Atsushi Nitanda
    • Journal Title

      In Advances in Neural Information Processing Systems

      Volume: 36

    • Peer Reviewed / Open Access / Int'l Joint Research
    • Data Source
      KAKENHI-PROJECT-23K24906
  • [Journal Article] Tight and Fast Generalization Error Bound of Graph Embedding in Metric Space2023

    • Author(s)
      Atsushi Suzuki, Atsushi Nitanda, Taiji Suzuki, Jing Wang, Feng Tian, Kenji Yamanishi
    • Journal Title

      The 40th International Conference on Machine Learning (ICML2023)

      Volume: 202

    • Peer Reviewed / Open Access / Int'l Joint Research
    • Data Source
      KAKENHI-PROJECT-23K24906
  • [Journal Article] Primal and Dual Analysis of Entropic Fictitious Play for Finite-sum Problems2023

    • Author(s)
      Atsushi Nitanda, Kazusato Oko, Denny Wu, Nobuhito Takenouchi, Taiji Suzuki
    • Journal Title

      The 40th International Conference on Machine Learning (ICML2023)

      Volume: 202

    • Peer Reviewed / Open Access / Int'l Joint Research
    • Data Source
      KAKENHI-PROJECT-23K24906
  • [Journal Article] Feature Learning via Mean-field Langevin Dynamics: Classifying Sparse Parities and Beyond2023

    • Author(s)
      Taiji Suzuki, Denny Wu, Kazusato Oko, Atsushi Nitanda
    • Journal Title

      In Advances in Neural Information Processing Systems

      Volume: 36

    • Peer Reviewed / Open Access / Int'l Joint Research
    • Data Source
      KAKENHI-PROJECT-23K24906
  • [Journal Article] Two-layer neural network on infinite dimensional data: global optimization guarantee in the mean-field regime2022

    • Author(s)
      Naoki Nishikawa, Taiji Suzuki, Atsushi Nitanda, Denny Wu
    • Journal Title

      Advances in Neural Information Processing Systems (NeurIPS2022)

      Volume: 35 Pages: 32612-32623

    • Peer Reviewed / Open Access / Int'l Joint Research
    • Data Source
      KAKENHI-PROJECT-23K24906
  • [Journal Article] Convex Analysis of the Mean Field Langevin Dynamics2022

    • Author(s)
      Atsushi Nitanda, Denny Wu, Taiji Suzuki
    • Journal Title

      Proceedings of Machine Learning Research (AISTATS2022)

      Volume: 151 Pages: 9741-9757

    • Peer Reviewed / Open Access / Int'l Joint Research
    • Data Source
      KAKENHI-PROJECT-19K20337
  • [Journal Article] Particle dual averaging: optimization of mean field neural network with global convergence rate analysis*2022

    • Author(s)
      Nitanda Atsushi、Wu Denny、Suzuki Taiji
    • Journal Title

      Journal of Statistical Mechanics: Theory and Experiment

      Volume: 2022 Issue: 11 Pages: 114010-114010

    • DOI

      10.1088/1742-5468/ac98a8

    • Peer Reviewed / Open Access / Int'l Joint Research
    • Data Source
      KAKENHI-PROJECT-23K24906
  • [Journal Article] Particle Stochastic Dual Coordinate Ascent: Exponential convergent algorithm for mean field neural network optimization2022

    • Author(s)
      Kazusato Oko, Taiji Suzuki, Atsushi Nitanda, Denny Wu
    • Journal Title

      The 10th International Conference on Learning Representations

      Volume: 10

    • Peer Reviewed / Open Access / Int'l Joint Research
    • Data Source
      KAKENHI-PROJECT-23K24906
  • [Journal Article] Sharp characterization of optimal minibatch size for stochastic finite sum convex optimization2021

    • Author(s)
      Nitanda Atsushi, Murata Tomoya, Suzuki Taiji
    • Journal Title

      Knowledge and Information Systems

      Volume: 63 Issue: 9 Pages: 2513-2539

    • DOI

      10.1007/s10115-021-01593-1

    • Peer Reviewed / Open Access
    • Data Source
      KAKENHI-PROJECT-19K20337, KAKENHI-PROJECT-18H03201
  • [Journal Article] Particle Dual Averaging: Optimization of Mean Field Neural Networks with Global Convergence Rate Analysis2021

    • Author(s)
      Atsushi Nitanda, Denny Wu, Taiji Suzuki
    • Journal Title

      Advances in Neural Information Processing Systems (NeurIPS2021)

      Volume: 34 Pages: 19608-19621

    • Peer Reviewed / Open Access / Int'l Joint Research
    • Data Source
      KAKENHI-PROJECT-19K20337
  • [Journal Article] Exponential Convergence Rates of Classification Errors on Learning with SGD and Random Features2021

    • Author(s)
      Shingo Yashima, Atsushi Nitanda, and Taiji Suzuki
    • Journal Title

      Proceedings of Machine Learning Research (AISTATS2021)

      Volume: 130 Pages: 1954-1962

    • Peer Reviewed / Open Access
    • Data Source
      KAKENHI-PROJECT-19K20337
  • [Journal Article] Deep learning is adaptive to intrinsic dimensionality of model smoothness in anisotropic Besov space2021

    • Author(s)
      Taiji Suzuki, Atsushi Nitanda
    • Journal Title

      Advances in Neural Information Processing Systems (NeurIPS2021)

      Volume: 34 Pages: 3609-3621

    • Peer Reviewed / Open Access
    • Data Source
      KAKENHI-PROJECT-19K20337
  • [Journal Article] Functional Gradient Boosting for Learning Residual-like Networks with Statistical Guarantees2020

    • Author(s)
      Atsushi Nitanda and Taiji Suzuki
    • Journal Title

      Proceedings of Machine Learning Research (AISTATS2020)

      Volume: 108 Pages: 2981-2991

    • Peer Reviewed / Open Access
    • Data Source
      KAKENHI-PROJECT-19K20337
  • [Journal Article] Hyperbolic Ordinal Embedding2019

    • Author(s)
      Atsushi Suzuki, Jing Wang, Feng Tian, Atsushi Nitanda, and Kenji Yamanishi
    • Journal Title

      In Proceedings of Machine Learning Research (ACML2019)

      Volume: 101 Pages: 1065-1080

    • Peer Reviewed / Open Access
    • Data Source
      KAKENHI-PROJECT-19K20337
  • [Journal Article] Stochastic Gradient Descent with Exponential Convergence Rates of Expected Classification Errors2019

    • Author(s)
      Atsushi Nitanda and Taiji Suzuki
    • Journal Title

      Proceedings of Machine Learning Research (AISTATS2019)

      Volume: 89 Pages: 1417-1426

    • Peer Reviewed / Open Access
    • Data Source
      KAKENHI-PROJECT-19K20337
  • [Journal Article] Data Cleansing for Models Trained with SGD2019

    • Author(s)
      Satoshi Hara, Atsushi Nitanda, and Takanori Maehara
    • Journal Title

      Advances in Neural Information Processing Systems

      Volume: 32 Pages: 4215-4224

    • Peer Reviewed / Open Access
    • Data Source
      KAKENHI-PROJECT-19K20337
  • [Journal Article] Sharp Characterization of Optimal Minibatch Size for Stochastic Finite Sum Convex Optimization2019

    • Author(s)
      Nitanda Atsushi, Murata Tomoya, and Suzuki Taiji
    • Journal Title

      In Proceedings of IEEE International Conference on Data Mining

      Volume: - Pages: 488-497

    • Peer Reviewed / Open Access
    • Data Source
      KAKENHI-PROJECT-19K20337
  • [Presentation] Optimization Theory of Neural Networks under Mean-field Regime2023

    • Author(s)
      Atsushi Nitanda
    • Organizer
      Workshop on Optimization and Machine Learning
    • Int'l Joint Research
    • Data Source
      KAKENHI-PROJECT-23K24906
  • [Presentation] Primal and Dual Analysis of Mean-field Models2023

    • Author(s)
      Atsushi Nitanda
    • Organizer
      RIKEN-AIP & PRAIRIE Joint Workshop on Machine Learning and Artificial Intelligence
    • Int'l Joint Research
    • Data Source
      KAKENHI-PROJECT-23K24906
  • [Presentation] Convergence theory for mean-field optimization methods2023

    • Author(s)
      Atsushi Nitanda, Denny Wu, Taiji Suzuki
    • Organizer
      Minisymposium: Recent advances on non-convex optimization in inverse problems, imaging and machine learning. International Council for Industrial and Applied Mathematics (ICIAM)
    • Invited / Int'l Joint Research
    • Data Source
      KAKENHI-PROJECT-23K24906
  • [Presentation] Primal and Dual Analysis of Mean-field Models2023

    • Author(s)
      Atsushi Nitanda
    • Organizer
      EPFL-CIS & RIKEN AIP Joint Workshop on Machine Learning
    • Int'l Joint Research
    • Data Source
      KAKENHI-PROJECT-23K24906
  • [Presentation] Parameter Averaging for SGD Stabilizes the Implicit Bias towards Flat Regions2022

    • Author(s)
      Atsushi Nitanda
    • Organizer
      First A*STAR CFAR - RIKEN AIP Joint Workshop on AI and Machine Learning
    • Int'l Joint Research
    • Data Source
      KAKENHI-PROJECT-23K24906
  • [Presentation] ニューラルネットワークの平均場解析2022

    • Author(s)
      二反田篤史
    • Organizer
      IBISML研究会
    • Data Source
      KAKENHI-PROJECT-23K24906
  • [Presentation] Convex Analysis of the Mean Field Langevin Dynamics2022

    • Author(s)
      Atsushi Nitanda
    • Organizer
      Workshop on Functional Inference and Machine Intelligence
    • Invited / Int'l Joint Research
    • Data Source
      KAKENHI-PROJECT-19K20337
  • [Presentation] 平均化確率的勾配降下法による平坦性を指向する帰納バイアスの強化2022

    • Author(s)
      菊池竜平,前田修吾,二反田篤史
    • Organizer
      情報論的学習理論ワークショップ (IBIS)
    • Data Source
      KAKENHI-PROJECT-23K24906
  • [Presentation] Convex Analysis of the Mean Field Langevin Dynamics2022

    • Author(s)
      Atsushi Nitanda, Denny Wu, Taiji Suzuki
    • Organizer
      Conference on the Mathematical Theory of Deep Neural Networks (DeepMath)
    • Int'l Joint Research
    • Data Source
      KAKENHI-PROJECT-23K24906
  • [Presentation] Convergence of mean field gradient Langevin dynamics for optimizing two-layer neural networks2022

    • Author(s)
      Taiji Suzuki, Atsushi Nitanda, Denny Wu, Kazusato Oko
    • Organizer
      International Conference of the ERCIM WG on Computational and Methodological Statistics (CMStatistics 2022)
    • Int'l Joint Research
    • Data Source
      KAKENHI-PROJECT-23K24906
  • [Presentation] Particle Dual Averaging: Optimization of Mean Field Neural Networks with Global Convergence Rate Analysis2021

    • Author(s)
      Atsushi Nitanda, Denny Wu, Taiji Suzuki
    • Organizer
      Neural Information Processing Systems (NeurIPS2021)
    • Int'l Joint Research
    • Data Source
      KAKENHI-PROJECT-19K20337
  • [Presentation] Particle Stochastic Dual Coordinate Ascent: Exponential Convergent Algorithm for Mean Field Neural Network Optimization2021

    • Author(s)
      大古一聡, 鈴木大慈, 二反田篤史, Denny Wu.
    • Organizer
      情報論的学習理論ワークショップ (IBIS)
    • Data Source
      KAKENHI-PROJECT-19K20337
  • [Presentation] 平均場ニューラルネットワークの最適化法2021

    • Author(s)
      二反田篤史
    • Organizer
      日本オペレーションズ・リサーチ学会九州支部 2021年度第1回講演会・研究会
    • Invited
    • Data Source
      KAKENHI-PROJECT-19K20337
  • [Presentation] Fast learning rates of averaged stochastic gradient descent for over-parameterized neural networks2021

    • Author(s)
      Atsushi Nitanda, Taiji Suzuki
    • Organizer
      International Conference on Econometrics and Statistics (EcoSta)
    • Int'l Joint Research
    • Data Source
      KAKENHI-PROJECT-19K20337
  • [Presentation] 平均場ニューラルネットワークの収束率保証付き最適化2021

    • Author(s)
      二反田篤史
    • Organizer
      日本応用数理学会年会
    • Data Source
      KAKENHI-PROJECT-19K20337
  • [Presentation] Deep learning is adaptive to intrinsic dimensionality of model smoothness in anisotropic Besov space2021

    • Author(s)
      Taiji Suzuki, Atsushi Nitanda
    • Organizer
      Neural Information Processing Systems (NeurIPS2021)
    • Int'l Joint Research
    • Data Source
      KAKENHI-PROJECT-19K20337
  • [Presentation] Optimality and superiority of deep learning for estimating functions in variants of Besov spaces2021

    • Author(s)
      Taiji Suzuki, Atsushi Nitanda, Kazuma Tsuji
    • Organizer
      International Conference on Econometrics and Statistics (EcoSta)
    • Int'l Joint Research
    • Data Source
      KAKENHI-PROJECT-19K20337
  • [Presentation] Convex Analysis of the Mean Field Langevin Dynamics2021

    • Author(s)
      Atsushi Nitanda, Denny Wu, Taiji Suzuki
    • Organizer
      International Conference on Artificial Intelligence and Statistics (AISTATS2022)
    • Int'l Joint Research
    • Data Source
      KAKENHI-PROJECT-19K20337
  • [Presentation] 平均場ニューラルネットワークの効率的最適化法2021

    • Author(s)
      二反田篤史,大古一聡,Denny Wu,鈴木大慈
    • Organizer
      統計関連学会連合大会
    • Data Source
      KAKENHI-PROJECT-19K20337
  • [Presentation] 二層ニューラルネットワークの最適化理論2021

    • Author(s)
      二反田篤史
    • Organizer
      第2回若手数学者交流会
    • Data Source
      KAKENHI-PROJECT-19K20337
  • [Presentation] When Does Preconditioning Help or Hurt Generalization?2020

    • Author(s)
      Shun-ichi Amari, Jimmy Ba, Roger Grosse, Xuechen Li, Atsushi Nitanda, Taiji Suzuki, Denny Wu, and Ji Xu
    • Organizer
      International Conference on Learning Representation (ICLR2021)
    • Int'l Joint Research
    • Data Source
      KAKENHI-PROJECT-19K20337
  • [Presentation] 二段階最適化によるモデル抽出攻撃に対する防御2020

    • Author(s)
      森雄人, 二反田篤史, 武田朗子
    • Organizer
      情報論的学習理論ワークショップ (IBIS)
    • Data Source
      KAKENHI-PROJECT-19K20337
  • [Presentation] Optimal Rates for Averaged Stochastic Gradient Descent under Neural Tangent Kernel Regime2020

    • Author(s)
      Atsushi Nitanda and Taiji Suzuki
    • Organizer
      International Conference on Learning Representation (ICLR2021)
    • Int'l Joint Research
    • Data Source
      KAKENHI-PROJECT-19K20337
  • [Presentation] 確率的勾配降下法のNTK理論による最適収束率2020

    • Author(s)
      二反田篤史,鈴木大慈
    • Organizer
      統計関連学会連合大会
    • Data Source
      KAKENHI-PROJECT-19K20337
  • [Presentation] 確率的最適化法の収束解析2020

    • Author(s)
      二反田篤史
    • Organizer
      RAMP数理最適化シンポジウム
    • Invited
    • Data Source
      KAKENHI-PROJECT-19K20337
  • [Presentation] When Does Preconditioning Help or Hurt Generalization?2020

    • Author(s)
      Shun-ichi Amari, Jimmy Ba, Roger Grosse, Xuechen Li, Atsushi Nitanda, Taiji Suzuki, Denny Wu, and Ji Xu
    • Organizer
      The 12th OPT Workshop on Optimization for Machine Learning
    • Int'l Joint Research
    • Data Source
      KAKENHI-PROJECT-19K20337
  • [Presentation] 粒子双対平均化法:平均場ニューラルネットワークの大域的収束保証付最適化法2020

    • Author(s)
      二反田篤史,Denny Wu, 鈴木大慈
    • Organizer
      情報論的学習理論ワークショップ (IBIS)
    • Data Source
      KAKENHI-PROJECT-19K20337
  • [Presentation] カーネル法におけるrandom featureを用いた確率的勾配法の期待識別誤差の線形収束性2019

    • Author(s)
      八嶋晋吾,二反田篤史,鈴木大慈
    • Organizer
      統計関連学会連合大会
    • Data Source
      KAKENHI-PROJECT-19K20337
  • [Presentation] Random Featureを用いた確率的勾配法の期待識別誤差の収束解析2019

    • Author(s)
      八嶋晋吾,二反田篤史,鈴木大慈
    • Organizer
      情報論的学習理論ワークショップ(IBIS)
    • Data Source
      KAKENHI-PROJECT-19K20337
  • [Presentation] 学習アルゴリズムの大域収束性と帰納的バイアス2019

    • Author(s)
      二反田篤史
    • Organizer
      情報論的学習理論ワークショップ (IBIS)
    • Invited
    • Data Source
      KAKENHI-PROJECT-19K20337
  • [Presentation] 識別問題に対する高次元二層ニューラルネットの大域収束性と汎化性能解析2019

    • Author(s)
      二反田篤史
    • Organizer
      情報系 WINTER FESTA Episode 5
    • Data Source
      KAKENHI-PROJECT-19K20337
  • [Presentation] SGDの挙動解析に基づくデータクレンジング2019

    • Author(s)
      原聡,二反田篤史,前原貴憲
    • Organizer
      情報論的学習理論ワークショップ(IBIS)
    • Data Source
      KAKENHI-PROJECT-19K20337
  • [Presentation] 高次元二層ニューラルネットに対する勾配降下法による識別誤差の大域収束性と汎化性能解析2019

    • Author(s)
      二反田篤史,鈴木大慈
    • Organizer
      情報論的学習理論ワークショップ(IBIS)
    • Data Source
      KAKENHI-PROJECT-19K20337
  • [Presentation] Stochastic Gradient Descent with Exponential Convergence Rates for Classification Problems2019

    • Author(s)
      Atsushi Nitanda
    • Organizer
      Summer School 2019 on Transfer Learning
    • Invited / Int'l Joint Research
    • Data Source
      KAKENHI-PROJECT-19K20337
  • [Presentation] 識別問題に対する高次元ニューラルネットの勾配降下法の大域収束性と汎化性能解析2019

    • Author(s)
      二反田篤史,鈴木大慈
    • Organizer
      日本応用数理学会年会
    • Data Source
      KAKENHI-PROJECT-19K20337
  • [Presentation] Exponential convergence of stochastic gradient descent for binary classification problems2019

    • Author(s)
      Atsushi Nitanda, Taiji Suzuki
    • Organizer
      The Conference of Data Science, Statistics & Visualisation (DSSV)
    • Int'l Joint Research
    • Data Source
      KAKENHI-PROJECT-19K20337
  • [Presentation] 高次元ニューラルネットに対する勾配法の大域収束性と汎化性能解析2019

    • Author(s)
      二反田篤史
    • Organizer
      日本オペレーションズ・リサーチ学会 研究部会 最適化とその応用 (OPTA)
    • Invited
    • Data Source
      KAKENHI-PROJECT-19K20337
  • [Presentation] 識別問題に対する高次元二層ニューラルネットの勾配法による汎化性能解析2019

    • Author(s)
      二反田篤史,鈴木大慈
    • Organizer
      統計関連学会連合大会
    • Data Source
      KAKENHI-PROJECT-19K20337

URL: 

Are you sure that you want to link your ORCID iD to your KAKEN Researcher profile?
* This action can be performed only by the researcher himself/herself who is listed on the KAKEN Researcher’s page. Are you sure that this KAKEN Researcher’s page is your page?

この研究者とORCID iDの連携を行いますか?
※ この処理は、研究者本人だけが実行できます。

Information User Guide FAQ News Terms of Use Attribution of KAKENHI

Powered by NII kakenhi