default search action
Hideaki Iiduka
Person information
Refine list
refinements active!
zoomed in on ?? of ?? records
view refined list in
export refined list as
2020 – today
- 2024
- [j37]Hiroyuki Sakai, Hideaki Iiduka:
Modified Memoryless Spectral-Scaling Broyden Family on Riemannian Manifolds. J. Optim. Theory Appl. 202(2): 834-853 (2024) - [j36]Hideaki Iiduka:
Theoretical analysis of Adam using hyperparameters close to one without Lipschitz smoothness. Numer. Algorithms 95(1): 383-421 (2024) - [i14]Naoki Sato, Hideaki Iiduka:
Role of Momentum in Smoothing Objective Function in Implicit Graduated Optimization. CoRR abs/2402.02325 (2024) - [i13]Kento Imaizumi, Hideaki Iiduka:
Iteration and Stochastic First-order Oracle Complexities of Stochastic Gradient Descent using Constant and Decaying Learning Rates. CoRR abs/2402.15344 (2024) - [i12]Hikaru Umeda, Hideaki Iiduka:
Increasing Both Batch Size and Learning Rate Accelerates Stochastic Gradient Descent. CoRR abs/2409.08770 (2024) - [i11]Hinata Harada, Hideaki Iiduka:
Convergence of Sharpness-Aware Minimization Algorithms using Increasing Batch Size and Decaying Learning Rate. CoRR abs/2409.09984 (2024) - 2023
- [j35]Hiroyuki Sakai, Hiroyuki Sato, Hideaki Iiduka:
Global convergence of Hager-Zhang type Riemannian conjugate gradient method. Appl. Math. Comput. 441: 127685 (2023) - [j34]Hideaki Iiduka:
ϵ-Approximation of Adaptive Leaning Rate Optimization Algorithms for Constrained Nonconvex Stochastic Optimization. IEEE Trans. Neural Networks Learn. Syst. 34(10): 8108-8115 (2023) - [c2]Hiroki Naganuma, Hideaki Iiduka:
Conjugate Gradient Method for Generative Adversarial Networks. AISTATS 2023: 4381-4408 - [c1]Naoki Sato, Hideaki Iiduka:
Existence and Estimation of Critical Batch Size for Training Generative Adversarial Networks with Two Time-Scale Update Rule. ICML 2023: 30080-30104 - [i10]Hiroyuki Sakai, Hideaki Iiduka:
Modified memoryless spectral-scaling Broyden family on Riemannian manifolds. CoRR abs/2307.08986 (2023) - [i9]Yuki Tsukada, Hideaki Iiduka:
Relationship between Batch Size and Number of Steps Needed for Nonconvex Optimization of Stochastic Gradient Descent using Armijo Line Search. CoRR abs/2307.13831 (2023) - [i8]Naoki Sato, Hideaki Iiduka:
Using Stochastic Gradient Descent to Smooth Nonconvex Functions: Analysis of Implicit Graduated Optimization with Optimal Noise Scheduling. CoRR abs/2311.08745 (2023) - 2022
- [j33]Hideaki Iiduka, Hiroyuki Sakai:
Riemannian stochastic fixed point optimization algorithm. Numer. Algorithms 90(4): 1493-1517 (2022) - [j32]Hiroyuki Sakai, Hideaki Iiduka:
Riemannian Adaptive Optimization Algorithm and its Application to Natural Language Processing. IEEE Trans. Cybern. 52(8): 7328-7339 (2022) - [j31]Hideaki Iiduka:
Appropriate Learning Rates of Adaptive Learning Rate Optimization Algorithms for Training Deep Neural Networks. IEEE Trans. Cybern. 52(12): 13250-13261 (2022) - [i7]Naoki Sato, Hideaki Iiduka:
Using Constant Learning Rate of Two Time-Scale Update Rule for Training Generative Adversarial Networks. CoRR abs/2201.11989 (2022) - [i6]Hiroki Naganuma, Hideaki Iiduka:
Conjugate Gradient Method for Generative Adversarial Networks. CoRR abs/2203.14495 (2022) - [i5]Hideaki Iiduka:
Theoretical analysis of Adam using hyperparameters close to one without Lipschitz smoothness. CoRR abs/2206.13290 (2022) - [i4]Hideaki Iiduka:
Critical Bach Size Minimizes Stochastic First-Order Oracle Complexity of Deep Learning Optimizer using Hyperparameters Close to One. CoRR abs/2208.09814 (2022) - 2021
- [j30]Yini Zhu, Hideaki Iiduka:
Unified Algorithm Framework for Nonconvex Stochastic Optimization in Deep Neural Networks. IEEE Access 9: 143807-143823 (2021) - [j29]Hideaki Iiduka:
Inexact stochastic subgradient projection method for stochastic equilibrium problems with nonmonotone bifunctions: application to expected risk minimization in machine learning. J. Glob. Optim. 80(2): 479-505 (2021) - [j28]Hiroyuki Sakai, Hideaki Iiduka:
Sufficient Descent Riemannian Conjugate Gradient Methods. J. Optim. Theory Appl. 190(1): 130-150 (2021) - [i3]Hideaki Iiduka:
The Number of Steps Needed for Nonconvex Optimization of a Deep Learning Optimizer is a Rational Function of Batch Size. CoRR abs/2108.11713 (2021) - [i2]Hideaki Iiduka:
Minimization of Stochastic First-order Oracle Complexity of Adaptive Methods for Nonconvex Optimization. CoRR abs/2112.07163 (2021) - 2020
- [j27]Hiroyuki Sakai, Hideaki Iiduka:
Hybrid Riemannian conjugate gradient methods with global convergence properties. Comput. Optim. Appl. 77(3): 811-830 (2020) - [j26]Kazuhiro Hishinuma, Hideaki Iiduka:
Fixed point quasiconvex subgradient method. Eur. J. Oper. Res. 282(2): 428-437 (2020) - [j25]Hideaki Iiduka:
Stochastic Fixed Point Optimization Algorithm for Classifier Ensemble. IEEE Trans. Cybern. 50(10): 4370-4380 (2020) - [i1]Yu Kobayashi, Hideaki Iiduka:
Conjugate-gradient-based Adam for stochastic optimization and its application to deep learning. CoRR abs/2003.00231 (2020)
2010 – 2019
- 2019
- [j24]Kazuhiro Hishinuma, Hideaki Iiduka:
Incremental and Parallel Machine Learning Algorithms With Automated Learning Rate Adjustments. Frontiers Robotics AI 6: 77 (2019) - [j23]Hideaki Iiduka:
Two stochastic optimization algorithms for convex optimization with fixed point constraints. Optim. Methods Softw. 34(4): 731-757 (2019) - [j22]Hideaki Iiduka:
Distributed Optimization for Network Resource Allocation With Nonsmooth Utility Functions. IEEE Trans. Control. Netw. Syst. 6(4): 1354-1365 (2019) - 2018
- [j21]Yoichi Hayashi, Hideaki Iiduka:
Optimality and convergence for convex ensemble learning with sparsity and diversity based on fixed point optimization. Neurocomputing 273: 367-372 (2018) - 2016
- [j20]Hideaki Iiduka:
Proximal point algorithms for nonsmooth convex optimization with fixed point constraints. Eur. J. Oper. Res. 253(2): 503-513 (2016) - [j19]Yoichi Hayashi, Yuki Tanaka, Tomohiro Takagi, Takamichi Saito, Hideaki Iiduka, Hiroaki Kikuchi, Guido Bologna, Sushmita Mitra:
Recursive-Rule Extraction Algorithm With J48graft And Applications To Generating Credit Scores. J. Artif. Intell. Soft Comput. Res. 6(1): 35 (2016) - [j18]Hideaki Iiduka:
Convergence analysis of iterative methods for nonsmooth convex optimization over fixed point sets of quasi-nonexpansive mappings. Math. Program. 159(1-2): 509-538 (2016) - [j17]Hideaki Iiduka:
Incremental subgradient method for nonsmooth convex optimization with fixed point constraints. Optim. Methods Softw. 31(5): 931-951 (2016) - 2015
- [j16]Masato Uchida, Hideaki Iiduka, Isao Sugino:
Modeling User Behavior in P2P Data Storage System. IEICE Trans. Commun. 98-B(1): 33-41 (2015) - [j15]Hideaki Iiduka:
Convex optimization over fixed point sets of quasi-nonexpansive and nonexpansive mappings in utility-based bandwidth allocation problems with operational constraints. J. Comput. Appl. Math. 282: 225-236 (2015) - [j14]Hideaki Iiduka:
Acceleration method for convex optimization over the fixed point set of a nonexpansive mapping. Math. Program. 149(1-2): 131-165 (2015) - 2014
- [j13]Hideaki Iiduka, Kazuhiro Hishinuma:
Acceleration Method Combining Broadcast and Incremental Distributed Optimization Algorithms. SIAM J. Optim. 24(4): 1840-1863 (2014) - 2013
- [j12]Hideaki Iiduka:
Fixed Point Optimization Algorithms for Distributed Optimization in Networked Systems. SIAM J. Optim. 23(1): 1-26 (2013) - 2012
- [j11]Hideaki Iiduka:
Fixed point optimization algorithm and its application to network bandwidth allocation. J. Comput. Appl. Math. 236(7): 1733-1742 (2012) - [j10]Hideaki Iiduka:
Fixed point optimization algorithm and its application to power control in CDMA data networks. Math. Program. 133(1-2): 227-242 (2012) - [j9]Hideaki Iiduka, Yasushi Narushima:
Conjugate gradient methods using value of objective function for unconstrained optimization. Optim. Lett. 6(5): 941-955 (2012) - [j8]Hideaki Iiduka, Isao Yamada:
Computational Method for Solving a Stochastic Linear-Quadratic Control Problem Given an Unsolvable Stochastic Algebraic Riccati Equation. SIAM J. Control. Optim. 50(4): 2173-2192 (2012) - [j7]Hideaki Iiduka:
Iterative Algorithm for Triple-Hierarchical Constrained Nonconvex Optimization Problem and Its Application to Network Bandwidth Allocation. SIAM J. Optim. 22(3): 862-878 (2012) - 2011
- [j6]Hideaki Iiduka:
Three-term conjugate gradient method for the convex optimization problem over the fixed point set of a nonexpansive mapping. Appl. Math. Comput. 217(13): 6315-6327 (2011) - [j5]Hideaki Iiduka, Masato Uchida:
Fixed Point Optimization Algorithms for Network Bandwidth Allocation Problems with Compoundable Constraints. IEEE Commun. Lett. 15(6): 596-598 (2011) - [j4]Hideaki Iiduka:
Iterative Algorithm for Solving Triple-Hierarchical Constrained Optimization Problem. J. Optim. Theory Appl. 148(3): 580-592 (2011) - [j3]Hideaki Iiduka:
Decentralized Algorithm for Centralized Variational Inequalities in Network Resource Allocation. J. Optim. Theory Appl. 151(3): 525-540 (2011)
2000 – 2009
- 2009
- [j2]Hideaki Iiduka, Isao Yamada:
An Ergodic Algorithm for the Power-Control Games for CDMA Data Networks. J. Math. Model. Algorithms 8(1): 1-18 (2009) - [j1]Hideaki Iiduka, Isao Yamada:
A Use of Conjugate Gradient Direction for the Convex Optimization Problem over the Fixed Point Set of a Nonexpansive Mapping. SIAM J. Optim. 19(4): 1881-1893 (2009)
Coauthor Index
manage site settings
To protect your privacy, all features that rely on external API calls from your browser are turned off by default. You need to opt-in for them to become active. All settings here will be stored as cookies with your web browser. For more information see our F.A.Q.
Unpaywalled article links
Add open access links from to the list of external document links (if available).
Privacy notice: By enabling the option above, your browser will contact the API of unpaywall.org to load hyperlinks to open access articles. Although we do not have any reason to believe that your call will be tracked, we do not have any control over how the remote server uses your data. So please proceed with care and consider checking the Unpaywall privacy policy.
Archived links via Wayback Machine
For web page which are no longer available, try to retrieve content from the of the Internet Archive (if available).
Privacy notice: By enabling the option above, your browser will contact the API of archive.org to check for archived content of web pages that are no longer available. Although we do not have any reason to believe that your call will be tracked, we do not have any control over how the remote server uses your data. So please proceed with care and consider checking the Internet Archive privacy policy.
Reference lists
Add a list of references from , , and to record detail pages.
load references from crossref.org and opencitations.net
Privacy notice: By enabling the option above, your browser will contact the APIs of crossref.org, opencitations.net, and semanticscholar.org to load article reference information. Although we do not have any reason to believe that your call will be tracked, we do not have any control over how the remote server uses your data. So please proceed with care and consider checking the Crossref privacy policy and the OpenCitations privacy policy, as well as the AI2 Privacy Policy covering Semantic Scholar.
Citation data
Add a list of citing articles from and to record detail pages.
load citations from opencitations.net
Privacy notice: By enabling the option above, your browser will contact the API of opencitations.net and semanticscholar.org to load citation information. Although we do not have any reason to believe that your call will be tracked, we do not have any control over how the remote server uses your data. So please proceed with care and consider checking the OpenCitations privacy policy as well as the AI2 Privacy Policy covering Semantic Scholar.
OpenAlex data
Load additional information about publications from .
Privacy notice: By enabling the option above, your browser will contact the API of openalex.org to load additional information. Although we do not have any reason to believe that your call will be tracked, we do not have any control over how the remote server uses your data. So please proceed with care and consider checking the information given by OpenAlex.
last updated on 2024-12-10 20:50 CET by the dblp team
all metadata released as open data under CC0 1.0 license
see also: Terms of Use | Privacy Policy | Imprint