KAM Theory Meets Statistical Learning Theory: Hamiltonian Neural Networks with Non-Zero Training Loss
〇Yuhan Chen, Takashi Matsubara and Takaharu Yaguchi
AAAI2022, Oral Presentation (Oral acceptance rate ~ 4.6%)
Secret Communication Systems Using Chaotic Wave Equations with Neural Network Boundary ConditionsYuhan Chen, Hideki Sano, Masashi Wakaiki and Takaharu Yaguchi(2021)Entropy, 23, 904.
〇Yuhan Chen, Takashi Matsubara, Takaharu Yaguchi, “Geometric Integrators for Neural Symplectic Forms,” International Symposium on Nonlinear Theory and Its Applications(NOLTA2023), Sep, 27, 2023. (Italy, Catania)
〇Yuhan Chen, Baige Xu, Takashi Matsubara, Takaharu Yaguchi, “Geometric Integrators for Neural Symplectic Forms,” International Congress onIndustrial and Applied Mathematics(ICIAM2023,Minisymposium), Aug, 23, 2023. (Tokyo)
〇Yuhan Chen, Baige Xu, Takashi Matsubara, Takaharu Yaguchi, “Variational Principle and Variational Integrators for Neural Symplectic Forms,” International Conference on Machine Learning (ICML2023,workshop), Jul, 29, 2023. (Hawaii)
〇Yuhan Chen, Takashi Matsubara, Takaharu Yaguchi, “Variational Integrator for Hamiltonian Neural Networks,” International Symposium on Nonlinear Theory and Its Applications (NOLTA2022). (Online)
〇Yuhan Chen, Takashi Matsubara, Takaharu Yaguchi, “KAM Theory Meets Statistical Learning Theory: Hamiltonian Neural Networks with Non-Zero Training Loss,” Association for the Advancement of Artificial Intelligence (AAAI2022), Feb 26, 2022. (Online)
〇Yuhan Chen, Takashi Matsubara, Takaharu Yaguchi, “Neural symplectic form and coordinate-free learning of Hamiltonian dynamics,” International Conference on Scientific Computation and Differential Equations (SciCADE), Jul 25, 2022. (Iceland)
〇Yuhan Chen, Takashi Matsubara, Takaharu Yaguchi, “Neural Symplectic Form: Learning Hamiltonian Equations on General Coordinate Systems,” Conference on Neural Information Processing Systems (NeurIPS2021), Dec 09, 2021. (Online)