Findings
Papers, talks, and posters. Each entry links to its detail page (parent / derivative findings, abstract, BibTeX, external resources).
Papers (16)
- Gaussian Equivalence for Self-Attention: Asymptotic Spectral Analysis of Attention Matrix
- Free Random Projection for In-Context Reinforcement Learning
- Cluster Haptic Texture Database: Haptic Texture Database with Variety in Velocity and Direction of Sliding Contacts
- PanoTree: Automated Photospot Explorer in Virtual Reality Scenes
- Understanding MLP-Mixer as a Wide and Sparse MLP
- Understanding Gradient Regularization in Deep Learning: Efficient Finite-Difference Computation and Implicit Bias
- Preliminary Study of Haptic Presentation in a VR Environment Using LRAs
- Asymptotic Freeness of Layerwise Jacobians Caused by Invariance of Multilayer Perceptron : The Haar Orthogonal Case
- Layer-Wise Interpretation of Deep Neural Networks Using Idneity Initialization
- Viewpoint Planning of Projector Placement for Spatial Augmented Reality using Star-Kernel Decomposition
- The Spectrum of Fisher Information of Deep Networks Achieving Dynamical Isometry
- Selective Forgetting of Deep Networks at a Finer Level than Samples
- Cauchy noise loss for stochastic optimization of random matrix models via free deterministic equivalents
- Identifiability of parametric random matrix models
- Free deterministic equivalent Z-scores of compound Wishart models: A goodness of fit test of 2DARMA models
- De Finetti theorem for a Boolean anaolgue of easy quantum groups
Talks (29)
- 文脈内強化学習のための Free Random Projection
- Gaussian Equivalence for Self-Attention: Spectral Analysis of Attention Matrix
- ランダム行列の自由性と状態空間モデルの接点
- ランダム行列の自由性と状態空間モデルの接点
- Free Random Projection for In-Context Reinforcement Learning
- Free Random Projection for In-Context Reinforcement Learning
- 自由確率論とランダム行列
- Free Random Projection for In-Context Reinforcement Learning
- ランダム行列と深層学習
- Random Matrices, Free Probability, and Neural Networks
- Polynomials of Randm Matrices in Deep Nerual Networks
- Analysis of Deep Neural Networks With Random Tensors
- Random Matrices, Free Probability, and Neural Networks
- Random Matrices, Free Probability, and Deep Neural Networks
- Understanding MLP-Mixer as a Wide and Sparse MLP throughout Random Permutation Matrices
- Asymptotic freeness in MLP and related topics
- Random Matrices, Free Probability and Deep Learning
- Asymptotic Freeness of Layerwise Jacobians Caused by Invariance of Multilayer Perceptron
- 深層神経回路の数理:無限次元近似, ランダム行列, 及び自由確率論
- Asymptotic Freeness of Layerwise Jacobians Caused by Invariance of Multilayer Perceptron
- Random matrix approach to deep learning
- 自由確率論による深層神経回路網の解析
- Parameter Estimation of Random Matrix Models via Free Probability Theory
- Cauchy noise loss: A machine learning approach to random matrices and free probability
- Free product of von Neumann algebras
- De Finetti theorems for a Boolean analogue of easy quantum groups
- Cumulants in noncommutative probability
- A symmetry in free probability: Quantum de Finetti theorem
- 自己注意のガウス等価性:注意行列の漸近スペクトル解析
Posters (6)
- MLP-Mixer の大次元かつ疎な MLP としての理解
- Asymptotic Freeness in Jacobian of Deep Neural Networks
- Asymptotic Freeness in Jacobian of Deep Neural Networks
- Spectral Parameter Estimation of Random Matrix Models (Cauchy雑音損失による次元復元)
- Spectral Parameter Estimation of Random Matrix Models
- Cauchy noise loss for stochastic optimization of random matrix models via free deterministic equivalents