OpenNN

Open Neural Network Research Lab

Supported by MODULABS and Brian Impact Foundation

Open Neural Network Research Lab (OpenNN) is an open research center where deep learning researchers from various companies and labs gather to research next-generation neural networks.

Current Lab director is Vincent-Daniel Yun from USC. If you have any question, please contact [juyoung dot yun @ usc dot com].

Our current research interests are:

  • Low-Cost LLM Training & Inference
  • Neural Network Optimization and Generalization Improvement
  • Novel Architectures of Neural Network
  • Analysis of Legal Issue for Deep Learning Applications

10 researchers from 10 institutions collaborated on artificial intelligence research, successfully completing the inaugural program. Further information regarding recruitment for the second cohort will be announced in the future.

See Our team members [Link] from various institutes



Recent news

Sep, 2025
Our paper entitled "MedCLM: Learning to Localize and Reason via a CoT-Curriculum in Medical Vision-Language Models" is submitted to Core Rank A Main Conference.
[Link]
Sep, 2025
Our paper entitled "Fast Fourier Transform-Based Spectral and Temporal Gradient Filtering for Differential Privacy" is accepted at CIKM 2025 Human-Centric AI Workshop.
[Link] [Workshop]
Sep, 2025
Our paper entitled "Sharpness-Aware Minimization with Z-Score Gradient Filtering" is accepted at Conference on Neural Information Processing Systems (NeurIPS) 2025 OPT Workshop.
[Link] [Workshop]
Sep, 2025
Our paper entitled "SGD Convergence under Stepsize Shrinkage in Low-Precision Training" is accepted at Conference on Neural Information Processing Systems (NeurIPS) 2025 OPT Workshop.
[Link] [Workshop]
Sep, 2025
Our paper entitled "Insights from Gradient Dynamics: Gradient Autoscaled Normalization" is accepted at Conference on Neural Information Processing Systems (NeurIPS) 2025 OPT Workshop.
[Link] [Workshop]