Ph.D. at KAIST AI; Prev Intern at Google DeepMind, Kakao; Interests in LLM Inference Acceleration, Foundation Model Training, Multimodal Learning
-
KAIST AI
- Seoul, Republic of Korea
- http://www.raymin0223.com
Highlights
- Pro
Pinned Loading
-
itsnamgyu/block-transformer
itsnamgyu/block-transformer PublicBlock Transformer: Global-to-Local Language Modeling for Fast Inference (Official Code)
-
fast_robust_early_exit
fast_robust_early_exit PublicFast and Robust Early-Exiting Framework for Autoregressive Language Models with Synchronized Parallel Decoding (EMNLP 2023 Long)
-
patch-mix_contrastive_learning
patch-mix_contrastive_learning PublicPatch-Mix Contrastive Learning with Audio Spectrogram Transformer on Respiratory Sound Classification (INTERSPEECH 2023)
-
sungnyun/openssl-simcore
sungnyun/openssl-simcore Public(CVPR 2023) Coreset Sampling from Open-Set for Fine-Grained Self-Supervised Learning
-
self-contrastive-learning
self-contrastive-learning PublicSelf-Contrastive Learning: Single-viewed Supervised Contrastive Framework using Sub-network (AAAI 2023)
Something went wrong, please refresh the page to try again.
If the problem persists, check the GitHub status page or contact support.
If the problem persists, check the GitHub status page or contact support.