Han-Byul Kim

hanbyul

hanbyulkim

han byul kim

HANBYUL

HANBYULKIM

HAN BYUL KIM

hanbyul seoul

HANBYUL seoul

Han-Byul seoul

hanbyul naver

Hanbyul naver

hanbyeol

I'm a Ph.D candidate at Computer Science & Engineering, Seoul National University under the supervision of professor Sungjoo Yoo.

I'm passionate about making machine learning faster and more efficient. I focus on optimizing all aspects of neural networks, from algorithms and architectures to system design and hardware implementation. My research explores techniques like quantization and neural architecture search to achieve this. Additionally, I work on projects for deploying optimized neural networks on FPGAs using RTL design.

LinkedIn  /  Scholar  /  Github  /  Email

Experience

Research Intern, Apple

Apr. 2024 - Now, Seattle, WA, United States

Full time PhD intern in MIND (Machine Intelligence & Neural Design) team, DMLI

Student Researcher, Google

Oct. 2022 - Dec. 2023, Seoul, Korea

Full time PhD intern in Model optimization team, CoreML

Research

MetaMix: Meta-state Precision Searcher for Mixed-precision Activation Quantization

Han-Byul Kim, Joo Hyung Lee, Sungjoo Yoo, Hong-Seok Kim


AAAI Conference on Artificial Intelligence (AAAI), 2024

#Quantization, #Neural_Architecture_Search, #Mixed_Precision

JaxPruner: A concise library for sparsity research

Joo Hyung Lee, Wonpyo Park, Nicole Mitchell, Jonathan Pilault, Johan Obando-Ceron, Han-Byul Kim, Namhoon Lee, Elias Frantar, Yun Long, Amir Yazdanbakhsh, Shivani Agrawal, Suvinay Subramanian, Xin Wang, Sheng-Chun Kao, Xingyao Zhang, Trevor Gale, Aart Bik, Woohyun Han, Milen Ferev, Zhonglin Han, Hong-Seok Kim, Yann Dauphin, Gintare Karolina Dziugaite, Pablo Samuel Castro, Utku Evci

Conference on Parsimony and Learning (CPAL), 2024

#Sparsity, #Quantization, #Jax

BASQ: Branch-wise Activation-clipping Search Quantization for Sub-4-bit Neural Networks

Han-Byul Kim, Eunhyeok Park, Sungjoo Yoo

European Conference on Computer Vision (ECCV), 2022

#Quantization, #Neural_Architecture_Search, #2-bit, #3-bit, #4-bit

For more details, please visit my LinkedIn


Webpage source: https://github.com/jonbarron/jonbarron_website.