Hanjun Kim  

Associate Professor
School of Electrical and Electronic Engineering, Yonsei University

Ph.D. 2013, Department of Computer Science, Princeton University

Office: Engineering Hall #3-C415
Phone: +82-2-2123-2770
Email: first_name at yonsei.ac.kr
 
 
[Home]   [Curriculum Vitae]   [Publications]   [CoreLab]   [Korean]  

Refereed International Conference Publications

ELASM: Error-Latency-Aware Scale Management for Fully Homomorphic Encryption [abstract] (USENIX Security, GitHub, PDF)
Yongwoo Lee, Seonyoung Cheon, Dongkwan Kim, Dongyoon Lee, and Hanjun Kim
32nd USENIX Security Symposium (USENIX Security), August 2023.
Accept Rate: 33% (190/569).

Thanks to its fixed-point arithmetic and SIMD-like vectorization, among fully homomorphic encryption (FHE) schemes that allow computation on encrypted data, RNS-CKKS is widely used for privacy-preserving machine learning services. Prior works have partly automated a daunting scale management task required for RNS-CKKS fixed-point arithmetic, yet none takes an output error into consideration, preventing users from exploring a better error-latency trade-off. This work proposes a new error- and latency-aware scale management (ELASM) scheme for the RNS-CKKS FHE scheme. By actively controlling the scale of a ciphertext, one can effectively make the impact of noise on an error smaller because an error is a scaled noise introduced by an RNS-CKKS operation. ELASM explores different scale management plans that repurpose an upscale operation as an error reduction operation, estimates the output error and latency of each plan, and iteratively finds the best plan that minimizes the error-latency cost function. In addition, this work proposes a new scale-to-noise ratio (SNR) parameter and introduces fine-grained noise-aware waterlines (a minimum scale requirement) for different RNS-CKKS operations, opening a new opportunity to further improve an error-latency trade-off. This work implements the proposed ideas in the ELASM compiler along with a new FHE language and type system that enforces the RNS-CKKS constraints including SNR-based noise-aware waterlines. For ten machine and deep learning benchmarks, ELASM finds the better error and latency trade-offs (lower Pareto curves) than the state-of-the-art solutions such as EVA and Hecate.