Featured Posts
Recent Posts
- On the effect of pre-training corpora on in-context learning by large-scale language model.
We investigated the effect of the source and size of pre-training corpora on in-context few-shot and zero-shot learning in HyperCLOVA, a Korean AI platform based on GPT-3.
Sang-Woo Lee (Conversation / Language Research)May 3, 2022 - Optimization points for HyperCLOVA services
We would like to share how we have applied multi-batch and tested the caching effect in a multi-turn pattern service to optimize HyperCLOVA-based services.
Kim Minsub, Lee Sungjae (Next ML Inference TF)Mar 10, 2022 - Extending the features of HyperCLOVA API
We would like to share our experience and some examples of how we've implemented the Early Stop, Semantic Search, and P-tuning features for HyperCLOVA-based services.
Kim Minsub, Lee Sungjae (Next ML Inference TF)Feb 24, 2022 - HyperCLOVA Serving Framework Selection
We would like to share our experiences and know-how earned during the process of optimizing HyperCLOVA's inference performance by changing the transformer framework.
Kim Minsub, Lee Sungjae (Next ML Inference TF)Jan 5, 2022
Meet CLOVA
Try getting some hands-on experience with using CLOVA’s AI technologies.