- On the effect of pre-training corpora on in-context learning by large-scale language model.
We investigated the effect of the source and size of pre-training corpora on in-context few-shot and zero-shot learning in HyperCLOVA, a Korean AI platform based on GPT-3.May 3, 2022
- Optimization points for HyperCLOVA services
We would like to share how we have applied multi-batch and tested the caching effect in a multi-turn pattern service to optimize HyperCLOVA-based services.Mar 10, 2022
- Extending the features of HyperCLOVA API
We would like to share our experience and some examples of how we've implemented the Early Stop, Semantic Search, and P-tuning features for HyperCLOVA-based services.Feb 24, 2022
- HyperCLOVA Serving Framework Selection
We would like to share our experiences and know-how earned during the process of optimizing HyperCLOVA's inference performance by changing the transformer framework.Jan 5, 2022
Try getting some hands-on experience with using CLOVA’s AI technologies.