GPT-2 pretrained on Korean datasets.
Why do you think that https://github.com/cedrickchee/awesome-transformer-nlp is a good alternative to KoGPT
GPT-2 pretrained on Korean datasets.
Why do you think that https://github.com/cedrickchee/awesome-transformer-nlp is a good alternative to KoGPT