
GPT-2: 1.5B release - OpenAI
2019年11月5日 · As the final model release of GPT-2’s staged release, we’re releasing the largest version (1.5B parameters) of GPT-2 along with code and model weights to facilitate detection …
GPT-2 - Wikipedia
Generative Pre-trained Transformer 2 (GPT-2) is a large language model by OpenAI and the second in their foundational series of GPT models. GPT-2 was pre-trained on a dataset of 8 …
openai-community/gpt2 - Hugging Face
GPT-2 is a transformers model pretrained on a very large corpus of English data in a self-supervised fashion. This means it was pretrained on the raw texts only, with no humans …
GitHub - openai/gpt-2: Code for the paper "Language Models are ...
This repository is meant to be a starting point for researchers and engineers to experiment with GPT-2. For basic information, see our model card.
OpenAI GPT2 - Hugging Face
GPT-2 is a large transformer-based language model with 1.5 billion parameters, trained on a dataset[1] of 8 million web pages. GPT-2 is trained with a simple objective: predict the next …
GPT-2: 6-month follow-up - OpenAI
2019年8月20日 · We’re releasing the 774 million parameter GPT‑2 language model after the release of our small 124M model in February, staged release of our medium 355M model in …
The Illustrated GPT-2 (Visualizing Transformer Language Models)
2019年8月12日 · The OpenAI GPT-2 exhibited impressive ability of writing coherent and passionate essays that exceed what we anticipated current language models are able to …
Fine-tuning GPT-2 from human preferences - OpenAI
2019年9月19日 · We’ve fine-tuned the 774M parameter GPT-2 language model using human feedback for various tasks, successfully matching the preferences of the external human …
OpenAI has published the text-generating AI it said was too …
2019年11月7日 · Research lab OpenAI announced it had created a new text-generating AI system called GPT-2 in February, but withheld releasing the full model as it was too …
gpt-2/model_card.md at master · openai/gpt-2 - GitHub
1.5 billion parameters: the fourth and largest GPT-2 version. We have also released 124 million, 355 million, and 774 million parameter models.
- 某些结果已被删除