Reference

Papers

Implementation

  • The Annotated GPT-2 is an annotated version of the GPT-2 paper with plenty of PyTorch code.

  • This GitHub repo is an PyTorch implementation of the GPT-2 by Hugging Face.

  • The minGPT is a minimal PyTorch re-implementation.

  • Yet another GPT-2 implementation via PyTorch

  • The Annotated Transformer explains in code how the transformer is implemented, and is endorsed by the author of “The Annotated GPT-2”.

  • The PyTorch tutorial

    • tutorial on training a sequence-to-sequence model that uses the nn.Transformer module.

APIs

OpenAI API

The document [1] of the official OpenAI library:

  • Text Completion

  • Edit / Correct Inputs

  • Similarity Comparison

  • Classification

  • Text Comprehension

  • Embedding

  • Fine-tuning

Transformers

The transformers from Hugging Face provides APIs to download and train pre-trained models, including GPT-2 and GPT Neo.

Fine-Tuning

  • This 150kb text contains podcast transcripts of Elon Musk.

  • This post shows how to retrain the GPT-2 model.

Other Sources

Reference

Back to GPT.