6 pointsby anarmorarm16 hours ago1 comment
  • anarmorarm5 hours ago
    Edit: GPT-2, not GPT-2 Medium. The 2nd paragraph should read:

    "With 23.8 PPL on WikiText-103, WaveletLM beats both GPT-2, which was trained on 80× more data, and Transformer-XL Standard, which..."