IPHS 300: Artificial Intelligence for the Humanities: Text, Image, and Sound


Fine-tuning Daria: Exploring the Implications of Temperature, Epochs, & Corpus Size on GPT-2 Screenplay Generation

Document Type


Publication Date

Fall 2021


GPT-2 has a number of hyper parameters that can be tuned to adjust the generated text, including temperature, number of epochs, size of the training text corpus, batch size, and number of samples generated. Since GPT-2 is a new AI model, the effects of fine-tuning are still relatively unknown and unexplored. This paper summits experiments adjusting these GPT-2 parameters and how they affect the text output that the model generates.

Creative Commons License

Creative Commons Attribution 4.0 License
This work is licensed under a Creative Commons Attribution 4.0 License.

This document is currently not available here.