IPHS 300: Artificial Intelligence for the Humanities: Text, Image, and Sound

Document Type


Publication Date

Fall 2021


GPT-2 has a number of hyper parameters that can be tuned to adjust the generated text, including temperature, number of epochs, size of the training text corpus, batch size, and number of samples generated. Since GPT-2 is a new AI model, the effects of fine-tuning are still relatively unknown and unexplored. This paper summits experiments adjusting these GPT-2 parameters and how they affect the text output that the model generates.

Creative Commons License

Creative Commons Attribution 4.0 License
This work is licensed under a Creative Commons Attribution 4.0 License.



To view the content in your browser, please download Adobe Reader or, alternately,
you may Download the file to your hard drive.

NOTE: The latest versions of Adobe Reader do not support viewing PDF files within Firefox on Mac OS and if you are using a modern (Intel) Mac, there is no official plugin for viewing PDF files within the browser window.