Skip to content

This model is a fine-tuned version of GPT-2 on IMDB fantasy synopsis dataset.

License

Notifications You must be signed in to change notification settings

Agniwald/GPT-2-Fantasy

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

3 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

GPT-2-Fantasy

This model was fine-tuned with intention of generating short fantasy stories based on given keywords. You can test this model here.

Training data

Training data was parsed from IMDB website and consists of keywords-synopsis pairs. Method of encoding data was inspired from this repo.

Training hyperparameters

The following hyperparameters were used during training:

  • optimizer: Adam
  • dropout: 0.2
  • learning schedule: exponential decay
  • epochs: 4

Training results

Keywords: time travel, magic, rescue.
Output: A group of young men find themselves in the middle of a desert, and must band together to save their friend from an evil force bent on destroying his world. But what if they don't trust each other? Is there something that might help them at all costs? Can they break the spell before it's too late? Will they be able to stop this Evil once and for all? Or will they do anything about it? Above all, can they make things right with humanity? And above all, is there some really good magic going on here? What are they trying to do?

Framework versions

  • Transformers 4.29.2
  • TensorFlow 2.12.0
  • Tokenizers 0.13.3

About

This model is a fine-tuned version of GPT-2 on IMDB fantasy synopsis dataset.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published