Inspiration

We did this project because we wanted to learn about how to generate text using machine learning.

What it does

It generates a Shakespearean Sonnet using a neural network and machine learning.

How we built it

We used python to build the generator. We posit that having three LSTMs in series on top of the neural network would probably give better results. This would be a future implementation for a more accurate and efficient model.

Certain approaches we did were to either generate lines individually or based off other lines. Another option we included was to generate the final couplet by including or excluding the context of the previously generated text. We found that including the context of previously generated text and keeping the rhyming scheme makes the sonnet more cohesive and semi-logical even though the grammar may not be present. We also found that word analysis of Shakespearean sonnets yielded topics that were present in Shakespearean literature such as strong emphases on love, death, tragedy, and emotions. These topics were highly influential on the selection of words for text generation.

Challenges we ran into

Sometimes, our model outputted sonnets that made absolutely no sense but were hilarious. The logic also did not make sense a lot of the time. Finally, a lot of the time was spent on learning how text generation works in the setting of neural networks, so we did not have enough time to implement everything we wanted.

Accomplishments that we're proud of

That we were able to learn so much about text generation in a field that is widely growing. We were very proud that we were able to spend many hours focusing on such a widely interesting topic.

What we learned

We learned how to generate text from an existing corpus using a neural network. We learned the usefulness of LSTMs. Even though we did not implement them due to bugs and the time constraint, we learned a lot! We also learned different ways to generate text and the structure necessary to make these things useful.

What's next for A2S

We will implement A2S with three LSTMs in series on top of the neural network for better results. We can also generalize this text generation tool to other authors with significant corpuses to create specialized text generations.

Built With

Share this project:

Updates