1) The input dataset from Memegenerator is a bit weird. More importantly, it does not distinctly identify top and bottom texts (some have a capital letter to signifify the start of the bottom text, which isn't always true). A good technique when encoding text for these types of things is to use a control token (e.g. a newline) to indicate these types of behaviors. (the conclusion notes this problem: "One example would be to train on a dataset that includes the break point in the text between upper and lower for the image. These were chosen manually here and are important for the humor impact of the meme.")
2) The use of GLoVe embeddings don't make as much sense here, even as a base. Generally the embeddings work best on text which follows real-world word usage, which memes do not follow. (in this case, it's better to let the network train the embeddings from scratch)
3) A 512-cell LSTM might be too big for a word-level model of that size; since the text follows rules, a 256-cell Bidirectional might work better.
You can use 512-cell LSTMs if you have a lot of text, though.
I thought it was funny though that Richard Socher, one of the authors of GLoVe and NLP researcher is pictured in the generated memes on p. 8. ("the face you make when")
This Artificial Intelligence Learned to Create Its Own Memes and the Results will Make you ROFL!!
How scientists trained an AI to create memes by looking at images
The end is near. The singluarity is here. Run for your lives!1!!
The other generated images are just dumb.
If not, I'll brb, need to set up some websites / facebook accounts.
You could tell it was automated, because every once in a while, a very reddit specific meme would appear on the 9gag front page, with a bunch of confused comments from 9gag users who didn't understand it. Here's a writeup from a couple of years ago on it [1]
I don't doubt that other clickbait sites like BoredPanda do exactly the same thing.
[1] https://www.reddit.com/r/pcmasterrace/comments/3z2wvf/about_...
If this was submitted we are certainly in the dankest timeline.