Using pre-trained word2vec with LSTM for word generation
I've created a gist with a simple generator that builds on top of your initial idea: it's an LSTM network wired to the pre-trained word2vec embeddings, trained to predict the next word in a sentence. The data is the list of abstracts from arXiv website.. I'll highlight the most important parts here. Gensim Word2Vec. Your code is fine, except for the number of iterations to train it.