Most of the blogs that I have read talk about RNNs automatically generating paragraphs of text based on the data it is trained with. For example, RNNs that can auto-generate script for an entire episode of Silicon valley. There are some RNNs that generate text based on a sentence given as input. For example, generating movie review based on an input sentence.
My question is, can a RNN generate text based on a keyword? For example, if I type "Europe", it should generate a travel blog on Europe. If I type "Airbnb", it should generate a review of my stay. The examples might sound too generic, but the idea is to know if a RNN can generate different "styles" of text depending on the type of keyword used. If this is possible, in what way would training this model vary from training a vanilla LSTM model?
Best Answer
In theory, that's feasible, since the encoder just summarizes the input text into 1 vector in a typical sequence to sequence model, and the decoder is pretty much a conditioned language model. (I'm aware that more advanced sequence to sequence models use attention, copy mechanism, and so on.)
In practice, you'll need a training set, and given the current state-of-the-art you're probably better off doing some information retrieval to retrieve blog posts of interest given the keywords.
Similar Posts:
- Solved – recurrent neural networks for sentence similarity
- Solved – Using RNN (LSTM) for predicting one feature value of a time series
- Solved – Using RNN (LSTM) for predicting one feature value of a time series
- Solved – Capturing initial patterns when using truncated backpropagation through time (RNN/LSTM)
- Solved – Capturing initial patterns when using truncated backpropagation through time (RNN/LSTM)