(original) (raw)
TensorFlow.js Text Generation:
Train a LSTM (Long Short Term Memory) model to generate text
Description
This example allows you to train a model to generate text in the style of some existing source text. The model is designed to predict the next character in a text given some preceding string of characters. Doing this repetedly builds up a text, charachter by character.
Status
Please select a text data source or enter your custom text in the text box below and click "Load source data".
Source Data
Model Loading/Creation
Model saved in IndexedDB:Load text data first.
LSTM layer size(s) (e.g., 128 or 100,50):
Model Training
It can take a while to generate an effective model. Try increasing the number of epochs to improve the results, we have found that about 50-100 epochs are needed to start generating reasonable text.
Number of Epochs:
Examples per epoch:
Batch size:
Validation spilt:
Learning rate:
Text Generation Parameters
To generate text the model needs to have some number of preceding characters from which it continues, we call these characters the seed text. You can type one in, or we will extract a random substring from the input text to be the seed text. Note that the seed text must be at least 40 charachters long.
Length of generated text:
Generation temperature:
Seed text:
Model Output
Generated text: