Tensorflow Train Batches For Multiple Epochs?
I don't understand how to run the result of tf.train.batch for multiple epochs. It runs out once of course and I don't know how to restart it. Maybe I can repeat it using tile, wh
Solution 1:
tf.train.batch
simply groups upstream samples into batches, and nothing more. It is meant to be used at the end of an input pipeline. Data and epochs are dealt with upstream.
For example, if your training data fits into a tensor, you could use tf.train.slice_input_producer
to produce samples. This function has arguments for shuffling and epochs.
Post a Comment for "Tensorflow Train Batches For Multiple Epochs?"