A character-level language model which generates more content based on the input data, built using RNN architecture
- implement basic bigram model implementation
- implement multi-layer perceptron, ref: "A Neural Probabilistic Language Model"
- add batchnorm, ref: "Batch Normalization: Accelerating Deep Network Training by Reducing Internal Covariate Shift"
- read about problems of batchnorm, ref: "Rethinking “Batch” in BatchNorm"
- implement RNN archietecture
- read wavenet's paper, ref: "WaveNet: A generative model for raw audio"