Unsupervised 2 - Autoencoder
Introduction
Supervised learning: fit(X, Y), train(X, Y), predict(X, Y)
Autoencoder: train(X, X), predict itself
- x is real value = squared error (regression) 
- still use cross-entropy 
- use sigmoid for a layer, since all output are [0, 1] 
- share weights will be one option 
# Not shared weight
Z = X * W_h + b_h
X_hat = Z * W_o + b_o
# shared weight
Z = X * W + b_h
X_hat = Z * Wtranspose + b_oPS:
sigmoid = binary classification (Sum may not be 1, and output is always in [0, 1])
softmax = multi classification (Sum = 1)
Regularization = way to generalization
- append dataset, like rotate or shift image 
- add some noise(Gaussian): X + epsilon 
- set some X to 0 
Last updated
Was this helpful?