Unsupervised 2 - Autoencoder
Introduction
Supervised learning: fit(X, Y), train(X, Y), predict(X, Y)
Autoencoder: train(X, X), predict itself
x is real value = squared error (regression)
still use cross-entropy
use sigmoid for a layer, since all output are [0, 1]
share weights will be one option
PS:
sigmoid = binary classification (Sum may not be 1, and output is always in [0, 1])
softmax = multi classification (Sum = 1)
Regularization = way to generalization
append dataset, like rotate or shift image
add some noise(Gaussian): X + epsilon
set some X to 0
Last updated
Was this helpful?