A linear stack of layers.
Useful for simple, feed-forward networks.
model = keras.Sequential([
layers.Dense(64, activation='relu', input_shape=(input_dim,)),
layers.Dense(10, activation='softmax')
])
A concise cheat sheet for Keras, covering fundamental concepts, common layers, model building, training, and evaluation techniques for deep learning.
A linear stack of layers.
|
Adding Layers:
|
A more flexible way to define models as graphs of layers.
|
|
Fully connected layer. |
|
2D convolutional layer (for images). |
|
Max pooling layer. |
|
Long Short-Term Memory layer (for sequences). |
|
Embedding layer (for representing words as vectors). |
Using the Sequential API:
Using the Functional API:
|
Specifying the optimizer, loss function, and metrics.
|
Optimizers: |
|
A fully-connected layer with ReLU activation. |
|
2D convolutional layer for image processing. |
|
Max pooling layer to reduce spatial dimensions. |
|
Dropout layer to prevent overfitting. |
Training the model on the training data.
Parameters: |
Callbacks:
|
Evaluating the model on the test data.
|
Making predictions with the model.
|
L1 Regularization |
Adds a penalty equal to the absolute value of the magnitude of coefficients.
|
L2 Regularization |
Adds a penalty equal to the square of the magnitude of coefficients.
|
Elastic Net Regularization |
A combination of L1 and L2 regularization.
|
Normalizes the activations of the previous layer at each batch, i.e. applies a transformation that maintains the mean activation close to 0 and the activation standard deviation close to 1.
|
Saving the model:
Loading the model:
|