Catalog / Keras Cheat Sheet
Keras Cheat Sheet
A concise cheat sheet for Keras, covering fundamental concepts, common layers, model building, training, and evaluation techniques for deep learning.
Core Concepts
Sequential Model
A linear stack of layers.
|
Adding Layers:
|
Functional API
A more flexible way to define models as graphs of layers.
|
Layers
|
Fully connected layer. |
|
2D convolutional layer (for images). |
|
Max pooling layer. |
|
Long Short-Term Memory layer (for sequences). |
|
Embedding layer (for representing words as vectors). |
Model Building
Defining the Model
Using the Sequential API:
Using the Functional API:
|
Compiling the Model
Specifying the optimizer, loss function, and metrics.
|
Optimizers: |
Common Layers
|
A fully-connected layer with ReLU activation. |
|
2D convolutional layer for image processing. |
|
Max pooling layer to reduce spatial dimensions. |
|
Dropout layer to prevent overfitting. |
Training and Evaluation
Training the Model
Training the model on the training data.
Parameters: |
Callbacks:
|
Evaluating the Model
Evaluating the model on the test data.
|
Prediction
Making predictions with the model.
|
Advanced Features
Regularization
L1 Regularization |
Adds a penalty equal to the absolute value of the magnitude of coefficients.
|
L2 Regularization |
Adds a penalty equal to the square of the magnitude of coefficients.
|
Elastic Net Regularization |
A combination of L1 and L2 regularization.
|
Batch Normalization
Normalizes the activations of the previous layer at each batch, i.e. applies a transformation that maintains the mean activation close to 0 and the activation standard deviation close to 1.
|
Saving and Loading Models
Saving the model:
Loading the model:
|