Catalog / TensorFlow Cheat Sheet
TensorFlow Cheat Sheet
A quick reference guide to TensorFlow, covering its core concepts, common operations, and essential functions for building and training machine learning models.
Core Concepts
Tensors
Definition: Multidimensional arrays representing data. Key Properties:
|
Creating Tensors:
|
Tensor Operations:
|
Variables
Definition: Tensors that can be modified during computation. Used to store model parameters. Initialization:
|
Updating Variables:
|
Graphs and Sessions (TensorFlow 1.x)
Note: This section refers to TensorFlow 1.x. TensorFlow 2.x uses eager execution by default. Graph: A computational graph representing the model structure.
|
Building Models
Layers
Definition: Building blocks of neural networks. Perform specific computations on input tensors. Common Layers:
|
Example:
|
Models
Sequential Model: A linear stack of layers.
Functional API: More flexible way to define complex models with shared layers and multiple inputs/outputs.
|
Loss Functions
Definition: Measures the difference between predicted and actual values. Common Loss Functions:
|
Optimizers
Definition: Algorithms for updating model parameters to minimize the loss function. Common Optimizers:
|
Training and Evaluation
Model Compilation
Purpose: Configures the model for training.
|
Model Training
Purpose: Trains the model using the training data.
|
Model Evaluation
Purpose: Evaluates the model’s performance on the test data.
|
Prediction
Purpose: Generates predictions on new data.
|
Saving and Loading Models
Saving Models
Purpose: Persists the trained model to disk.
|
Loading Models
Purpose: Restores a saved model from disk.
|
Callbacks
Purpose: Automate tasks during training (e.g., saving checkpoints, stopping early). Common Callbacks:
|