What is an epoch in the context of training model parameters?
Tuesday, 06 May 2025 by Carie Hughes
In the context of training model parameters within machine learning, an epoch is a fundamental concept that refers to one complete pass through the entire training dataset. During this pass, the learning algorithm processes each example in the dataset to update the model's parameters. This process is important for the model to learn from the
- Published in Artificial Intelligence, EITC/AI/GCML Google Cloud Machine Learning, First steps in Machine Learning, Plain and simple estimators
Tagged under: Artificial Intelligence, Epoch, Machine Learning, Model Training, Neural Networks, Optimization Algorithms
Are batch size, epoch and dataset size all hyperparameters?
Thursday, 07 March 2024 by Jose' da Cruz
Batch size, epoch, and dataset size are indeed important aspects in machine learning and are commonly referred to as hyperparameters. To understand this concept, let's consider each term individually. Batch size: The batch size is a hyperparameter that defines the number of samples processed before the model's weights are updated during training. It plays a
- Published in Artificial Intelligence, EITC/AI/GCML Google Cloud Machine Learning, First steps in Machine Learning, The 7 steps of machine learning
Tagged under: Artificial Intelligence, Batch Size, Dataset Size, Epoch, Hyperparameters, Machine Learning