Professional Documents
Culture Documents
MICCAI Educational Challenge
MICCAI Educational Challenge
For starters, the RAM available on your unit or on a Cloud solution such as Google
Colab is the main limiter. These pre-trained architectures will load the images as they go, the
amount of RAM mounting considerably as each added image leads to a weight update. Finally,
it will crash the moment when the amount of images induces too many weight changes for the
remaining RAM.
The solution is batching. By batching, the CNN deals with one batch of images at once
and then it adapts the weights. The weight of each CNN node updates only once in a 32-image
batch, rather than updating with each image. This reduces the CNN’s training time, without
compromising the learning rate.
For the vanilla example of Keras dataloader (batching) or PyTorch dataloader, you can
access their documentation at these links: https://keras.io/api/data_loading/#timeseries-data-
loading and https://pytorch.org/tutorials/recipes/recipes/loading_data_recipe.html. To label the
datasets, each folder should contain a folder of the class with its corresponding images. For
instance, in the training folder there would be four folders titled with the names of a color
containing the corresponding images.
# Freeze the base model's layers & establish the number of classes
base_model.trainable = False
num_classes = 4
model = tf.keras.models.Model(inputs=base_model.input,
outputs=predictions)