Get in touch
or send us a question?
CONTACT

Transfer learning (Example)

import tensorflow as tf
from tensorflow.keras import layers, models
import tensorflow_hub as hub

Load dataset

URL = ‘https://storage.googleapis.com/mledu-datasets/cats_and_dogs_filtered.zip’
path_to_zip = tf.keras.utils.get_file(‘cats_and_dogs.zip’, origin=URL, extract=True)
PATH = path_to_zip.replace(‘cats_and_dogs.zip’, ‘cats_and_dogs_filtered’)

train_dir = PATH + ‘/train’
val_dir = PATH + ‘/validation’

Create datasets

train_ds = tf.keras.utils.image_dataset_from_directory(train_dir, image_size=(224, 224), batch_size=32)
val_ds = tf.keras.utils.image_dataset_from_directory(val_dir, image_size=(224, 224), batch_size=32)

Load pre-trained model from TensorFlow Hub

feature_extractor_url = “https://tfhub.dev/google/tf2-preview/mobilenet_v2/feature_vector/4”
feature_extractor_layer = hub.KerasLayer(feature_extractor_url, input_shape=(224,224,3), trainable=False)

Build model

model = models.Sequential([
feature_extractor_layer,
layers.Dense(2, activation=’softmax’) # 2 classes: cat and dog
])

model.compile(optimizer=’adam’,
loss=’sparse_categorical_crossentropy’,
metrics=[‘accuracy’])

Train model

model.fit(train_ds, validation_data=val_ds, epochs=5)

Key Concepts

  • Feature Extraction: Use the pre-trained model as a fixed feature extractor.
  • Fine-Tuning: Optionally unfreeze some layers and retrain them on your data.
  • Efficiency: You don’t need a huge dataset—transfer learning works well with limited data.