Question - How can I use Keras with datasets that don’t fit in memory?
Answer -
You should use the tf.data API to create tf.data.Dataset objects — an abstraction over a data pipeline that can pull data from local disk, from a distributed file system, from GCS, etc., as well as efficiently apply various data transformations.
For instance, the utility tf.keras.preprocessing.image_dataset_from_directory will create a dataset that reads image data from a local directory. Likewise, the utility tf.keras.preprocessing.text_dataset_from_directory will create a dataset that reads text files from a local directory.
Dataset objects can be directly passed to fit(), or can be iterated over in a custom low-level training loop.
model.fit(dataset, epochs=10, validation_data=val_dataset)