kipoi.readers

Readers useful for creating new dataloaders

  • HDF5Reader

HDF5Reader

HDF5Reader(self, file_path)

Read the HDF5 file. Convenience wrapper around h5py.File

Arguments

  • file_path: File path to an HDF5 file

ls

HDF5Reader.ls(self)

Recursively list the arrays

load_all

HDF5Reader.load_all(self, unflatten=True)

Load the whole file

Arguments

  • unflatten: if True, nest/unflatten the keys. e.g. an entry f['/foo/bar'] would need to be accessed
  • using two nested get call: f['foo']['bar']

batch_iter

HDF5Reader.batch_iter(self, batch_size=16, **kwargs)

Create a batch iterator over the whole file

Arguments

  • batch_size: batch size
  • **kwargs: ignored argument. Used for consistency with other dataloaders

open

HDF5Reader.open(self)

Open the file

close

HDF5Reader.close(self)

Close the file

load

HDF5Reader.load(file_path, unflatten=True)

Load the data all at once (classmethod).

Arguments

  • file_path: HDF5 file path
  • unflatten: see load_all

ZarrReader

ZarrReader(self, file_path)

Read the Zarr file. Convenience wrapper around zarr.group

Arguments

  • file_path: File path to an Zarr file

ls

ZarrReader.ls(self)

Recursively list the arrays

load_all

ZarrReader.load_all(self, unflatten=True)

Load the whole file

Arguments

  • unflatten: if True, nest/unflatten the keys. e.g. an entry f['/foo/bar'] would need to be accessed
  • using two nested get call: f['foo']['bar']

batch_iter

ZarrReader.batch_iter(self, batch_size=16, **kwargs)

Create a batch iterator over the whole file

Arguments

  • batch_size: batch size
  • **kwargs: ignored argument. Used for consistency with other dataloaders

open

ZarrReader.open(self)

Open the file

close

ZarrReader.close(self)

Close the file

load

ZarrReader.load(file_path, unflatten=True)

Load the data all at once (classmethod).

Arguments

  • file_path: Zarr file path
  • unflatten: see load_all