BaseDataset#

class BaseDataset(filter_inputs, array_dictionary)[source]#

Basic implementation of a module that constructs a tf.data.Dataset from the dictionary of numpy arrays, describing the digital phantom.

Methods:

__call__([batchsize, prefetch])

Returns a nested tf.data.Dataset, where the inner dataset represents the digital

Attributes:

filter_indices

Indices to obtain the filtered arrays from the original inputs

map_names

Names of simulation quantities passed as dictionary keys on construction

set_size

Number of anatomies (0th - axis) of the passed simulation quantities

__call__(batchsize=1000, prefetch=5)[source]#
Returns a nested tf.data.Dataset, where the inner dataset represents the digital

phantom per image. These datasets are batch and prefetch, and the batch yielded is a dictionary like : {‘M0’, tf.Tensor(…), …}. The keys of the dict are strings, while the values each are Tensors representing a batch of iso-chromates/grid-positions of the flattened datasets.

The return dataset is supposed to be iterated as: .. code:

for batch in dataset(batchsize=500):
    # batch = {magnetization:( ), trajectories: ( ), T1: (-1), ....}
    ...
Parameters:
  • batchsize (int) – (int) batch size

  • prefetch (int) – (int) prefetched batches

Return type:

(int, DatasetV2)

Returns:

tf.data.Dataset

filter_indices: ndarray#

Indices to obtain the filtered arrays from the original inputs

map_names: Tuple = None#

Names of simulation quantities passed as dictionary keys on construction

set_size: int = None#

Number of anatomies (0th - axis) of the passed simulation quantities