Data manager#

file_io.data_manager.save(data, path, max_depth=2, max_length=100, compress_np=False, legacy_mode=False, legacy_params=None)#

Saves data using the data manager. Allows for the convenient storage of large unbalanced data structures without memory spikes.

Parameters:
  • data – The data to be stored.

  • path – Location for data storage.

  • max_depth – The depth to which folders are created prior to storing data via pickle.

  • max_length – Maximum length of a sublist within the to be saved object.

  • compress_np – Specifies whether numpy arrays are to be compressed.

  • legacy_modedepricated Will be removed in a future version.

  • legacy_paramsdepricated Will be removed in a future version.

file_io.data_manager.load(path, legacy_mode=False)#

Loads data stored via the data_manager.

Parameters:
  • path – Location from which the data is to be read.

  • legacy_modedeprecated Will be removed in a future version.

Returns:

The read data.

The following code example shows how to load and save data using the data manager.

import random
import numpy as np
import shutil

import finn.file_io.data_manager as dm

#Configure sample data
channel_count = 32
frequency = [random.randint(5, 50) for _ in range(channel_count)]
data_range = np.arange(0, 100)
frequency_sampling = 200

#Generate some sample data
epoch_count = 10
state_count = 2

raw_data = [[[None for _ in range(channel_count)] for _ in range(epoch_count)] for _ in range(state_count)]
for (state_idx, _) in enumerate(range(state_count)):
    for (epoch_idx, _) in enumerate(range(epoch_count)):
        for ch_idx in range(channel_count):
            genuine_signal = np.sin(2 * np.pi * frequency[ch_idx] * data_range / frequency_sampling)

            raw_data[state_idx][epoch_idx][ch_idx] = genuine_signal

#Save data
dm.save(raw_data, "test_file", max_depth = 2)

#Load data
loaded_data = dm.load("test_file")

if ((np.asarray(loaded_data) == np.asarray(raw_data)).all()):
    print("Data saved and loaded successfully.")
else:
    print("Error saving/loading data.")