Closed a-t-0 closed 1 year ago
redundancy.redunancy.py
to somethign like redundancy_mechanism/redundancy_creation.pyBackup in case you need to export graph from lava getting the monitors error:
# pylint: disable=R0913
remove_monitors_from_get_degree(get_degree)
# pylint: disable=R0801
unique_hash = get_unique_hash(
test_object.final_dead_neuron_names,
has_adaptation,
has_radiation,
iteration,
m,
neuron_death_probability,
seed,
sim_time,
)
@typechecked
def get_unique_hash(
dead_neuron_names: List[str],
has_adaptation: bool,
has_radiation: bool,
iteration: int,
m: int,
neuron_death_probability: float,
seed: int,
sim_time: int,
) -> int:
"""
:param dead_neuron_names:
:param has_adaptation:
:param has_radiation: Indicates whether the experiment simulates radiation
or not.
param iteration: The initialisation iteration that is used.
:param m: The amount of approximation iterations used in the MDSA
approximation.
:param neuron_death_probability:
:param seed: The value of the random seed used for this test.
:param sim_time: Nr. of timesteps for which the experiment is ran.
"""
# pylint: disable=R0913
# One could perform work to cluster the properties into different objects.
# Then one could read out these separate objects and write them to file.
if dead_neuron_names is None:
dead_neuron_names = []
# Needs to be frozen, because otherwise the set is mutable, and mutable
# Python objects are not hashable.
hash_set = frozenset(
[
frozenset(dead_neuron_names),
has_adaptation,
has_radiation,
iteration,
m,
neuron_death_probability,
seed,
sim_time,
]
)
# set(dead_neuron_names),
# hashable_set = [frozenset(i) for i in hash_set]
# return hash(hashable_set)
return hash(hash_set)
@typechecked
def uniq(lst: List) -> Generator:
"""
:param lst:
"""
last = object()
for item in lst:
if item == last:
continue
yield item
last = item
@typechecked
def sort_and_deduplicate(some_list: List[Any]) -> List[Any]:
"""
:param list:
"""
return list(uniq(sorted(some_list, reverse=True)))
Done.
[x] Create an experiment setup/performance module, including results computation.
[x] Create a algo to SNN module.
"to snn conversion.py"
and averify_results.py
for the respective algorithm. Similarly for the test folder, allow separate testing of the algorithm in a separate folder. (Allow testing the neuman algo, and the snn algo/conversion).[x] Create a brain adaptation on SNN module.
[x] Create a radiation on SNN module.
[x] Create a backend simulation module.