Sangwon91 / PORMAKE

Python library for the construction of porous materials using topology and building blocks.
MIT License
55 stars 12 forks source link

Topology loading #27

Open gianmarco-terrones opened 1 month ago

gianmarco-terrones commented 1 month ago

Hello, it seems 40 topologies listed in _get_topology_list cannot be loaded. I run into this issue with the following script:

import pormake as pm

database = pm.Database()
topologies = database._get_topology_list()
topologies.sort()

failed_topo_loads = []

num_topo = len(topologies)
for _i, topo_name in enumerate(topologies):
    print(f'topology is {topo_name}. Topology {_i+1} out of {num_topo}.')

    try:
        topo = database.get_topo(topo_name)
    except:
        print(f'Skipping {topo_name}')
        '''
        Error looks like this:
        >>> Topology parsing fails: ast-d
        >>> Topology loading is failed: Invalid cgd file: ast-d.

        or

        >>> Topology loading is failed: list index out of range.

        or

        >>> Topology loading is failed: zero-size array to reduction operation maximum which has no identity.

        or

        >>> Topology loading is failed: could not convert string to float: 'V1'.
        '''
        failed_topo_loads.append(topo_name)
        continue

print(f'The topologies that failed to load are {failed_topo_loads}')
print(f'The number of topologies that could not be loaded is {len(failed_topo_loads)}')

'''
The topologies that failed to load are ['ast-d', 'baz-a', 'cdh', 'cds-t', 'crt', 'css', 'ddi', 'ddy', 'dgo', 'dia-x', 'dnf-a', 'elv', 'ffg-a', 'ffj-a', 'fnx', 'ibb', 'ild', 'jsm', 'lcw_component_3', 'lcz', 'llw-z', 'mab', 'mhq', 'mmo', 'nbo-x', 'nia-d', 'rht-x', 'roa', 'scu-h', 'she-d', 'tcb', 'ten', 'tfy-a', 'tpt', 'tsn', 'utx', 'xbn', 'yzh', 'zim', 'zst']
The number of topologies that could not be loaded is 40
'''
Sangwon91 commented 1 month ago

Yes, that's correct. The current CGD parser in the code fails to load in specific topologies. I can't give you an exact timeline, but I'll try to resolve this issue soon. Thank you!