m2ms / fragalysis-frontend

The React, Redux frontend built by webpack
Other
1 stars 2 forks source link

Debug job failures (Access Control) #1047

Open alanbchristie opened 1 year ago

alanbchristie commented 1 year ago

Ruben's test failures with: -

These result in job execution failure and no uploaded compounds

alanbchristie commented 1 year ago

I've re-run the two jobs in Squonk with some fixes from Tim to verify their behaviour and they both now run to completion.

fragmenstein-combine

This job still fails with the following exception: -

Traceback (most recent call last):
  File "/code/merger.py", line 248, in combine_fragments
    v.combine(long_name='combine-' + str(i + 1))
  File "/opt/conda/lib/python3.9/site-packages/fragmenstein/victor/_victor_combine.py", line 59, in combine
    self._safely_do(execute=self._calculate_combination, resolve=self._resolve, reject=self._reject)
  File "/opt/conda/lib/python3.9/site-packages/fragmenstein/victor/_victor_safety.py", line 27, in _safely_do
    execute()
  File "/opt/conda/lib/python3.9/site-packages/fragmenstein/victor/_victor_combine.py", line 89, in _calculate_combination
    self._calculate_combination_thermo()
  File "/opt/conda/lib/python3.9/site-packages/fragmenstein/victor/_victor_combine.py", line 137, in _calculate_combination_thermo
    self.unminimized_pdbblock = self._plonk_monster_in_structure()
  File "/opt/conda/lib/python3.9/site-packages/fragmenstein/victor/_victor_plonk.py", line 62, in _plonk_monster_in_structure
    return self._plonk_monster_in_structure_minimal()
  File "/opt/conda/lib/python3.9/site-packages/fragmenstein/victor/_victor_plonk.py", line 93, in _plonk_monster_in_structure_minimal
    raise ValueError(f'Residue {self.ligand_resi} already exists in structure')
ValueError: Residue 1B already exists in structure

Consequently the output (merged.sdf) is empty.

We're using fragmenstein == 0.10 (March 8th).

Tim is in discussions with Matteo on a resolution.

fragmenstein-combine-multi-scoring

This job no-longer exhibits the Divide-by-zero exception and appears to run to completion without error. It still does not create an output (merged.sdf is present but empty).

Can we expect this job to generate an output?

phraenquex commented 1 year ago

Three kinds of errors for @alanbchristie .

  1. Fails because it generates no output (no valid merges). (@alanbchristie has something)
  2. Fails because it throws unexpected (fatal) errors. (@alanbchristie and @matteoferla to generate something in Media directory.)
  3. Fails because it throws programmatic exception - see comment above.
matteoferla commented 1 year ago

It has probably been addressed already. As it was said no test jobs in the frontend —or so I understood. Can the Fragmenstein pipeline for now be changed with something simple on the lines of (with correct outputs):

import enum
import ctypes
import warnings
import random

class ProgamError(Exception):
    pass

class ErrorTypes(enum.Enum):
    no_error = 1
    warned = 2
    bad_output = 3 # not the result we wanted
    codebase_error = 4 # the code is crap
    programmatic_error = 5 # the code did it on purpose
    segfault = 6

    def __call__(self):
        """
        Return zero if possible
        """
        cls = self.__class__
        if self.value == cls.no_error:
            return 0
        if self.value == cls.warned:
            warnings.warn('Warning')
            return 0
        if self.value == cls.warned:
            return 1
        elif  self.value == cls.codebase_error:
            return 0/0
        elif self.value == cls.programmatic_error:
            raise ProgamError
        elif self.value == cls.segfault:
            ctypes.string_at(0)

    @classmethod
    def random(cls):
        choice: int = random.randrange(len(ErrorTypes))
        cls(choice)

and made to call ErrorTypes.random() on each call? The whole discussion was about error handling after all so I am confused about not compartmentalising the task. (Obviously, I will address the pipeline file issue)

alanbchristie commented 1 year ago

Having re-tested, the jobs all behave as expected with the (i.e. there are no unhandled exceptions using the current inputs).