galaxyproject / iwc

Galaxy Workflows maintained by the Intergalactic Workflow Commission
https://iwc.galaxyproject.org
31 stars 72 forks source link

Split collections into 2 #593

Open lldelisle opened 2 weeks ago

lldelisle commented 2 weeks ago

See #583

github-actions[bot] commented 2 weeks ago

Test Results (powered by Planemo)

Test Summary

Test State Count
Total 3
Passed 1
Error 1
Failure 1
Skipped 0
Errored Tests *
❌ Split_collection_using_tabular.ga_0
**Execution Problem:** * ``` File [/home/runner/work/iwc/iwc/workflows/data-manipulation/split-collection/group_asignment.txt] does not exist - parent directory [/home/runner/work/iwc/iwc/workflows/data-manipulation/split-collection] does exist, cwd is [/home/runner/work/iwc/iwc] ```

Failed Tests *
❌ Split_collection_using_comma_separated_list.ga_0
**Problems**: * ``` Output collection 'collection_first_group': failed to find identifier 'cat1_1' in the tool generated elements [] ``` #### Workflow invocation details * Invocation Messages *
Steps - **Step 1: Input Dataset Collection**: * step_state: scheduled - **Step 2: Groups**: * step_state: scheduled - **Step 3: toolshed.g2.bx.psu.edu/repos/iuc/collection\_element\_identifiers/collection\_element\_identifiers/0.0.2**: * step_state: scheduled *
Jobs - **Job 1:** * Job state is ok **Command Line:** * ```console mv '/tmp/tmpp6_pn0hn/job_working_directory/000/11/configs/tmpvomaori0' '/tmp/tmpp6_pn0hn/job_working_directory/000/11/outputs/dataset_0d51c252-b5b3-4dae-b753-4d69d6231f6f.dat' ``` **Exit Code:** * ```console 0 ``` **Traceback:** * ```console ``` **Job Parameters:** * | Job parameter | Parameter value | | ------------- | --------------- | | \_\_input\_ext | ` "txt" ` | | \_\_workflow\_invocation\_uuid\_\_ | ` "ccd2a75ca0ed11efb5c6618888f353e2" ` | | chromInfo | ` "/tmp/tmpp6_pn0hn/galaxy-dev/tool-data/shared/ucsc/chrom/?.len" ` | | dbkey | ` "?" ` | | input\_collection | ` {"values": [{"id": 2, "src": "hdca"}]} ` |
 - **Step 4: Create a dataset from text**:

    * step_state: scheduled

    * <details><summary>Jobs</summary>

      - **Job 1:**

        * Job state is ok

        **Command Line:**

         * ```console
           times=1; yes -- '1,1,1,2,3' 2>/dev/null | head -n $times >> '/tmp/tmpp6_pn0hn/job_working_directory/000/12/outputs/dataset_122efa88-af32-488c-bd82-41dea16a24f0.dat';
           ```
        **Exit Code:**

         * ```console
           0
           ```
        **Traceback:**

         * ```console

           ```
        **Job Parameters:**

         *   | Job parameter | Parameter value |
             | ------------- | --------------- |
             | \_\_input\_ext | ` "input" ` |
             | \_\_workflow\_invocation\_uuid\_\_ | ` "ccd2a75ca0ed11efb5c6618888f353e2" ` |
             | chromInfo | ` "/tmp/tmpp6_pn0hn/galaxy-dev/tool-data/shared/ucsc/chrom/?.len" ` |
             | dbkey | ` "?" ` |
             | token\_set | ` [{"__index__": 0, "line": "1,1,1,2,3", "repeat_select": {"__current_case__": 0, "repeat_select_opts": "user", "times": "1"}}] ` |

      </details>

 - **Step 5: Replace comma by back to line**:

    * step_state: scheduled

    * <details><summary>Jobs</summary>

      - **Job 1:**

        * Job state is ok

        **Command Line:**

         * ```console
           perl '/tmp/shed_dir/toolshed.g2.bx.psu.edu/repos/bgruening/text_processing/86755160afbf/text_processing/find_and_replace' -o '/tmp/tmpp6_pn0hn/job_working_directory/000/13/outputs/dataset_b82ee5cc-f610-4735-90cd-76c90e209252.dat' -g    -r ',' '\n' '/tmp/tmpp6_pn0hn/files/1/2/2/dataset_122efa88-af32-488c-bd82-41dea16a24f0.dat'
           ```
        **Exit Code:**

         * ```console
           0
           ```
        **Traceback:**

         * ```console

           ```
        **Job Parameters:**

         *   | Job parameter | Parameter value |
             | ------------- | --------------- |
             | \_\_input\_ext | ` "input" ` |
             | \_\_workflow\_invocation\_uuid\_\_ | ` "ccd2a75ca0ed11efb5c6618888f353e2" ` |
             | chromInfo | ` "/tmp/tmpp6_pn0hn/galaxy-dev/tool-data/shared/ucsc/chrom/?.len" ` |
             | dbkey | ` "?" ` |
             | find\_and\_replace | ` [{"__index__": 0, "caseinsensitive": false, "find_pattern": ",", "global": true, "is_regex": true, "replace_pattern": "\\n", "searchwhere": {"__current_case__": 0, "searchwhere_select": "line"}, "skip_first_line": false, "wholewords": false}] ` |

      </details>

 - **Step 6: Put side by side identifiers and groups**:

    * step_state: scheduled

    * <details><summary>Jobs</summary>

      - **Job 1:**

        * Job state is ok

        **Command Line:**

         * ```console
           perl '/tmp/tmpp6_pn0hn/galaxy-dev/tools/filters/pasteWrapper.pl' '/tmp/tmpp6_pn0hn/files/0/d/5/dataset_0d51c252-b5b3-4dae-b753-4d69d6231f6f.dat' '/tmp/tmpp6_pn0hn/files/b/8/2/dataset_b82ee5cc-f610-4735-90cd-76c90e209252.dat' T '/tmp/tmpp6_pn0hn/job_working_directory/000/14/outputs/dataset_190210ea-92a6-460b-9211-ff72ccf98c19.dat'
           ```
        **Exit Code:**

         * ```console
           0
           ```
        **Traceback:**

         * ```console

           ```
        **Job Parameters:**

         *   | Job parameter | Parameter value |
             | ------------- | --------------- |
             | \_\_input\_ext | ` "txt" ` |
             | \_\_workflow\_invocation\_uuid\_\_ | ` "ccd2a75ca0ed11efb5c6618888f353e2" ` |
             | chromInfo | ` "/tmp/tmpp6_pn0hn/galaxy-dev/tool-data/shared/ucsc/chrom/?.len" ` |
             | dbkey | ` "?" ` |
             | delimiter | ` "T" ` |

      </details>

 - **Step 7: Unlabelled step**:

    * step_state: scheduled

    * <details><summary>Subworkflow Steps</summary>

      - **Step 1: Input Dataset Collection**:

         * step_state: scheduled

      - **Step 2: identifier mapping**:

         * step_state: scheduled

      - **Step 3: get the first group value**:

         * step_state: scheduled

         * <details><summary>Jobs</summary>

           - **Job 1:**

             * Job state is ok

             **Command Line:**

              * ```console
                env -i $(which awk) --sandbox -v FS='   ' -v OFS='  ' --re-interval -f '/tmp/tmpp6_pn0hn/job_working_directory/000/15/configs/tmp73gtl51n' '/tmp/tmpp6_pn0hn/files/1/9/0/dataset_190210ea-92a6-460b-9211-ff72ccf98c19.dat' > '/tmp/tmpp6_pn0hn/job_working_directory/000/15/outputs/dataset_2038971d-b463-4da9-b3b3-8c21d87b0638.dat'
                ```
             **Exit Code:**

              * ```console
                0
                ```
             **Traceback:**

              * ```console

                ```
             **Job Parameters:**

              *   | Job parameter | Parameter value |
                  | ------------- | --------------- |
                  | \_\_input\_ext | ` "input" ` |
                  | \_\_workflow\_invocation\_uuid\_\_ | ` "ccd2a75da0ed11efb5c6618888f353e2" ` |
                  | chromInfo | ` "/tmp/tmpp6_pn0hn/galaxy-dev/tool-data/shared/ucsc/chrom/?.len" ` |
                  | code | ` "NR==1{print $2}" ` |
                  | dbkey | ` "?" ` |

           </details>

      - **Step 4: convert to parameter**:

         * step_state: scheduled

         * <details><summary>Jobs</summary>

           - **Job 1:**

             * Job state is ok

             **Command Line:**

              * ```console
                cd ../; python _evaluate_expression_.py
                ```
             **Exit Code:**

              * ```console
                0
                ```
             **Traceback:**

              * ```console

                ```
             **Job Parameters:**

              *   | Job parameter | Parameter value |
                  | ------------- | --------------- |
                  | \_\_input\_ext | ` "tabular" ` |
                  | \_\_workflow\_invocation\_uuid\_\_ | ` "ccd2a75da0ed11efb5c6618888f353e2" ` |
                  | chromInfo | ` "/tmp/tmpp6_pn0hn/galaxy-dev/tool-data/shared/ucsc/chrom/?.len" ` |
                  | dbkey | ` "?" ` |
                  | param\_type | ` "text" ` |
                  | remove\_newlines | ` true ` |

           </details>

      - **Step 5: make filter condition**:

         * step_state: scheduled

         * <details><summary>Jobs</summary>

           - **Job 1:**

             * Job state is ok

             **Command Line:**

              * ```console
                cd ../; python _evaluate_expression_.py
                ```
             **Exit Code:**

              * ```console
                0
                ```
             **Traceback:**

              * ```console

                ```
             **Job Parameters:**

              *   | Job parameter | Parameter value |
                  | ------------- | --------------- |
                  | \_\_input\_ext | ` "input" ` |
                  | \_\_workflow\_invocation\_uuid\_\_ | ` "ccd2a75da0ed11efb5c6618888f353e2" ` |
                  | chromInfo | ` "/tmp/tmpp6_pn0hn/galaxy-dev/tool-data/shared/ucsc/chrom/?.len" ` |
                  | components | ` [{"__index__": 0, "param_type": {"__current_case__": 0, "component_value": "c2 == \"", "select_param_type": "text"}}, {"__index__": 1, "param_type": {"__current_case__": 0, "component_value": "1", "select_param_type": "text"}}, {"__index__": 2, "param_type": {"__current_case__": 0, "component_value": "\"", "select_param_type": "text"}}] ` |
                  | dbkey | ` "?" ` |

           </details>

      - **Step 6: filter tabular to get only lines with first group**:

         * step_state: scheduled

         * <details><summary>Jobs</summary>

           - **Job 1:**

             * Job state is ok

             **Command Line:**

              * ```console
                python '/tmp/tmpp6_pn0hn/galaxy-dev/tools/stats/filtering.py' '/tmp/tmpp6_pn0hn/files/1/9/0/dataset_190210ea-92a6-460b-9211-ff72ccf98c19.dat' '/tmp/tmpp6_pn0hn/job_working_directory/000/18/outputs/dataset_44ab842e-cac2-46e6-947e-914e1a093393.dat' '/tmp/tmpp6_pn0hn/job_working_directory/000/18/configs/tmpmroapp_c' 2 "str,int" 0
                ```
             **Exit Code:**

              * ```console
                0
                ```
             **Standard Output:**

              * ```console
                Filtering with c2 == "1", 
                kept 0.00% of 5 valid lines (5 total lines).

                ```
             **Traceback:**

              * ```console

                ```
             **Job Parameters:**

              *   | Job parameter | Parameter value |
                  | ------------- | --------------- |
                  | \_\_input\_ext | ` "tabular" ` |
                  | \_\_workflow\_invocation\_uuid\_\_ | ` "ccd2a75da0ed11efb5c6618888f353e2" ` |
                  | chromInfo | ` "/tmp/tmpp6_pn0hn/galaxy-dev/tool-data/shared/ucsc/chrom/?.len" ` |
                  | cond | ` "c2 == \"1\"" ` |
                  | dbkey | ` "?" ` |
                  | header\_lines | ` "0" ` |

           </details>

      - **Step 7: keep only identifiers**:

         * step_state: scheduled

         * <details><summary>Jobs</summary>

           - **Job 1:**

             * Job state is ok

             **Command Line:**

              * ```console
                perl '/tmp/tmpp6_pn0hn/galaxy-dev/tools/filters/cutWrapper.pl' '/tmp/tmpp6_pn0hn/files/4/4/a/dataset_44ab842e-cac2-46e6-947e-914e1a093393.dat' 'c1' T '/tmp/tmpp6_pn0hn/job_working_directory/000/19/outputs/dataset_dc1dd85a-4876-4dc1-b009-8e55d3b1df27.dat'
                ```
             **Exit Code:**

              * ```console
                0
                ```
             **Traceback:**

              * ```console

                ```
             **Job Parameters:**

              *   | Job parameter | Parameter value |
                  | ------------- | --------------- |
                  | \_\_input\_ext | ` "tabular" ` |
                  | \_\_workflow\_invocation\_uuid\_\_ | ` "ccd2a75da0ed11efb5c6618888f353e2" ` |
                  | chromInfo | ` "/tmp/tmpp6_pn0hn/galaxy-dev/tool-data/shared/ucsc/chrom/?.len" ` |
                  | columnList | ` "c1" ` |
                  | dbkey | ` "?" ` |
                  | delimiter | ` "T" ` |

           </details>

      - **Step 8: Split collection into 2**:

         * step_state: scheduled

         * <details><summary>Jobs</summary>

           - **Job 1:**

             * Job state is ok

             **Traceback:**

              * ```console

                ```
             **Job Parameters:**

              *   | Job parameter | Parameter value |
                  | ------------- | --------------- |
                  | \_\_workflow\_invocation\_uuid\_\_ | ` "ccd2a75da0ed11efb5c6618888f353e2" ` |
                  | how | ` {"__current_case__": 0, "filter_source": {"values": [{"id": 19, "src": "hda"}]}, "how_filter": "remove_if_absent"} ` |
                  | input | ` {"values": [{"id": 2, "src": "hdca"}]} ` |

           </details>
       </details>

  </details>
  • Other invocation details - **history_id** * 3443cf8fc2a993fb - **history_state** * ok - **invocation_id** * 65b5a0093bcdce8d - **invocation_state** * scheduled - **workflow_id** * 50b51245ddf901fd

Passed Tests *
✅ Split_collection_by_pattern_in_identifiers.ga_0
#### Workflow invocation details * Invocation Messages *
Steps - **Step 1: Input Dataset Collection**: * step_state: scheduled - **Step 2: pattern**: * step_state: scheduled - **Step 3: toolshed.g2.bx.psu.edu/repos/iuc/collection\_element\_identifiers/collection\_element\_identifiers/0.0.2**: * step_state: scheduled *
Jobs - **Job 1:** * Job state is ok **Command Line:** * ```console mv '/tmp/tmpp6_pn0hn/job_working_directory/000/26/configs/tmps3ybtech' '/tmp/tmpp6_pn0hn/job_working_directory/000/26/outputs/dataset_ecaeaadb-3138-4094-a833-fcf95abf84b9.dat' ``` **Exit Code:** * ```console 0 ``` **Traceback:** * ```console ``` **Job Parameters:** * | Job parameter | Parameter value | | ------------- | --------------- | | \_\_input\_ext | ` "txt" ` | | \_\_workflow\_invocation\_uuid\_\_ | ` "21be76c4a0ee11efb5c6618888f353e2" ` | | chromInfo | ` "/tmp/tmpp6_pn0hn/galaxy-dev/tool-data/shared/ucsc/chrom/?.len" ` | | dbkey | ` "?" ` | | input\_collection | ` {"values": [{"id": 5, "src": "hdca"}]} ` |
 - **Step 4: Select identifiers with pattern**:

    * step_state: scheduled

    * <details><summary>Jobs</summary>

      - **Job 1:**

        * Job state is ok

        **Command Line:**

         * ```console
           grep -P -A 0 -B 0 --no-group-separator  -i -- 'cat1' '/tmp/tmpp6_pn0hn/files/e/c/a/dataset_ecaeaadb-3138-4094-a833-fcf95abf84b9.dat' > '/tmp/tmpp6_pn0hn/job_working_directory/000/27/outputs/dataset_eedd0674-a2ba-4ba4-9163-901987ccad10.dat'
           ```
        **Exit Code:**

         * ```console
           0
           ```
        **Traceback:**

         * ```console

           ```
        **Job Parameters:**

         *   | Job parameter | Parameter value |
             | ------------- | --------------- |
             | \_\_input\_ext | ` "input" ` |
             | \_\_workflow\_invocation\_uuid\_\_ | ` "21be76c4a0ee11efb5c6618888f353e2" ` |
             | case\_sensitive | ` "-i" ` |
             | chromInfo | ` "/tmp/tmpp6_pn0hn/galaxy-dev/tool-data/shared/ucsc/chrom/?.len" ` |
             | color | ` "NOCOLOR" ` |
             | dbkey | ` "?" ` |
             | invert | ` "" ` |
             | lines\_after | ` "0" ` |
             | lines\_before | ` "0" ` |
             | regex\_type | ` "-P" ` |
             | url\_paste | ` "cat1" ` |

      </details>

 - **Step 5: Split collection into 2**:

    * step_state: scheduled

    * <details><summary>Jobs</summary>

      - **Job 1:**

        * Job state is ok

        **Traceback:**

         * ```console

           ```
        **Job Parameters:**

         *   | Job parameter | Parameter value |
             | ------------- | --------------- |
             | \_\_workflow\_invocation\_uuid\_\_ | ` "21be76c4a0ee11efb5c6618888f353e2" ` |
             | how | ` {"__current_case__": 0, "filter_source": {"values": [{"id": 31, "src": "hda"}]}, "how_filter": "remove_if_absent"} ` |
             | input | ` {"values": [{"id": 5, "src": "hdca"}]} ` |

      </details>
  </details>
  • Other invocation details - **history_id** * 50b51245ddf901fd - **history_state** * ok - **invocation_id** * 50b51245ddf901fd - **invocation_state** * scheduled - **workflow_id** * 9233b6e0ca46fef0

github-actions[bot] commented 2 weeks ago

Test Results (powered by Planemo)

Test Summary

Test State Count
Total 3
Passed 1
Error 1
Failure 1
Skipped 0
Errored Tests *
❌ Split_collection_using_tabular.ga_0
**Execution Problem:** * ``` File [/home/runner/work/iwc/iwc/workflows/data-manipulation/split-collection/group_asignment.txt] does not exist - parent directory [/home/runner/work/iwc/iwc/workflows/data-manipulation/split-collection] does exist, cwd is [/home/runner/work/iwc/iwc] ```

Failed Tests *
❌ Split_collection_using_comma_separated_list.ga_0
**Problems**: * ``` Output collection 'collection_first_group': failed to find identifier 'cat1_1' in the tool generated elements [] ``` #### Workflow invocation details * Invocation Messages *
Steps - **Step 1: Input Dataset Collection**: * step_state: scheduled - **Step 2: Groups**: * step_state: scheduled - **Step 3: toolshed.g2.bx.psu.edu/repos/iuc/collection\_element\_identifiers/collection\_element\_identifiers/0.0.2**: * step_state: scheduled *
Jobs - **Job 1:** * Job state is ok **Command Line:** * ```console mv '/tmp/tmpc2qy7ps4/job_working_directory/000/11/configs/tmp7ds1a737' '/tmp/tmpc2qy7ps4/job_working_directory/000/11/outputs/dataset_4a22dd7f-c9da-4190-ae86-22d3a915943f.dat' ``` **Exit Code:** * ```console 0 ``` **Traceback:** * ```console ``` **Job Parameters:** * | Job parameter | Parameter value | | ------------- | --------------- | | \_\_input\_ext | ` "txt" ` | | \_\_workflow\_invocation\_uuid\_\_ | ` "5b9b1424a0ee11ef87454dec7da58a7f" ` | | chromInfo | ` "/tmp/tmpc2qy7ps4/galaxy-dev/tool-data/shared/ucsc/chrom/?.len" ` | | dbkey | ` "?" ` | | input\_collection | ` {"values": [{"id": 2, "src": "hdca"}]} ` |
 - **Step 4: Create a dataset from text**:

    * step_state: scheduled

    * <details><summary>Jobs</summary>

      - **Job 1:**

        * Job state is ok

        **Command Line:**

         * ```console
           times=1; yes -- '1,1,1,2,3' 2>/dev/null | head -n $times >> '/tmp/tmpc2qy7ps4/job_working_directory/000/12/outputs/dataset_40aac13d-fc86-4d76-a463-7140bf08072f.dat';
           ```
        **Exit Code:**

         * ```console
           0
           ```
        **Traceback:**

         * ```console

           ```
        **Job Parameters:**

         *   | Job parameter | Parameter value |
             | ------------- | --------------- |
             | \_\_input\_ext | ` "input" ` |
             | \_\_workflow\_invocation\_uuid\_\_ | ` "5b9b1424a0ee11ef87454dec7da58a7f" ` |
             | chromInfo | ` "/tmp/tmpc2qy7ps4/galaxy-dev/tool-data/shared/ucsc/chrom/?.len" ` |
             | dbkey | ` "?" ` |
             | token\_set | ` [{"__index__": 0, "line": "1,1,1,2,3", "repeat_select": {"__current_case__": 0, "repeat_select_opts": "user", "times": "1"}}] ` |

      </details>

 - **Step 5: Replace comma by back to line**:

    * step_state: scheduled

    * <details><summary>Jobs</summary>

      - **Job 1:**

        * Job state is ok

        **Command Line:**

         * ```console
           perl '/tmp/shed_dir/toolshed.g2.bx.psu.edu/repos/bgruening/text_processing/86755160afbf/text_processing/find_and_replace' -o '/tmp/tmpc2qy7ps4/job_working_directory/000/13/outputs/dataset_91e2d4ca-11b1-4e37-a2b0-59ea41282e88.dat' -g    -r ',' '\n' '/tmp/tmpc2qy7ps4/files/4/0/a/dataset_40aac13d-fc86-4d76-a463-7140bf08072f.dat'
           ```
        **Exit Code:**

         * ```console
           0
           ```
        **Traceback:**

         * ```console

           ```
        **Job Parameters:**

         *   | Job parameter | Parameter value |
             | ------------- | --------------- |
             | \_\_input\_ext | ` "input" ` |
             | \_\_workflow\_invocation\_uuid\_\_ | ` "5b9b1424a0ee11ef87454dec7da58a7f" ` |
             | chromInfo | ` "/tmp/tmpc2qy7ps4/galaxy-dev/tool-data/shared/ucsc/chrom/?.len" ` |
             | dbkey | ` "?" ` |
             | find\_and\_replace | ` [{"__index__": 0, "caseinsensitive": false, "find_pattern": ",", "global": true, "is_regex": true, "replace_pattern": "\\n", "searchwhere": {"__current_case__": 0, "searchwhere_select": "line"}, "skip_first_line": false, "wholewords": false}] ` |

      </details>

 - **Step 6: Put side by side identifiers and groups**:

    * step_state: scheduled

    * <details><summary>Jobs</summary>

      - **Job 1:**

        * Job state is ok

        **Command Line:**

         * ```console
           perl '/tmp/tmpc2qy7ps4/galaxy-dev/tools/filters/pasteWrapper.pl' '/tmp/tmpc2qy7ps4/files/4/a/2/dataset_4a22dd7f-c9da-4190-ae86-22d3a915943f.dat' '/tmp/tmpc2qy7ps4/files/9/1/e/dataset_91e2d4ca-11b1-4e37-a2b0-59ea41282e88.dat' T '/tmp/tmpc2qy7ps4/job_working_directory/000/14/outputs/dataset_afcb2b94-aa6a-4d26-a0eb-c8dc58d83bed.dat'
           ```
        **Exit Code:**

         * ```console
           0
           ```
        **Traceback:**

         * ```console

           ```
        **Job Parameters:**

         *   | Job parameter | Parameter value |
             | ------------- | --------------- |
             | \_\_input\_ext | ` "txt" ` |
             | \_\_workflow\_invocation\_uuid\_\_ | ` "5b9b1424a0ee11ef87454dec7da58a7f" ` |
             | chromInfo | ` "/tmp/tmpc2qy7ps4/galaxy-dev/tool-data/shared/ucsc/chrom/?.len" ` |
             | dbkey | ` "?" ` |
             | delimiter | ` "T" ` |

      </details>

 - **Step 7: Unlabelled step**:

    * step_state: scheduled

    * <details><summary>Subworkflow Steps</summary>

      - **Step 1: Input Dataset Collection**:

         * step_state: scheduled

      - **Step 2: identifier mapping**:

         * step_state: scheduled

      - **Step 3: get the first group value**:

         * step_state: scheduled

         * <details><summary>Jobs</summary>

           - **Job 1:**

             * Job state is ok

             **Command Line:**

              * ```console
                env -i $(which awk) --sandbox -v FS='   ' -v OFS='  ' --re-interval -f '/tmp/tmpc2qy7ps4/job_working_directory/000/15/configs/tmpxdt19hu0' '/tmp/tmpc2qy7ps4/files/a/f/c/dataset_afcb2b94-aa6a-4d26-a0eb-c8dc58d83bed.dat' > '/tmp/tmpc2qy7ps4/job_working_directory/000/15/outputs/dataset_913be485-a966-4d50-8a44-cb37b96b8d16.dat'
                ```
             **Exit Code:**

              * ```console
                0
                ```
             **Traceback:**

              * ```console

                ```
             **Job Parameters:**

              *   | Job parameter | Parameter value |
                  | ------------- | --------------- |
                  | \_\_input\_ext | ` "input" ` |
                  | \_\_workflow\_invocation\_uuid\_\_ | ` "5b9b1425a0ee11ef87454dec7da58a7f" ` |
                  | chromInfo | ` "/tmp/tmpc2qy7ps4/galaxy-dev/tool-data/shared/ucsc/chrom/?.len" ` |
                  | code | ` "NR==1{print $2}" ` |
                  | dbkey | ` "?" ` |

           </details>

      - **Step 4: convert to parameter**:

         * step_state: scheduled

         * <details><summary>Jobs</summary>

           - **Job 1:**

             * Job state is ok

             **Command Line:**

              * ```console
                cd ../; python _evaluate_expression_.py
                ```
             **Exit Code:**

              * ```console
                0
                ```
             **Traceback:**

              * ```console

                ```
             **Job Parameters:**

              *   | Job parameter | Parameter value |
                  | ------------- | --------------- |
                  | \_\_input\_ext | ` "tabular" ` |
                  | \_\_workflow\_invocation\_uuid\_\_ | ` "5b9b1425a0ee11ef87454dec7da58a7f" ` |
                  | chromInfo | ` "/tmp/tmpc2qy7ps4/galaxy-dev/tool-data/shared/ucsc/chrom/?.len" ` |
                  | dbkey | ` "?" ` |
                  | param\_type | ` "text" ` |
                  | remove\_newlines | ` true ` |

           </details>

      - **Step 5: make filter condition**:

         * step_state: scheduled

         * <details><summary>Jobs</summary>

           - **Job 1:**

             * Job state is ok

             **Command Line:**

              * ```console
                cd ../; python _evaluate_expression_.py
                ```
             **Exit Code:**

              * ```console
                0
                ```
             **Traceback:**

              * ```console

                ```
             **Job Parameters:**

              *   | Job parameter | Parameter value |
                  | ------------- | --------------- |
                  | \_\_input\_ext | ` "input" ` |
                  | \_\_workflow\_invocation\_uuid\_\_ | ` "5b9b1425a0ee11ef87454dec7da58a7f" ` |
                  | chromInfo | ` "/tmp/tmpc2qy7ps4/galaxy-dev/tool-data/shared/ucsc/chrom/?.len" ` |
                  | components | ` [{"__index__": 0, "param_type": {"__current_case__": 0, "component_value": "c2 == \"", "select_param_type": "text"}}, {"__index__": 1, "param_type": {"__current_case__": 0, "component_value": "1", "select_param_type": "text"}}, {"__index__": 2, "param_type": {"__current_case__": 0, "component_value": "\"", "select_param_type": "text"}}] ` |
                  | dbkey | ` "?" ` |

           </details>

      - **Step 6: filter tabular to get only lines with first group**:

         * step_state: scheduled

         * <details><summary>Jobs</summary>

           - **Job 1:**

             * Job state is ok

             **Command Line:**

              * ```console
                python '/tmp/tmpc2qy7ps4/galaxy-dev/tools/stats/filtering.py' '/tmp/tmpc2qy7ps4/files/a/f/c/dataset_afcb2b94-aa6a-4d26-a0eb-c8dc58d83bed.dat' '/tmp/tmpc2qy7ps4/job_working_directory/000/18/outputs/dataset_d4e0df87-dde7-4ea8-87a8-80974571e8f9.dat' '/tmp/tmpc2qy7ps4/job_working_directory/000/18/configs/tmptvjcspcb' 2 "str,int" 0
                ```
             **Exit Code:**

              * ```console
                0
                ```
             **Standard Output:**

              * ```console
                Filtering with c2 == "1", 
                kept 0.00% of 5 valid lines (5 total lines).

                ```
             **Traceback:**

              * ```console

                ```
             **Job Parameters:**

              *   | Job parameter | Parameter value |
                  | ------------- | --------------- |
                  | \_\_input\_ext | ` "tabular" ` |
                  | \_\_workflow\_invocation\_uuid\_\_ | ` "5b9b1425a0ee11ef87454dec7da58a7f" ` |
                  | chromInfo | ` "/tmp/tmpc2qy7ps4/galaxy-dev/tool-data/shared/ucsc/chrom/?.len" ` |
                  | cond | ` "c2 == \"1\"" ` |
                  | dbkey | ` "?" ` |
                  | header\_lines | ` "0" ` |

           </details>

      - **Step 7: keep only identifiers**:

         * step_state: scheduled

         * <details><summary>Jobs</summary>

           - **Job 1:**

             * Job state is ok

             **Command Line:**

              * ```console
                perl '/tmp/tmpc2qy7ps4/galaxy-dev/tools/filters/cutWrapper.pl' '/tmp/tmpc2qy7ps4/files/d/4/e/dataset_d4e0df87-dde7-4ea8-87a8-80974571e8f9.dat' 'c1' T '/tmp/tmpc2qy7ps4/job_working_directory/000/19/outputs/dataset_138d7815-18cd-4af8-8daa-fe3afad05acf.dat'
                ```
             **Exit Code:**

              * ```console
                0
                ```
             **Traceback:**

              * ```console

                ```
             **Job Parameters:**

              *   | Job parameter | Parameter value |
                  | ------------- | --------------- |
                  | \_\_input\_ext | ` "tabular" ` |
                  | \_\_workflow\_invocation\_uuid\_\_ | ` "5b9b1425a0ee11ef87454dec7da58a7f" ` |
                  | chromInfo | ` "/tmp/tmpc2qy7ps4/galaxy-dev/tool-data/shared/ucsc/chrom/?.len" ` |
                  | columnList | ` "c1" ` |
                  | dbkey | ` "?" ` |
                  | delimiter | ` "T" ` |

           </details>

      - **Step 8: Split collection into 2**:

         * step_state: scheduled

         * <details><summary>Jobs</summary>

           - **Job 1:**

             * Job state is ok

             **Traceback:**

              * ```console

                ```
             **Job Parameters:**

              *   | Job parameter | Parameter value |
                  | ------------- | --------------- |
                  | \_\_workflow\_invocation\_uuid\_\_ | ` "5b9b1425a0ee11ef87454dec7da58a7f" ` |
                  | how | ` {"__current_case__": 0, "filter_source": {"values": [{"id": 19, "src": "hda"}]}, "how_filter": "remove_if_absent"} ` |
                  | input | ` {"values": [{"id": 2, "src": "hdca"}]} ` |

           </details>
       </details>

  </details>
  • Other invocation details - **history_id** * 8bc0613ac77c6934 - **history_state** * ok - **invocation_id** * b3ccdf42b5a440eb - **invocation_state** * scheduled - **workflow_id** * 462bee55b4c85682

Passed Tests *
✅ Split_collection_by_pattern_in_identifiers.ga_0
#### Workflow invocation details * Invocation Messages *
Steps - **Step 1: Input Dataset Collection**: * step_state: scheduled - **Step 2: pattern**: * step_state: scheduled - **Step 3: toolshed.g2.bx.psu.edu/repos/iuc/collection\_element\_identifiers/collection\_element\_identifiers/0.0.2**: * step_state: scheduled *
Jobs - **Job 1:** * Job state is ok **Command Line:** * ```console mv '/tmp/tmpc2qy7ps4/job_working_directory/000/26/configs/tmpy31z3yo2' '/tmp/tmpc2qy7ps4/job_working_directory/000/26/outputs/dataset_aa611102-3ccf-4021-9e5c-09345e3ce2bd.dat' ``` **Exit Code:** * ```console 0 ``` **Traceback:** * ```console ``` **Job Parameters:** * | Job parameter | Parameter value | | ------------- | --------------- | | \_\_input\_ext | ` "txt" ` | | \_\_workflow\_invocation\_uuid\_\_ | ` "aaba94daa0ee11ef87454dec7da58a7f" ` | | chromInfo | ` "/tmp/tmpc2qy7ps4/galaxy-dev/tool-data/shared/ucsc/chrom/?.len" ` | | dbkey | ` "?" ` | | input\_collection | ` {"values": [{"id": 5, "src": "hdca"}]} ` |
 - **Step 4: Select identifiers with pattern**:

    * step_state: scheduled

    * <details><summary>Jobs</summary>

      - **Job 1:**

        * Job state is ok

        **Command Line:**

         * ```console
           grep -P -A 0 -B 0 --no-group-separator  -i -- 'cat1' '/tmp/tmpc2qy7ps4/files/a/a/6/dataset_aa611102-3ccf-4021-9e5c-09345e3ce2bd.dat' > '/tmp/tmpc2qy7ps4/job_working_directory/000/27/outputs/dataset_b19d144e-570e-4415-ae6e-237369ce851f.dat'
           ```
        **Exit Code:**

         * ```console
           0
           ```
        **Traceback:**

         * ```console

           ```
        **Job Parameters:**

         *   | Job parameter | Parameter value |
             | ------------- | --------------- |
             | \_\_input\_ext | ` "input" ` |
             | \_\_workflow\_invocation\_uuid\_\_ | ` "aaba94daa0ee11ef87454dec7da58a7f" ` |
             | case\_sensitive | ` "-i" ` |
             | chromInfo | ` "/tmp/tmpc2qy7ps4/galaxy-dev/tool-data/shared/ucsc/chrom/?.len" ` |
             | color | ` "NOCOLOR" ` |
             | dbkey | ` "?" ` |
             | invert | ` "" ` |
             | lines\_after | ` "0" ` |
             | lines\_before | ` "0" ` |
             | regex\_type | ` "-P" ` |
             | url\_paste | ` "cat1" ` |

      </details>

 - **Step 5: Split collection into 2**:

    * step_state: scheduled

    * <details><summary>Jobs</summary>

      - **Job 1:**

        * Job state is ok

        **Traceback:**

         * ```console

           ```
        **Job Parameters:**

         *   | Job parameter | Parameter value |
             | ------------- | --------------- |
             | \_\_workflow\_invocation\_uuid\_\_ | ` "aaba94daa0ee11ef87454dec7da58a7f" ` |
             | how | ` {"__current_case__": 0, "filter_source": {"values": [{"id": 31, "src": "hda"}]}, "how_filter": "remove_if_absent"} ` |
             | input | ` {"values": [{"id": 5, "src": "hdca"}]} ` |

      </details>
  </details>
  • Other invocation details - **history_id** * 462bee55b4c85682 - **history_state** * ok - **invocation_id** * 462bee55b4c85682 - **invocation_state** * scheduled - **workflow_id** * dd281a73856d463b

github-actions[bot] commented 2 weeks ago

Test Results (powered by Planemo)

Test Summary

Test State Count
Total 3
Passed 1
Error 1
Failure 1
Skipped 0
Errored Tests *
❌ Split-collection-using-tabular.ga_0
**Execution Problem:** * ``` File [/home/runner/work/iwc/iwc/workflows/data-manipulation/split-collection/group_asignment.txt] does not exist - parent directory [/home/runner/work/iwc/iwc/workflows/data-manipulation/split-collection] does exist, cwd is [/home/runner/work/iwc/iwc] ```

Failed Tests *
❌ Split-collection-using-comma-separated-list.ga_0
**Problems**: * ``` Output collection 'collection_first_group': failed to find identifier 'cat1_1' in the tool generated elements [] ``` #### Workflow invocation details * Invocation Messages *
Steps - **Step 1: Input Dataset Collection**: * step_state: scheduled - **Step 2: Groups**: * step_state: scheduled - **Step 3: toolshed.g2.bx.psu.edu/repos/iuc/collection\_element\_identifiers/collection\_element\_identifiers/0.0.2**: * step_state: scheduled *
Jobs - **Job 1:** * Job state is ok **Command Line:** * ```console mv '/tmp/tmpe5mppa9j/job_working_directory/000/19/configs/tmppq_qfcfr' '/tmp/tmpe5mppa9j/job_working_directory/000/19/outputs/dataset_fc760c3e-b39f-4f21-8a26-ff8d446cb317.dat' ``` **Exit Code:** * ```console 0 ``` **Traceback:** * ```console ``` **Job Parameters:** * | Job parameter | Parameter value | | ------------- | --------------- | | \_\_input\_ext | ` "txt" ` | | \_\_workflow\_invocation\_uuid\_\_ | ` "84232944a0ef11ef810f9d5a5423fa88" ` | | chromInfo | ` "/tmp/tmpe5mppa9j/galaxy-dev/tool-data/shared/ucsc/chrom/?.len" ` | | dbkey | ` "?" ` | | input\_collection | ` {"values": [{"id": 5, "src": "hdca"}]} ` |
 - **Step 4: Create a dataset from text**:

    * step_state: scheduled

    * <details><summary>Jobs</summary>

      - **Job 1:**

        * Job state is ok

        **Command Line:**

         * ```console
           times=1; yes -- '1,1,1,2,3' 2>/dev/null | head -n $times >> '/tmp/tmpe5mppa9j/job_working_directory/000/20/outputs/dataset_8de8ae95-7020-4bb1-99c1-a617995a8804.dat';
           ```
        **Exit Code:**

         * ```console
           0
           ```
        **Traceback:**

         * ```console

           ```
        **Job Parameters:**

         *   | Job parameter | Parameter value |
             | ------------- | --------------- |
             | \_\_input\_ext | ` "input" ` |
             | \_\_workflow\_invocation\_uuid\_\_ | ` "84232944a0ef11ef810f9d5a5423fa88" ` |
             | chromInfo | ` "/tmp/tmpe5mppa9j/galaxy-dev/tool-data/shared/ucsc/chrom/?.len" ` |
             | dbkey | ` "?" ` |
             | token\_set | ` [{"__index__": 0, "line": "1,1,1,2,3", "repeat_select": {"__current_case__": 0, "repeat_select_opts": "user", "times": "1"}}] ` |

      </details>

 - **Step 5: Replace comma by back to line**:

    * step_state: scheduled

    * <details><summary>Jobs</summary>

      - **Job 1:**

        * Job state is ok

        **Command Line:**

         * ```console
           perl '/tmp/shed_dir/toolshed.g2.bx.psu.edu/repos/bgruening/text_processing/86755160afbf/text_processing/find_and_replace' -o '/tmp/tmpe5mppa9j/job_working_directory/000/21/outputs/dataset_1fa66a48-6c33-47f1-a742-d55a076787d0.dat' -g    -r ',' '\n' '/tmp/tmpe5mppa9j/files/8/d/e/dataset_8de8ae95-7020-4bb1-99c1-a617995a8804.dat'
           ```
        **Exit Code:**

         * ```console
           0
           ```
        **Traceback:**

         * ```console

           ```
        **Job Parameters:**

         *   | Job parameter | Parameter value |
             | ------------- | --------------- |
             | \_\_input\_ext | ` "input" ` |
             | \_\_workflow\_invocation\_uuid\_\_ | ` "84232944a0ef11ef810f9d5a5423fa88" ` |
             | chromInfo | ` "/tmp/tmpe5mppa9j/galaxy-dev/tool-data/shared/ucsc/chrom/?.len" ` |
             | dbkey | ` "?" ` |
             | find\_and\_replace | ` [{"__index__": 0, "caseinsensitive": false, "find_pattern": ",", "global": true, "is_regex": true, "replace_pattern": "\\n", "searchwhere": {"__current_case__": 0, "searchwhere_select": "line"}, "skip_first_line": false, "wholewords": false}] ` |

      </details>

 - **Step 6: Put side by side identifiers and groups**:

    * step_state: scheduled

    * <details><summary>Jobs</summary>

      - **Job 1:**

        * Job state is ok

        **Command Line:**

         * ```console
           perl '/tmp/tmpe5mppa9j/galaxy-dev/tools/filters/pasteWrapper.pl' '/tmp/tmpe5mppa9j/files/f/c/7/dataset_fc760c3e-b39f-4f21-8a26-ff8d446cb317.dat' '/tmp/tmpe5mppa9j/files/1/f/a/dataset_1fa66a48-6c33-47f1-a742-d55a076787d0.dat' T '/tmp/tmpe5mppa9j/job_working_directory/000/22/outputs/dataset_c4f53797-0b6e-46df-8430-ca059d70b914.dat'
           ```
        **Exit Code:**

         * ```console
           0
           ```
        **Traceback:**

         * ```console

           ```
        **Job Parameters:**

         *   | Job parameter | Parameter value |
             | ------------- | --------------- |
             | \_\_input\_ext | ` "txt" ` |
             | \_\_workflow\_invocation\_uuid\_\_ | ` "84232944a0ef11ef810f9d5a5423fa88" ` |
             | chromInfo | ` "/tmp/tmpe5mppa9j/galaxy-dev/tool-data/shared/ucsc/chrom/?.len" ` |
             | dbkey | ` "?" ` |
             | delimiter | ` "T" ` |

      </details>

 - **Step 7: Unlabelled step**:

    * step_state: scheduled

    * <details><summary>Subworkflow Steps</summary>

      - **Step 1: Input Dataset Collection**:

         * step_state: scheduled

      - **Step 2: identifier mapping**:

         * step_state: scheduled

      - **Step 3: get the first group value**:

         * step_state: scheduled

         * <details><summary>Jobs</summary>

           - **Job 1:**

             * Job state is ok

             **Command Line:**

              * ```console
                env -i $(which awk) --sandbox -v FS='   ' -v OFS='  ' --re-interval -f '/tmp/tmpe5mppa9j/job_working_directory/000/23/configs/tmpcmrd7w92' '/tmp/tmpe5mppa9j/files/c/4/f/dataset_c4f53797-0b6e-46df-8430-ca059d70b914.dat' > '/tmp/tmpe5mppa9j/job_working_directory/000/23/outputs/dataset_0a945b2a-667c-46e5-bb3f-8a672f66cf76.dat'
                ```
             **Exit Code:**

              * ```console
                0
                ```
             **Traceback:**

              * ```console

                ```
             **Job Parameters:**

              *   | Job parameter | Parameter value |
                  | ------------- | --------------- |
                  | \_\_input\_ext | ` "input" ` |
                  | \_\_workflow\_invocation\_uuid\_\_ | ` "84232945a0ef11ef810f9d5a5423fa88" ` |
                  | chromInfo | ` "/tmp/tmpe5mppa9j/galaxy-dev/tool-data/shared/ucsc/chrom/?.len" ` |
                  | code | ` "NR==1{print $2}" ` |
                  | dbkey | ` "?" ` |

           </details>

      - **Step 4: convert to parameter**:

         * step_state: scheduled

         * <details><summary>Jobs</summary>

           - **Job 1:**

             * Job state is ok

             **Command Line:**

              * ```console
                cd ../; python _evaluate_expression_.py
                ```
             **Exit Code:**

              * ```console
                0
                ```
             **Traceback:**

              * ```console

                ```
             **Job Parameters:**

              *   | Job parameter | Parameter value |
                  | ------------- | --------------- |
                  | \_\_input\_ext | ` "tabular" ` |
                  | \_\_workflow\_invocation\_uuid\_\_ | ` "84232945a0ef11ef810f9d5a5423fa88" ` |
                  | chromInfo | ` "/tmp/tmpe5mppa9j/galaxy-dev/tool-data/shared/ucsc/chrom/?.len" ` |
                  | dbkey | ` "?" ` |
                  | param\_type | ` "text" ` |
                  | remove\_newlines | ` true ` |

           </details>

      - **Step 5: make filter condition**:

         * step_state: scheduled

         * <details><summary>Jobs</summary>

           - **Job 1:**

             * Job state is ok

             **Command Line:**

              * ```console
                cd ../; python _evaluate_expression_.py
                ```
             **Exit Code:**

              * ```console
                0
                ```
             **Traceback:**

              * ```console

                ```
             **Job Parameters:**

              *   | Job parameter | Parameter value |
                  | ------------- | --------------- |
                  | \_\_input\_ext | ` "input" ` |
                  | \_\_workflow\_invocation\_uuid\_\_ | ` "84232945a0ef11ef810f9d5a5423fa88" ` |
                  | chromInfo | ` "/tmp/tmpe5mppa9j/galaxy-dev/tool-data/shared/ucsc/chrom/?.len" ` |
                  | components | ` [{"__index__": 0, "param_type": {"__current_case__": 0, "component_value": "c2 == \"", "select_param_type": "text"}}, {"__index__": 1, "param_type": {"__current_case__": 0, "component_value": "1", "select_param_type": "text"}}, {"__index__": 2, "param_type": {"__current_case__": 0, "component_value": "\"", "select_param_type": "text"}}] ` |
                  | dbkey | ` "?" ` |

           </details>

      - **Step 6: filter tabular to get only lines with first group**:

         * step_state: scheduled

         * <details><summary>Jobs</summary>

           - **Job 1:**

             * Job state is ok

             **Command Line:**

              * ```console
                python '/tmp/tmpe5mppa9j/galaxy-dev/tools/stats/filtering.py' '/tmp/tmpe5mppa9j/files/c/4/f/dataset_c4f53797-0b6e-46df-8430-ca059d70b914.dat' '/tmp/tmpe5mppa9j/job_working_directory/000/26/outputs/dataset_7785f065-1c09-4bf8-bddb-a62b78db29cf.dat' '/tmp/tmpe5mppa9j/job_working_directory/000/26/configs/tmp1lkiw7pn' 2 "str,int" 0
                ```
             **Exit Code:**

              * ```console
                0
                ```
             **Standard Output:**

              * ```console
                Filtering with c2 == "1", 
                kept 0.00% of 5 valid lines (5 total lines).

                ```
             **Traceback:**

              * ```console

                ```
             **Job Parameters:**

              *   | Job parameter | Parameter value |
                  | ------------- | --------------- |
                  | \_\_input\_ext | ` "tabular" ` |
                  | \_\_workflow\_invocation\_uuid\_\_ | ` "84232945a0ef11ef810f9d5a5423fa88" ` |
                  | chromInfo | ` "/tmp/tmpe5mppa9j/galaxy-dev/tool-data/shared/ucsc/chrom/?.len" ` |
                  | cond | ` "c2 == \"1\"" ` |
                  | dbkey | ` "?" ` |
                  | header\_lines | ` "0" ` |

           </details>

      - **Step 7: keep only identifiers**:

         * step_state: scheduled

         * <details><summary>Jobs</summary>

           - **Job 1:**

             * Job state is ok

             **Command Line:**

              * ```console
                perl '/tmp/tmpe5mppa9j/galaxy-dev/tools/filters/cutWrapper.pl' '/tmp/tmpe5mppa9j/files/7/7/8/dataset_7785f065-1c09-4bf8-bddb-a62b78db29cf.dat' 'c1' T '/tmp/tmpe5mppa9j/job_working_directory/000/27/outputs/dataset_37758570-d1e9-49da-b2e6-3468e2143150.dat'
                ```
             **Exit Code:**

              * ```console
                0
                ```
             **Traceback:**

              * ```console

                ```
             **Job Parameters:**

              *   | Job parameter | Parameter value |
                  | ------------- | --------------- |
                  | \_\_input\_ext | ` "tabular" ` |
                  | \_\_workflow\_invocation\_uuid\_\_ | ` "84232945a0ef11ef810f9d5a5423fa88" ` |
                  | chromInfo | ` "/tmp/tmpe5mppa9j/galaxy-dev/tool-data/shared/ucsc/chrom/?.len" ` |
                  | columnList | ` "c1" ` |
                  | dbkey | ` "?" ` |
                  | delimiter | ` "T" ` |

           </details>

      - **Step 8: Split collection into 2**:

         * step_state: scheduled

         * <details><summary>Jobs</summary>

           - **Job 1:**

             * Job state is ok

             **Traceback:**

              * ```console

                ```
             **Job Parameters:**

              *   | Job parameter | Parameter value |
                  | ------------- | --------------- |
                  | \_\_workflow\_invocation\_uuid\_\_ | ` "84232945a0ef11ef810f9d5a5423fa88" ` |
                  | how | ` {"__current_case__": 0, "filter_source": {"values": [{"id": 31, "src": "hda"}]}, "how_filter": "remove_if_absent"} ` |
                  | input | ` {"values": [{"id": 5, "src": "hdca"}]} ` |

           </details>
       </details>

  </details>
  • Other invocation details - **history_id** * de2d236c7be5e2a8 - **history_state** * ok - **invocation_id** * 3a3ddc1218b29ebe - **invocation_state** * scheduled - **workflow_id** * 9fc91d2c728bbe01

Passed Tests *
✅ Split-collection-by-pattern-in-identifiers.ga_0
#### Workflow invocation details * Invocation Messages *
Steps - **Step 1: Input Dataset Collection**: * step_state: scheduled - **Step 2: pattern**: * step_state: scheduled - **Step 3: toolshed.g2.bx.psu.edu/repos/iuc/collection\_element\_identifiers/collection\_element\_identifiers/0.0.2**: * step_state: scheduled *
Jobs - **Job 1:** * Job state is ok **Command Line:** * ```console mv '/tmp/tmpe5mppa9j/job_working_directory/000/6/configs/tmppj63nh13' '/tmp/tmpe5mppa9j/job_working_directory/000/6/outputs/dataset_92339aae-c36c-4f94-9c56-496a7ebb0518.dat' ``` **Exit Code:** * ```console 0 ``` **Traceback:** * ```console ``` **Job Parameters:** * | Job parameter | Parameter value | | ------------- | --------------- | | \_\_input\_ext | ` "txt" ` | | \_\_workflow\_invocation\_uuid\_\_ | ` "38c200d8a0ef11ef810f9d5a5423fa88" ` | | chromInfo | ` "/tmp/tmpe5mppa9j/galaxy-dev/tool-data/shared/ucsc/chrom/?.len" ` | | dbkey | ` "?" ` | | input\_collection | ` {"values": [{"id": 1, "src": "hdca"}]} ` |
 - **Step 4: Select identifiers with pattern**:

    * step_state: scheduled

    * <details><summary>Jobs</summary>

      - **Job 1:**

        * Job state is ok

        **Command Line:**

         * ```console
           grep -P -A 0 -B 0 --no-group-separator  -i -- 'cat1' '/tmp/tmpe5mppa9j/files/9/2/3/dataset_92339aae-c36c-4f94-9c56-496a7ebb0518.dat' > '/tmp/tmpe5mppa9j/job_working_directory/000/7/outputs/dataset_e45b0140-c7f3-480d-ac30-5dda3440014b.dat'
           ```
        **Exit Code:**

         * ```console
           0
           ```
        **Traceback:**

         * ```console

           ```
        **Job Parameters:**

         *   | Job parameter | Parameter value |
             | ------------- | --------------- |
             | \_\_input\_ext | ` "input" ` |
             | \_\_workflow\_invocation\_uuid\_\_ | ` "38c200d8a0ef11ef810f9d5a5423fa88" ` |
             | case\_sensitive | ` "-i" ` |
             | chromInfo | ` "/tmp/tmpe5mppa9j/galaxy-dev/tool-data/shared/ucsc/chrom/?.len" ` |
             | color | ` "NOCOLOR" ` |
             | dbkey | ` "?" ` |
             | invert | ` "" ` |
             | lines\_after | ` "0" ` |
             | lines\_before | ` "0" ` |
             | regex\_type | ` "-P" ` |
             | url\_paste | ` "cat1" ` |

      </details>

 - **Step 5: Split collection into 2**:

    * step_state: scheduled

    * <details><summary>Jobs</summary>

      - **Job 1:**

        * Job state is ok

        **Traceback:**

         * ```console

           ```
        **Job Parameters:**

         *   | Job parameter | Parameter value |
             | ------------- | --------------- |
             | \_\_workflow\_invocation\_uuid\_\_ | ` "38c200d8a0ef11ef810f9d5a5423fa88" ` |
             | how | ` {"__current_case__": 0, "filter_source": {"values": [{"id": 7, "src": "hda"}]}, "how_filter": "remove_if_absent"} ` |
             | input | ` {"values": [{"id": 1, "src": "hdca"}]} ` |

      </details>
  </details>
  • Other invocation details - **history_id** * cdfac92d550372b3 - **history_state** * ok - **invocation_id** * cdfac92d550372b3 - **invocation_state** * scheduled - **workflow_id** * cdfac92d550372b3

lldelisle commented 2 weeks ago

@wm75 if you want to review this.

lldelisle commented 1 week ago

@mvdbeek, in the workflow: "Split-collection-using-tabular", I extract during the workflow the value of the group name I want to select. Is there a way to use this value to rename the final output collections? I tried to define the json parameter as a workflow output and reuse the name of this output as ${my_variable} but it just asked me its value when I tried to run the workflow which is not what I wanted...