PathoFact issueshttps://git-r3lab.uni.lu/laura.denies/PathoFact/-/issues2020-06-08T09:09:24+02:00https://git-r3lab.uni.lu/laura.denies/PathoFact/-/issues/10Snakemake: use && or || to concat CMDs in rules2020-06-08T09:09:24+02:00Valentina Galatavalentina.galata@uni.luSnakemake: use && or || to concat CMDs in rulesIf a rule has multiple CMDs use `&&` or `||` to concatenate them (if appropriate)
```bash
# execute cmd1
# if cmd1 fails execute cmd2 otherwise cmd2 is not executed
cmd1 || cmd2
# execute cmd1
# if cmd1 succeeds execute cmd2 otherwise ...If a rule has multiple CMDs use `&&` or `||` to concatenate them (if appropriate)
```bash
# execute cmd1
# if cmd1 fails execute cmd2 otherwise cmd2 is not executed
cmd1 || cmd2
# execute cmd1
# if cmd1 succeeds execute cmd2 otherwise cmd2 is not executed
cmd1 && cmd2
```https://git-r3lab.uni.lu/laura.denies/PathoFact/-/issues/42intallation with docker/singularity2020-06-08T08:15:55+02:00soilmicrobiomeintallation with docker/singularityI am excited to use this program. But I work on a HPC which does not allow for conda. Is it possible to have the program available to pull with docker/singularity?
thanksI am excited to use this program. But I work on a HPC which does not allow for conda. Is it possible to have the program available to pull with docker/singularity?
thankshttps://git-r3lab.uni.lu/laura.denies/PathoFact/-/issues/54Skip prodigal and use curated annotation2022-08-10T15:14:30+02:00avilaHugoSkip prodigal and use curated annotationDear developers,
I have an extensively cured dataset of annotated genomes, but when I run pathofact with fastas files I lose those annotations. Is there a way to use my annotaions to run the program, such as using a gbk or gff file?
(...Dear developers,
I have an extensively cured dataset of annotated genomes, but when I run pathofact with fastas files I lose those annotations. Is there a way to use my annotaions to run the program, such as using a gbk or gff file?
(quick fix if you are a user like me with the same problem: use blast 100% to find match between Prodigal locus tags and your annotations.)https://git-r3lab.uni.lu/laura.denies/PathoFact/-/issues/59Input: allow to provide a protein FAA file2022-06-15T15:31:24+02:00Valentina Galatavalentina.galata@uni.luInput: allow to provide a protein FAA fileTo be able to include `PathoFact` into other workflows and to use the already existing annotations, it should be allowed to provide a protein FAA file as input (together with the corresponding contigs FASTA).To be able to include `PathoFact` into other workflows and to use the already existing annotations, it should be allowed to provide a protein FAA file as input (together with the corresponding contigs FASTA).https://git-r3lab.uni.lu/laura.denies/PathoFact/-/issues/61error while running run_PLASMID2023-03-18T19:36:31+01:00sabina-llrerror while running run_PLASMIDHi, I am trying to test for the correct installation of PathoFact using the test module. Unfortunately I get this message:
Error in rule run_PLASMID:
jobid: 39
output: test/output_test_again/PathoFact_intermediate/MGE/plasmid/Pl...Hi, I am trying to test for the correct installation of PathoFact using the test module. Unfortunately I get this message:
Error in rule run_PLASMID:
jobid: 39
output: test/output_test_again/PathoFact_intermediate/MGE/plasmid/PlasFlow/test_sample/group_1_plasflow_prediction.tsv
log: test/output_test_again/logs/test_sample/group_1_plasflow_prediction.log (check log file(s) for error message)
conda-env: /net/fs-1/home01/sale/PathoFact/PathoFact/.snakemake/conda/d6afdeea
shell:
PlasFlow.py --input test/output_test_again/PathoFact_intermediate/MGE/plasmid_splitted/test_sample/group_1.fasta --output test/output_test_again/PathoFact_intermediate/MGE/plasmid/PlasFlow/test_sample/group_1_plasflow_prediction.tsv --threshold 0.7 &> test/output_test_again/logs/test_sample/group_1_plasflow_prediction.log
(exited with non-zero exit code)
Do you have any suggestion on how to solve this issue? (log file is attached)
Thanks very much in advance,
Kind Regards,
Sabina
[group_1_plasflow_prediction.log](/uploads/64e74de02a8f5c80f13a8b31f53c5bd0/group_1_plasflow_prediction.log)https://git-r3lab.uni.lu/laura.denies/PathoFact/-/issues/67virulence classifier: class probability cutoff?2021-04-15T12:11:24+02:00Nick Youngblutvirulence classifier: class probability cutoff?What class probability cutoff did you use for PathoFact RF classifier benchmarking in the [manuscript](https://microbiomejournal.biomedcentral.com/articles/10.1186/s40168-020-00993-9)? Given that the RF classifier can generate very low c...What class probability cutoff did you use for PathoFact RF classifier benchmarking in the [manuscript](https://microbiomejournal.biomedcentral.com/articles/10.1186/s40168-020-00993-9)? Given that the RF classifier can generate very low class probs for some targets (e.g., 0.1), I'm guessing that you filtered out classifications with low probabilities, but I can't find that info in the manuscript.https://git-r3lab.uni.lu/laura.denies/PathoFact/-/issues/68Percent identity and query coverage for toxin hits2021-04-27T17:06:10+02:00Vadim (Dani) DubinskyPercent identity and query coverage for toxin hitsHello Laura,
I was wondering if it is possible to obtain in addition to the reported score and e-value for each toxin hit, also percent identity and query coverage? is such information is given for HMM profiles?
It might be very useful...Hello Laura,
I was wondering if it is possible to obtain in addition to the reported score and e-value for each toxin hit, also percent identity and query coverage? is such information is given for HMM profiles?
It might be very useful, because to know only the score and e-value is not enough for some cases.
Thank you
Vadimhttps://git-r3lab.uni.lu/laura.denies/PathoFact/-/issues/71The unknown task termination of multiple input genomes2021-09-06T12:29:47+02:00neptuneytThe unknown task termination of multiple input genomesDear PathoFact team,
Thanks for your such amazing work.
My issue is that I got unknown job termination when I run PathoFact with multiple input genomes.
* Systems:Ubuntu 20.04.1 LTS
* Snakemake 5.5.4
* Python 3.6.4
```command```
```
...Dear PathoFact team,
Thanks for your such amazing work.
My issue is that I got unknown job termination when I run PathoFact with multiple input genomes.
* Systems:Ubuntu 20.04.1 LTS
* Snakemake 5.5.4
* Python 3.6.4
```command```
```
nohup time snakemake -s Snakefile --configfile config.yaml --cores --use-conda -p &>iso.log&
```
```config.yaml```
```
(PathoFact) [u@h@PathoFact]$ cat config.yaml
pathofact:
sample: ["ISO_102-3","ISO_1051","ISO_108-2","ISO_10SP3-6-1","ISO_10SP3-6-3","ISO_130","ISO_138","ISO_167","ISO_174","ISO_194-2","ISO_197","ISO_20-2-1.001","ISO_20-2-1.002","ISO_32-3-3","ISO_83-1","ISO_83-3-2","ISO_95"]
project: PathoFact_results # requires user input
datadir: "/mnt/nfs/software/PathoFact/Test" # requires user input
workflow: "complete" #options: "complete", "AMR", "Tox", "Vir"
size_fasta: 10000 #Adjustable to preference
scripts: "scripts"
signalp: "/mnt/nfs/software/signalp-5.0b/bin" # requires user input
deepvirfinder: "submodules/DeepVirFinder/dvf.py"
tox_hmm: "databases/toxins/combined_Toxin.hmm"
tox_lib: "databases/library_HMM_Toxins.csv"
tox_threshold: 40 #Bitscore threshold of the toxin prediction, adjustable by user to preference
vir_hmm: "databases/virulence/Virulence_factor.hmm"
vir_domains: "databases/models_and_domains"
plasflow_threshold: 0.7
plasflow_minlen: 1000
runtime:
short: "00:10:00"
medium: "01:00:00"
long: "02:00:00"
mem:
normal_mem_per_core_gb: "10G"
big_mem_cores: 10
big_mem_per_core_gb: "30G"
```
```Logs(part)```
```
17 HMM_VIR_report
17 HMM_correct_format_2
17 HMM_correct_format_2_vir
17 HMM_correct_format_3
17 Plasmid_aggregate
34 Prodigal
17 R_script
17 SignalPN_aggregate
17 SignalPP_aggregate
17 aggregate_RGI
17 aggregate_VirFinder
17 aggregate_VirSorter
17 aggregate_deepARG
17 aggregate_signalP
1 all
17 clean_all
17 combine_AMR
17 combine_AMR_plasmid
17 combine_PathoFact
17 filter_seq
17 format_classifier
17 format_classifier_2
17 generate_ContigTranslation
34 generate_ID
17 generate_contigID
17 generate_translation
17 mapping_file
17 merge_SignalPVir
17 run_VirSorter
17 select
17 splitcontig
17 splitplasmid
34 splitting
17 splittingsignalP
664
767 of 857 steps (89%) done
[Tue May 11 10:01:52 2021]
Finished job 326.
768 of 857 steps (90%) done
[Tue May 11 10:02:28 2021]
Finished job 270.
769 of 857 steps (90%) done
[Tue May 11 10:03:00 2021]
Finished job 278.
770 of 857 steps (90%) done
[Tue May 11 10:03:22 2021]
Finished job 342.
771 of 857 steps (90%) done
[Tue May 11 10:05:56 2021]
Finished job 246.
772 of 857 steps (90%) done
[Tue May 11 10:07:47 2021]
Finished job 294.
773 of 857 steps (90%) done
```
It blocked on the 90% step, and I checked the output reasult no other error message.
I am looking forward your reply, Thanks a lothttps://git-r3lab.uni.lu/laura.denies/PathoFact/-/issues/74PathoFact block in the DVF step2021-07-06T11:24:08+02:00neptuneytPathoFact block in the DVF stepDear PathoFact team, Thanks for your such amazing work. My issue is that I got unknown block on the *dvf* step when I run PathoFact with a test input genomes.
* Systems:Ubuntu 20.04.1 LTS
* Snakemake 5.5.4
* Python 3.6.4
snakemake log:...Dear PathoFact team, Thanks for your such amazing work. My issue is that I got unknown block on the *dvf* step when I run PathoFact with a test input genomes.
* Systems:Ubuntu 20.04.1 LTS
* Snakemake 5.5.4
* Python 3.6.4
snakemake log:
```
Building DAG of jobs...
Updating job 10.
Using shell: /usr/bin/bash
Provided cores: 1
Rules claiming more threads will be scaled down.
Job counts:
count jobs
1 HMM_R_VIR
1 HMM_VIR_classification
1 HMM_VIR_finalformat
1 HMM_VIR_report
1 HMM_correct_format_2
1 HMM_correct_format_2_vir
1 HMM_correct_format_3
1 Plasmid_aggregate
1 R_script
1 SignalPN_aggregate
1 SignalPP_aggregate
1 aggregate_RGI
1 aggregate_VirFinder
1 aggregate_deepARG
1 aggregate_signalP
1 all
1 clean_all
1 combine_AMR
1 combine_AMR_plasmid
1 combine_PathoFact
1 filter_seq
1 format_classifier
1 format_classifier_2
2 generate_ID
1 generate_translation
1 mapping_file
1 merge_SignalPVir
1 run_VirFinder
1 select
1 splitplasmid
2 splitting
1 splittingsignalP
34
[Thu Jun 17 10:22:10 2021]
Job 35: Filter samples on length for PlasFlow predictions: PathoFact_results - ISO_DWRC1-3
Reason: Missing output files: /mnt/nfs/software/PathoFact/Test/PathoFact_results/PathoFact_intermediate/plasmid/ISO_DWRC1-3_filtered.fna
scripts/filter.pl 1000 /mnt/nfs/software/PathoFact/Test/PathoFact_results/PathoFact_intermediate/renamed/ISO_DWRC1-3_Contig_ID.fna > /mnt/nfs/software/PathoFact/Test/PathoFact_results/PathoFact_intermediate/plasmid/ISO_DWRC1-3_filtered.fna 2> /mnt/nfs/software/PathoFact/Test/PathoFact_results/logs/ISO_DWRC1-3/plasmid_filtered.log
Activating conda environment: /mnt/nfs/software/Old_PathoFact/.snakemake/conda/f490db85
[Thu Jun 17 10:22:16 2021]
Finished job 35.
1 of 34 steps (3%) done
[Thu Jun 17 10:22:16 2021]
checkpoint splitplasmid:
input: /mnt/nfs/software/PathoFact/Test/PathoFact_results/PathoFact_intermediate/plasmid/ISO_DWRC1-3_filtered.fna
output: /mnt/nfs/software/PathoFact/Test/PathoFact_results/PathoFact_intermediate/MGE/plasmid_splitted/ISO_DWRC1-3/
jobid: 29
reason: Missing output files: /mnt/nfs/software/PathoFact/Test/PathoFact_results/PathoFact_intermediate/MGE/plasmid_splitted/ISO_DWRC1-3/; Input files updated by another job: /mnt/nfs/software/PathoFact/Test/PathoFact_results/PathoFact_intermediate/plasmid/ISO_DWRC1-3_filtered.fna
wildcards: project=PathoFact_results, sample=ISO_DWRC1-3
Downstream jobs will be updated after completion.
python scripts/split.py /mnt/nfs/software/PathoFact/Test/PathoFact_results/PathoFact_intermediate/plasmid/ISO_DWRC1-3_filtered.fna 10000 /mnt/nfs/software/PathoFact/Test/PathoFact_results/PathoFact_intermediate/MGE/plasmid_splitted/ISO_DWRC1-3/
Activating conda environment: /mnt/nfs/software/Old_PathoFact/.snakemake/conda/f490db85
Wrote 5 records to group_1.fasta
Removing temporary output file /mnt/nfs/software/PathoFact/Test/PathoFact_results/PathoFact_intermediate/plasmid/ISO_DWRC1-3_filtered.fna.
Updating job 20.
Creating conda environment envs/PlasFlow.yaml...
Downloading remote packages.
Environment for envs/PlasFlow.yaml created (location: .snakemake/conda/0d925277)
[Thu Jun 17 11:22:31 2021]
Finished job 29.
2 of 33 steps (6%) done
[Thu Jun 17 11:22:32 2021]
Job 41: Executing Deep-VirFinder with 1 threads on the following sample(s): PathoFact_results - ISO_DWRC1-3
Reason: Missing output files: /mnt/nfs/software/PathoFact/Test/PathoFact_results/PathoFact_intermediate/MGE/phage/ISO_DWRC1-3/virfinder/group_1.fasta_gt1bp_dvfpred.txt
python submodules/DeepVirFinder/dvf.py -i /mnt/nfs/software/PathoFact/Test/PathoFact_results/PathoFact_intermediate/contig_splitted/ISO_DWRC1-3/group_1.fasta -o /mnt/nfs/software/PathoFact/Test/PathoFact_results/PathoFact_intermediate/MGE/phage/ISO_DWRC1-3/virfinder -c 1 &> /mnt/nfs/software/PathoFact/Test/PathoFact_results/logs/ISO_DWRC1-3/group_1.fasta_gt1bp_dvfpred.log
Activating conda environment: /mnt/nfs/software/Old_PathoFact/.snakemake/conda/45939b1f
```
It seems blocked in the dvf step,so I check the *group_1.fasta_gt1bp_dvfpred.log*:
```
bp_dvfpred.log
Using Theano backend.
WARNING (theano.configdefaults): install mkl with `conda install mkl-service`: No module named 'mkl'
OMP: Info #212: KMP_AFFINITY: decoding x2APIC ids.
OMP: Info #210: KMP_AFFINITY: Affinity capable, using global cpuid leaf 11 info
OMP: Info #154: KMP_AFFINITY: Initial OS proc set respected: 0-255
OMP: Info #156: KMP_AFFINITY: 256 available OS procs
OMP: Info #157: KMP_AFFINITY: Uniform topology
OMP: Info #179: KMP_AFFINITY: 2 packages x 64 cores/pkg x 2 threads/core (128 total cores)
OMP: Info #214: KMP_AFFINITY: OS proc to physical thread map:
OMP: Info #171: KMP_AFFINITY: OS proc 0 maps to package 0 core 0 thread 0
OMP: Info #171: KMP_AFFINITY: OS proc 128 maps to package 0 core 0 thread 1
OMP: Info #171: KMP_AFFINITY: OS proc 1 maps to package 0 core 1 thread 0
OMP: Info #171: KMP_AFFINITY: OS proc 129 maps to package 0 core 1 thread 1
OMP: Info #171: KMP_AFFINITY: OS proc 2 maps to package 0 core 2 thread 0
OMP: Info #171: KMP_AFFINITY: OS proc 130 maps to package 0 core 2 thread 1
OMP: Info #171: KMP_AFFINITY: OS proc 3 maps to package 0 core 3 thread 0
OMP: Info #171: KMP_AFFINITY: OS proc 131 maps to package 0 core 3 thread 1
OMP: Info #171: KMP_AFFINITY: OS proc 4 maps to package 0 core 4 thread 0
OMP: Info #171: KMP_AFFINITY: OS proc 132 maps to package 0 core 4 thread 1
OMP: Info #171: KMP_AFFINITY: OS proc 5 maps to package 0 core 5 thread 0
OMP: Info #171: KMP_AFFINITY: OS proc 133 maps to package 0 core 5 thread 1
OMP: Info #171: KMP_AFFINITY: OS proc 6 maps to package 0 core 6 thread 0
OMP: Info #171: KMP_AFFINITY: OS proc 134 maps to package 0 core 6 thread 1
OMP: Info #171: KMP_AFFINITY: OS proc 7 maps to package 0 core 7 thread 0
OMP: Info #171: KMP_AFFINITY: OS proc 135 maps to package 0 core 7 thread 1
OMP: Info #171: KMP_AFFINITY: OS proc 8 maps to package 0 core 8 thread 0
OMP: Info #171: KMP_AFFINITY: OS proc 136 maps to package 0 core 8 thread 1
OMP: Info #171: KMP_AFFINITY: OS proc 9 maps to package 0 core 9 thread 0
OMP: Info #171: KMP_AFFINITY: OS proc 137 maps to package 0 core 9 thread 1
OMP: Info #171: KMP_AFFINITY: OS proc 10 maps to package 0 core 10 thread 0
OMP: Info #171: KMP_AFFINITY: OS proc 138 maps to package 0 core 10 thread 1
OMP: Info #171: KMP_AFFINITY: OS proc 11 maps to package 0 core 11 thread 0
OMP: Info #171: KMP_AFFINITY: OS proc 139 maps to package 0 core 11 thread 1
OMP: Info #171: KMP_AFFINITY: OS proc 12 maps to package 0 core 12 thread 0
OMP: Info #171: KMP_AFFINITY: OS proc 140 maps to package 0 core 12 thread 1
OMP: Info #171: KMP_AFFINITY: OS proc 13 maps to package 0 core 13 thread 0
OMP: Info #171: KMP_AFFINITY: OS proc 141 maps to package 0 core 13 thread 1
OMP: Info #171: KMP_AFFINITY: OS proc 14 maps to package 0 core 14 thread 0
OMP: Info #171: KMP_AFFINITY: OS proc 142 maps to package 0 core 14 thread 1
OMP: Info #171: KMP_AFFINITY: OS proc 15 maps to package 0 core 15 thread 0
OMP: Info #171: KMP_AFFINITY: OS proc 143 maps to package 0 core 15 thread 1
OMP: Info #171: KMP_AFFINITY: OS proc 16 maps to package 0 core 16 thread 0
OMP: Info #171: KMP_AFFINITY: OS proc 144 maps to package 0 core 16 thread 1
OMP: Info #171: KMP_AFFINITY: OS proc 17 maps to package 0 core 17 thread 0
OMP: Info #171: KMP_AFFINITY: OS proc 145 maps to package 0 core 17 thread 1
OMP: Info #171: KMP_AFFINITY: OS proc 18 maps to package 0 core 18 thread 0
OMP: Info #171: KMP_AFFINITY: OS proc 146 maps to package 0 core 18 thread 1
OMP: Info #171: KMP_AFFINITY: OS proc 19 maps to package 0 core 19 thread 0
OMP: Info #171: KMP_AFFINITY: OS proc 147 maps to package 0 core 19 thread 1
OMP: Info #171: KMP_AFFINITY: OS proc 20 maps to package 0 core 20 thread 0
OMP: Info #171: KMP_AFFINITY: OS proc 148 maps to package 0 core 20 thread 1
OMP: Info #171: KMP_AFFINITY: OS proc 21 maps to package 0 core 21 thread 0
OMP: Info #171: KMP_AFFINITY: OS proc 149 maps to package 0 core 21 thread 1
OMP: Info #171: KMP_AFFINITY: OS proc 22 maps to package 0 core 22 thread 0
OMP: Info #171: KMP_AFFINITY: OS proc 150 maps to package 0 core 22 thread 1
OMP: Info #171: KMP_AFFINITY: OS proc 23 maps to package 0 core 23 thread 0
OMP: Info #171: KMP_AFFINITY: OS proc 151 maps to package 0 core 23 thread 1
OMP: Info #171: KMP_AFFINITY: OS proc 24 maps to package 0 core 24 thread 0
OMP: Info #171: KMP_AFFINITY: OS proc 152 maps to package 0 core 24 thread 1
OMP: Info #171: KMP_AFFINITY: OS proc 25 maps to package 0 core 25 thread 0
OMP: Info #171: KMP_AFFINITY: OS proc 153 maps to package 0 core 25 thread 1
OMP: Info #171: KMP_AFFINITY: OS proc 26 maps to package 0 core 26 thread 0
OMP: Info #171: KMP_AFFINITY: OS proc 154 maps to package 0 core 26 thread 1
OMP: Info #171: KMP_AFFINITY: OS proc 27 maps to package 0 core 27 thread 0
OMP: Info #171: KMP_AFFINITY: OS proc 155 maps to package 0 core 27 thread 1
OMP: Info #171: KMP_AFFINITY: OS proc 28 maps to package 0 core 28 thread 0
OMP: Info #171: KMP_AFFINITY: OS proc 156 maps to package 0 core 28 thread 1
OMP: Info #171: KMP_AFFINITY: OS proc 29 maps to package 0 core 29 thread 0
OMP: Info #171: KMP_AFFINITY: OS proc 157 maps to package 0 core 29 thread 1
OMP: Info #171: KMP_AFFINITY: OS proc 30 maps to package 0 core 30 thread 0
OMP: Info #171: KMP_AFFINITY: OS proc 158 maps to package 0 core 30 thread 1
OMP: Info #171: KMP_AFFINITY: OS proc 31 maps to package 0 core 31 thread 0
OMP: Info #171: KMP_AFFINITY: OS proc 159 maps to package 0 core 31 thread 1
OMP: Info #171: KMP_AFFINITY: OS proc 32 maps to package 0 core 32 thread 0
OMP: Info #171: KMP_AFFINITY: OS proc 160 maps to package 0 core 32 thread 1
OMP: Info #171: KMP_AFFINITY: OS proc 33 maps to package 0 core 33 thread 0
OMP: Info #171: KMP_AFFINITY: OS proc 161 maps to package 0 core 33 thread 1
OMP: Info #171: KMP_AFFINITY: OS proc 34 maps to package 0 core 34 thread 0
OMP: Info #171: KMP_AFFINITY: OS proc 162 maps to package 0 core 34 thread 1
OMP: Info #171: KMP_AFFINITY: OS proc 35 maps to package 0 core 35 thread 0
OMP: Info #171: KMP_AFFINITY: OS proc 163 maps to package 0 core 35 thread 1
OMP: Info #171: KMP_AFFINITY: OS proc 36 maps to package 0 core 36 thread 0
OMP: Info #171: KMP_AFFINITY: OS proc 164 maps to package 0 core 36 thread 1
OMP: Info #171: KMP_AFFINITY: OS proc 37 maps to package 0 core 37 thread 0
OMP: Info #171: KMP_AFFINITY: OS proc 165 maps to package 0 core 37 thread 1
OMP: Info #171: KMP_AFFINITY: OS proc 38 maps to package 0 core 38 thread 0
OMP: Info #171: KMP_AFFINITY: OS proc 166 maps to package 0 core 38 thread 1
OMP: Info #171: KMP_AFFINITY: OS proc 39 maps to package 0 core 39 thread 0
OMP: Info #171: KMP_AFFINITY: OS proc 167 maps to package 0 core 39 thread 1
OMP: Info #171: KMP_AFFINITY: OS proc 40 maps to package 0 core 40 thread 0
OMP: Info #171: KMP_AFFINITY: OS proc 168 maps to package 0 core 40 thread 1
OMP: Info #171: KMP_AFFINITY: OS proc 41 maps to package 0 core 41 thread 0
OMP: Info #171: KMP_AFFINITY: OS proc 169 maps to package 0 core 41 thread 1
OMP: Info #171: KMP_AFFINITY: OS proc 42 maps to package 0 core 42 thread 0
OMP: Info #171: KMP_AFFINITY: OS proc 170 maps to package 0 core 42 thread 1
OMP: Info #171: KMP_AFFINITY: OS proc 43 maps to package 0 core 43 thread 0
OMP: Info #171: KMP_AFFINITY: OS proc 171 maps to package 0 core 43 thread 1
OMP: Info #171: KMP_AFFINITY: OS proc 44 maps to package 0 core 44 thread 0
OMP: Info #171: KMP_AFFINITY: OS proc 172 maps to package 0 core 44 thread 1
OMP: Info #171: KMP_AFFINITY: OS proc 45 maps to package 0 core 45 thread 0
OMP: Info #171: KMP_AFFINITY: OS proc 173 maps to package 0 core 45 thread 1
OMP: Info #171: KMP_AFFINITY: OS proc 46 maps to package 0 core 46 thread 0
OMP: Info #171: KMP_AFFINITY: OS proc 174 maps to package 0 core 46 thread 1
OMP: Info #171: KMP_AFFINITY: OS proc 47 maps to package 0 core 47 thread 0
OMP: Info #171: KMP_AFFINITY: OS proc 175 maps to package 0 core 47 thread 1
OMP: Info #171: KMP_AFFINITY: OS proc 48 maps to package 0 core 48 thread 0
OMP: Info #171: KMP_AFFINITY: OS proc 176 maps to package 0 core 48 thread 1
OMP: Info #171: KMP_AFFINITY: OS proc 49 maps to package 0 core 49 thread 0
OMP: Info #171: KMP_AFFINITY: OS proc 177 maps to package 0 core 49 thread 1
OMP: Info #171: KMP_AFFINITY: OS proc 50 maps to package 0 core 50 thread 0
OMP: Info #171: KMP_AFFINITY: OS proc 178 maps to package 0 core 50 thread 1
OMP: Info #171: KMP_AFFINITY: OS proc 51 maps to package 0 core 51 thread 0
OMP: Info #171: KMP_AFFINITY: OS proc 179 maps to package 0 core 51 thread 1
OMP: Info #171: KMP_AFFINITY: OS proc 52 maps to package 0 core 52 thread 0
OMP: Info #171: KMP_AFFINITY: OS proc 180 maps to package 0 core 52 thread 1
OMP: Info #171: KMP_AFFINITY: OS proc 53 maps to package 0 core 53 thread 0
OMP: Info #171: KMP_AFFINITY: OS proc 181 maps to package 0 core 53 thread 1
OMP: Info #171: KMP_AFFINITY: OS proc 54 maps to package 0 core 54 thread 0
OMP: Info #171: KMP_AFFINITY: OS proc 182 maps to package 0 core 54 thread 1
OMP: Info #171: KMP_AFFINITY: OS proc 55 maps to package 0 core 55 thread 0
OMP: Info #171: KMP_AFFINITY: OS proc 183 maps to package 0 core 55 thread 1
OMP: Info #171: KMP_AFFINITY: OS proc 56 maps to package 0 core 56 thread 0
OMP: Info #171: KMP_AFFINITY: OS proc 184 maps to package 0 core 56 thread 1
OMP: Info #171: KMP_AFFINITY: OS proc 57 maps to package 0 core 57 thread 0
OMP: Info #171: KMP_AFFINITY: OS proc 185 maps to package 0 core 57 thread 1
OMP: Info #171: KMP_AFFINITY: OS proc 58 maps to package 0 core 58 thread 0
OMP: Info #171: KMP_AFFINITY: OS proc 186 maps to package 0 core 58 thread 1
OMP: Info #171: KMP_AFFINITY: OS proc 59 maps to package 0 core 59 thread 0
OMP: Info #171: KMP_AFFINITY: OS proc 187 maps to package 0 core 59 thread 1
OMP: Info #171: KMP_AFFINITY: OS proc 60 maps to package 0 core 60 thread 0
OMP: Info #171: KMP_AFFINITY: OS proc 188 maps to package 0 core 60 thread 1
OMP: Info #171: KMP_AFFINITY: OS proc 61 maps to package 0 core 61 thread 0
OMP: Info #171: KMP_AFFINITY: OS proc 189 maps to package 0 core 61 thread 1
OMP: Info #171: KMP_AFFINITY: OS proc 62 maps to package 0 core 62 thread 0
OMP: Info #171: KMP_AFFINITY: OS proc 190 maps to package 0 core 62 thread 1
OMP: Info #171: KMP_AFFINITY: OS proc 63 maps to package 0 core 63 thread 0
OMP: Info #171: KMP_AFFINITY: OS proc 191 maps to package 0 core 63 thread 1
OMP: Info #171: KMP_AFFINITY: OS proc 64 maps to package 1 core 0 thread 0
OMP: Info #171: KMP_AFFINITY: OS proc 192 maps to package 1 core 0 thread 1
OMP: Info #171: KMP_AFFINITY: OS proc 65 maps to package 1 core 1 thread 0
OMP: Info #171: KMP_AFFINITY: OS proc 193 maps to package 1 core 1 thread 1
OMP: Info #171: KMP_AFFINITY: OS proc 66 maps to package 1 core 2 thread 0
OMP: Info #171: KMP_AFFINITY: OS proc 194 maps to package 1 core 2 thread 1
OMP: Info #171: KMP_AFFINITY: OS proc 67 maps to package 1 core 3 thread 0
OMP: Info #171: KMP_AFFINITY: OS proc 195 maps to package 1 core 3 thread 1
OMP: Info #171: KMP_AFFINITY: OS proc 68 maps to package 1 core 4 thread 0
OMP: Info #171: KMP_AFFINITY: OS proc 196 maps to package 1 core 4 thread 1
OMP: Info #171: KMP_AFFINITY: OS proc 69 maps to package 1 core 5 thread 0
OMP: Info #171: KMP_AFFINITY: OS proc 197 maps to package 1 core 5 thread 1
OMP: Info #171: KMP_AFFINITY: OS proc 70 maps to package 1 core 6 thread 0
OMP: Info #171: KMP_AFFINITY: OS proc 198 maps to package 1 core 6 thread 1
OMP: Info #171: KMP_AFFINITY: OS proc 71 maps to package 1 core 7 thread 0
OMP: Info #171: KMP_AFFINITY: OS proc 199 maps to package 1 core 7 thread 1
OMP: Info #171: KMP_AFFINITY: OS proc 72 maps to package 1 core 8 thread 0
OMP: Info #171: KMP_AFFINITY: OS proc 200 maps to package 1 core 8 thread 1
OMP: Info #171: KMP_AFFINITY: OS proc 73 maps to package 1 core 9 thread 0
OMP: Info #171: KMP_AFFINITY: OS proc 201 maps to package 1 core 9 thread 1
OMP: Info #171: KMP_AFFINITY: OS proc 74 maps to package 1 core 10 thread 0
OMP: Info #171: KMP_AFFINITY: OS proc 202 maps to package 1 core 10 thread 1
OMP: Info #171: KMP_AFFINITY: OS proc 75 maps to package 1 core 11 thread 0
OMP: Info #171: KMP_AFFINITY: OS proc 203 maps to package 1 core 11 thread 1
OMP: Info #171: KMP_AFFINITY: OS proc 76 maps to package 1 core 12 thread 0
OMP: Info #171: KMP_AFFINITY: OS proc 204 maps to package 1 core 12 thread 1
OMP: Info #171: KMP_AFFINITY: OS proc 77 maps to package 1 core 13 thread 0
OMP: Info #171: KMP_AFFINITY: OS proc 205 maps to package 1 core 13 thread 1
OMP: Info #171: KMP_AFFINITY: OS proc 78 maps to package 1 core 14 thread 0
OMP: Info #171: KMP_AFFINITY: OS proc 206 maps to package 1 core 14 thread 1
OMP: Info #171: KMP_AFFINITY: OS proc 79 maps to package 1 core 15 thread 0
OMP: Info #171: KMP_AFFINITY: OS proc 207 maps to package 1 core 15 thread 1
OMP: Info #171: KMP_AFFINITY: OS proc 80 maps to package 1 core 16 thread 0
OMP: Info #171: KMP_AFFINITY: OS proc 208 maps to package 1 core 16 thread 1
OMP: Info #171: KMP_AFFINITY: OS proc 81 maps to package 1 core 17 thread 0
OMP: Info #171: KMP_AFFINITY: OS proc 209 maps to package 1 core 17 thread 1
OMP: Info #171: KMP_AFFINITY: OS proc 82 maps to package 1 core 18 thread 0
OMP: Info #171: KMP_AFFINITY: OS proc 210 maps to package 1 core 18 thread 1
OMP: Info #171: KMP_AFFINITY: OS proc 83 maps to package 1 core 19 thread 0
OMP: Info #171: KMP_AFFINITY: OS proc 211 maps to package 1 core 19 thread 1
OMP: Info #171: KMP_AFFINITY: OS proc 84 maps to package 1 core 20 thread 0
OMP: Info #171: KMP_AFFINITY: OS proc 212 maps to package 1 core 20 thread 1
OMP: Info #171: KMP_AFFINITY: OS proc 85 maps to package 1 core 21 thread 0
OMP: Info #171: KMP_AFFINITY: OS proc 213 maps to package 1 core 21 thread 1
OMP: Info #171: KMP_AFFINITY: OS proc 86 maps to package 1 core 22 thread 0
OMP: Info #171: KMP_AFFINITY: OS proc 214 maps to package 1 core 22 thread 1
OMP: Info #171: KMP_AFFINITY: OS proc 87 maps to package 1 core 23 thread 0
OMP: Info #171: KMP_AFFINITY: OS proc 215 maps to package 1 core 23 thread 1
OMP: Info #171: KMP_AFFINITY: OS proc 88 maps to package 1 core 24 thread 0
OMP: Info #171: KMP_AFFINITY: OS proc 216 maps to package 1 core 24 thread 1
OMP: Info #171: KMP_AFFINITY: OS proc 89 maps to package 1 core 25 thread 0
OMP: Info #171: KMP_AFFINITY: OS proc 217 maps to package 1 core 25 thread 1
OMP: Info #171: KMP_AFFINITY: OS proc 90 maps to package 1 core 26 thread 0
OMP: Info #171: KMP_AFFINITY: OS proc 218 maps to package 1 core 26 thread 1
OMP: Info #171: KMP_AFFINITY: OS proc 91 maps to package 1 core 27 thread 0
OMP: Info #171: KMP_AFFINITY: OS proc 219 maps to package 1 core 27 thread 1
OMP: Info #171: KMP_AFFINITY: OS proc 92 maps to package 1 core 28 thread 0
OMP: Info #171: KMP_AFFINITY: OS proc 220 maps to package 1 core 28 thread 1
OMP: Info #171: KMP_AFFINITY: OS proc 93 maps to package 1 core 29 thread 0
OMP: Info #171: KMP_AFFINITY: OS proc 221 maps to package 1 core 29 thread 1
OMP: Info #171: KMP_AFFINITY: OS proc 94 maps to package 1 core 30 thread 0
OMP: Info #171: KMP_AFFINITY: OS proc 222 maps to package 1 core 30 thread 1
OMP: Info #171: KMP_AFFINITY: OS proc 95 maps to package 1 core 31 thread 0
OMP: Info #171: KMP_AFFINITY: OS proc 223 maps to package 1 core 31 thread 1
OMP: Info #171: KMP_AFFINITY: OS proc 96 maps to package 1 core 32 thread 0
OMP: Info #171: KMP_AFFINITY: OS proc 224 maps to package 1 core 32 thread 1
OMP: Info #171: KMP_AFFINITY: OS proc 97 maps to package 1 core 33 thread 0
OMP: Info #171: KMP_AFFINITY: OS proc 225 maps to package 1 core 33 thread 1
OMP: Info #171: KMP_AFFINITY: OS proc 98 maps to package 1 core 34 thread 0
OMP: Info #171: KMP_AFFINITY: OS proc 226 maps to package 1 core 34 thread 1
OMP: Info #171: KMP_AFFINITY: OS proc 99 maps to package 1 core 35 thread 0
OMP: Info #171: KMP_AFFINITY: OS proc 227 maps to package 1 core 35 thread 1
OMP: Info #171: KMP_AFFINITY: OS proc 100 maps to package 1 core 36 thread 0
OMP: Info #171: KMP_AFFINITY: OS proc 228 maps to package 1 core 36 thread 1
OMP: Info #171: KMP_AFFINITY: OS proc 101 maps to package 1 core 37 thread 0
OMP: Info #171: KMP_AFFINITY: OS proc 229 maps to package 1 core 37 thread 1
OMP: Info #171: KMP_AFFINITY: OS proc 102 maps to package 1 core 38 thread 0
OMP: Info #171: KMP_AFFINITY: OS proc 230 maps to package 1 core 38 thread 1
OMP: Info #171: KMP_AFFINITY: OS proc 103 maps to package 1 core 39 thread 0
OMP: Info #171: KMP_AFFINITY: OS proc 231 maps to package 1 core 39 thread 1
OMP: Info #171: KMP_AFFINITY: OS proc 104 maps to package 1 core 40 thread 0
OMP: Info #171: KMP_AFFINITY: OS proc 232 maps to package 1 core 40 thread 1
OMP: Info #171: KMP_AFFINITY: OS proc 105 maps to package 1 core 41 thread 0
OMP: Info #171: KMP_AFFINITY: OS proc 233 maps to package 1 core 41 thread 1
OMP: Info #171: KMP_AFFINITY: OS proc 106 maps to package 1 core 42 thread 0
OMP: Info #171: KMP_AFFINITY: OS proc 234 maps to package 1 core 42 thread 1
OMP: Info #171: KMP_AFFINITY: OS proc 107 maps to package 1 core 43 thread 0
OMP: Info #171: KMP_AFFINITY: OS proc 235 maps to package 1 core 43 thread 1
OMP: Info #171: KMP_AFFINITY: OS proc 108 maps to package 1 core 44 thread 0
OMP: Info #171: KMP_AFFINITY: OS proc 236 maps to package 1 core 44 thread 1
OMP: Info #171: KMP_AFFINITY: OS proc 109 maps to package 1 core 45 thread 0
OMP: Info #171: KMP_AFFINITY: OS proc 237 maps to package 1 core 45 thread 1
OMP: Info #171: KMP_AFFINITY: OS proc 110 maps to package 1 core 46 thread 0
OMP: Info #171: KMP_AFFINITY: OS proc 238 maps to package 1 core 46 thread 1
OMP: Info #171: KMP_AFFINITY: OS proc 111 maps to package 1 core 47 thread 0
OMP: Info #171: KMP_AFFINITY: OS proc 239 maps to package 1 core 47 thread 1
OMP: Info #171: KMP_AFFINITY: OS proc 112 maps to package 1 core 48 thread 0
OMP: Info #171: KMP_AFFINITY: OS proc 240 maps to package 1 core 48 thread 1
OMP: Info #171: KMP_AFFINITY: OS proc 113 maps to package 1 core 49 thread 0
OMP: Info #171: KMP_AFFINITY: OS proc 241 maps to package 1 core 49 thread 1
OMP: Info #171: KMP_AFFINITY: OS proc 114 maps to package 1 core 50 thread 0
OMP: Info #171: KMP_AFFINITY: OS proc 242 maps to package 1 core 50 thread 1
OMP: Info #171: KMP_AFFINITY: OS proc 115 maps to package 1 core 51 thread 0
OMP: Info #171: KMP_AFFINITY: OS proc 243 maps to package 1 core 51 thread 1
OMP: Info #171: KMP_AFFINITY: OS proc 116 maps to package 1 core 52 thread 0
OMP: Info #171: KMP_AFFINITY: OS proc 244 maps to package 1 core 52 thread 1
OMP: Info #171: KMP_AFFINITY: OS proc 117 maps to package 1 core 53 thread 0
OMP: Info #171: KMP_AFFINITY: OS proc 245 maps to package 1 core 53 thread 1
OMP: Info #171: KMP_AFFINITY: OS proc 118 maps to package 1 core 54 thread 0
OMP: Info #171: KMP_AFFINITY: OS proc 246 maps to package 1 core 54 thread 1
OMP: Info #171: KMP_AFFINITY: OS proc 119 maps to package 1 core 55 thread 0
OMP: Info #171: KMP_AFFINITY: OS proc 247 maps to package 1 core 55 thread 1
OMP: Info #171: KMP_AFFINITY: OS proc 120 maps to package 1 core 56 thread 0
OMP: Info #171: KMP_AFFINITY: OS proc 248 maps to package 1 core 56 thread 1
OMP: Info #171: KMP_AFFINITY: OS proc 121 maps to package 1 core 57 thread 0
OMP: Info #171: KMP_AFFINITY: OS proc 249 maps to package 1 core 57 thread 1
OMP: Info #171: KMP_AFFINITY: OS proc 122 maps to package 1 core 58 thread 0
OMP: Info #171: KMP_AFFINITY: OS proc 250 maps to package 1 core 58 thread 1
OMP: Info #171: KMP_AFFINITY: OS proc 123 maps to package 1 core 59 thread 0
OMP: Info #171: KMP_AFFINITY: OS proc 251 maps to package 1 core 59 thread 1
OMP: Info #171: KMP_AFFINITY: OS proc 124 maps to package 1 core 60 thread 0
OMP: Info #171: KMP_AFFINITY: OS proc 252 maps to package 1 core 60 thread 1
OMP: Info #171: KMP_AFFINITY: OS proc 125 maps to package 1 core 61 thread 0
OMP: Info #171: KMP_AFFINITY: OS proc 253 maps to package 1 core 61 thread 1
OMP: Info #171: KMP_AFFINITY: OS proc 126 maps to package 1 core 62 thread 0
OMP: Info #171: KMP_AFFINITY: OS proc 254 maps to package 1 core 62 thread 1
OMP: Info #171: KMP_AFFINITY: OS proc 127 maps to package 1 core 63 thread 0
OMP: Info #171: KMP_AFFINITY: OS proc 255 maps to package 1 core 63 thread 1
OMP: Info #250: KMP_AFFINITY: pid 2526244 tid 2526244 thread 0 bound to OS proc set 0
1. Loading Models.
model directory /mnt/nfs/software/Old_PathoFact/submodules/DeepVirFinder/models
2. Encoding and Predicting Sequences.
processing line 1
processing line 61512
/mnt/nfs/software/Old_PathoFact/.snakemake/conda/45939b1f/lib/python3.6/site-packages/tensorflow/python/framework/dtypes.py:516: FutureWarning: Passing (type, 1) or '1type' as a synonym of type is deprecated; in a future version of numpy, it will be understood as (type, (1,)) / '(1,)type'.
_np_qint8 = np.dtype([("qint8", np.int8, 1)])
/mnt/nfs/software/Old_PathoFact/.snakemake/conda/45939b1f/lib/python3.6/site-packages/tensorflow/python/framework/dtypes.py:517: FutureWarning: Passing (type, 1) or '1type' as a synonym of type is deprecated; in a future version of numpy, it will be understood as (type, (1,)) / '(1,)type'.
_np_quint8 = np.dtype([("quint8", np.uint8, 1)])
/mnt/nfs/software/Old_PathoFact/.snakemake/conda/45939b1f/lib/python3.6/site-packages/tensorflow/python/framework/dtypes.py:518: FutureWarning: Passing (type, 1) or '1type' as a synonym of type is deprecated; in a future version of numpy, it will be understood as (type, (1,)) / '(1,)type'.
_np_qint16 = np.dtype([("qint16", np.int16, 1)])
/mnt/nfs/software/Old_PathoFact/.snakemake/conda/45939b1f/lib/python3.6/site-packages/tensorflow/python/framework/dtypes.py:519: FutureWarning: Passing (type, 1) or '1type' as a synonym of type is deprecated; in a future version of numpy, it will be understood as (type, (1,)) / '(1,)type'.
_np_quint16 = np.dtype([("quint16", np.uint16, 1)])
/mnt/nfs/software/Old_PathoFact/.snakemake/conda/45939b1f/lib/python3.6/site-packages/tensorflow/python/framework/dtypes.py:520: FutureWarning: Passing (type, 1) or '1type' as a synonym of type is deprecated; in a future version of numpy, it will be understood as (type, (1,)) / '(1,)type'.
_np_qint32 = np.dtype([("qint32", np.int32, 1)])
/mnt/nfs/software/Old_PathoFact/.snakemake/conda/45939b1f/lib/python3.6/site-packages/tensorflow/python/framework/dtypes.py:525: FutureWarning: Passing (type, 1) or '1type' as a synonym of type is deprecated; in a future version of numpy, it will be understood as (type, (1,)) / '(1,)type'.
np_resource = np.dtype([("resource", np.ubyte, 1)])
/mnt/nfs/software/Old_PathoFact/.snakemake/conda/45939b1f/lib/python3.6/site-packages/tensorboard/compat/tensorflow_stub/dtypes.py:541: FutureWarning: Passing (type, 1) or '1type' as a synonym of type is deprecated; in a future version of numpy, it will be understood as (type, (1,)) / '(1,)type'.
_np_qint8 = np.dtype([("qint8", np.int8, 1)])
/mnt/nfs/software/Old_PathoFact/.snakemake/conda/45939b1f/lib/python3.6/site-packages/tensorboard/compat/tensorflow_stub/dtypes.py:542: FutureWarning: Passing (type, 1) or '1type' as a synonym of type is deprecated; in a future version of numpy, it will be understood as (type, (1,)) / '(1,)type'.
_np_quint8 = np.dtype([("quint8", np.uint8, 1)])
/mnt/nfs/software/Old_PathoFact/.snakemake/conda/45939b1f/lib/python3.6/site-packages/tensorboard/compat/tensorflow_stub/dtypes.py:543: FutureWarning: Passing (type, 1) or '1type' as a synonym of type is deprecated; in a future version of numpy, it will be understood as (type, (1,)) / '(1,)type'.
_np_qint16 = np.dtype([("qint16", np.int16, 1)])
/mnt/nfs/software/Old_PathoFact/.snakemake/conda/45939b1f/lib/python3.6/site-packages/tensorboard/compat/tensorflow_stub/dtypes.py:544: FutureWarning: Passing (type, 1) or '1type' as a synonym of type is deprecated; in a future version of numpy, it will be understood as (type, (1,)) / '(1,)type'.
_np_quint16 = np.dtype([("quint16", np.uint16, 1)])
/mnt/nfs/software/Old_PathoFact/.snakemake/conda/45939b1f/lib/python3.6/site-packages/tensorboard/compat/tensorflow_stub/dtypes.py:545: FutureWarning: Passing (type, 1) or '1type' as a synonym of type is deprecated; in a future version of numpy, it will be understood as (type, (1,)) / '(1,)type'.
_np_qint32 = np.dtype([("qint32", np.int32, 1)])
/mnt/nfs/software/Old_PathoFact/.snakemake/conda/45939b1f/lib/python3.6/site-packages/tensorboard/compat/tensorflow_stub/dtypes.py:550: FutureWarning: Passing (type, 1) or '1type' as a synonym of type is deprecated; in a future version of numpy, it will be understood as (type, (1,)) / '(1,)type'.
np_resource = np.dtype([("resource", np.ubyte, 1)])
```
The pipline stop above step for long time(up to 7 days) and seems hardly finished.
Looking forward your reply, Thanks a lot.https://git-r3lab.uni.lu/laura.denies/PathoFact/-/issues/75CondaVerificationError after installation2021-09-01T13:48:10+02:00AndrewCondaVerificationError after installationHi,
I followed Installation guide step by step. But when I tried to run PathoFact got the error:
<details>
<summary>console output for test from guide</summary>
```bash
(PathoFact) anri@anriPC:/media/anri/FAE8F3E0E8F39959/PathoFact...Hi,
I followed Installation guide step by step. But when I tried to run PathoFact got the error:
<details>
<summary>console output for test from guide</summary>
```bash
(PathoFact) anri@anriPC:/media/anri/FAE8F3E0E8F39959/PathoFact-master$ snakemake -s test/Snakefile --use-conda --reason --cores 4 -p
Building DAG of jobs...
Executing subworkflow pathofact.
Building DAG of jobs...
Creating conda environment envs/VirSorter.yaml...
Downloading remote packages.
CreateCondaEnvironmentException:
Could not create conda environment from /media/anri/FAE8F3E0E8F39959/PathoFact-master/rules/AMR/../../envs/VirSorter.yaml:
Collecting package metadata (repodata.json): ...working... done
Solving environment: ...working... done
Downloading and Extracting Packages
libcurl-7.69.1 | 573 KB | ########## | 100%
libedit-3.1.20170329 | 172 KB | ########## | 100%
entrez-direct-13.3 | 4.7 MB | ########## | 100%
perl-moose-2.2011 | 444 KB | ########## | 100%
perl-class-method-mo | 13 KB | ########## | 100%
perl-module-implemen | 9 KB | ########## | 100%
perl-mro-compat-0.13 | 10 KB | ########## | 100%
perl-getopt-long-2.5 | 27 KB | ########## | 100%
perl-package-stash-0 | 66 KB | ########## | 100%
perl-sub-name-0.21 | 13 KB | ########## | 100%
perl-apache-test-1.4 | 115 KB | ########## | 100%
perl-package-depreca | 10 KB | ########## | 100%
muscle-3.8.1551 | 280 KB | ########## | 100%
perl-devel-overloadi | 8 KB | ########## | 100%
perl-devel-stacktrac | 16 KB | ########## | 100%
blast-2.9.0 | 17.7 MB | ########## | 100%
perl-sub-install-0.9 | 10 KB | ########## | 100%
perl-yaml-1.29 | 41 KB | ########## | 100%
perl-moo-2.003004 | 38 KB | ########## | 100%
ncurses-6.1 | 1.3 MB | ########## | 100%
perl-params-util-1.0 | 14 KB | ########## | 100%
perl-role-tiny-2.000 | 15 KB | ########## | 100%
mcl-14.137 | 2.2 MB | ########## | 100%
perl-package-stash-x | 21 KB | ########## | 100%
perl-dist-checkconfl | 10 KB | ########## | 100%
perl-eval-closure-0. | 11 KB | ########## | 100%
libssh2-1.8.2 | 257 KB | ########## | 100%
metagene_annotator-1 | 729 KB | ########## | 100%
curl-7.69.1 | 137 KB | ########## | 100%
perl-sub-exporter-pr | 8 KB | ########## | 100%
perl-file-which-1.23 | 12 KB | ########## | 100%
perl-devel-globaldes | 7 KB | ########## | 100%
krb5-1.17.1 | 1.5 MB | ########## | 100%
perl-class-load-0.25 | 12 KB | ########## | 100%
perl-module-runtime- | 7 KB | ########## | 100%
virsorter-1.0.5 | 47 KB | ########## | 100%
perl-bioperl-1.6.924 | 2.2 MB | ########## | 100%
perl-class-load-xs-0 | 13 KB | ########## | 100%
perl-module-runtime- | 15 KB | ########## | 100%
perl-sub-quote-2.006 | 18 KB | ########## | 100%
diamond-0.9.14 | 556 KB | ########## | 100%
perl-sub-exporter-0. | 30 KB | ########## | 100%
perl-parallel-forkma | 21 KB | ########## | 100%
perl-sub-identify-0. | 12 KB | ########## | 100%
openssl-1.1.1f | 2.1 MB | ########## | 100%
perl-data-optlist-0. | 10 KB | ########## | 100%
Preparing transaction: ...working... done
Verifying transaction: ...working... done
Executing transaction: ...working... done
ERROR conda.core.link:_execute(699): An error occurred while installing package 'conda-forge::perl-5.26.2-h516909a_1006'.
Rolling back transaction: ...working... done
[Errno 22] Invalid argument: '/media/anri/FAE8F3E0E8F39959/PathoFact-master/.snakemake/conda/f8c49d3c/man/man3/App::Cpan.3'
()
```
</details>
<details>
<summary>console output for my data</summary>
```bash
anri@anriPC:/media/anri/FAE8F3E0E8F39959/PathoFact-master$ conda activate PathoFact
(PathoFact) anri@anriPC:/media/anri/FAE8F3E0E8F39959/PathoFact-master$ snakemake -s Snakefile --use-conda --reason --cores 4 -p
Building DAG of jobs...
Creating conda environment envs/Biopython.yaml...
Downloading remote packages.
CreateCondaEnvironmentException:
Could not create conda environment from /media/anri/FAE8F3E0E8F39959/PathoFact-master/rules/Universal/../../envs/Biopython.yaml:
Collecting package metadata (repodata.json): ...working... done
Solving environment: ...working... done
Preparing transaction: ...working... done
Verifying transaction: ...working... failed
CondaVerificationError: The package for scikit-learn located at /home/anri/anaconda3/pkgs/scikit-learn-0.21.3-py36hcdab131_0
appears to be corrupted. The path 'lib/python3.6/site-packages/sklearn/datasets/data/boston_house_prices.csv'
specified in the package manifest cannot be found.
CondaVerificationError: The package for scikit-learn located at /home/anri/anaconda3/pkgs/scikit-learn-0.21.3-py36hcdab131_0
appears to be corrupted. The path 'lib/python3.6/site-packages/sklearn/datasets/data/breast_cancer.csv'
specified in the package manifest cannot be found.
CondaVerificationError: The package for scikit-learn located at /home/anri/anaconda3/pkgs/scikit-learn-0.21.3-py36hcdab131_0
appears to be corrupted. The path 'lib/python3.6/site-packages/sklearn/datasets/data/iris.csv'
specified in the package manifest cannot be found.
CondaVerificationError: The package for scikit-learn located at /home/anri/anaconda3/pkgs/scikit-learn-0.21.3-py36hcdab131_0
appears to be corrupted. The path 'lib/python3.6/site-packages/sklearn/datasets/data/linnerud_exercise.csv'
specified in the package manifest cannot be found.
CondaVerificationError: The package for scikit-learn located at /home/anri/anaconda3/pkgs/scikit-learn-0.21.3-py36hcdab131_0
appears to be corrupted. The path 'lib/python3.6/site-packages/sklearn/datasets/data/linnerud_physiological.csv'
specified in the package manifest cannot be found.
CondaVerificationError: The package for scikit-learn located at /home/anri/anaconda3/pkgs/scikit-learn-0.21.3-py36hcdab131_0
appears to be corrupted. The path 'lib/python3.6/site-packages/sklearn/datasets/data/wine_data.csv'
specified in the package manifest cannot be found.
CondaVerificationError: The package for scikit-learn located at /home/anri/anaconda3/pkgs/scikit-learn-0.21.3-py36hcdab131_0
appears to be corrupted. The path 'lib/python3.6/site-packages/sklearn/datasets/descr/boston_house_prices.rst'
specified in the package manifest cannot be found.
CondaVerificationError: The package for scikit-learn located at /home/anri/anaconda3/pkgs/scikit-learn-0.21.3-py36hcdab131_0
appears to be corrupted. The path 'lib/python3.6/site-packages/sklearn/datasets/descr/breast_cancer.rst'
specified in the package manifest cannot be found.
CondaVerificationError: The package for scikit-learn located at /home/anri/anaconda3/pkgs/scikit-learn-0.21.3-py36hcdab131_0
appears to be corrupted. The path 'lib/python3.6/site-packages/sklearn/datasets/descr/california_housing.rst'
specified in the package manifest cannot be found.
CondaVerificationError: The package for scikit-learn located at /home/anri/anaconda3/pkgs/scikit-learn-0.21.3-py36hcdab131_0
appears to be corrupted. The path 'lib/python3.6/site-packages/sklearn/datasets/descr/covtype.rst'
specified in the package manifest cannot be found.
CondaVerificationError: The package for scikit-learn located at /home/anri/anaconda3/pkgs/scikit-learn-0.21.3-py36hcdab131_0
appears to be corrupted. The path 'lib/python3.6/site-packages/sklearn/datasets/descr/diabetes.rst'
specified in the package manifest cannot be found.
CondaVerificationError: The package for scikit-learn located at /home/anri/anaconda3/pkgs/scikit-learn-0.21.3-py36hcdab131_0
appears to be corrupted. The path 'lib/python3.6/site-packages/sklearn/datasets/descr/digits.rst'
specified in the package manifest cannot be found.
CondaVerificationError: The package for scikit-learn located at /home/anri/anaconda3/pkgs/scikit-learn-0.21.3-py36hcdab131_0
appears to be corrupted. The path 'lib/python3.6/site-packages/sklearn/datasets/descr/iris.rst'
specified in the package manifest cannot be found.
CondaVerificationError: The package for scikit-learn located at /home/anri/anaconda3/pkgs/scikit-learn-0.21.3-py36hcdab131_0
appears to be corrupted. The path 'lib/python3.6/site-packages/sklearn/datasets/descr/kddcup99.rst'
specified in the package manifest cannot be found.
CondaVerificationError: The package for scikit-learn located at /home/anri/anaconda3/pkgs/scikit-learn-0.21.3-py36hcdab131_0
appears to be corrupted. The path 'lib/python3.6/site-packages/sklearn/datasets/descr/lfw.rst'
specified in the package manifest cannot be found.
CondaVerificationError: The package for scikit-learn located at /home/anri/anaconda3/pkgs/scikit-learn-0.21.3-py36hcdab131_0
appears to be corrupted. The path 'lib/python3.6/site-packages/sklearn/datasets/descr/linnerud.rst'
specified in the package manifest cannot be found.
CondaVerificationError: The package for scikit-learn located at /home/anri/anaconda3/pkgs/scikit-learn-0.21.3-py36hcdab131_0
appears to be corrupted. The path 'lib/python3.6/site-packages/sklearn/datasets/descr/olivetti_faces.rst'
specified in the package manifest cannot be found.
CondaVerificationError: The package for scikit-learn located at /home/anri/anaconda3/pkgs/scikit-learn-0.21.3-py36hcdab131_0
appears to be corrupted. The path 'lib/python3.6/site-packages/sklearn/datasets/descr/rcv1.rst'
specified in the package manifest cannot be found.
CondaVerificationError: The package for scikit-learn located at /home/anri/anaconda3/pkgs/scikit-learn-0.21.3-py36hcdab131_0
appears to be corrupted. The path 'lib/python3.6/site-packages/sklearn/datasets/descr/twenty_newsgroups.rst'
specified in the package manifest cannot be found.
CondaVerificationError: The package for scikit-learn located at /home/anri/anaconda3/pkgs/scikit-learn-0.21.3-py36hcdab131_0
appears to be corrupted. The path 'lib/python3.6/site-packages/sklearn/datasets/descr/wine_data.rst'
specified in the package manifest cannot be found.
CondaVerificationError: The package for scikit-learn located at /home/anri/anaconda3/pkgs/scikit-learn-0.21.3-py36hcdab131_0
appears to be corrupted. The path 'lib/python3.6/site-packages/sklearn/datasets/images/china.jpg'
specified in the package manifest cannot be found.
CondaVerificationError: The package for scikit-learn located at /home/anri/anaconda3/pkgs/scikit-learn-0.21.3-py36hcdab131_0
appears to be corrupted. The path 'lib/python3.6/site-packages/sklearn/datasets/images/flower.jpg'
specified in the package manifest cannot be found.
SafetyError: The package for scikit-learn located at /home/anri/anaconda3/pkgs/scikit-learn-0.21.3-py36hcdab131_0
appears to be corrupted. The path 'lib/python3.6/site-packages/sklearn/neighbors/ball_tree.cpython-36m-x86_64-linux-gnu.so'
has an incorrect size.
reported size: 538712 bytes
actual size: 241152 bytes
CondaVerificationError: The package for scikit-learn located at /home/anri/anaconda3/pkgs/scikit-learn-0.21.3-py36hcdab131_0
appears to be corrupted. The path 'lib/python3.6/site-packages/sklearn/utils/sparsefuncs_fast.cpython-36m-x86_64-linux-gnu.so'
specified in the package manifest cannot be found.
```
</details>
Whats wrong?https://git-r3lab.uni.lu/laura.denies/PathoFact/-/issues/76hmm.R2021-09-06T12:31:47+02:00Matthew Macowanhmm.RHi, I just got this error, and I'm wondering if it is similar to previously closed issues [#62 (closed)](https://git-r3lab.uni.lu/laura.denies/PathoFact/-/issues/62) and [#39 (closed)](https://git-r3lab.uni.lu/laura.denies/PathoFact/-/is...Hi, I just got this error, and I'm wondering if it is similar to previously closed issues [#62 (closed)](https://git-r3lab.uni.lu/laura.denies/PathoFact/-/issues/62) and [#39 (closed)](https://git-r3lab.uni.lu/laura.denies/PathoFact/-/issues/39).
```bash
Building DAG of jobs...
Using shell: /bin/bash
Provided cores: 16
Rules claiming more threads will be scaled down.
Job counts:
count jobs
1 HMM_R_VIR
1
[Sat Jul 31 01:59:24 2021]
rule HMM_R_VIR:
input: /fs03/mf33/Matt/sunbeam_cf_shotgun/sunbeam_output/assembly/contigs/PathoFact_results/PathoFact_intermediate/VIRULENCE/HMM_virulence/181s4_S72-co$
output: /fs03/mf33/Matt/sunbeam_cf_shotgun/sunbeam_output/assembly/contigs/PathoFact_results/PathoFact_intermediate/VIRULENCE/HMM_virulence/181s4_S72-c$
log: /fs03/mf33/Matt/sunbeam_cf_shotgun/sunbeam_output/assembly/contigs/PathoFact_results/logs/181s4_S72-contigs/hmm_results.log
jobid: 0
wildcards: project=PathoFact_results, sample=181s4_S72-contigs
Rscript --vanilla /fs03/mf33/Matt/PathoFact/.snakemake/scripts/tmp02b99mr2.hmm.R
Activating conda environment: /fs03/mf33/Matt/PathoFact/.snakemake/conda/7950189c
/fs03/mf33/Matt/PathoFact/.snakemake/conda/7950189c/lib/R/bin/R: line 240: /fs03/mf33/Matt/PathoFact/.snakemake/conda/7950189c/lib/R/etc/ldpaths: No such f$
[Sat Jul 31 01:59:50 2021]
Error in rule HMM_R_VIR:
jobid: 0
output: /fs03/mf33/Matt/sunbeam_cf_shotgun/sunbeam_output/assembly/contigs/PathoFact_results/PathoFact_intermediate/VIRULENCE/HMM_virulence/181s4_S72-c$
log: /fs03/mf33/Matt/sunbeam_cf_shotgun/sunbeam_output/assembly/contigs/PathoFact_results/logs/181s4_S72-contigs/hmm_results.log (check log file(s) for$
conda-env: /fs03/mf33/Matt/PathoFact/.snakemake/conda/7950189c
RuleException:
CalledProcessError in line 87 of /fs03/mf33/Matt/PathoFact/rules/Virulence/Virulence.smk:
Command 'source /scratch/of33/mmacowan/miniconda/bin/activate '/fs03/mf33/Matt/PathoFact/.snakemake/conda/7950189c'; set -euo pipefail; Rscript --vanilla $
File "/fs03/mf33/Matt/PathoFact/rules/Virulence/Virulence.smk", line 87, in __rule_HMM_R_VIR
File "/scratch/of33/mmacowan/miniconda/envs/PathoFact/lib/python3.6/concurrent/futures/thread.py", line 56, in run
Shutting down, this might take some time.
Exiting because a job execution failed. Look above for error message
```
It looks like `line 87` refers to script `hmm.R`.
The `hmm_results.log` file for the particular instance that caused the error was present but empty, however this is the contents of that file for one of the other samples:
```bash
Registered S3 methods overwritten by 'ggplot2':
method from
[.quosures rlang
c.quosures rlang
print.quosures rlang
Registered S3 method overwritten by 'rvest':
method from
read_xml.response xml2
── Attaching packages ─────────────────────────────────────── tidyverse 1.2.1 ──
✔ ggplot2 3.1.1 ✔ purrr 0.3.2
✔ tibble 2.1.1 ✔ dplyr 0.8.0.1
✔ tidyr 0.8.3 ✔ stringr 1.4.0
✔ readr 1.3.1 ✔ forcats 0.4.0
── Conflicts ────────────────────────────────────────── tidyverse_conflicts() ──
✖ dplyr::filter() masks stats::filter()
✖ dplyr::lag() masks stats::lag()
Attaching package: 'reshape2'
The following object is masked from 'package:tidyr':
smiths
```
Have you encountered a similar issue? Any help would be greatly appreciated.
Thanks,
Matthewhttps://git-r3lab.uni.lu/laura.denies/PathoFact/-/issues/77Feature request: include ARO in output reports2021-09-01T14:13:10+02:00Benjamin WingfieldFeature request: include ARO in output reportsHello,
Thanks for making PathoFact, it's awesome!
I ran into a small problem matching up output reports to the CARD database. Sometimes gene names change over time. It would be easier if we had access to `BEST_ARO_HIT` in the output re...Hello,
Thanks for making PathoFact, it's awesome!
I ran into a small problem matching up output reports to the CARD database. Sometimes gene names change over time. It would be easier if we had access to `BEST_ARO_HIT` in the output reports :smile:
Thanks,
BenLaura DeniesLaura Denieshttps://git-r3lab.uni.lu/laura.denies/PathoFact/-/issues/78ownHMM_library.R script fails with empty input files2021-11-12T13:27:17+01:00Susheel BusiownHMM_library.R script fails with empty input files`rule R_script` from the `Combine_Toxin_SignalP.smk` throws an error (below) if the input file is empty:
```
RuleException:
CalledProcessError in line 29 of /mnt/irisgpfs/users/sbusi/apps/PathoFact/rules/Toxin/Combine_Toxin_SignalP.smk:
...`rule R_script` from the `Combine_Toxin_SignalP.smk` throws an error (below) if the input file is empty:
```
RuleException:
CalledProcessError in line 29 of /mnt/irisgpfs/users/sbusi/apps/PathoFact/rules/Toxin/Combine_Toxin_SignalP.smk:
Command 'source /scratch/users/sbusi/tools/miniconda3/bin/activate '/mnt/irisgpfs/users/sbusi/apps/PathoFact/.snakemake/conda/c6514c6c'; set -euo pipefail; Rscript --vanilla /mnt/irisgpfs/users/sbusi/apps/PathoFact/.snakemake/scripts/tmpxk7hk0_a.ownHMM_library.R' returned non-zero exit status 1.
File "/mnt/irisgpfs/users/sbusi/apps/PathoFact/rules/Toxin/Combine_Toxin_SignalP.smk", line 29, in __rule_R_script
File "/scratch/users/sbusi/tools/miniconda3/envs/PathoFact/lib/python3.6/concurrent/futures/thread.py", line 56, in run
```
**Failed file example:**
```
==> /work/projects/amrwd/results/PathoFact/PathoFact_space_output/PathoFact_intermediate/TOXIN/HMM_toxin/IG4SW-M_F1_L4_NORM_GC.Input_HMM_R.csv <==
Query_sequence HMM_Name Significance_Evalue Score
#Toxin
```
**Working file example:**
```
==> /work/projects/amrwd/results/PathoFact/PathoFact_space_output/PathoFact_intermediate/TOXIN/HMM_toxin/IIF1SW-M_F2_L1_NORM_FLT.Input_HMM_R.csv <==
Query_sequence HMM_Name Significance_Evalue Score
#Toxin
0000010215 K01114_780_95_420 3.8e-78 262.5
0000013630 K01186_172 5.1e-103 344.2
0000012319 K01186_172 1.7e-08 33.1
0000019322 K01197_243 0.29 8.8
0000018125 K03699_483 2.8e-112 374.6
0000018124 K03699_483 2.3e-71 239.7
0000010016 K03699_483 1.8e-63 213.7
0000011228 K03699_483 1.5e-62 210.6
```Laura DeniesLaura Denieshttps://git-r3lab.uni.lu/laura.denies/PathoFact/-/issues/79Adding information in the final PathoFact report2022-06-01T23:43:08+02:00ntromasAdding information in the final PathoFact reportHi!
I wonder if it would be possible to add some extra information in the PathoFact report, for example AMR database (deepARG or RGI), as this information is already in the AMR report.
Thanks !Hi!
I wonder if it would be possible to add some extra information in the PathoFact report, for example AMR database (deepARG or RGI), as this information is already in the AMR report.
Thanks !https://git-r3lab.uni.lu/laura.denies/PathoFact/-/issues/80Question about execution Sna2021-10-29T14:51:39+02:00adolfoislaQuestion about execution SnaDear, I performed the installation and test/test_config.yaml was executed correctly, but after configuring the config.yaml with my data and execute ~/PathoFact$ snakemake -s Snakefile --use-conda --reason --cores 64 -p I got the follow...Dear, I performed the installation and test/test_config.yaml was executed correctly, but after configuring the config.yaml with my data and execute ~/PathoFact$ snakemake -s Snakefile --use-conda --reason --cores 64 -p I got the following error message:
Building DAG of jobs...
MissingInputException in line 33 of /home/superserver/PathoFact/Snakefile:
Missing input files for rule check_AMR_MGE:
/home/superserver/PathoFact/output_expected/PathoFact_report/AMR_MGE_prediction_CP011849_report.tsv
regardshttps://git-r3lab.uni.lu/laura.denies/PathoFact/-/issues/82Error in rule Plasmid_aggregate2021-11-02T17:11:45+01:00Ella KantorError in rule Plasmid_aggregateHello, I have been having an issue where the program terminates with an error and I am not sure why. Here is the log file and it looks like the error is in rule Plasmid_aggregate, jobid: 1094. The command I used is `nohup snakemake -s Sn...Hello, I have been having an issue where the program terminates with an error and I am not sure why. Here is the log file and it looks like the error is in rule Plasmid_aggregate, jobid: 1094. The command I used is `nohup snakemake -s Snakefile --use-conda --reason --cores 12 -p`
[2021-10-07T162426.322356.snakemake.log](/uploads/806d6a534564348fcf8985eb557421bb/2021-10-07T162426.322356.snakemake.log)https://git-r3lab.uni.lu/laura.denies/PathoFact/-/issues/83Finished NCBI genomes and PathoFact2022-09-21T07:04:47+02:00vkisandFinished NCBI genomes and PathoFactHi,
I have used the PathoFact successfully on my onw WGSs' containing several contigs. However, now I wanted to test it on finished genomes of similar strains downloaded from NCBI. Whitespaces are deleted from names, but "complete" run f...Hi,
I have used the PathoFact successfully on my onw WGSs' containing several contigs. However, now I wanted to test it on finished genomes of similar strains downloaded from NCBI. Whitespaces are deleted from names, but "complete" run freezes, I would say rather randomly. Re-runs may continue a bit but still never finish. "Vir" or "Tox" were finished, but "AMR" froze too
Any ideas? I increased mem setting a bit
mem:
normal_mem_per_core_gb: "4G"
big_mem_cores: 12
big_mem_per_core_gb: "64G"
cheers,
/vhttps://git-r3lab.uni.lu/laura.denies/PathoFact/-/issues/84Error in rule run_VirSorter:2022-01-27T02:51:28+01:00yuzie0314Error in rule run_VirSorter:Dear @laura.denies,
Recently I was keeping trying your tool PathoFact with the test data provided from you.
I installed your tool and dependency according to the instructions from your ReadMe file.
However I frequently confronted with s...Dear @laura.denies,
Recently I was keeping trying your tool PathoFact with the test data provided from you.
I installed your tool and dependency according to the instructions from your ReadMe file.
However I frequently confronted with such error shown as follows (snakemake.log):
![image](/uploads/a8d72901d7b1ab60cf05944dd4cdcb71/image.png)
I also found the log in "test/output_test/logs/test_sample/VIRSorter_global-phage-signal.log" : [/usr/bin/bash: line 1: wrapper_phage_contigs_sorter_iPlant.pl: command not found]
Do you have any idea about popping up such issue when testing your tool ?
Thank You in advance.
Yu[2022-01-25T064727.536530.snakemake.log](/uploads/0884d9f32814d1f15a6d5edf434938f6/2022-01-25T064727.536530.snakemake.log)https://git-r3lab.uni.lu/laura.denies/PathoFact/-/issues/88testing module fails deepARG- how to update deepARG within PathoFact?2022-09-19T15:23:39+02:00RahilRydertesting module fails deepARG- how to update deepARG within PathoFact?Im having the same issue as issue #64 and the solution was to update deepARG 1.0.2 in bitbucket. I am using Ubuntu to run PathoFact. I tried updating deepARG using the github page but unsure what I would need to do within PathoFact to ge...Im having the same issue as issue #64 and the solution was to update deepARG 1.0.2 in bitbucket. I am using Ubuntu to run PathoFact. I tried updating deepARG using the github page but unsure what I would need to do within PathoFact to get it updated. I updated it and moved the files into the PathoFact folder where deepARG files are but this did not fix the problem. I've been able to run the toxin and virulence reports successfully. Appreciate any help!
"/mnt/d/Metagenomics_trial/PathoFact/.snakemake/conda/e658ef70/bin/deeparg", line 5, in from deeparg.entry import main File "/mnt/d/Metagenomics_trial/PathoFact/.snakemake/conda/e658ef70/lib/python2.7/site-packages/deeparg/entry.py", line 10, in import deeparg.predict.bin.deepARG as clf File "/mnt/d/Metagenomics_trial/PathoFact/.snakemake/conda/e658ef70/lib/python2.7/site-packages/deeparg/predict/bin/deepARG.py", line 21, in from deeparg.predict.bin import process_blast File "/mnt/d/Metagenomics_trial/PathoFact/.snakemake/conda/e658ef70/lib/python2.7/site-packages/deeparg/predict/bin/deeparg.py", line 21, in from deeparg.predict.bin import process_blast ImportError: No module named predict.binhttps://git-r3lab.uni.lu/laura.denies/PathoFact/-/issues/91Error in rule CTDT:2023-09-13T10:12:25+02:00fconstanciasError in rule CTDT:Dear PathoFact developers,
Thanks for your effort developping this very usefull tool.
Could you clarify the different branches (`orf_coinf`, `orf`, `PathoFact_v`, `master` ...)
?
I am using PathoFact using already annotated contigs i....Dear PathoFact developers,
Thanks for your effort developping this very usefull tool.
Could you clarify the different branches (`orf_coinf`, `orf`, `PathoFact_v`, `master` ...)
?
I am using PathoFact using already annotated contigs i.e., ORFs `https://gitlab.lcsb.uni.lu/laura.denies/PathoFact/-/tree/orf_coinf` and I prepared the necessary files: .fna .faa .contig using the script: https://raw.githubusercontent.com/fconstancias/NRP72-FBT/master/functions/prepare_faa_fna_PAthoFact_orf.Rscript.R
I am using the enclosed [config.yaml](/uploads/6bab28ae40b5b022704a7bd305167531/config.yaml)
I run the pipeline using `snakemake -s Snakefile --use-conda --reason --cores 10 -p --configfile analyses/config.yaml`
everything went smooth until the following error:
```
[Fri Jul 1 16:05:57 2022]
Finished job 321.
277 of 1075 steps (26%) done
[Fri Jul 1 16:05:57 2022]
Job 324: Identify features (CTDT) on the following sample(s): PathoFact_results - chicken1_l10000
Reason: Missing output files: /datadrive05/Flo/tools/orf_coinf/analyses/PathoFact_results/PathoFact_intermediate/VIRULENCE/classifier_virulence/chicken1_l10000/group_22_CTDT.txt
python scripts/CTDT.py --file /datadrive05/Flo/tools/orf_coinf/analyses/PathoFact_results/splitted/chicken1_l10000/group_22.fasta --out /datadrive05/Flo/tools/orf_coinf/analyses/PathoFact_results/PathoFact_intermediate/VIRULENCE/classifier_virulence/chicken1_l10000/group_22_CTDT.txt &> /datadrive05/Flo/tools/orf_coinf/analyses/PathoFact_results/logs/chicken1_l10000/group_22_CTDT.log
Activating conda environment: /datadrive05/Flo/tools/orf_coinf/.snakemake/conda/67994cc8
Activating conda environment: /datadrive05/Flo/tools/orf_coinf/.snakemake/conda/67994cc8
[Fri Jul 1 16:05:59 2022]
Error in rule CTDT:
jobid: 319
output: /datadrive05/Flo/tools/orf_coinf/analyses/PathoFact_results/PathoFact_intermediate/VIRULENCE/classifier_virulence/chicken1_l10000/group_6_CTDT.txt
log: /datadrive05/Flo/tools/orf_coinf/analyses/PathoFact_results/logs/chicken1_l10000/group_6_CTDT.log (check log file(s) for error message)
conda-env: /datadrive05/Flo/tools/orf_coinf/.snakemake/conda/67994cc8
shell:
python scripts/CTDT.py --file /datadrive05/Flo/tools/orf_coinf/analyses/PathoFact_results/splitted/chicken1_l10000/group_6.fasta --out /datadrive05/Flo/tools/orf_coinf/analyses/PathoFact_results/PathoFact_intermediate/VIRULENCE/classifier_virulence/chicken1_l10000/group_6_CTDT.txt &> /datadrive05/Flo/tools/orf_coinf/analyses/PathoFact_results/logs/chicken1_l10000/group_6_CTDT.log
(exited with non-zero exit code)
Removing temporary output file /datadrive05/Flo/tools/orf_coinf/analyses/PathoFact_results/PathoFact_intermediate/VIRULENCE/classifier_virulence/chicken1_l10000/group_49_matrix.tsv.
[Fri Jul 1 16:06:00 2022]
Finished job 66.
```
Please find enclosed the logs: [2022-07-01T155547.324708.snakemake.log](/uploads/2142c321b536feddf05ad97afc16537c/2022-07-01T155547.324708.snakemake.log) [group_6_CTDT.log](/uploads/f415d5ad31b6e24516751a131e2001bc/group_6_CTDT.log) as well as the [group_6.fasta.gz](/uploads/e507e6dd8e0187f3d67195b07a4bcad7/group_6.fasta.gz)
```
conda --version
conda 4.12.0
```
[environment.yml](/uploads/bac774322dc73e9f67996ebf53b26f01/environment.yml)
It seems specific to the `VIR` workflow since `AMR` and `Tox` worked fine on the same dataset.
Thanks for the support!https://git-r3lab.uni.lu/laura.denies/PathoFact/-/issues/92How to run PathoFact on Fasta file2022-09-19T15:07:54+02:00harelarikHow to run PathoFact on Fasta fileHi,
I have a fasta file of few proteins or DNA sequences, I have removed spaces from fasta headers.
PathoFact is installed in my linux system (version- Ubuntu 18.04).
conda version: 4.8.4 snakemake version 5.5.4
**How do I run Pathofact...Hi,
I have a fasta file of few proteins or DNA sequences, I have removed spaces from fasta headers.
PathoFact is installed in my linux system (version- Ubuntu 18.04).
conda version: 4.8.4 snakemake version 5.5.4
**How do I run Pathofact to scan the fasta file ?**
Thank you,
Arikhttps://git-r3lab.uni.lu/laura.denies/PathoFact/-/issues/93NA in MGE prediction2022-10-17T13:56:57+02:00ntromasNA in MGE predictionHi PathoFact team,
I run PathoFact using AMR in the workflow and the run ended without error. However, the AMR report is composed of NA (only NA) in the MGE prediction column. My guess is that there is an issue with the AMR_MGE.R script...Hi PathoFact team,
I run PathoFact using AMR in the workflow and the run ended without error. However, the AMR report is composed of NA (only NA) in the MGE prediction column. My guess is that there is an issue with the AMR_MGE.R script. I used the last version so not sure why I got this. DvF and VirSorter worked properly, same for PlasFlow.
Thanks for the help,
Nicohttps://git-r3lab.uni.lu/laura.denies/PathoFact/-/issues/94CalledProcessError in line 29 of Combine_Toxin_SignalP.smk:2022-09-19T14:59:34+02:00Ghost UserCalledProcessError in line 29 of Combine_Toxin_SignalP.smk:I run PathoFact with ```snakemake -s Snakefile --use-conda --rerun-incomplete --cores 28 -p``` and encountered a CalledProcessError. The run is not interrupted but remains at the current step for two days now (with average CPU usage of 1...I run PathoFact with ```snakemake -s Snakefile --use-conda --rerun-incomplete --cores 28 -p``` and encountered a CalledProcessError. The run is not interrupted but remains at the current step for two days now (with average CPU usage of 15). I have seen another user experiencing a similar issue (https://gitlab.lcsb.uni.lu/laura.denies/PathoFact/-/issues/90) but all my FASTA files contain more than one contig. What effect has the CalledProcessError on the final outcome? Thanks!
```
RuleException:
CalledProcessError in line 29 of /pathofact/PathoFact/rules/Toxin/Combine_Toxin_SignalP.smk:
Command 'source /miniconda3/bin/activate '/pathofact/PathoFact/.snakemake/conda/fb1f3485'; set -euo pipefail; Rscript --vanilla /pathofact/PathoFact/.snakemake/scripts/tmp23lfmhsd.ownHMM_library.R' returned non-zero exit status 1.
File "/protect/pathofact/PathoFact/rules/Toxin/Combine_Toxin_SignalP.smk", line 29, in __rule_R_script
File "/miniconda3/envs/PathoFact/lib/python3.6/concurrent/futures/thread.py", line 56, in run
Removing output files of failed job R_script since they might be corrupted:
/pathofact/contigs_raw_files/PathoFact_results/PathoFact_report/Toxin_gene_library_ABC.tsv
[Sun Jul 24 10:14:54 2022]
Finished job 37616.
31250 of 33263 steps (94%) done
[Sun Jul 24 10:14:58 2022]
Finished job 289.
31251 of 33263 steps (94%) done
```https://git-r3lab.uni.lu/laura.denies/PathoFact/-/issues/95Error in rule run_deepARG2023-02-20T14:59:08+01:00WoCer2019Error in rule run_deepARGThanks for the nice tool!
Recently I was running PathoFact with the test data. However, I was confronted with an error shown as follows (group_1.out.mapping.ARG.log).
```
Traceback (most recent call last):
File "/data/LJ/software/patho...Thanks for the nice tool!
Recently I was running PathoFact with the test data. However, I was confronted with an error shown as follows (group_1.out.mapping.ARG.log).
```
Traceback (most recent call last):
File "/data/LJ/software/pathofact/PathoFact/.snakemake/conda/a3c12cf1/bin/deeparg", line 5, in <module>
from deeparg.entry import main
File "/data/LJ/software/pathofact/PathoFact/.snakemake/conda/a3c12cf1/lib/python2.7/site-packages/deeparg/entry.py", line 9, in <module>
import deeparg.predict.bin.deepARG as clf
File "/data/LJ/software/pathofact/PathoFact/.snakemake/conda/a3c12cf1/lib/python2.7/site-packages/deeparg/predict/bin/deepARG.py", line 12, in <module>
from lasagne import layers
File "/data/LJ/software/pathofact/PathoFact/.snakemake/conda/a3c12cf1/lib/python2.7/site-packages/lasagne/__init__.py", line 13, in <module>
""")
ImportError: Could not import Theano.
Please make sure you install a recent enough version of Theano. See
section 'Install from PyPI' in the installation docs for more details:
http://lasagne.readthedocs.org/en/latest/user/installation.html#install-from-pypi
```
Thank you in advance.https://git-r3lab.uni.lu/laura.denies/PathoFact/-/issues/100Convert PathoFact AMR_MGE_report.tsv output to GFF2023-04-14T09:18:45+02:00theadrijaConvert PathoFact AMR_MGE_report.tsv output to GFFIs there any way I can convert the .tsv report of AMR_MGE to GFF?Is there any way I can convert the .tsv report of AMR_MGE to GFF?