Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Improving configs for OpenMS dev and DIANN versions over 1.8.1 #441

Merged
merged 28 commits into from
Nov 23, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
28 commits
Select commit Hold shift + click to select a range
4085296
remove big-nodes.config
ypriverol Nov 17, 2024
db259e8
move diann 1.9.1beta to config
ypriverol Nov 17, 2024
78d35e9
quantms-utils 0.0.11 -> 0.0.12
ypriverol Nov 17, 2024
d9f9fe2
remove diann version
ypriverol Nov 18, 2024
6e4ef56
pride slurm added
ypriverol Nov 18, 2024
4b4aff0
pride slurm added
ypriverol Nov 18, 2024
26eb83f
diann 1.9.1dev -> 1.9.2
ypriverol Nov 18, 2024
a814355
diann 1.9.1dev -> 1.9.2
ypriverol Nov 18, 2024
a0bad9e
diann 1.9.1dev -> 1.9.2
ypriverol Nov 18, 2024
2c2380a
private containers login
ypriverol Nov 18, 2024
7ea1f4c
private containers login
ypriverol Nov 18, 2024
4512b41
private containers login
ypriverol Nov 18, 2024
70ce58e
private containers login
ypriverol Nov 18, 2024
2c1829c
private containers login
ypriverol Nov 18, 2024
5d77377
private containers login
ypriverol Nov 18, 2024
a626a2a
private containers login
ypriverol Nov 18, 2024
c328512
add diann_private.yml
ypriverol Nov 18, 2024
82c6318
add diann_private.yml
ypriverol Nov 18, 2024
3f2dce5
add diann_private.yml
ypriverol Nov 18, 2024
d50c0f3
Merge pull request #46 from bigbio/dev
daichengxin Nov 19, 2024
2cdd220
Merge pull request #47 from ypriverol/dev
daichengxin Nov 19, 2024
9e0487a
compatible with different formats
daichengxin Nov 19, 2024
4c6a8cf
Merge pull request #2 from daichengxin/dev
ypriverol Nov 21, 2024
440a182
quantms-utils 0.0.12 -> 0.0.13
ypriverol Nov 21, 2024
df037e2
quantms-utils 0.0.13 -> 0.0.14
ypriverol Nov 22, 2024
effa1ba
quantms-utils 0.0.12 -> 0.0.15
ypriverol Nov 21, 2024
6485495
Merge remote-tracking branch 'origin/dev' into dev
ypriverol Nov 23, 2024
c663c2f
quantms-utils 0.0.14 -> 0.0.15
ypriverol Nov 23, 2024
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 0 additions & 2 deletions .github/workflows/ci.yml
Original file line number Diff line number Diff line change
Expand Up @@ -48,8 +48,6 @@ jobs:
- NXF_VER: "latest-everything"
exec_profile: "conda"
include:
- test_profile: test_latest_dia
exec_profile: "singularity"
- test_profile: test_lfq
exec_profile: "conda"
- test_profile: test_dda_id
Expand Down
86 changes: 86 additions & 0 deletions .github/workflows/diann_private.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,86 @@
name: Fork-specific CI for test_latest_dia

# Trigger only for your fork and the 'test_latest_dia' test profile
on:
push:
branches:
- dev
pull_request:
branches:
- dev
paths:
- "**.yml"

env:
NXF_ANSI_LOG: false

concurrency:
group: "${{ github.workflow }}-${{ github.event.pull_request.number || github.ref }}"
cancel-in-progress: true

jobs:
test:
name: Run test_latest_dia with Docker
runs-on: ubuntu-latest
env:
NXF_ANSI_LOG: false
CAPSULE_LOG: none
TEST_PROFILE: test_latest_dia
EXEC_PROFILE: docker

# Ensure this workflow only runs for your fork
if: ${{ github.repository == 'ypriverol/quantms' }}

steps:
- name: Check out pipeline code
uses: actions/checkout@v4

- name: Set up Nextflow
uses: nf-core/setup-nextflow@v2
with:
version: "24.04.2" # Or the Nextflow version you prefer

- name: Log in to GitHub Container Registry
uses: docker/login-action@v3
with:
registry: ghcr.io
username: ${{ secrets.GHCR_USERNAME }}
password: ${{ secrets.GHCR_TOKEN }}

- name: Run test_latest_dia with Docker
run: |
nextflow run ${GITHUB_WORKSPACE} -profile $TEST_PROFILE,$EXEC_PROFILE --outdir ${TEST_PROFILE}_${EXEC_PROFILE}_results

- name: Gather failed logs
if: failure() || cancelled()
run: |
mkdir failed_logs
failed=$(grep "FAILED" ${TEST_PROFILE}_${EXEC_PROFILE}_results/pipeline_info/execution_trace.txt | cut -f 2)
while read -r line ; do cp $(ls work/${line}*/*.log) failed_logs/ | true ; done <<< "$failed"

- uses: actions/upload-artifact@v4
if: failure() || cancelled()
name: Upload failed logs
with:
name: failed_logs_${{ matrix.test_profile }}_${{ matrix.exec_profile }}
include-hidden-files: true
path: failed_logs
overwrite: false

- uses: actions/upload-artifact@v4
if: always()
name: Upload results
with:
name: ${{ matrix.test_profile }}_${{ matrix.exec_profile }}_results
include-hidden-files: true
path: ${{ matrix.test_profile }}_${{ matrix.exec_profile }}_results
overwrite: false

- uses: actions/upload-artifact@v4
if: always()
name: Upload log
with:
name: nextflow_${{ matrix.test_profile }}_${{ matrix.exec_profile }}.log
include-hidden-files: true
path: .nextflow.log
overwrite: false
2 changes: 1 addition & 1 deletion conf/dev.config
Original file line number Diff line number Diff line change
Expand Up @@ -18,7 +18,7 @@ params {

process {
withLabel: openms {
conda = "openms::openms-thirdparty=3.2.0"
conda = "bioconda::openms-thirdparty=3.2.0" // The conda package is not the nightly version
ypriverol marked this conversation as resolved.
Show resolved Hide resolved
container = {"${ workflow.containerEngine == 'singularity' && !task.ext.singularity_pull_docker_container ? 'ghcr.io/openms/openms-executables-sif:latest' : 'ghcr.io/openms/openms-executables:latest' }"}
}
}
60 changes: 60 additions & 0 deletions conf/pride_codon_slurm.config
Original file line number Diff line number Diff line change
@@ -0,0 +1,60 @@
/*
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Nextflow config file for EMBL-EBI Codon Cluster for the SLURM login nodes
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Author: Yasset Perez-Riverol
Mail: yperez@ebi.ac.uk
URL: https://www.ebi.ac.uk/
Basedon: https://github.com/nf-core/configs/blob/master/conf/ebi_codon_slurm.config
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
*/

params {
config_profile_contact = "Yasset Perez-Riverol"
config_profile_description = "The European Bioinformatics Institute HPC cluster (codon) profile for the SLURM login nodes"
config_profile_url = "https://www.ebi.ac.uk/"
}

singularity {
enabled = true
// the default is 20 minutes and fails with large images
pullTimeout = "3 hours"
autoMounts = false
runOptions = '-B /hps/nobackup/juan/pride/reanalysis:/hps/nobackup/juan/pride/reanalysis'
cacheDir = "/hps/nobackup/juan/pride/reanalysis/singularity/"
}

process {
// this is to avoid errors for missing files due to shared filesystem latency
maxRetries = 30
errorStrategy = { task.exitStatus == 0 ? "retry" : "terminate" }
cache = "lenient"
afterScript = "sleep 60"

withName:PROTEOMICSLFQ{
memory = {(mzmls as List).size() < 200 ? 72.GB * task.attempt : 250.GB * task.attempt }
cpus = {(mzmls as List).size() < 200 ? 12 * task.attempt : 24 * task.attempt }
}

withName:ASSEMBLE_EMPIRICAL_LIBRARY{
memory = {(ms_files as List).size() < 200 ? 72.GB * task.attempt : 250.GB * task.attempt}
cpus = {(ms_files as List).size() < 200 ? 12 * task.attempt : 24 * task.attempt }
}

withLabel: diann {
container = '/hps/nobackup/juan/pride/reanalysis/singularity/ghcr.io-bigbio-diann-1.9.2.sif'
}
}

executor {
name = "slurm"
queueSize = 2000
submitRateLimit = "10/1sec"
exitReadTimeout = "30 min"
jobName = {
task.name
.replace("[", "(")
.replace("]", ")")
.replace(" ", "_")
}
}
23 changes: 14 additions & 9 deletions conf/test_latest_dia.config
Original file line number Diff line number Diff line change
Expand Up @@ -10,14 +10,6 @@
------------------------------------------------------------------------------------------------
*/

process {
resourceLimits = [
cpus: 4,
memory: '12.GB',
time: '48.h'
]
}

params {
config_profile_name = 'Test profile for latest DIA'
config_profile_description = 'Minimal test dataset to check pipeline function for the data-independent acquisition pipeline branch for latest DIA-NN.'
Expand All @@ -27,7 +19,6 @@ params {
// Input data
input = 'https://raw.githubusercontent.com/nf-core/test-datasets/quantms/testdata/dia_ci/PXD026600.sdrf.tsv'
database = 'https://raw.githubusercontent.com/nf-core/test-datasets/quantms/testdata/dia_ci/REF_EColi_K12_UPS1_combined.fasta'
diann_version = '1.9.beta.1'
min_pr_mz = 350
max_pr_mz = 950
min_fr_mz = 500
Expand All @@ -47,5 +38,19 @@ process {
withName: 'NFCORE_QUANTMS:QUANTMS:FILE_PREPARATION:THERMORAWFILEPARSER' {
publishDir = [path: { "${params.outdir}/${task.process.tokenize(':')[-1].toLowerCase()}" }, pattern: "*.log" ]
}

withLabel: diann {
container = 'ghcr.io/bigbio/diann:1.9.2' // This docker container is private in for quantms
}

resourceLimits = [
cpus: 4,
memory: '12.GB',
time: '48.h'
]

}

singularity.enabled = false // Force to use docker
docker.enabled = true

6 changes: 3 additions & 3 deletions modules/local/add_sage_feat/main.nf
Original file line number Diff line number Diff line change
Expand Up @@ -2,10 +2,10 @@ process SAGEFEATURE {
tag "$meta.mzml_id"
label 'process_low'

conda "bioconda::quantms-utils=0.0.11"
conda "bioconda::quantms-utils=0.0.15"
container "${ workflow.containerEngine == 'singularity' && !task.ext.singularity_pull_docker_container ?
'https://depot.galaxyproject.org/singularity/quantms-utils:0.0.11--pyhdfd78af_0' :
'biocontainers/quantms-utils:0.0.11--pyhdfd78af_0' }"
'https://depot.galaxyproject.org/singularity/quantms-utils:0.0.15--pyhdfd78af_0' :
'biocontainers/quantms-utils:0.0.15--pyhdfd78af_0' }"

input:
tuple val(meta), path(id_file), path(extra_feat)
Expand Down
9 changes: 3 additions & 6 deletions modules/local/assemble_empirical_library/main.nf
Original file line number Diff line number Diff line change
@@ -1,15 +1,12 @@
process ASSEMBLE_EMPIRICAL_LIBRARY {
tag "$meta.experiment_id"
label 'process_low'
label 'diann'

container "${ workflow.containerEngine == 'singularity' && !task.ext.singularity_pull_docker_container ?
'https://containers.biocontainers.pro/s3/SingImgsRepo/diann/v1.8.1_cv1/diann_v1.8.1_cv1.img' :
'docker.io/biocontainers/diann:v1.8.1_cv1' }"

if (params.diann_version == "1.9.beta.1") {
container 'https://ftp.pride.ebi.ac.uk/pub/databases/pride/resources/tools/ghcr.io-bigbio-diann-1.9.1dev.sif'
}

input:
// In this step the real files are passed, and not the names
path(ms_files)
Expand All @@ -18,7 +15,7 @@ process ASSEMBLE_EMPIRICAL_LIBRARY {
path(lib)

output:
path "empirical_library.tsv", emit: empirical_library
path "empirical_library.*", emit: empirical_library
path "assemble_empirical_library.log", emit: log
path "versions.yml", emit: versions

Expand Down Expand Up @@ -48,7 +45,7 @@ process ASSEMBLE_EMPIRICAL_LIBRARY {
diann --f ${(ms_files as List).join(' --f ')} \\
--lib ${lib} \\
--threads ${task.cpus} \\
--out-lib empirical_library.tsv \\
--out-lib empirical_library \\
--verbose $params.diann_debug \\
--rt-profiling \\
--temp ./quant/ \\
Expand Down
5 changes: 1 addition & 4 deletions modules/local/diann_preliminary_analysis/main.nf
Original file line number Diff line number Diff line change
@@ -1,15 +1,12 @@
process DIANN_PRELIMINARY_ANALYSIS {
tag "$ms_file.baseName"
label 'process_high'
label 'diann'

container "${ workflow.containerEngine == 'singularity' && !task.ext.singularity_pull_docker_container ?
'https://containers.biocontainers.pro/s3/SingImgsRepo/diann/v1.8.1_cv1/diann_v1.8.1_cv1.img' :
'docker.io/biocontainers/diann:v1.8.1_cv1' }"

if (params.diann_version == "1.9.beta.1") {
container 'https://ftp.pride.ebi.ac.uk/pub/databases/pride/resources/tools/ghcr.io-bigbio-diann-1.9.1dev.sif'
}

input:
tuple val(meta), path(ms_file), path(predict_library)

Expand Down
6 changes: 3 additions & 3 deletions modules/local/diannconvert/main.nf
Original file line number Diff line number Diff line change
Expand Up @@ -2,10 +2,10 @@ process DIANNCONVERT {
tag "$meta.experiment_id"
label 'process_medium'

conda "bioconda::quantms-utils=0.0.11"
conda "bioconda::quantms-utils=0.0.15"
container "${ workflow.containerEngine == 'singularity' && !task.ext.singularity_pull_docker_container ?
'https://depot.galaxyproject.org/singularity/quantms-utils:0.0.11--pyhdfd78af_0' :
'biocontainers/quantms-utils:0.0.11--pyhdfd78af_0' }"
'https://depot.galaxyproject.org/singularity/quantms-utils:0.0.15--pyhdfd78af_0' :
'biocontainers/quantms-utils:0.0.15--pyhdfd78af_0' }"

input:
path(report)
Expand Down
5 changes: 1 addition & 4 deletions modules/local/diannsummary/main.nf
Original file line number Diff line number Diff line change
@@ -1,15 +1,12 @@
process DIANNSUMMARY {
tag "$meta.experiment_id"
label 'process_high'
label 'diann'

container "${ workflow.containerEngine == 'singularity' && !task.ext.singularity_pull_docker_container ?
'https://containers.biocontainers.pro/s3/SingImgsRepo/diann/v1.8.1_cv1/diann_v1.8.1_cv1.img' :
'docker.io/biocontainers/diann:v1.8.1_cv1' }"

if (params.diann_version == "1.9.beta.1") {
container 'https://ftp.pride.ebi.ac.uk/pub/databases/pride/resources/tools/ghcr.io-bigbio-diann-1.9.1dev.sif'
}

input:
// Note that the files are passed as names and not paths, this prevents them from being staged
// in the directory
Expand Down
6 changes: 3 additions & 3 deletions modules/local/extract_psm/main.nf
Original file line number Diff line number Diff line change
Expand Up @@ -2,10 +2,10 @@ process PSMCONVERSION {
tag "$meta.mzml_id"
label 'process_medium'

conda "bioconda::quantms-utils=0.0.11"
conda "bioconda::quantms-utils=0.0.15"
container "${ workflow.containerEngine == 'singularity' && !task.ext.singularity_pull_docker_container ?
'https://depot.galaxyproject.org/singularity/quantms-utils:0.0.11--pyhdfd78af_0' :
'biocontainers/quantms-utils:0.0.11--pyhdfd78af_0' }"
'https://depot.galaxyproject.org/singularity/quantms-utils:0.0.15--pyhdfd78af_0' :
'biocontainers/quantms-utils:0.0.15--pyhdfd78af_0' }"


input:
Expand Down
6 changes: 3 additions & 3 deletions modules/local/extract_sample/main.nf
Original file line number Diff line number Diff line change
Expand Up @@ -2,10 +2,10 @@ process GETSAMPLE {
tag "$design.Name"
label 'process_low'

conda "bioconda::quantms-utils=0.0.11"
conda "bioconda::quantms-utils=0.0.15"
container "${ workflow.containerEngine == 'singularity' && !task.ext.singularity_pull_docker_container ?
'https://depot.galaxyproject.org/singularity/quantms-utils:0.0.11--pyhdfd78af_0' :
'biocontainers/quantms-utils:0.0.11--pyhdfd78af_0' }"
'https://depot.galaxyproject.org/singularity/quantms-utils:0.0.15--pyhdfd78af_0' :
'biocontainers/quantms-utils:0.0.15--pyhdfd78af_0' }"


input:
Expand Down
6 changes: 3 additions & 3 deletions modules/local/generate_diann_cfg/main.nf
Original file line number Diff line number Diff line change
Expand Up @@ -2,10 +2,10 @@ process GENERATE_DIANN_CFG {
tag "$meta.experiment_id"
label 'process_low'

conda "bioconda::quantms-utils=0.0.11"
conda "bioconda::quantms-utils=0.0.15"
container "${ workflow.containerEngine == 'singularity' && !task.ext.singularity_pull_docker_container ?
'https://depot.galaxyproject.org/singularity/quantms-utils:0.0.11--pyhdfd78af_0' :
'biocontainers/quantms-utils:0.0.11--pyhdfd78af_0' }"
'https://depot.galaxyproject.org/singularity/quantms-utils:0.0.15--pyhdfd78af_0' :
'biocontainers/quantms-utils:0.0.15--pyhdfd78af_0' }"

input:
val(meta)
Expand Down
5 changes: 1 addition & 4 deletions modules/local/individual_final_analysis/main.nf
Original file line number Diff line number Diff line change
@@ -1,15 +1,12 @@
process INDIVIDUAL_FINAL_ANALYSIS {
tag "$ms_file.baseName"
label 'process_high'
label 'diann'

container "${ workflow.containerEngine == 'singularity' && !task.ext.singularity_pull_docker_container ?
'https://containers.biocontainers.pro/s3/SingImgsRepo/diann/v1.8.1_cv1/diann_v1.8.1_cv1.img' :
'docker.io/biocontainers/diann:v1.8.1_cv1' }"

if (params.diann_version == "1.9.beta.1") {
container 'https://ftp.pride.ebi.ac.uk/pub/databases/pride/resources/tools/ghcr.io-bigbio-diann-1.9.1dev.sif'
}

input:
tuple val(meta), path(ms_file), path(fasta), path(diann_log), path(library)

Expand Down
6 changes: 3 additions & 3 deletions modules/local/ms2rescore/main.nf
Original file line number Diff line number Diff line change
Expand Up @@ -2,10 +2,10 @@ process MS2RESCORE {
tag "$meta.mzml_id"
label 'process_high'

conda "bioconda::quantms-utils=0.0.11"
conda "bioconda::quantms-utils=0.0.15"
container "${ workflow.containerEngine == 'singularity' && !task.ext.singularity_pull_docker_container ?
'https://depot.galaxyproject.org/singularity/quantms-utils:0.0.11--pyhdfd78af_0' :
'biocontainers/quantms-utils:0.0.11--pyhdfd78af_0' }"
'https://depot.galaxyproject.org/singularity/quantms-utils:0.0.15--pyhdfd78af_0' :
'biocontainers/quantms-utils:0.0.15--pyhdfd78af_0' }"

// userEmulation settings when docker is specified
containerOptions = (workflow.containerEngine == 'docker') ? '-u $(id -u) -e "HOME=${HOME}" -v /etc/passwd:/etc/passwd:ro -v /etc/shadow:/etc/shadow:ro -v /etc/group:/etc/group:ro -v $HOME:$HOME' : ''
Expand Down
6 changes: 3 additions & 3 deletions modules/local/mzmlstatistics/main.nf
Original file line number Diff line number Diff line change
Expand Up @@ -3,10 +3,10 @@ process MZMLSTATISTICS {
label 'process_very_low'
label 'process_single'

conda "bioconda::quantms-utils=0.0.11"
conda "bioconda::quantms-utils=0.0.15"
container "${ workflow.containerEngine == 'singularity' && !task.ext.singularity_pull_docker_container ?
'https://depot.galaxyproject.org/singularity/quantms-utils:0.0.11--pyhdfd78af_0' :
'biocontainers/quantms-utils:0.0.11--pyhdfd78af_0' }"
'https://depot.galaxyproject.org/singularity/quantms-utils:0.0.15--pyhdfd78af_0' :
'biocontainers/quantms-utils:0.0.15--pyhdfd78af_0' }"

input:
tuple val(meta), path(ms_file)
Expand Down
Loading
Loading