WORKSHOP: Unlocking nf-core - customising workflows for your research
This record includes training materials associated with the Australian BioCommons workshop Unlocking nf-core - customising workflows for your research’. This workshop took place over two, 3 hour sessions on 18-19 May 2023.
Event description
Processing and analysing omics datasets poses many...
Keywords: Bioinformatics, Workflows, Nextflow, nf-core
WORKSHOP: Unlocking nf-core - customising workflows for your research
https://zenodo.org/records/8026170
https://dresa.org.au/materials/workshop-unlocking-nf-core-customising-workflows-for-your-research-1584ff39-e007-4422-9fd5-4e407df6b6c5
This record includes training materials associated with the Australian BioCommons workshop Unlocking nf-core - customising workflows for your research’. This workshop took place over two, 3 hour sessions on 18-19 May 2023.
Event description
Processing and analysing omics datasets poses many challenges to life scientists, particularly when we need to share our methods with other researchers and scale up our research. Public and reproducible bioinformatics workflows, like those developed by nf-core, are invaluable resources for the life science community.
nf-core is a community-driven effort to provide high-quality bioinformatics workflows for common analyses including, RNAseq, mapping, variant calling, and single cell transcriptomics. A big advantage of using nf-core workflows is the ability to customise and optimise them for different computational environments, types and sizes of data and research goals.
This workshop will set you up with the foundational knowledge required to run and customise nf-core workflows in a reproducible manner. On day 1 you will learn about the nf-core tools utility, and step through the code structure of nf-core workflows. Then on day 2, using the nf-core/rnaseq workflow as an example, you will explore the various ways to adjust the workflow parameters, customise processes, and configure the workflow for your computational environment.
This workshop event and accompanying materials were developed by the Sydney Informatics Hub, University of Sydney in partnership with Seqera Labs, Pawsey Supercomputing Research Centre, and Australia’s National Research Education Network (AARNet). The workshop was enabled through the Australian BioCommons - Bring Your Own Data Platforms project (Australian Research Data Commons and NCRIS via Bioplatforms Australia).
Materials
Materials are shared under a Creative Commons Attribution 4.0 International agreement unless otherwise specified and were current at the time of the event.
Files and materials included in this record:
Event metadata (PDF): Information about the event including, description, event URL, learning objectives, prerequisites, technical requirements etc.
Index of training materials (PDF): List and description of all materials associated with this event including the name, format, location and a brief description of each file.
nfcore_Schedule: Schedule for the workshop providing a breakdown of topics and timings
nfcore_Q_and_A: Archive of questions and their answers from the workshop Slack Channel.
Materials shared elsewhere:
This workshop follows the accompanying training materials that were developed by the Sydney Informatics Hub, University of Sydney in partnership with Seqera Labs, Pawsey Supercomputing Research Centre, and Australia’s National Research Education Network (AARNet).
https://sydney-informatics-hub.github.io/customising-nfcore-workshop
Melissa Burke (melissa@biocommons.org.au)
Samaha, Georgina (orcid: 0000-0003-0419-1476)
Willet, Cali (orcid: 0000-0001-8449-1502)
Hakkaart, Chris (orcid: 0000-0001-5007-2684)
Beecroft, Sarah (orcid: 0000-0002-3935-2279)
Stott, Audrey (orcid: 0000-0003-0939-3173)
Ip, Alex (orcid: 0000-0001-8937-8904)
Cooke, Steele
Bioinformatics, Workflows, Nextflow, nf-core
ARDC FAIR Data 101 self-guided
FAIR Data 101 v3.0 is a self-guided course covering the FAIR Data principles
The FAIR Data 101 virtual course was designed and delivered by the ARDC Skilled Workforce Program twice in 2020 and has now been reworked as a self-guided course.
The course structure was based on 'FAIR Data in the...
Keywords: training material, FAIR data, video, webinar, activities, quiz, FAIR, research data management
ARDC FAIR Data 101 self-guided
https://zenodo.org/records/5094034
https://dresa.org.au/materials/ardc-fair-data-101-self-guided-2d794a84-f0ff-4e11-a39c-fa8ea481e097
FAIR Data 101 v3.0 is a self-guided course covering the FAIR Data principles
The FAIR Data 101 virtual course was designed and delivered by the ARDC Skilled Workforce Program twice in 2020 and has now been reworked as a self-guided course.
The course structure was based on 'FAIR Data in the Scholarly Communications Lifecycle', run by Natasha Simons at the FORCE11 Scholarly Communications Institute. These training materials are hosted on GitHub.
contact@ardc.edu.au
Stokes, Liz (orcid: 0000-0002-2973-5647)
Liffers, Matthias (orcid: 0000-0002-3639-2080)
Burton, Nichola (orcid: 0000-0003-4470-4846)
Martinez, Paula A. (orcid: 0000-0002-8990-1985)
Simons, Natasha (orcid: 0000-0003-0635-1998)
Russell, Keith (orcid: 0000-0001-5390-2719)
McCafferty, Siobhann (orcid: 0000-0002-2491-0995)
Ferrers, Richard (orcid: 0000-0002-2923-9889)
McEachern, Steve (orcid: 0000-0001-7848-4912)
Barlow, Melanie (orcid: 0000-0002-3956-5784)
Brady, Catherine (orcid: 0000-0002-7919-7592)
Brownlee, Rowan (orcid: 0000-0002-1955-1262)
Honeyman, Tom (orcid: 0000-0001-9448-4023)
Quiroga, Maria del Mar (orcid: 0000-0002-8943-2808)
training material, FAIR data, video, webinar, activities, quiz, FAIR, research data management
WORKSHOP: Unlocking nf-core - customising workflows for your research
This record includes training materials associated with the Australian BioCommons workshop Unlocking nf-core - customising workflows for your research’. This workshop took place over two, 3 hour sessions on 18-19 May 2023.
Event description
Processing and analysing omics datasets poses many...
Keywords: Bioinformatics, Workflows, Nextflow, nf-core
WORKSHOP: Unlocking nf-core - customising workflows for your research
https://zenodo.org/record/8026170
https://dresa.org.au/materials/workshop-unlocking-nf-core-customising-workflows-for-your-research
This record includes training materials associated with the Australian BioCommons workshop Unlocking nf-core - customising workflows for your research’. This workshop took place over two, 3 hour sessions on 18-19 May 2023.
Event description
Processing and analysing omics datasets poses many challenges to life scientists, particularly when we need to share our methods with other researchers and scale up our research. Public and reproducible bioinformatics workflows, like those developed by nf-core, are invaluable resources for the life science community.
nf-core is a community-driven effort to provide high-quality bioinformatics workflows for common analyses including, RNAseq, mapping, variant calling, and single cell transcriptomics. A big advantage of using nf-core workflows is the ability to customise and optimise them for different computational environments, types and sizes of data and research goals.
This workshop will set you up with the foundational knowledge required to run and customise nf-core workflows in a reproducible manner. On day 1 you will learn about the nf-core tools utility, and step through the code structure of nf-core workflows. Then on day 2, using the nf-core/rnaseq workflow as an example, you will explore the various ways to adjust the workflow parameters, customise processes, and configure the workflow for your computational environment.
This workshop event and accompanying materials were developed by the Sydney Informatics Hub, University of Sydney in partnership with Seqera Labs, Pawsey Supercomputing Research Centre, and Australia’s National Research Education Network (AARNet). The workshop was enabled through the Australian BioCommons - Bring Your Own Data Platforms project (Australian Research Data Commons and NCRIS via Bioplatforms Australia).
Materials
Materials are shared under a Creative Commons Attribution 4.0 International agreement unless otherwise specified and were current at the time of the event.
Files and materials included in this record:
Event metadata (PDF): Information about the event including, description, event URL, learning objectives, prerequisites, technical requirements etc.
Index of training materials (PDF): List and description of all materials associated with this event including the name, format, location and a brief description of each file.
nfcore_Schedule: Schedule for the workshop providing a breakdown of topics and timings
nfcore_Q_and_A: Archive of questions and their answers from the workshop Slack Channel.
Materials shared elsewhere:
This workshop follows the accompanying training materials that were developed by the Sydney Informatics Hub, University of Sydney in partnership with Seqera Labs, Pawsey Supercomputing Research Centre, and Australia’s National Research Education Network (AARNet).
https://sydney-informatics-hub.github.io/customising-nfcore-workshop
Melissa Burke (melissa@biocommons.org.au)
Samaha, Georgina (orcid: 0000-0003-0419-1476)
Willet, Cali (orcid: 0000-0001-8449-1502)
Hakkaart, Chris (orcid: 0000-0001-5007-2684)
Beecroft, Sarah (orcid: 0000-0002-3935-2279)
Stott, Audrey (orcid: 0000-0003-0939-3173)
Ip, Alex (orcid: 0000-0001-8937-8904)
Cooke, Steele
Bioinformatics, Workflows, Nextflow, nf-core
ARDC FAIR Data 101 self-guided
FAIR Data 101 v3.0 is a self-guided course covering the FAIR Data principles
The FAIR Data 101 virtual course was designed and delivered by the ARDC Skilled Workforce Program twice in 2020 and has now been reworked as a self-guided course.
The course structure was based on 'FAIR Data in the...
Keywords: training material, FAIR data, video, webinar, activities, quiz, FAIR, research data management
ARDC FAIR Data 101 self-guided
https://zenodo.org/record/5094034
https://dresa.org.au/materials/ardc-fair-data-101-self-guided-bba41a59-8479-4f4f-b9ee-337b9eb294bf
FAIR Data 101 v3.0 is a self-guided course covering the FAIR Data principles
The FAIR Data 101 virtual course was designed and delivered by the ARDC Skilled Workforce Program twice in 2020 and has now been reworked as a self-guided course.
The course structure was based on 'FAIR Data in the Scholarly Communications Lifecycle', run by Natasha Simons at the FORCE11 Scholarly Communications Institute. These training materials are hosted on GitHub.
contact@ardc.edu.au
Stokes, Liz (orcid: 0000-0002-2973-5647)
Liffers, Matthias (orcid: 0000-0002-3639-2080)
Burton, Nichola (orcid: 0000-0003-4470-4846)
Martinez, Paula A. (orcid: 0000-0002-8990-1985)
Simons, Natasha (orcid: 0000-0003-0635-1998)
Russell, Keith (orcid: 0000-0001-5390-2719)
McCafferty, Siobhann (orcid: 0000-0002-2491-0995)
Ferrers, Richard (orcid: 0000-0002-2923-9889)
McEachern, Steve (orcid: 0000-0001-7848-4912)
Barlow, Melanie (orcid: 0000-0002-3956-5784)
Brady, Catherine (orcid: 0000-0002-7919-7592)
Brownlee, Rowan (orcid: 0000-0002-1955-1262)
Honeyman, Tom (orcid: 0000-0001-9448-4023)
Quiroga, Maria del Mar (orcid: 0000-0002-8943-2808)
training material, FAIR data, video, webinar, activities, quiz, FAIR, research data management
HPC file systems and what users need to consider for appropriate and efficient usage
Three videos on miscellaneous aspects of HPC usage - useful reference for new users of HPC systems.
1 – General overview of different file systems that might be available on HPC. The video goes through shared file systems such as /home and /scratch, local compute node file systems (local...
Keywords: HPC, high performance computer, File systems
Resource type: video, presentation
HPC file systems and what users need to consider for appropriate and efficient usage
https://www.youtube.com/watch?v=cNW7F9V1plA&list=PLjlLx279X4yO62jHF4rd7I9iEfbnz3Ts1
https://dresa.org.au/materials/hpc-file-systems-and-what-users-need-to-consider-for-appropriate-and-efficient-usage
Three videos on miscellaneous aspects of HPC usage - useful reference for new users of HPC systems.
1 – General overview of different file systems that might be available on HPC. The video goes through shared file systems such as /home and /scratch, local compute node file systems (local scratch or $TMPDIR) and storage file system. It outlines what users need to consider if they wish to use any of these in their workflows.
2 – Overview of the different directories that might be present on HPC. These could include /home, /scratch, /opt, /lib and lib64, /sw and others.
3 – Overview of the Message-of-the-day file and the message that is displayed to users every time they log in. This displays info about general help and often current problems or upcoming outages.
QCIF Training (training@qcif.edu.au)
Marlies Hankel
HPC, high performance computer, File systems
Basic Linux/Unix commands
A series of eight videos (each between 5 and 10 minutes long) following the content of the Software Carpentry workshop "The Unix Shell".
Sessions 1, 2 and 3 provide instructions on the minimal level of Linux/Unix commands recommended for new...
Keywords: HPC, high performance computer, Unix, Linux, Software Carpentry
Resource type: video, guide
Basic Linux/Unix commands
https://www.youtube.com/playlist?list=PLjlLx279X4yP5GodfbqQTJuJ1S9EJU3GM
https://dresa.org.au/materials/basic-linux-unix-commands
A series of eight videos (each between 5 and 10 minutes long) following the content of the Software Carpentry workshop ["The Unix Shell"](https://swcarpentry.github.io/shell-novice/).
Sessions 1, 2 and 3 provide instructions on the minimal level of Linux/Unix commands recommended for new users of HPC.
1 – An overview of how to find out where a user is in the filesystem, list the files there, and how to get help on Unix commands
2 – How to move around the file system and change into other directories
3 – Explains the difference between an absolute and relative path
4 – Overview of how to create new directories, and to create and edit new files with nano
5 – How to use the vi editor to edit files
6 – Overview of file viewers available
7 – How to copy and move files and directories
8 – How to remove files and directories
Further details and exercises with solutions can be found on the Software Carpentry "The Unix Shell" page (https://swcarpentry.github.io/shell-novice/)
QCIF Training (training@qcif.edu.au)
Marlies Hankel
HPC, high performance computer, Unix, Linux, Software Carpentry
Transferring files and data
A short video outlining the basics on how to use FileZilla to establish a secure file transfer protocol (sftp) connection to HPC to use a drag and drop interface to transfer files between the HPC and a desktop computer.
Keywords: sftp, file transfer, HPC, high performance computer
Resource type: video, guide
Transferring files and data
https://www.youtube.com/watch?v=9ABMxcKqfkQ&list=PLjlLx279X4yP3eTLu0S6nOt0HQ7XRf6WF
https://dresa.org.au/materials/transferring-files-and-data
A short video outlining the basics on how to use FileZilla to establish a secure file transfer protocol (sftp) connection to HPC to use a drag and drop interface to transfer files between the HPC and a desktop computer.
QCIF Training (training@qcif.edu.au)
Marlies Hankel
sftp, file transfer, HPC, high performance computer
Connecting to HPC
A series of three short videos introducing how to use PuTTY to connect from a Windows PC to a secure HPC (high performance computing) cluster.
1 - The very basics on how to establish a connection to HPC.
2 - How to add more specific options for the connection to HPC.
3 - How to save the...
Keywords: HPC, high performance computer, ssh
Resource type: video, guide
Connecting to HPC
https://www.youtube.com/playlist?list=PLjlLx279X4yPJBVQuIRhz1CVMfQpTuvZW
https://dresa.org.au/materials/connecting-to-hpc
A series of three short videos introducing how to use PuTTY to connect from a Windows PC to a secure HPC (high performance computing) cluster.
1 - The very basics on how to establish a connection to HPC.
2 - How to add more specific options for the connection to HPC.
3 - How to save the details and options for a connection for future use.
QCIF Training (training@qcif.edu.au)
Marlies Hankel
HPC, high performance computer, ssh