Register training material
8 materials found

Keywords: Data visualisation  or Research Computing 


WORKSHOP: R: fundamental skills for biologists

This record includes training materials associated with the Australian BioCommons workshop ‘R: fundamental skills for biologists’. This workshop took place over four, three-hour sessions on 1, 8, 15 and 22 June 2022.

 

Event description

Biologists need data analysis skills to be able to...

Keywords: Bioinformatics, Analysis, Statistics, R software, RStudio, Data visualisation

WORKSHOP: R: fundamental skills for biologists https://dresa.org.au/materials/workshop-r-fundamental-skills-for-biologists-81aa00db-63ad-4962-a7ac-b885bf9f676b This record includes training materials associated with the Australian BioCommons workshop ‘R: fundamental skills for biologists’. This workshop took place over four, three-hour sessions on 1, 8, 15 and 22 June 2022.   Event description Biologists need data analysis skills to be able to interpret, visualise and communicate their research results. While Excel can cover some data analysis needs, there is a better choice, particularly for large and complex datasets.  R is a free, open-source software and programming language that enables data exploration, statistical analysis, visualisation and more. The large variety of R packages available for analysing biological data make it a robust and flexible option for data of all shapes and sizes.  Getting started can be a little daunting for those without a background in statistics and programming. In this workshop we will equip you with the foundations for getting the most out of R and RStudio, an interactive way of structuring and keeping track of your work in R. Using biological data from a model of influenza infection, you will learn how to efficiently and reproducibly organise, read, wrangle, analyse, visualise and generate reports from your data in R. Topics covered in this workshop include: Spreadsheets, organising data and first steps with R Manipulating and analysing data with dplyr Data visualisation Summarized experiments and getting started with Bioconductor   This workshop is presented by the Australian BioCommons and Saskia Freytag from WEHI  with the assistance of a network of facilitators from the national Bioinformatics Training Cooperative. Materials are shared under a Creative Commons Attribution 4.0 International agreement unless otherwise specified and were current at the time of the event. Files and materials included in this record: Event metadata (PDF): Information about the event including, description, event URL, learning objectives, prerequisites, technical requirements etc. Index of training materials (PDF): List and description of all materials associated with this event including the name, format, location and a brief description of each file. Schedule (PDF): A breakdown of the topics and timings for the workshop Recommended resources (PDF): A list of resources recommended by trainers and participants Q_and_A(PDF): Archive of questions and their answers from the workshop Slack Channel. Materials shared elsewhere:   This workshop follows the tutorial ‘Introduction to data analysis with R and Bioconductor’ which is publicly available. https://saskiafreytag.github.io/biocommons-r-intro/ This is derived from material produced as part of The Carpentries Incubator project https://carpentries-incubator.github.io/bioc-intro/ Melissa Burke (melissa@biocommons.org.au) Bioinformatics, Analysis, Statistics, R software, RStudio, Data visualisation
WORKSHOP: R: fundamental skills for biologists

This record includes training materials associated with the Australian BioCommons workshop ‘R: fundamental skills for biologists’. This workshop took place over four, three-hour sessions on 1, 8, 15 and 22 June 2022.

Event description

Biologists need data analysis skills to be able to...

Keywords: Bioinformatics, Analysis, Statistics, R software, RStudio, Data visualisation

WORKSHOP: R: fundamental skills for biologists https://dresa.org.au/materials/workshop-r-fundamental-skills-for-biologists This record includes training materials associated with the Australian BioCommons workshop ‘R: fundamental skills for biologists’. This workshop took place over four, three-hour sessions on 1, 8, 15 and 22 June 2022. **Event description** Biologists need data analysis skills to be able to interpret, visualise and communicate their research results. While Excel can cover some data analysis needs, there is a better choice, particularly for large and complex datasets.  R is a free, open-source software and programming language that enables data exploration, statistical analysis, visualisation and more. The large variety of R packages available for analysing biological data make it a robust and flexible option for data of all shapes and sizes.  Getting started can be a little daunting for those without a background in statistics and programming. In this workshop we will equip you with the foundations for getting the most out of R and RStudio, an interactive way of structuring and keeping track of your work in R. Using biological data from a model of influenza infection, you will learn how to efficiently and reproducibly organise, read, wrangle, analyse, visualise and generate reports from your data in R. Topics covered in this workshop include: - Spreadsheets, organising data and first steps with R - Manipulating and analysing data with dplyr - Data visualisation - Summarized experiments and getting started with Bioconductor This workshop is presented by the Australian BioCommons and Saskia Freytag from WEHI  with the assistance of a network of facilitators from the national Bioinformatics Training Cooperative. Materials are shared under a Creative Commons Attribution 4.0 International agreement unless otherwise specified and were current at the time of the event. **Files and materials included in this record:** - Event metadata (PDF): Information about the event including, description, event URL, learning objectives, prerequisites, technical requirements etc. - Index of training materials (PDF): List and description of all materials associated with this event including the name, format, location and a brief description of each file. - Schedule (PDF): A breakdown of the topics and timings for the workshop - Recommended resources (PDF): A list of resources recommended by trainers and participants - Q_and_A(PDF): Archive of questions and their answers from the workshop Slack Channel. **Materials shared elsewhere:** This workshop follows the tutorial ‘Introduction to data analysis with R and Bioconductor’ which is publicly available. https://saskiafreytag.github.io/biocommons-r-intro/ This is derived from material produced as part of The Carpentries Incubator project https://carpentries-incubator.github.io/bioc-intro/ Melissa Burke (melissa@biocommons.org.au) Bioinformatics, Analysis, Statistics, R software, RStudio, Data visualisation
Unix Shell and Command Line Basics

The Unix environment is incredibly powerful but quite daunting to the newcomer. Command line confidence unlocks powerful computing resources beyond the desktop, including virtual machines and High Performance Computing. It enables repetitive tasks to be automated. And it comes with a swag of...

Keywords: Research Computing, Unix

Unix Shell and Command Line Basics https://dresa.org.au/materials/unix-shell-and-command-line-basics The Unix environment is incredibly powerful but quite daunting to the newcomer. Command line confidence unlocks powerful computing resources beyond the desktop, including virtual machines and High Performance Computing. It enables repetitive tasks to be automated. And it comes with a swag of handy tools that can be combined in powerful ways. Getting started is the hardest part, but our helpful instructors are there to demystify Unix as you get to work running programs and writing scripts on the command line. Every attendee is given a dedicated training environment for the duration of the workshop, with all software and data fully loaded and ready to run. We teach this course within a GNU/Linux environment. This is best characterised as a Unix-like environment. We teach how to run commands within the Bash Shell. The skills you'll learn at this course are generally transferable to other Unix environments. #### You'll learn: - Navigate and work with files and directories (folders) - Use a selection of essential tools - Combine data and tools to build a processing workflow - Automate repetitive analysis using the command line #### Prerequisites: The course has no prerequisites. **For more information, please click [here](https://intersect.org.au/training/course/unix101).** training@intersect.org.au Research Computing, Unix
From PC to Cloud or High Performance Computing

Most of you would have heard of Cloud and High Performance Computing (HPC), or you may already be using it. HPC is not the same as cloud computing. Both technologies differ in a number of ways, and have some similarities as well.

We may refer to both types as “large scale computing” - but what...

Keywords: Research Computing

From PC to Cloud or High Performance Computing https://dresa.org.au/materials/from-pc-to-cloud-or-high-performance-computing Most of you would have heard of Cloud and High Performance Computing (HPC), or you may already be using it. HPC is not the same as cloud computing. Both technologies differ in a number of ways, and have some similarities as well. We may refer to both types as “large scale computing” - but what is the difference? Both systems target scalability of computing, but in different ways. This webinar will give a good overview to the researchers thinking to make a move from their local computer to Cloud of High Performance Computing Cluster. #### You'll learn: - Introduction - HPC vs Cloud computing - When to use HPC - When to use the Cloud - The Cloud – Pros and Cons - HPC – Pros and Cons #### Prerequisites: The webinar has no prerequisites. **For more information, please click [here](https://intersect.org.au/training/course/compute001).** training@intersect.org.au Research Computing
Getting started with HPC using PBS Pro

Is your computer's limited power throttling your research ambitions? Are your analysis scripts pushing your laptop's processor to its limits? Is your software crashing because you’ve run out of memory? Would you like to unleash to power of the Unix command line to automate and run your analysis...

Keywords: Research Computing, HPC

Getting started with HPC using PBS Pro https://dresa.org.au/materials/getting-started-with-hpc-using-pbs-pro Is your computer's limited power throttling your research ambitions? Are your analysis scripts pushing your laptop's processor to its limits? Is your software crashing because you’ve run out of memory? Would you like to unleash to power of the Unix command line to automate and run your analysis on supercomputers that you can access for free? High-Performance Computing (HPC) allows you to accomplish your analysis faster by using many parallel CPUs and huge amounts of memory simultaneously. This course provides a hands on introduction to running software on HPC infrastructure using PBS Pro. #### You'll learn: - Connect to an HPC cluster - Use the Unix command line to operate a remote computer and create job scripts - Submit and manage jobs on a cluster using a scheduler - Transfer files to and from a remote computer - Use software through environment modules - Use parallelisation to speed up data analysis - Access the facilities available to you as a researcher - This is the PBS Pro version of the Getting Started with HPC course. #### Prerequisites: This course assumes basic familiarity with the Bash command line environment found on GNU/Linux and other Unix-like environments. To come up to speed, consider taking our [Unix Shell and Command Line Basics](https://intersect.org.au/training/course/unix101/) course. **For more information, please click [here](https://intersect.org.au/training/course/hpc201).** training@intersect.org.au Research Computing, HPC
Getting started with HPC using Slurm

Is your computer's limited power throttling your research ambitions? Are your analysis scripts pushing your laptop's processor to its limits? Is your software crashing because you’ve run out of memory? Would you like to unleash to power of the Unix command line to automate and run your analysis...

Keywords: Research Computing, HPC

Getting started with HPC using Slurm https://dresa.org.au/materials/getting-started-with-hpc-using-slurm Is your computer's limited power throttling your research ambitions? Are your analysis scripts pushing your laptop's processor to its limits? Is your software crashing because you’ve run out of memory? Would you like to unleash to power of the Unix command line to automate and run your analysis on supercomputers that you can access for free? High-Performance Computing (HPC) allows you to accomplish your analysis faster by using many parallel CPUs and huge amounts of memory simultaneously. This course provides a hands on introduction to running software on HPC infrastructure using Slurm. #### You'll learn: - Connect to an HPC cluster - Use the Unix command line to operate a remote computer and create job scripts - Submit and manage jobs on a cluster using a scheduler - Transfer files to and from a remote computer - Use software through environment modules - Use parallelisation to speed up data analysis - Access the facilities available to you as a researcher - This is the Slurm version of the Getting Started with HPC course. #### Prerequisites: This course assumes basic familiarity with the Bash command line environment found on GNU/Linux and other Unix-like environments. To come up to speed, consider taking our [Unix Shell and Command Line Basics](https://intersect.org.au/training/course/unix101/) course. **For more information, please click [here](https://intersect.org.au/training/course/hpc202).** training@intersect.org.au Research Computing, HPC
Parallel Programming for HPC

You have written, compiled and run functioning programs in C and/or Fortran. You know how HPC works and you've submitted batch jobs.

Now you want to move from writing single-threaded programs into the parallel programming paradigm, so you can truly harness the full power of High Performance...

Keywords: Research Computing, HPC

Parallel Programming for HPC https://dresa.org.au/materials/parallel-programming-for-hpc You have written, compiled and run functioning programs in C and/or Fortran. You know how HPC works and you've submitted batch jobs. Now you want to move from writing single-threaded programs into the parallel programming paradigm, so you can truly harness the full power of High Performance Computing. #### You'll learn: - OpenMP (Open Multi-Processing): a widespread method for shared memory programming - MPI (Message Passing Interface): a leading distributed memory programming model #### Prerequisites: To do this course you need to have: A good working knowledge of HPC. Consider taking our Getting Started with HPC using PBS Pro course to come up to speed beforehand. Prior experience of writing programs in either C or Fortran. **For more information, please click [here](https://intersect.org.au/training/course/hpc301).** training@intersect.org.au Research Computing, HPC
Heurist Tutorials

A set of video tutorials with accompanying walkthroughs for building your first Heurist database and website. The first three tutorials show you how to get started in Heurist. The five subsequent tutorials introduce you to the five main menus in the Heurist interface.

Keywords: Heurist, Data management, Data visualisation, Digital Humanities, Databasing, website

Resource type: tutorial

Heurist Tutorials https://dresa.org.au/materials/heurist-tutorials A set of video tutorials with accompanying walkthroughs for building your first Heurist database and website. The first three tutorials show you how to get started in Heurist. The five subsequent tutorials introduce you to the five main menus in the Heurist interface. michael.falk@sydney.edu.au Johnson, Ian Osmakov, Artem Heurist, Data management, Data visualisation, Digital Humanities, Databasing, website mbr phd ecr researcher support