Register training material
10 materials found

Content provider: Griffith University  or QCIF 


Introduction to REDCap at Griffith University

This site is designed as a companion to Griffith Library’s Research Data Capture workshops. It can also be treated as a standalone, self-paced tutorial for learning to use REDCap (Research Electronic Data Capture) a secure web application for building and managing online surveys and databases.

Keywords: REDCap, survey instruments

Resource type: tutorial

Introduction to REDCap at Griffith University https://dresa.org.au/materials/introduction-to-redcap-at-griffith-university This site is designed as a companion to Griffith Library’s Research Data Capture workshops. It can also be treated as a standalone, self-paced tutorial for learning to use REDCap (Research Electronic Data Capture) a secure web application for building and managing online surveys and databases. y.banens@griffith.edu.au REDCap, survey instruments mbr phd ecr researcher support
Introducing Computational Thinking

This workshop is for researchers at all career stages who want to understand the uses and the building blocks of computational thinking. This skill is useful for all kinds of problem solving, whether in real life or in computing.

The workshop will not teach computer programming per se. Instead...

Keywords: computational skills, data skills

Resource type: tutorial

Introducing Computational Thinking https://dresa.org.au/materials/introducing-computational-thinking This workshop is for researchers at all career stages who want to understand the uses and the building blocks of computational thinking. This skill is useful for all kinds of problem solving, whether in real life or in computing. The workshop will not teach computer programming per se. Instead it will cover the thought processes involved should you want to learn to program. s.stapleton@griffith.edu.au computational skills, data skills
Advanced Data Wrangling with OpenRefine

This online self-paced workshop teaches advanced data wrangling skills including combining datasets, geolocating data, and “what if” exploration using OpenRefine.

Keywords: data skills, data

Resource type: tutorial

Advanced Data Wrangling with OpenRefine https://dresa.org.au/materials/advanced-data-wrangling-with-openrefine This online self-paced workshop teaches advanced data wrangling skills including combining datasets, geolocating data, and “what if” exploration using OpenRefine. s.stapleton@griffith.edu.au data skills, data mbr phd ecr researcher support professional
Introduction to Data Cleaning with OpenRefine

Learn basic data cleaning techniques in this self-paced online workshop using open data from data.qld.gov.au and open source tool OpenRefine openrefine.org. Learn techniques to prepare messy tabular data for comupational analysis. Of most relevance to HASS disciplines, working with textual data...

Keywords: data skills, Data analysis

Resource type: tutorial

Introduction to Data Cleaning with OpenRefine https://dresa.org.au/materials/introduction-to-data-cleaning-with-openrefine Learn basic data cleaning techniques in this self-paced online workshop using open data from data.qld.gov.au and open source tool OpenRefine openrefine.org. Learn techniques to prepare messy tabular data for comupational analysis. Of most relevance to HASS disciplines, working with textual data in a structured or semi-structured format. s.stapleton@griffith.edu.au; Sharron Stapleton data skills, Data analysis mbr phd ecr researcher support professional
10 Reproducible Research things - Building Business Continuity

The idea that you can duplicate an experiment and get the same conclusion is the basis for all scientific discoveries. Reproducible research is data analysis that starts with the raw data and offers a transparent workflow to arrive at the same results and conclusions. However not all studies are...

Keywords: reproducibility, data management

Resource type: tutorial, video

10 Reproducible Research things - Building Business Continuity https://dresa.org.au/materials/9-reproducible-research-things-building-business-continuity The idea that you can duplicate an experiment and get the same conclusion is the basis for all scientific discoveries. Reproducible research is data analysis that starts with the raw data and offers a transparent workflow to arrive at the same results and conclusions. However not all studies are replicable due to lack of information on the process. Therefore, reproducibility in research is extremely important. Researchers genuinely want to make their research more reproducible, but sometimes don’t know where to start and often don’t have the available time to investigate or establish methods on how reproducible research can speed up every day work. We aim for the philosophy “Be better than you were yesterday”. Reproducibility is a process, and we highlight there is no expectation to go from beginner to expert in a single workshop. Instead, we offer some steps you can take towards the reproducibility path following our Steps to Reproducible Research self paced program. Video: https://www.youtube.com/watch?v=bANTr9RvnGg Tutorial: https://guereslib.github.io/ten-reproducible-research-things/ a.miotto@griffith.edu.au; s.stapleton@griffith.edu.au; i.jennings@griffith.edu.au; Sharron Stapleton Isaac Jennings reproducibility, data management masters phd ecr researcher support
Data Storytelling

Nowadays, more information created than our audience could possibly analyse on their own! A study by Stanford professor Chip Heath found that during the recall of speeches, 63% of people remember stories and how they made them feel, but only 5% remember a single statistic. So, you should convert...

Keywords: data storytelling, data visualisation

Data Storytelling https://dresa.org.au/materials/data-storytelling Nowadays, more information created than our audience could possibly analyse on their own! A study by Stanford professor Chip Heath found that during the recall of speeches, 63% of people remember stories and how they made them feel, but only 5% remember a single statistic. So, you should convert your insights and discovery from data into stories to share with non-experts with a language they understand. But how? This tutorial helps you construct stories that incite an emotional response and create meaning and understanding for the audience by applying data storytelling techniques. m.yamaguchi@griffith.edu.au a.miotto@griffith.edu.au data storytelling, data visualisation support masters phd researcher
HPC file systems and what users need to consider for appropriate and efficient usage

Three videos on miscellaneous aspects of HPC usage - useful reference for new users of HPC systems.

1 – General overview of different file systems that might be available on HPC. The video goes through shared file systems such as /home and /scratch, local compute node file systems (local...

Keywords: HPC, high performance computer, File systems

Resource type: video, presentation

HPC file systems and what users need to consider for appropriate and efficient usage https://dresa.org.au/materials/hpc-file-systems-and-what-users-need-to-consider-for-appropriate-and-efficient-usage Three videos on miscellaneous aspects of HPC usage - useful reference for new users of HPC systems. 1 – General overview of different file systems that might be available on HPC. The video goes through shared file systems such as /home and /scratch, local compute node file systems (local scratch or $TMPDIR) and storage file system. It outlines what users need to consider if they wish to use any of these in their workflows. 2 – Overview of the different directories that might be present on HPC. These could include /home, /scratch, /opt, /lib and lib64, /sw and others. 3 – Overview of the Message-of-the-day file and the message that is displayed to users every time they log in. This displays info about general help and often current problems or upcoming outages. QCIF Training (training@qcif.edu.au) HPC, high performance computer, File systems
Basic Linux/Unix commands

A series of eight videos (each between 5 and 10 minutes long) following the content of the Software Carpentry workshop "The Unix Shell".

Sessions 1, 2 and 3 provide instructions on the minimal level of Linux/Unix commands recommended for new...

Keywords: HPC, high performance computer, Unix, Linux, Software Carpentry

Resource type: video, guide

Basic Linux/Unix commands https://dresa.org.au/materials/basic-linux-unix-commands A series of eight videos (each between 5 and 10 minutes long) following the content of the Software Carpentry workshop ["The Unix Shell"](https://swcarpentry.github.io/shell-novice/). Sessions 1, 2 and 3 provide instructions on the minimal level of Linux/Unix commands recommended for new users of HPC. 1 – An overview of how to find out where a user is in the filesystem, list the files there, and how to get help on Unix commands 2 – How to move around the file system and change into other directories 3 – Explains the difference between an absolute and relative path 4 – Overview of how to create new directories, and to create and edit new files with nano 5 – How to use the vi editor to edit files 6 – Overview of file viewers available 7 – How to copy and move files and directories 8 – How to remove files and directories Further details and exercises with solutions can be found on the Software Carpentry "The Unix Shell" page (https://swcarpentry.github.io/shell-novice/) QCIF Training (training@qcif.edu.au) HPC, high performance computer, Unix, Linux, Software Carpentry
Transferring files and data

A short video outlining the basics on how to use FileZilla to establish a secure file transfer protocol (sftp) connection to HPC to use a drag and drop interface to transfer files between the HPC and a desktop computer.

Keywords: sftp, file transfer, HPC, high performance computer

Resource type: video, guide

Transferring files and data https://dresa.org.au/materials/transferring-files-and-data A short video outlining the basics on how to use FileZilla to establish a secure file transfer protocol (sftp) connection to HPC to use a drag and drop interface to transfer files between the HPC and a desktop computer. QCIF Training (training@qcif.edu.au) sftp, file transfer, HPC, high performance computer
Connecting to HPC

A series of three short videos introducing how to use PuTTY to connect from a Windows PC to a secure HPC (high performance computing) cluster.

1 - The very basics on how to establish a connection to HPC.
2 - How to add more specific options for the connection to HPC.
3 - How to save the...

Keywords: HPC, high performance computer, ssh

Resource type: video, guide

Connecting to HPC https://dresa.org.au/materials/connecting-to-hpc A series of three short videos introducing how to use PuTTY to connect from a Windows PC to a secure HPC (high performance computing) cluster. 1 - The very basics on how to establish a connection to HPC. 2 - How to add more specific options for the connection to HPC. 3 - How to save the details and options for a connection for future use. QCIF Training (training@qcif.edu.au) HPC, high performance computer, ssh