Register training material
11 materials found

Licence: YouTube  or other-open 


AWS Ramp-Up Guide: Academic Research

AWS Ramp-Up Guides offer a variety of resources to help you build your skills and knowledge of the AWS Cloud. Each guide features carefully selected digital training, classroom courses, videos, whitepapers, certifications, and more. AWS now offers four ramp-up guides that help academic...

Keywords: Machine learning, machine learning, aws, AWS, cloud, Cloud computing, cloud computing, training material, HPC training, HPC, training registry, training partnerships

AWS Ramp-Up Guide: Academic Research https://dresa.org.au/materials/aws-ramp-up-guide-academic-research AWS Ramp-Up Guides offer a variety of resources to help you build your skills and knowledge of the AWS Cloud. Each guide features carefully selected digital training, classroom courses, videos, whitepapers, certifications, and more. AWS now offers four ramp-up guides that help academic researchers who use AI, ML, Generative AI, and HPC in their research activities, as well as the essential AWS knowledge for Statistician Researchers and Research IT professionals. The guides help learners decide where to start, and how to navigate, their learning journey. Some resources will be more relevant than others based on each learner’s specific research tasks. AI, ML, Generative AI ramp-up guide (page 2) is for academic researchers who are exploring using AWS AI, ML, and Generative AI tools to improve efficiency and productivity in their research tasks. This course introduces seven components on AI and ML and ten components on Generative AI. The course starts with an introduction to AI, and covers AWS AI/ML services, such as Amazon SageMaker. The Generative AI content covers topics such as planning a Generative AI project, responsible AI Practices, security, compliance, and governance for AI solutions. The Generative AI topics also cover how to get started with Amazon Bedrock. Recommended prerequisites: basic understanding of Python. High Performance Computing ramp-up guide (page 3) is designed for academic researchers who seek to use HPC on AWS. In this course, you will be introduced to eleven components that are essential about Higher Performance Computing on AWS. The course starts with an overview of HPC on AWS, followed by topics including AWS ParallelCluster and Research HPC Workloads on AWS Batch. Recommended prerequisites: complete AWS Cloud Essentials. Statistician Researcher ramp-up guide (page 4) is specifically catered for researchers in the fields of statistics and quantum analysis. The course covers topics such as building with Amazon Redshift clusters, getting started with Amazon EMR, Machine Learning for Data Scientists, authoring visual analytics using Amazon QuickSight, Batch analytics on AWS, and Amazon Lightsail for Research. Recommended prerequisites: complete AWS Cloud Essentials. Research IT ramp-up guide (page 5) is an extension of the Foundational Researcher Learning Plan, and enables Research IT leaders and professionals to dive deeper into specific topics. The goal of this extension for Research IT professionals is to dive deeper on fundamentals, understand management capabilities and implementing guardrails, cost optimization for research workloads, become familiar with platforms for research and research partners, and learn more about AWS Landing Zone and AWS Control Tower for Research. Recommended prerequisites: Foundational Researcher Learning Plan. emmarrig@amazon.com Machine learning, machine learning, aws, AWS, cloud, Cloud computing, cloud computing, training material, HPC training, HPC, training registry, training partnerships
Amazon Braket - Knowledge Badge Readiness Path

This Learning Path helps you build knowledge and technical skills to use Amazon Braket. This Learning Path presents domain-specific content and includes courses, knowledge checks, a pre-assessment and a knowledge badge assessment. This path is a guide and presents learning in a structured order,...

Keywords: quantum, cloud, AWS, aws, Cloud computing, cloud computing

Amazon Braket - Knowledge Badge Readiness Path https://dresa.org.au/materials/amazon-braket-knowledge-badge-readiness-path This Learning Path helps you build knowledge and technical skills to use Amazon Braket. This Learning Path presents domain-specific content and includes courses, knowledge checks, a pre-assessment and a knowledge badge assessment. This path is a guide and presents learning in a structured order, it can be used as presented or you can select the content that is most beneficial. Intended Audience This path is created to help Quantum-curious developers, Solutions Architects and Enterprise technology evaluators program quantum computers and explore their potential applications. Learning Objectives After completing this learning path, you will be able to: Summarize the key benefits of Amazon Braket Explain the key concepts of Amazon Braket Explain the typical use cases for Amazon Braket Explain how to run Amazon Braket on an On-Demand Simulator and QPU Illustrate the business value of quantum technology with Amazon Braket List the key stages of quantum program development Describe how to plan the journey through the key features of Amazon Braket Create Amazon Braket quantum tasks using the Amazon Braket SDK and third-party plugins Identify the Amazon Braket resources for building on top of existing Amazon Braket deployments Differentiate between local and on-demand simulators based on appropriate use cases and project needs Examine QPU properties using both the AWS console and the Amazon Braket SDK Identify the QPU access paradigms available on Amazon Braket Express the pricing scheme for QPUs and estimate costs prior to running tasks Find and parse quantum task performance Access AWS Management Console interfaces for monitoring and managing quantum tasks, jobs, and their costs Differentiate between quantum tasks and hybrid jobs Describe the concepts of Braket Pulse Explain how to create Analog Hamiltonian Simulation programs Use error mitigation to deploy it with Amazon Braket AWS Knowledge Badge To verify your knowledge, or identify any gaps that you might have, take the knowledge badge assessment. Score 80% or higher and earn an AWS Knowledge badge that you can share with your network. The assessment is based on the courses in the learning path so we recommend completing these courses as needed. Already have some knowledge on Amazon Braket? Go directly to the assessment, test your knowledge. The score report will identify your areas of strength and direct you to the courses where you can improve any knowledge gaps. emmarrig@amazon.com quantum, cloud, AWS, aws, Cloud computing, cloud computing
AWS Foundational Researcher Learning Plan

Foundational Researcher Learning Plan is designed for researchers and research IT professionals who want to become more proficient in optimizing research on AWS. Learn how to use the right storage medium, remove heavy lifting with managed services, and reproduce research with containers and...

Keywords: cloud, cloud computing, AWS, aws, training material

AWS Foundational Researcher Learning Plan https://dresa.org.au/materials/aws-foundational-researcher-learning-plan Foundational Researcher Learning Plan is designed for researchers and research IT professionals who want to become more proficient in optimizing research on AWS. Learn how to use the right storage medium, remove heavy lifting with managed services, and reproduce research with containers and software-defined infrastructure. Foundational Researcher LP is suitable for academic researchers who need to acquire skills in: job roles such as Cloud architects, DevOps engineers, Operations staff, Developers, business decision makers; all tech roles interested in AWS Cloud Storage, Cloud architects, Storage administrators, Application developers, Data scientists, Machine Learning (ML) and a ML process, artificial intelligence, Application development. emmarrig@amazon.com cloud, cloud computing, AWS, aws, training material
ARDC Your first step to FAIR

This workshop gives a brief overview of the FAIR principles, including a method to make a one-file dataset FAIR.

Keywords: training material, FAIR, data, workshop

ARDC Your first step to FAIR https://dresa.org.au/materials/ardc-your-first-step-to-fair-1ee3dc3c-23b0-4287-b96c-c120c5697932 This workshop gives a brief overview of the FAIR principles, including a method to make a one-file dataset FAIR. contact@ardc.edu.au Stokes, Liz (type: Editor) Martinez, Paula Andrea (type: Editor) Russell, Keith (type: Editor) training material, FAIR, data, workshop
Programming and tidy data analysis in R

A workshop to expand the skill-set of someone who has basic familiarity with R. Covers programming constructs such as functions and for-loops, and working with data frames using the dplyr and tidyr packages. Explains the importance of a "tidy" data representation, and goes through common steps...

Keywords: R, Tidyverse, Programming

Resource type: tutorial

Programming and tidy data analysis in R https://dresa.org.au/materials/programming-and-tidy-data-analysis-in-r A workshop to expand the skill-set of someone who has basic familiarity with R. Covers programming constructs such as functions and for-loops, and working with data frames using the dplyr and tidyr packages. Explains the importance of a "tidy" data representation, and goes through common steps needed to load data and convert it into a tidy form. To be taught as a hands on workshop, typically as two half-days. Developed by the Monash Bioinformatics Platform and taught as part of the Data Fluency program at Monash University. License is CC-BY-4. You are free to share and adapt the material so long as attribution is given. Paul Harrison paul.harrison@monash.edu R, Tidyverse, Programming phd ecr researcher
Linear models in R

A workshop on linear models in R. Learning to use linear models provides a foundation for modelling, estimation, prediction, and statistical testing in R. Many commonly used statistical tests can be performed using linear models. Ideas introduced using linear models are applicable to many of the...

Keywords: R statistics

Resource type: tutorial

Linear models in R https://dresa.org.au/materials/linear-models-in-r A workshop on linear models in R. Learning to use linear models provides a foundation for modelling, estimation, prediction, and statistical testing in R. Many commonly used statistical tests can be performed using linear models. Ideas introduced using linear models are applicable to many of the more complicated statistical and machine learning models available in R. To be taught as a hands on workshop, typically as two half-days. Developed by the Monash Bioinformatics Platform and taught as part of the Data Fluency program at Monash University. License is CC-BY-4. You are free to share and adapt the material so long as attribution is given. Paul Harrison paul.harrison@monash.edu R statistics phd ecr researcher
Introduction to R

An introduction to R, for people with zero coding experience.

To be taught as a hands on workshop, typically as two half-days.

Developed by the Monash Bioinformatics Platform and taught as part of the Data Fluency program at Monash University. License is CC-BY-4. You are free to share and...

Keywords: R

Resource type: tutorial

Introduction to R https://dresa.org.au/materials/introduction-to-r An introduction to R, for people with zero coding experience. To be taught as a hands on workshop, typically as two half-days. Developed by the Monash Bioinformatics Platform and taught as part of the Data Fluency program at Monash University. License is CC-BY-4. You are free to share and adapt the material so long as attribution is given. Paul Harrison paul.harrison@monash.edu R phd ecr researcher
HPC file systems and what users need to consider for appropriate and efficient usage

Three videos on miscellaneous aspects of HPC usage - useful reference for new users of HPC systems.

1 – General overview of different file systems that might be available on HPC. The video goes through shared file systems such as /home and /scratch, local compute node file systems (local...

Keywords: HPC, high performance computer, File systems

Resource type: video, presentation

HPC file systems and what users need to consider for appropriate and efficient usage https://dresa.org.au/materials/hpc-file-systems-and-what-users-need-to-consider-for-appropriate-and-efficient-usage Three videos on miscellaneous aspects of HPC usage - useful reference for new users of HPC systems. 1 – General overview of different file systems that might be available on HPC. The video goes through shared file systems such as /home and /scratch, local compute node file systems (local scratch or $TMPDIR) and storage file system. It outlines what users need to consider if they wish to use any of these in their workflows. 2 – Overview of the different directories that might be present on HPC. These could include /home, /scratch, /opt, /lib and lib64, /sw and others. 3 – Overview of the Message-of-the-day file and the message that is displayed to users every time they log in. This displays info about general help and often current problems or upcoming outages. QCIF Training (training@qcif.edu.au) HPC, high performance computer, File systems
Basic Linux/Unix commands

A series of eight videos (each between 5 and 10 minutes long) following the content of the Software Carpentry workshop "The Unix Shell".

Sessions 1, 2 and 3 provide instructions on the minimal level of Linux/Unix commands recommended for new...

Keywords: HPC, high performance computer, Unix, Linux, Software Carpentry

Resource type: video, guide

Basic Linux/Unix commands https://dresa.org.au/materials/basic-linux-unix-commands A series of eight videos (each between 5 and 10 minutes long) following the content of the Software Carpentry workshop ["The Unix Shell"](https://swcarpentry.github.io/shell-novice/). Sessions 1, 2 and 3 provide instructions on the minimal level of Linux/Unix commands recommended for new users of HPC. 1 – An overview of how to find out where a user is in the filesystem, list the files there, and how to get help on Unix commands 2 – How to move around the file system and change into other directories 3 – Explains the difference between an absolute and relative path 4 – Overview of how to create new directories, and to create and edit new files with nano 5 – How to use the vi editor to edit files 6 – Overview of file viewers available 7 – How to copy and move files and directories 8 – How to remove files and directories Further details and exercises with solutions can be found on the Software Carpentry "The Unix Shell" page (https://swcarpentry.github.io/shell-novice/) QCIF Training (training@qcif.edu.au) HPC, high performance computer, Unix, Linux, Software Carpentry
Transferring files and data

A short video outlining the basics on how to use FileZilla to establish a secure file transfer protocol (sftp) connection to HPC to use a drag and drop interface to transfer files between the HPC and a desktop computer.

Keywords: sftp, file transfer, HPC, high performance computer

Resource type: video, guide

Transferring files and data https://dresa.org.au/materials/transferring-files-and-data A short video outlining the basics on how to use FileZilla to establish a secure file transfer protocol (sftp) connection to HPC to use a drag and drop interface to transfer files between the HPC and a desktop computer. QCIF Training (training@qcif.edu.au) sftp, file transfer, HPC, high performance computer
Connecting to HPC

A series of three short videos introducing how to use PuTTY to connect from a Windows PC to a secure HPC (high performance computing) cluster.

1 - The very basics on how to establish a connection to HPC.
2 - How to add more specific options for the connection to HPC.
3 - How to save the...

Keywords: HPC, high performance computer, ssh

Resource type: video, guide

Connecting to HPC https://dresa.org.au/materials/connecting-to-hpc A series of three short videos introducing how to use PuTTY to connect from a Windows PC to a secure HPC (high performance computing) cluster. 1 - The very basics on how to establish a connection to HPC. 2 - How to add more specific options for the connection to HPC. 3 - How to save the details and options for a connection for future use. QCIF Training (training@qcif.edu.au) HPC, high performance computer, ssh