Register training material
12 materials found

Licence: CC-BY-4.0 

and

Keywords: AI  or software publishing  or HPC 


Pawsey: AWS Quantum 101 Using Amazon Braket

Join us as AWS Quantum Specialists introduce quantum simulators and gate-based quantum computers, before turning to more advanced topics.

Keywords: Pawsey Supercomputing Centre, AWS, quantum, HPC

Pawsey: AWS Quantum 101 Using Amazon Braket https://dresa.org.au/materials/pawsey-aws-quantum-101-using-amazon-braket Join us as AWS Quantum Specialists introduce quantum simulators and gate-based quantum computers, before turning to more advanced topics. training@pawsey.org.au Pawsey Supercomputing Centre, AWS, quantum, HPC
OpenCL

Supercomputers make use of accelerators from a variety of different hardware vendors, using devices such as multi-core CPU’s, GPU’s and even FPGA’s. OpenCL is a way for your HPC application to make effective use of heterogeneous computing devices, and to avoid code refactoring for new HPC...

Keywords: OpenCL, supercomputing, CPUs, GPUs, FPGAs, HPC

OpenCL https://dresa.org.au/materials/opencl-3eabb316-794d-4f46-959a-725be3ae1bde Supercomputers make use of accelerators from a variety of different hardware vendors, using devices such as multi-core CPU’s, GPU’s and even FPGA’s. OpenCL is a way for your HPC application to make effective use of heterogeneous computing devices, and to avoid code refactoring for new HPC infrastructure. Topics covered in this course are : - Introduction to OpenCL - How to build and run applications on Setonix with OpenCL and MPI - Matrix multiplication with OpenCL – fully explained line by line - How to debug OpenCL applications and kernels - Measure performance with OpenCL Events and open source tools - Memory management - Coarse and fine-grained shared memory - Strategies for building optimised OpenCL kernels - Optimise IO performance with asynchronous operations training@pawsey.org.au OpenCL, supercomputing, CPUs, GPUs, FPGAs, HPC
PCon Preparing applications for El Capitan and beyond

As Lawrence Livermore National Laboratories (LLNL) prepares to stand up its next supercomputer, El Capitan, application teams prepare to pivot to another GPU architecture.

This talk presents how the LLNL application teams made the transition from distributed-memory, CPU-only architectures to...

Keywords: GPUs, supercomputing, HPC, PaCER

PCon Preparing applications for El Capitan and beyond https://dresa.org.au/materials/pcon-preparing-applications-for-el-capitan-and-beyond As Lawrence Livermore National Laboratories (LLNL) prepares to stand up its next supercomputer, El Capitan, application teams prepare to pivot to another GPU architecture. This talk presents how the LLNL application teams made the transition from distributed-memory, CPU-only architectures to GPUs. They share institutional best practices. They discuss new open-source software products as tools for porting and profiling applications and as avenues for collaboration across the computational science community. Join LLNL's Erik Draeger and Jane Herriman, who presented this talk at Pawsey's PaCER Conference in September 2023. training@pawsey.org.au Pawsey Supercomputing Research Centre GPUs, supercomputing, HPC, PaCER masters phd researcher ecr support professional ugrad
WEBINAR: Pro tips for scaling bioinformatics workflows to HPC

This record includes training materials associated with the Australian BioCommons webinar ‘Pro tips for scaling bioinformatics workflows to HPC’. This webinar took place on 31 May 2023.

Event description 

High Performance Computing (HPC) infrastructures offer the computational scale and...

Keywords: Bioinformatics, Workflows, HPC, High Performance Computing

WEBINAR: Pro tips for scaling bioinformatics workflows to HPC https://dresa.org.au/materials/webinar-pro-tips-for-scaling-bioinformatics-workflows-to-hpc This record includes training materials associated with the Australian BioCommons webinar ‘Pro tips for scaling bioinformatics workflows to HPC’. This webinar took place on 31 May 2023. Event description  High Performance Computing (HPC) infrastructures offer the computational scale and efficiency that life scientists need to handle complex biological datasets and multi-step computational workflows. But scaling workflows to HPC from smaller, more familiar computational infrastructures brings with it new jargon, expectations, and processes to learn. To make the most of HPC resources, bioinformatics workflows need to be designed for distributed computing environments and carefully manage varying resource requirements, and data scale related to biology.   In this webinar, Dr Georgina Samaha from the Sydney Informatics Hub, Dr Matthew Downton from the National Computational Infrastructure (NCI) and Dr Sarah Beecroft from the Pawsey Supercomputing Research Centre help you navigate the world of HPC for running and developing bioinformatics workflows. They explain when you should take your workflows to HPC and highlight the architectural features you should make the most of to scale your analyses once you’re there. You’ll hear pro-tips for dealing with common pain points like software installation, optimising for parallel computing and resource management, and will find out how to get access to Australia’s National HPC infrastructures at NCI and Pawsey.  Materials Materials are shared under a Creative Commons Attribution 4.0 International agreement unless otherwise specified and were current at the time of the event. Files and materials included in this record: Event metadata (PDF): Information about the event including, description, event URL, learning objectives, prerequisites, technical requirements etc. Index of training materials (PDF): List and description of all materials associated with this event including the name, format, location and a brief description of each file. Pro-tips_HPC_Slides: A PDF copy of the slides presented during the webinar. Materials shared elsewhere: A recording of this webinar is available on the Australian BioCommons YouTube Channel: https://youtu.be/YKJDRXCmGMo Melissa Burke (melissa@biocommons.org.au) Bioinformatics, Workflows, HPC, High Performance Computing
WEBINAR: AlphaFold: what's in it for me?

This record includes training materials associated with the Australian BioCommons webinar ‘WEBINAR: AlphaFold: what’s in it for me?’. This webinar took place on 18 April 2023.

Event description 

AlphaFold has taken the scientific world by storm with the ability to accurately predict the...

Keywords: Bioinformatics, Machine Learning, Structural Biology, Proteins, Drug discovery, AlphaFold, AI, Artificial Intelligence, Deep learning

WEBINAR: AlphaFold: what's in it for me? https://dresa.org.au/materials/webinar-alphafold-what-s-in-it-for-me This record includes training materials associated with the Australian BioCommons webinar ‘WEBINAR: AlphaFold: what’s in it for me?’. This webinar took place on 18 April 2023. Event description  AlphaFold has taken the scientific world by storm with the ability to accurately predict the structure of any protein in minutes using artificial intelligence (AI). From drug discovery to enzymes that degrade plastics, this promises to speed up and fundamentally change the way that protein structures are used in biological research.  Beyond the hype, what does this mean for structural biology as a field (and as a career)? Dr Craig Morton, Drug Discovery Lead at the CSIRO, is an early adopter of AlphaFold and has decades of expertise in protein structure / function, protein modelling, protein – ligand interactions and computational small molecule drug discovery, with particular interest in anti-infective agents for the treatment of bacterial and viral diseases. Craig joins this webinar to share his perspective on the implications of AlphaFold for science and structural biology. He will give an overview of how AlphaFold works, ways to access AlphaFold, and some examples of how it can be used for protein structure/function analysis. Materials are shared under a Creative Commons Attribution 4.0 International agreement unless otherwise specified and were current at the time of the event. Files and materials included in this record: Event metadata (PDF): Information about the event including, description, event URL, learning objectives, prerequisites, technical requirements etc. Index of training materials (PDF): List and description of all materials associated with this event including the name, format, location and a brief description of each file. Materials shared elsewhere: A recording of this webinar is available on the Australian BioCommons YouTube Channel: https://youtu.be/4ytn2_AiH8s Melissa Burke (melissa@biocommons.org.au) Bioinformatics, Machine Learning, Structural Biology, Proteins, Drug discovery, AlphaFold, AI, Artificial Intelligence, Deep learning
WEBINAR: Where to go when your bioinformatics outgrows your compute

This record includes training materials associated with the Australian BioCommons webinar ‘Where to go when your bioinformatics outgrows your compute’. This webinar took place on 19 August 2021.

Bioinformatics analyses are often complex, requiring multiple software tools and specialised...

Keywords: Computational Biology, Bioinformatics, High performance computing, HPC, Galaxy Australia, Nectar Research Cloud, Pawsey Supercomputing Centre, NCI, NCMAS, Cloud computing

WEBINAR: Where to go when your bioinformatics outgrows your compute https://dresa.org.au/materials/webinar-where-to-go-when-your-bioinformatics-outgrows-your-compute This record includes training materials associated with the Australian BioCommons webinar ‘Where to go when your bioinformatics outgrows your compute’. This webinar took place on 19 August 2021. Bioinformatics analyses are often complex, requiring multiple software tools and specialised compute resources. “I don’t know what compute resources I will need”, “My analysis won’t run and I don’t know why” and "Just getting it to work" are common pain points for researchers. In this webinar, you will learn how to understand the compute requirements for your bioinformatics workflows. You will also hear about ways of accessing compute that suits your needs as an Australian researcher, including Galaxy Australia, cloud and high-performance computing services offered by the Australian Research Data Commons, the National Compute Infrastructure (NCI) and Pawsey.  We also describe bioinformatics and computing support services available to Australian researchers.  This webinar was jointly organised with the Sydney Informatics Hub at the University of Sydney. Materials are shared under a Creative Commons Attribution 4.0 International agreement unless otherwise specified and were current at the time of the event. **Files and materials included in this record:** - Event metadata (PDF): Information about the event including, description, event URL, learning objectives, prerequisites, technical requirements etc. - Index of training materials (PDF): List and description of all materials associated with this event including the name, format, location and a brief description of each file. - Where to go when your bioinformatics outgrows your compute - slides (PDF and PPTX): Slides presented during the webinar - Australian research computing resources cheat sheet (PDF): A list of resources and useful links mentioned during the webinar. **Materials shared elsewhere:** A recording of the webinar is available on the Australian BioCommons YouTube Channel: https://youtu.be/hNTbngSc-W0 Melissa Burke (melissa@biocommons.org.au) Computational Biology, Bioinformatics, High performance computing, HPC, Galaxy Australia, Nectar Research Cloud, Pawsey Supercomputing Centre, NCI, NCMAS, Cloud computing
WEBINAR: High performance bioinformatics: submitting your best NCMAS application

This record includes training materials associated with the Australian BioCommons webinar ‘High performance bioinformatics: submitting your best NCMAS application’. This webinar took place on 20 August 2021.

Bioinformaticians are increasingly turning to specialised compute infrastructure and...

Keywords: Computational Biology, Bioinformatics, High Performance Computing, HPC, NCMAS

WEBINAR: High performance bioinformatics: submitting your best NCMAS application https://dresa.org.au/materials/webinar-high-performance-bioinformatics-submitting-your-best-ncmas-application This record includes training materials associated with the Australian BioCommons webinar ‘High performance bioinformatics: submitting your best NCMAS application’. This webinar took place on 20 August 2021. Bioinformaticians are increasingly turning to specialised compute infrastructure and efficient, scalable workflows as their research becomes more data intensive. Australian researchers that require extensive compute resources to process large datasets can apply for access to national high performance computing facilities (e.g. Pawsey and NCI) to power their research through the National Computational Merit Allocation Scheme (NCMAS). NCMAS is a competitive, merit-based scheme and requires applicants to carefully consider how the compute infrastructure and workflows will be applied.  This webinar provides life science researchers with insights into what makes a strong NCMAS application, with a focus on the technical assessment, and how to design and present effective and efficient bioinformatic workflows for the various national compute facilities. It will be followed by a short Q&A session. Materials are shared under a Creative Commons Attribution 4.0 International agreement unless otherwise specified and were current at the time of the event. **Files and materials included in this record:** - Event metadata (PDF): Information about the event including, description, event URL, learning objectives, prerequisites, technical requirements etc. - Index of training materials (PDF): List and description of all materials associated with this event including the name, format, location and a brief description of each file. - High performance bioinformatics: submitting your best NCMAS application - slides (PDF and PPTX): Slides presented during the webinar **Materials shared elsewhere:** A recording of the webinar is available on the Australian BioCommons YouTube Channel: https://youtu.be/HeFGjguwS0Y Melissa Burke (melissa@biocommons.org.au) Computational Biology, Bioinformatics, High Performance Computing, HPC, NCMAS
Accelerating skills development in Data science and AI at scale

At the Monash Data Science and AI  platform, we believe that upskilling our research community and building a workforce with data science skills are key to accelerating the application of data science in research. To achieve this, we create and leverage new and existing training capabilities...

Keywords: AI, machine learning, eresearch skills, training, train the trainer, volunteer instructors, training partnerships, training material

Accelerating skills development in Data science and AI at scale https://dresa.org.au/materials/accelerating-skills-development-in-data-science-and-ai-at-scale At the Monash Data Science and AI  platform, we believe that upskilling our research community and building a workforce with data science skills are key to accelerating the application of data science in research. To achieve this, we create and leverage new and existing training capabilities within and outside Monash University. In this talk, we will discuss the principles and purpose of establishing collaborative models to accelerate skills development at scale. We will talk about our approach to identifying gaps in the existing skills and training available in data science, key areas of interest as identified by the research community and various sources of training available in the marketplace. We will provide insights into the collaborations we currently have and intend to develop in the future within the university and also nationally. The talk will also cover our approach as outlined below •        Combined survey of gaps in skills and trainings for Data science and AI •        Provide seats to partners •        Share associate instructors/helpers/volunteers •        Develop combined training materials •        Publish a repository of open source trainings •        Train the trainer activities •        Establish a network of volunteers to deliver trainings at their local regions Industry plays a significant role in making some invaluable training available to the research community either through self learning platforms like AWS Machine Learning University or Instructor led courses like NVIDIA Deep Learning Institute. We will discuss how we leverage our partnerships with Industry to bring these trainings to our research community. Finally, we will discuss how we map our training to the ARDC skills roadmap and how the ARDC platforms project “Environments to accelerate Machine Learning based Discovery” has enabled collaboration between Monash University and University of Queensland to develop and deliver training together. contact@ardc.edu.au AI, machine learning, eresearch skills, training, train the trainer, volunteer instructors, training partnerships, training material
Monash University - University of Queensland training partnership in Data science and AI

We describe the peer network exchange for training that has been recently created via an ARDC funded partnership between Monash University and Universities of Queensland under the umbrella of the Queensland Cyber Infrastructure Foundation (QCIF). As part of a training program in machine learning,...

Keywords: data skills, training partnerships, data science, AI, training material

Monash University - University of Queensland training partnership in Data science and AI https://dresa.org.au/materials/monash-university-university-of-queensland-training-partnership-in-data-science-and-ai We describe the peer network exchange for training that has been recently created via an ARDC funded partnership between Monash University and Universities of Queensland under the umbrella of the Queensland Cyber Infrastructure Foundation (QCIF). As part of a training program in machine learning, visualisation, and computing tools, we have established a series of over 20 workshops over the year where either Monash or QCIF hosts the event for some 20-40 of their researchers and students, while some 5 places are offered to participants from the other institution. In the longer term we aim to share material developed at one institution and have trainers present it at the other. In this talk we will describe the many benefits we have found to this approach including access to a wider range of expertise in several rapidly developing fields, upskilling of trainers, faster identification of emerging training needs, and peer learning for trainers. contact@ardc.edu.au data skills, training partnerships, data science, AI, training material
Training resources for sharing and reuse

This presentation outlines the work completed during a consultancy for ARDC by Dr Paula Martinez to develop new and publish existing national skills materials for reuse by the sector. She was responsible for the work package targeted to co-develop national skills materials with a strong emphasis...

Keywords: FAIR training material, training material, guides, software citation, software publishing, containers, software licensing, training materials checklist, research data governance

Training resources for sharing and reuse https://dresa.org.au/materials/training-resources-for-sharing-and-reuse This presentation outlines the work completed during a consultancy for ARDC by Dr Paula Martinez to develop new and publish existing national skills materials for reuse by the sector. She was responsible for the work package targeted to co-develop national skills materials with a strong emphasis on sharing and reuse. This was a very collaborative project with the opportunity to work with different target audiences, topics and support expertise. To accommodate for a short timeline. We defined the scope to six topics. 1) Containers in Research 2) Data Governance 3) Software citation and Licensing 4) FAIR Data 101 5) Metadata for Training Materials 6) Machine Learning Resources. You can watch the video on YouTube here: https://youtu.be/10Yv_BFa-mw contact@ardc.edu.au FAIR training material, training material, guides, software citation, software publishing, containers, software licensing, training materials checklist, research data governance
Software publishing, licensing, and citation

A short presentation for reuse includes speaker notes.

Making software citable using a code repository, an ORCID and a licence.

Cite as
Liffers, Matthias. (2021, July 12). Software publishing, licensing, and citation. Zenodo. https://doi.org/10.5281/zenodo.5091717

Keywords: software citation, software publishing, software registry, software repository, research software

Resource type: presentation

Software publishing, licensing, and citation https://dresa.org.au/materials/software-publishing-licensing-and-citation A short presentation for reuse includes speaker notes. Making software citable using a code repository, an ORCID and a licence. **Cite as** Liffers, Matthias. (2021, July 12). Software publishing, licensing, and citation. Zenodo. https://doi.org/10.5281/zenodo.5091717 ARDC Contact us: https://ardc.edu.au/contact-us/ software citation, software publishing, software registry, software repository, research software phd ecr researcher support
ARDC Guide to making Software Citable

A short guide to making software citable using a code repository, an ORCID and a licence.

Cite as
Liffers, Matthias, & Honeyman, Tom. (2021). ARDC Guide to making software citable. Zenodo. https://doi.org/10.5281/zenodo.5003989

Keywords: software citation, software publishing, software registry, software repository, research software

Resource type: guide

ARDC Guide to making Software Citable https://dresa.org.au/materials/ardc-guide-to-making-software-citable A short guide to making software citable using a code repository, an ORCID and a licence. **Cite as** Liffers, Matthias, & Honeyman, Tom. (2021). ARDC Guide to making software citable. Zenodo. https://doi.org/10.5281/zenodo.5003989 ARDC Contact us: https://ardc.edu.au/contact-us/ software citation, software publishing, software registry, software repository, research software phd ecr researcher support