Date: 5 June 2024 @ 10:30 - 11:50

Timezone: Eastern Time (US & Canada)

Language of instruction: English

Google's 2017 research paper "Attention Is All You Need" described the transformer, a new machine learning technique. From that paper the modern Large Language Model was born, and we're now living in the thick of a new era brought on by companies like OpenAI, Mistral and Anthropic. But where does this cutting-edge technology come from? What are its roots? What are its problems?

This talk explores the history of procedural generation in text and games, from the I-Ching to tranformer-based language models and beyond. The talk will emphasize current state of the art in text-based language models, and include demonstrations on how to run language models locally on your own hardware.

Level: Introductory

Length: 1.5 Hours

Format: Lecture

Prerequisites: None

:: Wed. June 5 ::

10:30 to 11:50


Registration link

Compute Ontario Summer School is a series of online courses on Advanced Research Computing, High Performance Computing, Research Data Management, and Research Software. It runs from June 3 to June 21, 2024. The courses are delivered each workday from 9:00am to 4:30pm (EDT) with a lunch break,  in two parallel streams. Pick-and-choose the course(s) you want to attend. Registration is free. Please register early as  courses have a limited capacity. The Summer School is jointly delivered by SHARCNET, SciNet, Centre for Advanced Computing, in collaboration with the Alliance and RDM experts from across Ontario and Canada.

Keywords: Machine Learning, AI


Activity log