Can a Neural Network write poetry? Leveraging next sentence prediction

Can a Neural Network write poetry? Leveraging next sentence prediction

RCC Workshop

By Research Computing Center

Date and time

Thursday, July 16, 2020 · 12 - 2pm PDT

Location

Online

About this event

Can a Neural Network write poetry? Leveraging next sentence prediction with transformers BERT and GPT-2

Abstract: This workshop provides a basic introduction to two current cutting-edge deep learning frameworks for generating custom text based on a custom source.

Attendees will learn 1) how to set up and use pre-existing deep learning models on the university’s high-performance computing (HPC) cluster; 2) how to train models for BERT and GPU-2 on any corpus of documents, optimizing parameters to get the best results; 3) how to leverage these models to create predictive text in any genre (including poetry), and 4) discuss methods and strategies to evaluate accuracy and variation in the output.

Objectives

Attendees will:

  • Learn how deep learning frameworks for generating text can be set up and deployed
  • Learn how to use pre-trained models for BERT and GPT-2
  • Learn how to train and optimize the models based on a custom corpus to produce output in any genre, style or mode of composition (poetry, prose, drama, technical, etc.)
  • Learn strategies for evaluating accuracy and creating variation in the output

Level: Intermediate

Duration: 2 hours

Prerequisites: Participants are expected to use their computer with a Mac, Linux, or Windows operating system. Prior experience programming in Python and using UNIX/Linux recommended but not required.

Github repository: https://github.com/rcc-uchicago/BERT+GPT2_tutorial_Summer2020

Organized by

Sales Ended