FNLP: Lab Exercises

All labs except for Lab 0 in week 1 are conducted during the timetabled lab sessions that take place in odd weeks (1, 3, 5, 7, 9, 11).   They are designed so that confident students can do them in their own time.  Lab 0 should be done in your own time (ie, there is no timetabled lab session for this), some time before the beginning of week 2.  Lab 1 and the other labs will appear here in due time.
 
  • Week 1: Lab 0 + introduction.pdf. For the labs, we will be using jupyter notebooks.
    Those who are unfamiliar with jupyter notebooks can use Lab 0 to familiarise themselves with notebooks and perhaps refresh your use of Python.  [Solutions]
    Not in person: This lab is for you to do on your own only, there will be no lab sessions on this day.
  • Week 3: Lab 1.  Pytorch Tensors and Naive Bayes. This lab has been tested on two notebook platforms, Notable and Google Colab (cpu server type), and we recommend to use one of the two for your labs.  You should be able to access Notable via Books & Course Tools section of FNLP's course page on Learn. From there you can upload this file and requirements.txt and begin. Alternatively, uou can create a new notebook by uploading the notebook file (lab1-naivebayes-and-tensors.ipynb) to Google Colab, and then can start a "CPU" type server. You will then need to upload the requirements.txt file before starting. We are unable to provide support for people having difficulty running the labs in other set-ups. [Solutions]
  • Week 5: Lab 2. Char-RNN for Shakespeare. This lab will guide you through making a basic character-level RNN for generating shakespearean text in Pytorch.  You will learn how to implement an RNN from scratch, and how to use Pytorch optimizers and learning rate schedulers in a custom training loop in order to train this RNN on a text dataset. You will evalulate performance with perplexity and loss. Finally, you will briefly explore sampling methods to generate novel text from this RNN. [Solutions]
  • Week 7: Lab3: In this lab, you will perform character-level text generation again, but this time using a Transformer instead of an RNN. We will work with a small dataset of names. [Solutions]
The solutions to the prior week's lab will appear during the week following the lab.

 

 

License
All rights reserved The University of Edinburgh