Question Generation using Natural Language processing

Question Generation using Natural Language processing
item image
 Buy Now
Facebook Twitter Pinterest

Price: 24.99$

This course focuses on using state-of-the-art Natural Language processing techniques to solve the problem of question generation in edtech. If we pick up any middle school textbook, at the end of every chapter we see assessment questions like MCQs, True/False questions, Fill-in-the-blanks, Match the following, etc. In this course, we will see how we can take any text content and generate these assessment questions using NLPtechniques. This course will be a very practical use case of NLP where we put basic algorithms like word vectors (word2vec, Glove, etc) to recent advancements like BERT, open AI GPT-2, and T5 transformers to real-world use. We will use NLPlibraries like Spacy, NLTK, Allen NLP, Hugging Face transformers, etc. All the sections will be accompanied by easy to use Google Colab notebooks. You can run Google Colab notebooks for free on the cloud and also train models using free GPUs provided by Google. Prerequisites: This course will focus on the practical use cases of algorithms. A high-level introduction to the algorithms used will be introduced but the focus is not on the mathematics behind the algorithms. A high-level understanding of deep learning concepts like forward pass, backpropagation, optimizers, loss functions is expected. Strong Python programming skills with basic knowledge of Natural Language processing and Pytorch is assumed. The course outline: Generate distractors (wrong choices) for MCQoptions Students will use several approaches like Wordnet, Concept Net, and Sense2vec to generate distractors for MCQ options. Generate True or False questions using pre-trained models like sentence BERT, constituency parser, and Open AIGPT-2Students will learn to use constituency parser from Allen NLP to split any sentence. They will learn to use GPT-2 to generate sentences with alternate endings and filter them with Sentence BERT. Generate MCQs from any content by training a T5 transformer model using the Hugging Face library. Students will understand the T5 transformer algorithm and use SQUAD dataset to train a question generation model using Hugging Face Transformers library and Pytorch Lightning. Generate Fill in the blanks questions Students will learn to use Python Keyword extraction library to extract keywords, use flashtext library to do fast keyword matching, and visualize fill-in-the-blanks using HTML Element Tree in Colab Generate Match the following questions. Students will learn to use Python Keyword extraction library to extract keywords, use flashtext library to do fast keyword matching, and use BERT to do word sense disambiguation (WSD). Deploy question generation models to production. Deploy transformer models like T5 to production in a serverless fashion by converting them to ONNX format and performing quantization. Create lightweight docker containers using Fast API for transformer model and deploy on Google Cloud Run.

1 Comment
Leave a Reply