Skip to content

anu-gtb/Text2Poetry

Repository files navigation

Text2Poetry

This NLP-based project is a Text to Poetry Generator which takes any definition, fact, etc. related to chemistry, physics and biology as input and generate a poetry according to given input as output. The aim of this project is to make learning science concepts for students interesting so that the students are able to remember these concepts for a long time. Sometimes, it becomes difficult for students to remember these complex science concepts which results in burden on them. This project would help students to learn science in a funny way and reduce their stress. The implementation has done using pre-trained GPT-2 model along with development of suitable API.

Steps involved in implementation of this project are -:

  1. Data Collection
  2. Data Cleaning
  3. Data Preprocessing
  4. Model Training
  5. Evaluation
  6. API development

Tools and Technologies used -:

  1. YouData.ai - For dataset

The dataset contains two columns - one for concept/theory and other for corresponding Poetry.

  1. Pandas - Exploratory Data Analysis and cleaning
  2. GPT-2 Tokenizer - For tokenization
  3. GPT-2 Language Model - For model training
  4. Streamlit - For User Interface development

Screenshot (139)

Screenshot (140)

Screenshot (141)