[NeurIPS 2023 Main Track] This is the repository for the paper titled "Don’t Stop Pretraining? Make Prompt-based Fine-tuning Powerful Learner"
-
Updated
Feb 4, 2024 - Python
[NeurIPS 2023 Main Track] This is the repository for the paper titled "Don’t Stop Pretraining? Make Prompt-based Fine-tuning Powerful Learner"
Transformer-based models implemented in tensorflow 2.x(using keras).
Grammar test suite for masked language models
Sample tutorials for training Natural Language Processing Models with Transformers
The source code used for paper "Empower Entity Set Expansion via Language Model Probing", published in ACL 2020.
Score masked language models on grammar test suites
Recent Advances in Vision-Language Pre-training!
Comparing Selective Masking Methods for Depression Detection in Social Media
Code to reproduce experiments from the paper "Continual Pre-Training Mitigates Forgetting in Language and Vision" https://arxiv.org/abs/2205.09357
BERT Attention Visualization is a web application powered by Streamlit, offering intuitive visualization of attention weights generated by BERT-based models.
Code for "Using Masked Language Model Probabilities of Connectives for Stance Detection in English Discourse"
Final assigment for "Gestione dell'Informazione" ("Search Engines") course @ UniMoRe
Measuring Biases in Masked Language Models for PyTorch Transformers. Support for multiple social biases and evaluation measures.
Transformers Pre-Training with MLM objective — implemented encoder-only model and trained from scratch on Wikipedia dataset.
[CHIL 2024] Interpretation of Intracardiac Electrograms Through Textual Representations
A Context Aware Approach for Generating Natural Language Attacks.
Data pipelines for both TensorFlow and PyTorch!
Unscrambles shuffled letters in a word sequence.
Code for "ELLEN: Extremely Lightly Supervised Learning For Efficient Named Entity Recognition" (LREC-COLING 2024)
A transformer-based language model trained on politics-related Twitter data. This repo is the official resource of the paper "PoliBERTweet: A Pre-trained Language Model for Analyzing Political Content on Twitter", LREC 2022
Add a description, image, and links to the masked-language-models topic page so that developers can more easily learn about it.
To associate your repository with the masked-language-models topic, visit your repo's landing page and select "manage topics."