Sentence Pair Modeling
6 papers with code • 0 benchmarks • 0 datasets
Comparing two sentences and their relationship based on their internal representation.
Benchmarks
These leaderboards are used to track progress in Sentence Pair Modeling
Most implemented papers
ZEN: Pre-training Chinese Text Encoder Enhanced by N-gram Representations
Moreover, it is shown that reasonable performance can be obtained when ZEN is trained on a small corpus, which is important for applying pre-training techniques to scenarios with limited data.
Character-based Neural Networks for Sentence Pair Modeling
Sentence pair modeling is critical for many NLP tasks, such as paraphrase identification, semantic textual similarity, and natural language inference.
Neural Network Models for Paraphrase Identification, Semantic Textual Similarity, Natural Language Inference, and Question Answering
In this paper, we analyze several neural network designs (and their variations) for sentence pair modeling and compare their performance extensively across eight datasets, including paraphrase identification, semantic textual similarity, natural language inference, and question answering tasks.
Scalable Attentive Sentence-Pair Modeling via Distilled Sentence Embedding
In this paper, we introduce Distilled Sentence Embedding (DSE) - a model that is based on knowledge distillation from cross-attentive models, focusing on sentence-pair tasks.
Augmented SBERT: Data Augmentation Method for Improving Bi-Encoders for Pairwise Sentence Scoring Tasks
Bi-encoders, on the other hand, require substantial training data and fine-tuning over the target task to achieve competitive performance.
Once is Enough: A Light-Weight Cross-Attention for Fast Sentence Pair Modeling
Transformer-based models have achieved great success on sentence pair modeling tasks, such as answer selection and natural language inference (NLI).