RobustEmbed: Robust Sentence Embeddings Using Self-Supervised Contrastive Pre-Training
Published in Empirical Methods in Natural Language Processing: EMNLP Findings, 2023
The paper proposes RobustEmbed, a self-supervised sentence embedding framework aimed at enhancing both generalization and robustness in text representation tasks and adversarial scenarios. By generating high-risk adversarial perturbations and leveraging a novel contrastive objective approach, RobustEmbed effectively learns high-quality sentence embeddings.
Recommended citation: Javad Rafiei Asl, Eduardo Blanco, and Daniel Takabi. "RobustEmbed: Robust Sentence Embeddings Using Self-Supervised Contrastive Pre-Training." In The 2023 Conference on Empirical Methods in Natural Language Processing (EMNLP 2023). https://aclanthology.org/2023.findings-emnlp.305