Fair-Sentence-Transformers is an inference-time attention calibration method designed to mitigate positional and language biases in state-of-the-art embedding models. It redistributes attention more evenly across document segments, enhancing the discoverability of later or lower-resource language content.
Fair-Sentence-Transformers is a method to make AI models better at understanding long texts by ensuring all parts of the text, not just the beginning, are equally important. It fixes biases where early parts or common languages get too much attention, making search results fairer and more complete.
attention calibration, bias mitigation for embeddings, fair embedding models
Was this definition helpful?