You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: sentence_transformers/losses/CachedMultipleNegativesRankingLoss.py
+2-2Lines changed: 2 additions & 2 deletions
Original file line number
Diff line number
Diff line change
@@ -144,15 +144,15 @@ def __init__(
144
144
- ``"hard_negatives"``: Applies ``hardness_strength * stop_grad(cos_sim)`` only to the logits of
145
145
explicit hard negatives, leaving in-batch negatives unpenalized. Only active when explicit
146
146
negatives are provided. As used in
147
-
`Lan et al. 2025 <https://huggingface.co/papers/2509.20354>`_ (EmbeddingGemma).
147
+
`Schechter Vera et al. 2025 <https://huggingface.co/papers/2509.20354>`_ (EmbeddingGemma).
148
148
- ``"all_negatives"``: Applies ``hardness_strength * stop_grad(cos_sim)`` to every negative logit,
149
149
both in-batch negatives and explicit hard negatives, leaving only the positive unpenalized.
150
150
Combines the effect of ``"in_batch_negatives"`` and ``"hard_negatives"``.
151
151
152
152
hardness_strength: Strength of the hardness weighting. The meaning depends on ``hardness_mode``:
153
153
154
154
- For ``"in_batch_negatives"``: acts as ``alpha`` in the hardness penalty, `Lan et al. 2025 <https://huggingface.co/papers/2503.04812>`_ uses 9.
155
-
- For ``"hard_negatives"``: acts as ``alpha`` in the hardness penalty, `Lan et al. 2025 <https://huggingface.co/papers/2509.20354>`_ uses 5.
155
+
- For ``"hard_negatives"``: acts as ``alpha`` in the hardness penalty, `Schechter Vera et al. 2025 <https://huggingface.co/papers/2509.20354>`_ uses 5.
156
156
157
157
Must be non-negative. Ignored when ``hardness_mode`` is ``None``.
Copy file name to clipboardExpand all lines: sentence_transformers/losses/MultipleNegativesRankingLoss.py
+2-2Lines changed: 2 additions & 2 deletions
Original file line number
Diff line number
Diff line change
@@ -93,15 +93,15 @@ def __init__(
93
93
- ``"hard_negatives"``: Applies ``hardness_strength * stop_grad(cos_sim)`` only to the logits of
94
94
explicit hard negatives, leaving in-batch negatives unpenalized. Only active when explicit
95
95
negatives are provided. As used in
96
-
`Lan et al. 2025 <https://huggingface.co/papers/2509.20354>`_ (EmbeddingGemma).
96
+
`Schechter Vera et al. 2025 <https://huggingface.co/papers/2509.20354>`_ (EmbeddingGemma).
97
97
- ``"all_negatives"``: Applies ``hardness_strength * stop_grad(cos_sim)`` to every negative logit,
98
98
both in-batch negatives and explicit hard negatives, leaving only the positive unpenalized.
99
99
Combines the effect of ``"in_batch_negatives"`` and ``"hard_negatives"``.
100
100
101
101
hardness_strength: Strength of the hardness weighting. The meaning depends on ``hardness_mode``:
102
102
103
103
- For ``"in_batch_negatives"``: acts as ``alpha`` in the hardness penalty, `Lan et al. 2025 <https://huggingface.co/papers/2503.04812>`_ uses 9.
104
-
- For ``"hard_negatives"``: acts as ``alpha`` in the hardness penalty, `Lan et al. 2025 <https://huggingface.co/papers/2509.20354>`_ uses 5.
104
+
- For ``"hard_negatives"``: acts as ``alpha`` in the hardness penalty, `Schechter Vera et al. 2025 <https://huggingface.co/papers/2509.20354>`_ uses 5.
105
105
106
106
Must be non-negative. Ignored when ``hardness_mode`` is ``None``.
0 commit comments