Is there any way to use nonseparable losses with XGBoost (or other boosting libraries)? #11921
Unanswered
alekfrohlich
asked this question in
Q&A
Replies: 1 comment 2 replies
-
|
Hi, it's not entirely clear to me what exactly is the loss function here ("nonseparable losses"?). From a high level, XGBoost works with pointwise gradient. As long as you can derive the pointwise gradient and hessian and the hessian is positive (maybe after some transformations), it should work with XGBoost. |
Beta Was this translation helpful? Give feedback.
2 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
I am dealing with the following loss function:$\frac{1}{n(n-1)}\sum_{i\neq j} f(x_i, y_j)^2 - 2\frac{1}{n}\sum_{i=1}^n f(x_i, y_i)$ . I would like to find a function f that minimizes this loss. It is possible to take a functional-gradient point of view towards the loss, however XGBoost doesn't seem to support nonseparable losses. Any suggestion?
Beta Was this translation helpful? Give feedback.
All reactions