Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

DANN for Regression #30

Open
Farahani1 opened this issue Mar 8, 2019 · 7 comments
Open

DANN for Regression #30

Farahani1 opened this issue Mar 8, 2019 · 7 comments

Comments

@Farahani1
Copy link

Farahani1 commented Mar 8, 2019

Hi,
I have changed this project in order to try domain adversarial learning in a regression problem. actually, I replaced the label predictor with a regressor. But it just makes the regression accuracy worse. Do you have any idea why it happens?

@jaehongyoon
Copy link

jaehongyoon commented Mar 8, 2019 via email

@Farahani1 Farahani1 reopened this Mar 8, 2019
@Farahani1
Copy link
Author

Well it’s not right to directly apply the loss functions of classification problem on regression problem as your not calculating the posterior of y in regression. Can you specify the loss you used? Instead I would suggest using Monte Carlo to generate the entropy of y and use it as a loss. Try to sample y accordingly to the latent variables for the given X and generate empirical p(y) (I use KDE for its simplicity and robustness).

On Fri, Mar 8, 2019 at 10:08 AM Farahani1 @.***> wrote: Hi, I have changed this project in order to try domain adversarial learning in a regression problem. actually, I replaced the label classifier with a regressor. But it just makes the regression accuracy worse. Do you have any idea why it happens? — You are receiving this because you are subscribed to this thread. Reply to this email directly, view it on GitHub <#30>, or mute the thread https://github.com/notifications/unsubscribe-auth/AYM41-DXddJ2WTQgGrig0hOJjMmC8foQks5vUn0EgaJpZM4bliCM .

I used Mean Square Error of output(y) as the loss function for regression

@jaehongyoon
Copy link

jaehongyoon commented Mar 8, 2019 via email

@Farahani1
Copy link
Author

I did not consider the point you mentioned, but now I almost got what you mean. I am gonna work on my project considering your guides. thanks for your help dear @jaehongyoon

@jaehongyoon
Copy link

jaehongyoon commented Mar 8, 2019 via email

@Farahani1
Copy link
Author

would you please explain more about your idea of Monte Carlo sampling and latent variables. Do you mean that I should use Expectation Maximization? What do you mean by posterior exactly? Do you mean conditional probability of y given x?

@jaehongyoon
Copy link

jaehongyoon commented Mar 10, 2019 via email

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants