Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Generalize the type signature for y. #141

Open
wants to merge 2 commits into
base: master
Choose a base branch
from
Open

Conversation

emsal0
Copy link

@emsal0 emsal0 commented Feb 28, 2020

This is the nascent step of the work that I'm beginning to lay out in the issue #140 for supporting vector-valued GPs.

I'd be interested in discussing what the next steps are after doing this - specifically I'd like to begin work supplementing the likelihood and mean entities. What pattern should be employed so that the current Likelihood and Mean types won't work/error out for y such that !(Y <: AbstractVector)? Is that even the right path to take?

Instead of just AbstractVector, it now includes AbstractMatrix.

This has no effect on functionality yet, as y values that don't conform
to AbstractVector will return an error message at the GPA() call.
@chris-nemeth
Copy link
Member

Has this been tested against any benchmarks or test scripts? I think to extend the GP to the vector-valued setting requires some extra working on the kernel. If I remember correctly, you need to introduce a block diagonal matrix and then you have a Kronecker product between the new matrix and the kernel matrix. There's a paper here on this topic.

@emsal0
Copy link
Author

emsal0 commented Mar 11, 2020

The changes made in this PR don't change anything functionally; they only generalize the type signature for y, but will error out if y is not <: AbstractVector{Real}, in the call to GPA.

Will give the paper a read. Are there any general guidelines for writing test scripts/benchmarks for the implementation as I write more of it out?

Thanks

@thomaspinder
Copy link
Member

In general, the unit test you write @emsal1863 should ensure that any features that you add are working correctly, and, if relevant, ensure nothing else has been broken as a result of your changes. In general, the former can be done by evaluating an inequality arising from your implementation whereby you know what "the truth" should be. A simple example of these tests can be found in the GP unit test. Considering the latter point, it is probably sufficient to just ensure that the behaviour of existing code that you make any changes to is appropriately captured by an existing unit test.

@emsal0
Copy link
Author

emsal0 commented Mar 28, 2020

I just included a unit test for the error case in GPA().

@emsal0
Copy link
Author

emsal0 commented Apr 11, 2020

Has this been tested against any benchmarks or test scripts? I think to extend the GP to the vector-valued setting requires some extra working on the kernel. If I remember correctly, you need to introduce a block diagonal matrix and then you have a Kronecker product between the new matrix and the kernel matrix. There's a paper here on this topic.

@chris-nemeth questions for the near future

What would the contents of the block diagonal matrix be?

Also, would additional work also need to be done on the MCMC sampler?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants