Commit 129b550
[Spark][Build] Fix filtering of internal deps from final spark published jar to account for Spark-versioned builds (delta-io#5542)
<!--
Thanks for sending a pull request! Here are some tips for you:
1. If this is your first time, please read our contributor guidelines:
https://github.com/delta-io/delta/blob/master/CONTRIBUTING.md
2. If the PR is unfinished, add '[WIP]' in your PR title, e.g., '[WIP]
Your PR title ...'.
3. Be sure to keep the PR description updated to reflect all changes.
4. Please write your PR title to summarize what this PR proposes.
5. If possible, provide a concise example to reproduce the issue for a
faster review.
6. If applicable, include the corresponding issue number in the PR title
and link it in the body.
-->
#### Which Delta project/connector is this regarding?
<!--
Please add the component selected below to the beginning of the pull
request title
For example: [Spark] Title of my pull request
-->
- [X] Spark
- [ ] Standalone
- [ ] Flink
- [ ] Kernel
- [ ] Other (fill in here)
## Description
When trying to compile examples for
delta-io#5530 I ran into this issue since
we changed the artifact naming for non-default Spark versions. It's hard
to test this without updating `examples/build.sbt` which we should do
eventually, but for now just make this fix.
## How was this patch tested?
Compiled examples locally for
delta-io#5530
## Does this PR introduce _any_ user-facing changes?
No1 parent 3b4cd31 commit 129b550
1 file changed
+4
-1
lines changed| Original file line number | Diff line number | Diff line change | |
|---|---|---|---|
| |||
586 | 586 | | |
587 | 587 | | |
588 | 588 | | |
589 | | - | |
| 589 | + | |
| 590 | + | |
| 591 | + | |
| 592 | + | |
590 | 593 | | |
591 | 594 | | |
592 | 595 | | |
| |||
0 commit comments