Skip to content

Commit 790939a

Browse files
authored
Bugfix of cutpoint calculations in ordinal regression notebook (#564)
* Draft update of BNN notebook * Pre-commit fixes * Address reviewer comments * Fixed minmax scaling in constainedUniform * Cleanup and fixes
1 parent 904fb16 commit 790939a

File tree

2 files changed

+10
-8
lines changed

2 files changed

+10
-8
lines changed

examples/generalized_linear_models/GLM-ordinal-regression.ipynb

Lines changed: 5 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -375,8 +375,8 @@
375375
" pt.concatenate(\n",
376376
" [\n",
377377
" np.ones(1) * min,\n",
378-
" pt.extra_ops.cumsum(pm.Dirichlet(\"cuts_unknown\", a=np.ones(N - 2)))\n",
379-
" * (min + (max - min)),\n",
378+
" pt.extra_ops.cumsum(pm.Dirichlet(\"cuts_unknown\", a=np.ones(N - 2))) * (max - min)\n",
379+
" + min,\n",
380380
" ]\n",
381381
" ),\n",
382382
" )"
@@ -387,7 +387,7 @@
387387
"cell_type": "markdown",
388388
"metadata": {},
389389
"source": [
390-
"The above function, (brainchild of Dr Ben Vincen and Adrian Seyboldt), looks a little indimidating, but it's just a convenience function to specify a prior over the cutpoints in our $Y_{latent}$. The Dirichlet distribution is special in that draws from the distribution must sum to one. The above function ensures that each draw from the prior distribution is a cumulative share of the maximum category greater than the minimum of our ordinal categorisation. "
390+
"The above function, (brainchild of Dr Ben Vincent and Adrian Seyboldt), looks a little indimidating, but it's just a convenience function to specify a prior over the cutpoints in our $Y_{latent}$. The Dirichlet distribution is special in that draws from the distribution must sum to one. The above function ensures that each draw from the prior distribution is a cumulative share of the maximum category greater than the minimum of our ordinal categorisation. "
391391
]
392392
},
393393
{
@@ -3146,7 +3146,8 @@
31463146
" [\n",
31473147
" np.ones(1) * min,\n",
31483148
" pt.extra_ops.cumsum(pm.Dirichlet(f\"cuts_unknown_{group}\", a=np.ones(N - 2)))\n",
3149-
" * (min + (max - min)),\n",
3149+
" * (max - min)\n",
3150+
" + min,\n",
31503151
" ]\n",
31513152
" ),\n",
31523153
" )"

examples/generalized_linear_models/GLM-ordinal-regression.myst.md

Lines changed: 5 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -197,14 +197,14 @@ def constrainedUniform(N, min=0, max=1):
197197
pt.concatenate(
198198
[
199199
np.ones(1) * min,
200-
pt.extra_ops.cumsum(pm.Dirichlet("cuts_unknown", a=np.ones(N - 2)))
201-
* (min + (max - min)),
200+
pt.extra_ops.cumsum(pm.Dirichlet("cuts_unknown", a=np.ones(N - 2))) * (max - min)
201+
+ min,
202202
]
203203
),
204204
)
205205
```
206206

207-
The above function, (brainchild of Dr Ben Vincen and Adrian Seyboldt), looks a little indimidating, but it's just a convenience function to specify a prior over the cutpoints in our $Y_{latent}$. The Dirichlet distribution is special in that draws from the distribution must sum to one. The above function ensures that each draw from the prior distribution is a cumulative share of the maximum category greater than the minimum of our ordinal categorisation.
207+
The above function, (brainchild of Dr Ben Vincent and Adrian Seyboldt), looks a little indimidating, but it's just a convenience function to specify a prior over the cutpoints in our $Y_{latent}$. The Dirichlet distribution is special in that draws from the distribution must sum to one. The above function ensures that each draw from the prior distribution is a cumulative share of the maximum category greater than the minimum of our ordinal categorisation.
208208

209209
```{code-cell} ipython3
210210
:tags: [hide-output]
@@ -554,7 +554,8 @@ def constrainedUniform(N, group, min=0, max=1):
554554
[
555555
np.ones(1) * min,
556556
pt.extra_ops.cumsum(pm.Dirichlet(f"cuts_unknown_{group}", a=np.ones(N - 2)))
557-
* (min + (max - min)),
557+
* (max - min)
558+
+ min,
558559
]
559560
),
560561
)

0 commit comments

Comments
 (0)