You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
There is a shape mismatch issue in dot_likelihood() within infer_actively_pymdp/pymdp.maths.py, where np.squeeze(X) is removing dimensions incorrectly. Given an example case of
X = (1, 25, 1)
the result calculation will return (25,) instead of the expected (25,1), causing shape inconsistencies in downstream calculations.
recreation
import numpy as np
from pymdp.maths import dot_likelihood
A = np.expand_dims(np.eye(25), axis=-1)
obs = np.eye(1, 25).flatten()
LL = dot_likelihood(A, obs)
suggested fix
given the code
def dot_likelihood(A,obs):
s = np.ones(np.ndim(A), dtype = int)
s[0] = obs.shape[0]
X = A * obs.reshape(tuple(s))
X = np.sum(X, axis=0, keepdims=True)
LL = np.squeeze(X)
# check to see if `LL` is a scalar
if np.prod(LL.shape) <= 1.0:
LL = LL.item()
LL = np.array([LL]).astype("float64")
return LL
the np.squeeze(X) into LL = np.squeeze(X, axis=0)
Additional context
This issue was encountered while running get_joint_likelihood() inside fpi.py.
It caused unexpected shape mismatches in calc_free_energy(), breaking Active Inference computations.
This bug is critical for maintaining shape consistency in pymdp’s probability calculations.
The fix ensures the function remains backward-compatible and prevents unexpected errors.
The text was updated successfully, but these errors were encountered:
Could you provide more context around the overarching call to FPI where this error happened?
The code recreation you provided, makes me think you're trying to compute the log likelihood of a discrete observation under a discrete generative model with a single latent factor:
import numpy as np
from pymdp.maths import dot_likelihood
A = np.expand_dims(np.eye(25), axis=-1)
obs = np.eye(1, 25).flatten()
LL = dot_likelihood(A, obs)
However, if that's in general what you're trying to do, the correct way to do it, would be to remove the np.expand_dims(..., axis=-1) from the A matrix, and simply do
import numpy as np
from pymdp.maths import dot_likelihood
A = np.eye(25)
obs = np.eye(1, 25).flatten()
LL = dot_likelihood(A, obs)
Is there a reason you have that extra lagging singleton dimension on the A matrix in your example? I assume it was to recreate a situation with similarly-shaped arrays in the context of a larger call to fpi, but having more context on that situation would help me understand the problem here.
Summary
There is a shape mismatch issue in dot_likelihood() within infer_actively_pymdp/pymdp.maths.py, where np.squeeze(X) is removing dimensions incorrectly. Given an example case of
X = (1, 25, 1)
the result calculation will return (25,) instead of the expected (25,1), causing shape inconsistencies in downstream calculations.
recreation
import numpy as np
from pymdp.maths import dot_likelihood
A = np.expand_dims(np.eye(25), axis=-1)
obs = np.eye(1, 25).flatten()
LL = dot_likelihood(A, obs)
suggested fix
given the code
def dot_likelihood(A,obs):
the np.squeeze(X) into LL = np.squeeze(X, axis=0)
Additional context
This issue was encountered while running get_joint_likelihood() inside fpi.py.
It caused unexpected shape mismatches in calc_free_energy(), breaking Active Inference computations.
This bug is critical for maintaining shape consistency in pymdp’s probability calculations.
The fix ensures the function remains backward-compatible and prevents unexpected errors.
The text was updated successfully, but these errors were encountered: