You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
for (o, g) inzip(vectors_out, conj.(gradient[2:end]))
84
84
o .= g
85
85
end
86
86
return cost[]
@@ -115,7 +115,7 @@ Run the belief propagation algorithm, and return the final state and the informa
115
115
116
116
### Keyword Arguments
117
117
- `max_iter::Int=100`: the maximum number of iterations
118
-
- `tol::Float64=1e-6`: the tolerance for the convergence
118
+
- `tol::Float64=1e-6`: the tolerance for the convergence, the convergence is checked by infidelity of messages in consecutive iterations. For complex numbers, the converged message may be different only by a phase factor.
119
119
- `damping::Float64=0.2`: the damping factor for the message update, updated-message = damping * old-message + (1 - damping) * new-message
0 commit comments