You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I am using HawkesExpKern to infer parameters on a simulated process with known parameters. It is able to work ok(*) with least-squares as a goodness-of-fit measure but it struggles with likelihood. It errors out under most solvers and with svrg it fails to converge.
If I repeat the same process using AGD as a solver it errors out as follows
---------------------------------------------------------------------------RuntimeErrorTraceback (mostrecentcalllast)
CellIn[12], line1---->1sample_hawkes_learner_loglik.fit(sample_simulation.timestamps)
File [/.venv/lib/python3.8/site-packages/tick/hawkes/inference/base/learner_hawkes_param.py:210](https://file+.vscode-resource.vscode-cdn.net//.venv/lib/python3.8/site-packages/tick/hawkes/inference/base/learner_hawkes_param.py:210), inLearnerHawkesParametric.fit(self, events, start)
207coeffs_start=np.ones(model_obj.n_coeffs)
209# Launch the solver-->210coeffs=solver_obj.solve(coeffs_start)
212# Get the learned coefficients213self._set("coeffs", coeffs)
File [/.venv/lib/python3.8/site-packages/tick/solver/base/first_order.py:283](https://file+.vscode-resource.vscode-cdn.net//.venv/lib/python3.8/site-packages/tick/solver/base/first_order.py:283), inSolverFirstOrder.solve(self, x0, step)
280ifself.proxisNone:
281raiseValueError('You must first set the prox using '282'``set_prox``.')
-->283solution=Solver.solve(self, x0, step)
284returnsolutionFile [/.venv/lib/python3.8/site-packages/tick/solver/base/solver.py:109](https://file+.vscode-resource.vscode-cdn.net//.venv/lib/python3.8/site-packages/tick/solver/base/solver.py:109), inSolver.solve(self, *args, **kwargs)
107defsolve(self, *args, **kwargs):
108self._start_solve()
-->109self._solve(*args, **kwargs)
110self._end_solve()
111returnself.solution
...
120r"""loss(Model self, ArrayDouble const & coeffs) -> double"""-->121return_hawkes_model.Model_loss(self, coeffs)
RuntimeError: Thesumoftheinfluenceonsomeonecannotbenegative. Maybedidyouforgettoaddapositiveconstrainttoyourproximaloperator
What makes it even stranger is that I can find the maximum through brute force. This is the plot of the likelihood function (using the score method of the class). It converges a bit further away from the simulation parameters but it does exist.
The text was updated successfully, but these errors were encountered:
I must add that is seems to be an issue with the solver (or what is being passed to it) since the score method appears to be working correctly for the likelihood based learner
Actually optimizing for the llh of Hawkes processes with gradient descent is very hard due to the shape of the optimization curve (very flat near the optimum, very picky near the boundaries. Hence, the classical optimization algorithms (AGD, SVRG etc.) that rely on the gradient Lipschitz assumption have high chances to fail (see https://arxiv.org/abs/1807.03545).
You can try several hacks to make it work:
Fit with least squares and use the obtained point as a starting point
Hi!
I am using HawkesExpKern to infer parameters on a simulated process with known parameters. It is able to work ok(*) with least-squares as a goodness-of-fit measure but it struggles with likelihood. It errors out under most solvers and with svrg it fails to converge.
To replicate the process.
The response I am getting
If I repeat the same process using AGD as a solver it errors out as follows
What makes it even stranger is that I can find the maximum through brute force. This is the plot of the likelihood function (using the score method of the class). It converges a bit further away from the simulation parameters but it does exist.
The text was updated successfully, but these errors were encountered: