add start of hawkes MLE#161
Conversation
Codecov Report✅ All modified and coverable lines are covered by tests. Additional details and impacted files@@ Coverage Diff @@
## main #161 +/- ##
=======================================
+ Coverage 74.8% 74.8% +0.1%
=======================================
Files 12 12
Lines 1938 1942 +4
=======================================
+ Hits 1449 1453 +4
Misses 489 489 🚀 New features to boost your workflow:
|
|
One open question is on estimation strategy. While we can use pure python and MLE to recover the parameters, this will likely be quite slow/inefficient. So three strategies get us out of this:
Alternatively, I think this would be case where moving to a Bayesian estimator might kill two birds with one stone.... I also don't know yet how to do inference on the parameters in MLE using a permutation approach yet. Any permutation-based inference will be seriously expensive, since we have to deal with randomizing over space, time, and mark. But, if we implement this log likelihood in stan/pymc, we could get estimation (only) using their MAP estimators, or also support inference using sampling. |
This adds the start of a spatiotemporal Hawkes process estimator. I'm trying to finish the single-dimension case first.
A Hawkes process is a kind of spatially-and-temporally dependent point process. Points that arrive at time t at location s make other arrivals near s and shortly after t more likely. Thus, Hawkes processes are sometimes called self-exciting.
Some existing implementations exist, and I'm hopeful to make both an MLE estimator and a simulator.