A novel family of non-parametric cumulative based divergences for point processes

Sohan Seth, Park Il, Austin Brockmeier, Mulugeta Semework, John Choi, Joseph Francis, Jose Principe

Research output: Chapter in Book/Report/Conference proceedingConference contribution


Hypothesis testing on point processes has several applications such as model fitting, plasticity detection, and non-stationarity detection. Standard tools for hypothesis testing include tests on mean firing rate and time varying rate function. However, these statistics do not fully describe a point process and thus the tests can be misleading. In this paper, we introduce a family of non-parametric divergence measures for hypothesis testing. We extend the traditional Kolmogorov--Smirnov and Cramer--von-Mises tests for point process via stratification. The proposed divergence measures compare the underlying probability structure and, thus, is zero if and only if the point processes are the same. This leads to a more robust test of hypothesis. We prove consistency and show that these measures can be efficiently estimated from data. We demonstrate an application of using the proposed divergence as a cost function to find optimally matched spike trains.
Original languageEnglish
Title of host publicationAdvances in Neural Information Processing Systems 23 (NIPS 2010)
Number of pages9
Publication statusPublished - 2010


Dive into the research topics of 'A novel family of non-parametric cumulative based divergences for point processes'. Together they form a unique fingerprint.

Cite this