Projects per year
The field of statistical relational learning aims at unifying logic and probability to reason and learn from data. Perhaps the most successful paradigm in the field is probabilistic logic programming (PLP): the enabling of stochastic primitives in logic programming. While many systems offer inference capabilities, the more significant challenge is that of learning meaningful and interpretable symbolic representations from data. In that regard, inductive logic programming and related techniques have paved much of the way for the last few decades, but a major limitation of this exciting landscape is that only discrete features and distributions are handled. Many disciplines express phenomena in terms of continuous models. In this paper, we propose a new computational framework for inducing probabilistic logic programs over continuous and mixed discrete-continuous data. Most significantly, we show how to learn these programs while making no assumption about the true underlying density. Our experiments show the promise of the proposed framework.
|Title of host publication||Inductive Logic Programming|
|Subtitle of host publication||29th International Conference, ILP 2019, Plovdiv, Bulgaria, September 3–5, 2019, Proceedings|
|Editors||Dimitar Kazakov, Can Erten|
|Number of pages||16|
|Publication status||Published - 5 Jun 2020|
|Event||29th International Conference on Inductive Logic Programming - Plovdiv, Bulgaria|
Duration: 3 Sep 2019 → 5 Sep 2019
|Name||Lecture Notes in Computer Scienc|
|Conference||29th International Conference on Inductive Logic Programming|
|Abbreviated title||ILP 2019|
|Period||3/09/19 → 5/09/19|
FingerprintDive into the research topics of 'Learning Probabilistic Logic Programs over Continuous Data'. Together they form a unique fingerprint.
- 1 Finished
15/06/18 → 14/09/19