Projects per year
Abstract
The field of statistical relational learning aims at unifying logic and probability to reason and learn from data. Perhaps the most successful paradigm in the field is probabilistic logic programming: the enabling of stochastic primitives in logic programming, which is now increasingly seen to provide a declarative background to complex machine learning applications. While many systems offer inference capabilities, the
more significant challenge is that of learning meaningful and interpretable symbolic representations from data. In that regard, inductive logic programming and related techniques have paved much of the way for the last few decades. Unfortunately, a major limitation of this exciting landscape is that much of the work is limited to finitedomain discrete probability distributions. Recently, a handful of systems have been extended to represent and perform inference with continuous distributions. The problem, of course, is that classical solutions for inference are either restricted to wellknown parametric families (e.g., Gaussians) or resort to sampling strategies that provide correct answers only in the limit. When it comes to learning, moreover, inducing representations remains entirely open, other than “datafitting” solutions that forcefit points to aforementioned parametric families. In this paper, we take the first steps towards inducing probabilistic logic programs for continuous and mixed discretecontinuous data, without being pigeonholed to a fixed set of distribution families. Our key insight is to leverage techniques from piecewise polynomial function approximation theory, yielding a principled way to learn and compositionally construct density functions. We test the framework and discuss the learned representations.
more significant challenge is that of learning meaningful and interpretable symbolic representations from data. In that regard, inductive logic programming and related techniques have paved much of the way for the last few decades. Unfortunately, a major limitation of this exciting landscape is that much of the work is limited to finitedomain discrete probability distributions. Recently, a handful of systems have been extended to represent and perform inference with continuous distributions. The problem, of course, is that classical solutions for inference are either restricted to wellknown parametric families (e.g., Gaussians) or resort to sampling strategies that provide correct answers only in the limit. When it comes to learning, moreover, inducing representations remains entirely open, other than “datafitting” solutions that forcefit points to aforementioned parametric families. In this paper, we take the first steps towards inducing probabilistic logic programs for continuous and mixed discretecontinuous data, without being pigeonholed to a fixed set of distribution families. Our key insight is to leverage techniques from piecewise polynomial function approximation theory, yielding a principled way to learn and compositionally construct density functions. We test the framework and discuss the learned representations.
Original language  English 

Number of pages  11 
Publication status  Published  2018 
Event  Workshop on Hybrid Reasoning and Learning (HRL 2018): At 16th International Conference on Principles of Knowledge Representation and Reasoning (KR 2018)  Tempe, United States Duration: 28 Oct 2018 → 28 Oct 2018 https://www.hybridreasoning.org/kr2018_ws/ 
Workshop
Workshop  Workshop on Hybrid Reasoning and Learning (HRL 2018) 

Abbreviated title  HRL 2018 
Country/Territory  United States 
City  Tempe 
Period  28/10/18 → 28/10/18 
Internet address 
Fingerprint
Dive into the research topics of 'Learning Probabilistic Logic Programs in Continuous Domains'. Together they form a unique fingerprint.Projects
 1 Finished

Towards Explainable and Robust Statistical AI: A Symbolic Approach
15/06/18 → 14/09/19
Project: Research