TY - JOUR

T1 - Learning Sparse Additive Models with Interactions in High Dimensions

T2 - Proceedings of the 19th International Conference on Artificial Intelligence and Statistics

AU - Tyagi, Hemant

AU - Kyrillidis, Anastasios

AU - Papailiopoulos, Dimitris

AU - Gärtner, Bernd

AU - Krause, Andreas

PY - 2016/5/2

Y1 - 2016/5/2

N2 - A function f:Rd→R is referred to as a Sparse Additive Model (SPAM), if it is of the form f(x)=∑l∈Sϕl(xl), where S⊂[d], |S|≪d. Assuming ϕl’s and S to be unknown, the problem of estimating f from its samples has been studied extensively. In this work, we consider a generalized SPAM, allowing for second order interaction terms. For some S1⊂[d], S2⊂([d]2), the function f is assumed to be of the form: f(x)=∑p∈S1ϕp(xp)+∑(l,l′)∈S2ϕl,l′(xl,xl′). Assuming ϕp, ϕ(l,l′), S1 and S2 to be unknown, we provide a randomized algorithm that queries f and exactly recovers S1,S2. Consequently, this also enables us to estimate the underlying ϕp, ϕl,l′. We derive sample complexity bounds for our scheme and also extend our analysis to include the situation where the queries are corrupted with noise – either stochastic, or arbitrary but bounded. Lastly, we provide simulation results on synthetic data, that validate our theoretical findings.

AB - A function f:Rd→R is referred to as a Sparse Additive Model (SPAM), if it is of the form f(x)=∑l∈Sϕl(xl), where S⊂[d], |S|≪d. Assuming ϕl’s and S to be unknown, the problem of estimating f from its samples has been studied extensively. In this work, we consider a generalized SPAM, allowing for second order interaction terms. For some S1⊂[d], S2⊂([d]2), the function f is assumed to be of the form: f(x)=∑p∈S1ϕp(xp)+∑(l,l′)∈S2ϕl,l′(xl,xl′). Assuming ϕp, ϕ(l,l′), S1 and S2 to be unknown, we provide a randomized algorithm that queries f and exactly recovers S1,S2. Consequently, this also enables us to estimate the underlying ϕp, ϕl,l′. We derive sample complexity bounds for our scheme and also extend our analysis to include the situation where the queries are corrupted with noise – either stochastic, or arbitrary but bounded. Lastly, we provide simulation results on synthetic data, that validate our theoretical findings.

M3 - Article

VL - 51

SP - 111

EP - 120

JO - Journal of Machine Learning Research: Workshop and Conference Proceedings

JF - Journal of Machine Learning Research: Workshop and Conference Proceedings

SN - 1938-7228

ER -