Abstract
Species range maps (SRMs) are essential tools for research and policy-making in ecology, conservation, and environmental management. However, traditional SRMs rely on the availability of environmental covariates and high-quality species location observation data, both of which can be challenging to obtain due to geographic inaccessibility and resource constraints. We propose a novel approach combining millions of citizen science species observations with textual descriptions from Wikipedia, covering habitat preferences and range descriptions for tens of thousands of species. Our framework maps locations, species, and text descriptions into a common space, facilitating the learning of rich spatial covariates at a global scale and enabling zero-shot range estimation from textual descriptions. Evaluated on held-out species, our zero-shot SRMs significantly outperform baselines and match the performance of SRMs obtained using tens of observations. Our approach also acts as a strong prior when combined with observational data, resulting in more accurate range estimation with less data. We present extensive quantitative and qualitative analyses of the learned representations in the context of range estimation and other spatial tasks, demonstrating the effectiveness of our approach.
| Original language | English |
|---|---|
| Pages | 1-12 |
| Number of pages | 12 |
| DOIs | |
| Publication status | Published - 7 Dec 2024 |
| Event | The Thirty-Eighth Annual Conference on Neural Information Processing Systems - Vancouver Convention Center, Vancouver, Canada Duration: 10 Dec 2024 → 15 Dec 2024 Conference number: 38 https://neurips.cc/Conferences/2024 |
Conference
| Conference | The Thirty-Eighth Annual Conference on Neural Information Processing Systems |
|---|---|
| Abbreviated title | NeurIPS 2024 |
| Country/Territory | Canada |
| City | Vancouver |
| Period | 10/12/24 → 15/12/24 |
| Internet address |
Keywords / Materials (for Non-textual outputs)
- databases
- machine learning
- species range estimation
- zero-shot learning
- few-shot learning
- implicit networks