Edinburgh Research Explorer

Joint learning of object and action detectors

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Related Edinburgh Organisations

Open Access permissions

Open

Original languageEnglish
Title of host publicationInternational Conference on Computer Vision (ICCV 2017)
PublisherInstitute of Electrical and Electronics Engineers (IEEE)
Pages2001-2010
Number of pages10
ISBN (Electronic)978-1-5386-1032-9 Print on Demand(PoD) ISBN: 978-1-5386-10
ISBN (Print)978-1-5386-1033-6
DOIs
Publication statusPublished - 25 Dec 2017
Event2017 IEEE International Conference on Computer Vision - Venice, Italy
Duration: 22 Oct 201729 Oct 2017
http://iccv2017.thecvf.com/

Conference

Conference2017 IEEE International Conference on Computer Vision
Abbreviated titleICCV 2017
CountryItaly
CityVenice
Period22/10/1729/10/17
Internet address

Abstract

While most existing approaches for detection in videos focus on objects or human actions separately, we aim at jointly detecting objects performing actions, such as cat eating or dog jumping. We introduce an end-to-end multitask objective that jointly learns object-action relationships. We compare it with different training objectives, validate its effectiveness for detecting objects-actions in videos, and show that both tasks of object and action detection benefit from this joint learning. Moreover, the proposed architecture can be used for zero-shot learning of actions: our multitask objective leverages the commonalities of an action performed by different objects, e.g. dog and cat jumping, enabling to detect actions of an object without training with these object-actions pairs. In experiments on the A2D dataset [50], we obtain state-of-the-art results on segmentation of object-action pairs. We finally apply our multitask architecture to detect visual relationships between objects in images of the VRD dataset [24].

Event

2017 IEEE International Conference on Computer Vision

22/10/1729/10/17

Venice, Italy

Event: Conference

Download statistics

No data available

ID: 41520494