Supporting shared analysis for mobile investigators

Chris Baber, James Cross, Fan Yang, Paul Smith

Research output: Chapter in Book/Report/Conference proceedingConference contribution


This paper reports the development of applications that support automatic annotation of images as a result of activity by investigators in the field. The work reported in this paper initially involved manually annotation of images collected by people in the field (using a wearable computer and web-cam as an automatic capture device). However, it became apparent that this could be combined with automated capture of information related to the activity of the person, i.e., as the investigator moves around a scene, the activity and location is logged and used to mark an image; the marking is layered in terms of different types of activity. The potential use of such a system, and its role in collaboration is discussed in this paper. It is suggested that, in this context, synchronous collaboration is often not very useful. Rather, the investigators need to be able to relate recovered objects to their location and state on recovery.
Original languageEnglish
Title of host publicationProceedings of the international workshop on annotation for collaboration
Subtitle of host publicationmethods, tools and practices, La Sorbonne, Paris, France, 2005, November 23-24
EditorsJean-Francois Boujut
Place of PublicationParis
PublisherCentre National de la Recherche Scientifique
Publication statusPublished - 2005
EventInternational Workshop on Annotation for Collaboration - Methods, Tools and Practices - La Sorbonne, Paris, France
Duration: 23 Nov 200524 Nov 2005


WorkshopInternational Workshop on Annotation for Collaboration - Methods, Tools and Practices


Dive into the research topics of 'Supporting shared analysis for mobile investigators'. Together they form a unique fingerprint.

Cite this