Skip to content

CodeEye: Haptic interface for perceiving ship engine data through multisensory perception

Research output: Non-textual formDigital or Visual Products

Developing an on board interface for monitoring ship engine performance, initially through a GUI, but looking to experiment with haptic modality. The interface is designed for lay person, not engineers, and needs to be perceived intuitively. Ship engine rooms also are noisy and dark places, both conditions that are conducive to a preference of using the skin to communicate information, rather than the eyes. Different sensory modalities have different evolutionary communicative potential, whereas vision is very good at picking up discrete and precise differences, such as alphanumerical data and shades of colour. Sound becomes more useful when an alert has to be given from far away or in conditions where it becomes difficult to see, such as fog or darkness. Touch then is also useful as a substitutory sense in situations where we cannot rely on our eyes, but it also has a few advantages as it seems to be the most ‘trustworthy’. The flip side of this it has a certain ‘alert’ function so if overused could potentially become stressful. However, learning to how ship engineers used to understand their engines after years of experience, it was indeed through listening to the sounds of the engines, and laying hands on the outer surface, to get a sense of what is going inside. Ship engines are huge and opaque, making it very difficult to diagnose exactly the health of the insides, and pinpoint parts that are about to cause things to go wrong.
Original languageEnglish
Publication statusPublished - Mar 2015
EventBetween Craft and Code: Making Sense of Data Materialisation - University of Portsmouth, Portsmouth, United Kingdom
Duration: 16 Mar 201530 Apr 2015
http://creativespace.cci.port.ac.uk/2015/02/11/between-craft-and-code-making-sense-of-data-materialisation/

Documents

  • IMG_0629

    Final published version, 2.27 MB, image/jpeg

  • IMG_0628

    Final published version, 2.12 MB, image/jpeg

Related information

Outputs

Relations Get citation (various referencing formats)

ID: 4959820