Developing an on board interface for monitoring ship engine performance, initially through a GUI, but looking to experiment with haptic modality. The interface is designed for lay person, not engineers, and needs to be perceived intuitively. Ship engine rooms also are noisy and dark places, both conditions that are conducive to a preference of using the skin to communicate information, rather than the eyes. Different sensory modalities have different evolutionary communicative potential, whereas vision is very good at picking up discrete and precise differences, such as alphanumerical data and shades of colour. Sound becomes more useful when an alert has to be given from far away or in conditions where it becomes difficult to see, such as fog or darkness. Touch then is also useful as a substitutory sense in situations where we cannot rely on our eyes, but it also has a few advantages as it seems to be the most ‘trustworthy’. The flip side of this it has a certain ‘alert’ function so if overused could potentially become stressful. However, learning to how ship engineers used to understand their engines after years of experience, it was indeed through listening to the sounds of the engines, and laying hands on the outer surface, to get a sense of what is going inside. Ship engines are huge and opaque, making it very difficult to diagnose exactly the health of the insides, and pinpoint parts that are about to cause things to go wrong.