What extra sensory perceptions would would you like? Seeing behind your back? Smelling odorless gasses like carbon monoxide? How about seeing in the dark? Sensors already exist that can do these things. All that is needed is a way to input what they sense into our brains. The most common way to input information from external sensors is visually. We can use our eyes to see distant airplanes or weather clouds on a radar scope. We can read how much carbon monoxide is in the air we breath by looking at a meter.
Suppose we need to sense things without using our eyes. Most often when we cannot see, we use our fingers to get information. Blind people use a cane to feel thier way around. Sometimes they tap their cane and listen for echoes to sense a barrier.
Another way to sense data about our environment is with our tongue. Suppose a ten by ten grid of electrodes were placed on the tongue and small voltages were used to create various patterns of sensation on the tongue. Just like bumps on paper can create thousands of words for people trained to read braille, the hundred electrodes on the tongue can allow trained people to sense data from sonar, radar, toxin detectors, or any other data measurable by various sensors.
At the institute for Machine and Human Cognition (IFHMC) Anil Raj is principle investigator in research titled: Adaptive Human/Machine Multi-sensory Prostheses. They are working on TSAS: Tactile Situation Awareness System. The research is exploring how electrodes on the tongue or in a body suit can allow users to receive input from external devices. Such input is desirable when your hands and eyes are already too busy or when they cannot be used.