You experience an external event through your sensory input channels. Traditionally, there are considered to be five sensory input channels.
Visual – what you see – pictures, use of colour, decoration. You may have a preference for taking in information visually and you may have a preference for information that’s presented in a visual manner.
Auditory – what you hear – sounds, voices, music. Being able to process information that’s presented to you verbally.
Kinesthetic – what you feel. You may have a preference for touch, to experience things personally, and you may learn better by trying out things and doing things.
Olfactory – what you smell – aroma. This is linked strongly to mood and memories because the sense of smell goes directly into the limbic system of the brain, faster than any of the other senses. Women are thought to be 1000 times more sensitive to aromas than men.
Gustatory – what you taste – responses linked to food and drink.
There are other sensory inputs. About 20 have been discovered and these are often overlooked. These include our response to pressure, coordination, equilibrium, how we relate one thing to another and how we perceive time. Before you make an internal representation of the event, you filter the information, running it through your internal processing filters. These internal processing filters sift information that comes in through your senses to create your own unique perspective. 94% of what we see is thought to be seen in the mind’s eye.
Do you remember this dress from early 2015? It was a little bit of an internet sensation. Some people saw the dress as blue and gold and some people saw it as almost black and white or anything in between.
How do you see it?
This is all about primal biology and the way human eyes and brains have evolved to see colour in sunlight. In a sunlit world, light enters the eye through the lens. The light hits the retina at the back of the eye, where pigments fire up neural connections to the visual cortex. This is the part of the brain that processes those signals into an image. Your brain figures out what colour light is bouncing off whatever your eyes are looking at and essentially subtracts that colour from the object. Different wavelengths correspond to different colours.
This image hits some kind of perceptual boundary. Human beings evolved to see in daylight, but daylight itself changes colour. The chromatic axis varies from pinkish red of dawn up through to the blue white of noontime and then back down to reddish twilight. What’s happening here is that your visual system is looking at an object and you’re trying to discount the chromatic bias of the daylight axis. So people either discount the blue, in which case they end up seeing white and gold, or they discount the gold side, in which case they end up with blue and black. It may be that a white gold prejudice favours the idea of seeing the dress under strong daylight and those seeing it as blue / black may compensate with a dawn or dusk chromatic bias.
Here we have something else that hits some kind of perceptual boundary.
What do you hear? Do you hear Laurel or do you hear Yanni?
How you’re hearing the audio file is dependent upon how you are responding to various frequencies. People who are good at hearing higher frequencies tend to hear Laurel whilst those who hear the lower frequencies tend to hear Yanni. What you hear is right for you and you’d be surprised that somebody else can hear it another way yet can’t hear it your way.
So your sensory input channels give you a representation of the world that is perfectly adequate for you, but it’s fundamentally flawed due to how you’re sensing the world and how you’re filtering the information.