The everyday experience of our surrounding environment consists of a plethora of stimuli. Each of our five senses sight, sound, touch, smell, and taste, is responsible for receiving qualitatively different information from the world around us. This information is then integrated into a composite picture of our environment.
Each sensory system functions optimally in a given situation but together they enhance the probability of object recognition. Given the separation of sensory streams and that sensory input can be of congruent, as well as incongruent character, our brains’ ability to integrate disparate and complex sensory information into a single percept is remarkable.
Despite the individual differences between each of our senses we are able to maintain an amazingly coherent and unified perception of our surroundings. Although most sensory stimuli we encounter are either crossmodal (integration of two senses) or multimodal (integration of more than two senses) in character, the vast majority of both behavioral and imaging studies to date have used unimodal stimuli (e.g. olfactory, visual or auditory stimuli). The assumption underlying use of the unimodal stimulus is that our brain processes multimodal and unimodal stimuli in a similar fashion. However, there is recent evidence showing that this assumption is incorrect. Most of our everyday percepts are created by multiple sensory systems receiving different sensations. The relevance of using multimodal stimulation as a means to fully understand the brain is thus both, ecologically relevant and necessary.
Using the olfactory system as a template, we are currently trying to elucidate the mechanisms by which the brain integrates these stimuli by using both behavioral as well as imaging paradigms.