Skip to main content
Advertisement
Browse Subject Areas
?

Click through the PLOS taxonomy to find articles in your field.

For more information about PLOS Subject Areas, click here.

  • Loading metrics

Simultaneity judgment using olfactory–visual, visual–gustatory, and olfactory–gustatory combinations

  • Naomi Gotow,

    Affiliation Human Informatics Research Institute, National Institute of Advanced Industrial Science and Technology (AIST), Tsukuba, Ibaraki, Japan

  • Tatsu Kobayakawa

    kobayakawa-tatsu@aist.go.jp

    Affiliation Human Informatics Research Institute, National Institute of Advanced Industrial Science and Technology (AIST), Tsukuba, Ibaraki, Japan

Abstract

Vision is a physical sense, whereas olfaction and gustation are chemical senses. Active sensing might function in vision, olfaction, and gustation, whereas passive sensing might function in vision and olfaction but not gustation. To investigate whether each sensory property affected synchrony perception, participants in this study performed simultaneity judgment (SJ) for three cross-modal combinations using visual (red LED light), olfactory (coumarin), and gustatory (NaCl solution) stimuli. We calculated the half-width at half-height (HWHH) and point of subjective simultaneity (PSS) on the basis of temporal distributions of simultaneous response rates in each combination. Although HWHH did not differ significantly among three cross-modal combinations, HWHH exhibited a higher value in cross-modal combinations involving one or two chemical stimuli than in combinations of two physical stimuli, reported in a previous study. The PSS of the olfactory–visual combination was approximately equal to the point of objective simultaneity (POS), whereas the PSS of visual–gustatory, and olfactory–gustatory combinations receded significantly from the POS. In order to generalize these results as specific to chemical senses in regard to synchrony perception, we need to determine whether the same phenomena will be reproduced when performing SJ for various cross-modal combinations using visual, olfactory, and gustatory stimuli other than red LED light, coumarin, and NaCl solution.

Introduction

Stability of perception in everyday life is preserved by integration of multimodal information. Perception of synchrony in cross-modal combinations plays an important role in maintaining perceptual stability in a continually changing environment.

When researchers examine synchrony perception in cross-modal combinations, they generally perform simultaneity judgment (SJ) task, in which participants report whether two stimuli are presented simultaneously (e.g., [15]), or temporal order judgment (TOJ) task, in which participants report the sensory modality of a stimulus that was perceived more rapidly than another one (e.g., [610]). In these tasks, cross-modal combinations are presented by varying stimulus onset asynchrony (SOA). Conventionally, the sensory modalities used in these tasks have been limited to physical senses (visual, audio, and tactile sensations), and no studies of this kid have focused on chemical senses (olfactory, and gustatory sensations).

As methods for exploring the environment surrounding the organism, two concepts have been proposed in engineering research: “active sensing” and “passive sensing” [11,12]. Active sensing is defined as searching the environment in a manner that investigates the properties of an object purposefully. To illustrate this situation, using interactions between sensory modalities and the environment, in active sensing the organism watches, gazes at, or carefully observes the appearance of an object with its eyes; strains to hear or listens to the sound from an object with its ears; sniffs the odor that an object gives off through its nose; and moves its hand (or paw) over the surface of an object in order to determine its texture. By contrast, passive sensing means that the organism unexpectedly notices the environment. For example, even if an organism is not going to investigate the properties of an object purposefully, it can look at the object’s appearance, hear sound from the object, smell the odor that the object gives off, or incidentally touch the surface of an object with a part of its body. Cognitive aspects, such as how the organism directs its attention toward the object, concern interactions between sensations and the environment.

As mentioned above, we consider both active sensing and passive sensing to be involved in vision, audition, olfaction, and tactile sensation. On the other hand, in contrast to the other four senses, gustation may be a sensory modality specialized for active sensing. In order for an organism to perceive taste in everyday life, it needs to consume food. As soon as the organism takes food into its oral cavity, it might involuntarily direct its attention to the food.

In chemical sense research, reliable and precise measurement is assured by rigidly controlling gaseous and liquid stimuli. Evens and colleagues [13] proposed the necessary conditions for measuring olfactory evoked potential with high precision: (1) olfactory stimuli must be inserted into an air flow as a pulse in order to prevent stimulation of the trigeminal nerve system by tactile sensation; (2) the olfactory stimulus should reach 70% of its maximum concentration within 50 milliseconds; and (3) air should be at greater than 50% humidity and approximately body temperature. The difficulty of controlling chemical stimuli can be overcome by development of stimulus presentation apparatus that satisfies all their proposals (for olfactory stimulus, [14,15]; for gustatory stimulus, [16,17]).

As mentioned above, although vision is a physical sense, olfaction and gustation are chemical senses. Furthermore, although both passive sensing and active sensing seem to function in vision and olfaction, gustation is likely to function only by active sensing. In this study, we investigated whether these properties of sensory modalities affected perception of synchrony in cross-modal combinations. We performed SJ for three combinations of cross-modal stimuli (i.e., olfactory–visual, visual–gustatory, and olfactory–gustatory combinations), with within-subject design. We used red LED light, coumarin, and NaCl solution as visual, olfactory, and gustatory stimuli, respectively. We determined the temporal distribution of simultaneous response rates in each cross-modal combination for each participant, and calculated approximations on the assumption that these temporal distributions were Gaussian. Using the coefficients of these approximations, we compared the half-width at half-height (HWHH) [18] among three cross-modal combinations. Because HWHH means the extent of the temporal distribution of simultaneous response rate, we defined its value as the temporal resolution of synchrony perception. Furthermore, we compared the point of subjective simultaneity (PSS) [19,20] with the point of objective simultaneity (POS, i.e., SOA = 0 milliseconds) [21] in each cross-modal combination. PSS is equal to SOA value corresponding to the peak on the temporal distribution of simultaneous response rate.

Methods

Participant

This study was conducted in accordance with the revised version of the Declaration of Helsinki. All procedures in this study were approved by the ethical committee for ergonomic experiments of the National Institute of Advanced Industrial Science and Technology, Japan. We explained the experiments to each participant in advance of the study, and informed them of their right to cease participation even after their initial agreement to participate; informed written consent was acquired from all subjects. Ten female volunteers without subjective olfactory and gustatory disorders, aged 20–25 (mean age ± standard deviation (SD) = 22.7 ± 1.7 years old), participated in the experiment.

Stimuli presentation

Visual stimulus.

In accordance with the previous studies [22,23], green LED light was used as a fixation point and to provide notice of stimulus presentation. Therefore, we selected red LED light, which is the complementary color of green light, as the source of visual stimulus. The luminous body (diameter of 0.24 cm, 57.5 cd/m2) derived through an optical fiber was placed about 150 cm in front of the participant. The duration of the visual stimulus was 400 milliseconds per trial.

Olfactory stimulus.

The odorant was selected on the basis of the following criteria: (1) no stimulation of the trigeminal nerve on the olfactory mucosa, and (2) no unpleasant feeling during smelling. We presented the smell of cherry tree leaves (68.4mM coumarin [Wako Pure Chemical Industries, Tokyo, Japan] dissolved in propylene glycol) as the olfactory stimulus, using an olfactory stimulator developed by Kobal and colleagues (“Olfactometer OM4”: Burghart Instruments, Wedel, Germany). The olfactory stimulus was inserted into an air flow as a pulse. In order to conduct real-time monitoring of stimulus presentation, a high-speed ultrasonic gas sensor [24,25] was placed at the outlet of the olfactory stimulator. The perceived intensity of the olfactory stimulus became approximately ‘moderate’ (3) on a 6-point magnitude scale (odorless: 0, barely detectable: 1, weak: 2, moderate: 3, strong: 4, very strong: 5) [26]. The duration of stimulus presentation was 400 milliseconds, and the flow rate was 7.5 liters per minute. The temperatures of air and olfactory stimulus were adjusted to be equivalent to the temperature in the nasal cavity, i.e., about 36°C. Before starting the measurement, two experimenters confirmed that the perceived intensity of the olfactory stimulus and the temperatures of air and olfactory stimulus at the outlet of stimulator were appropriate for performing SJ. Additionally, white noise was presented at all times during the measurement in order to prevent participant from detecting the timing of stimulus presentation on the basis of the noise of switching between air and the olfactory stimulus.

Gustatory stimulus.

Tastant was selected on the basis of the following criteria: (1) hydrophilic, and (2) no unpleasant feeling during tasting. We presented a solution of salt (600 mM sodium chloride dissolved in deionized water) as the gustatory stimulus, using an improved version of the gustatory stimulator developed by Kobayakawa and his colleagues [16,17]. The perceived intensity of the gustatory stimulus became was approximately ‘moderate’ (3) on the 6-point magnitude scale. Duration of stimulus presentation was 500 milliseconds, and flow rate was 120 milliliters per minute. Temperatures of deionized water and gustatory stimulus were adjusted to be equivalent to the temperature in the tongue, i.e., about 36°C. Before starting the measurement, two experimenters confirmed that the perceived intensity of the gustatory stimulus and the temperatures of deionized water and gustatory stimulus at the stimulus presentation unit (a Teflon tube in which a small hole of 0.7 × 0.3 cm was drilled into the side) were appropriate for performing SJ.

Procedure

The experiment was performed in a small room (295 cm in width × 400 cm in depth × 240 cm in height) shielded from outside noise. The door of the room was closed during measurement; a video camera and intercom were placed inside the room so that the experimenter could monitor and communicate with participants from outside.

Sessions consisting of 93 trials were conducted four times for each cross-modal combination. Each participant took part in the experiment over 5 or 6 days, and performed one or two SJ sessions per day. When two SJ sessions were performed in a day, we arranged to present different combinations of cross-modal stimuli.

In all sessions, we placed the luminous body of green LED light derived from an optical fiber about 150 cm in front of the participant. The luminous body of green LED light was adjacent to a luminous body of red LED, which was used as the visual stimulus. The green light was turned on for 7 seconds per trial as a fixation point and a notice of stimulus presentation. In olfactory–visual and olfactory–gustatory combinations, it was possible that the olfactory stimulus would be presented in expiratory or inspiratory phases, resulting in a difference in perceived intensity; therefore, the participant was instructed to stop respirations while the green light was turned on. Additionally, in order to fix the position of their chin, the participant was also asked to hold in their mouth the Teflon tube for presentation of the gustatory stimulus, even in sessions that did not include a gustatory stimulus.

We regarded the gustatory stimulus in the visual–gustatory and olfactory–gustatory combinations, and the visual stimulus in the olfactory–visual combination, as the standard stimuli, whereas we regarded the other stimulus in each combination as the comparison stimulus. The presentation timing of standard stimulus at each trial was adjusted to be ± 500 milliseconds, centered around 3 seconds after the green light was turned on. The inter-stimulus interval was about 20 seconds. Furthermore, in each combination, we prepared 31 steps from −1900 milliseconds (comparison stimulus first: negative sign) to 1900 milliseconds (standard stimulus first: positive sign) as the SOA between the standard and comparison stimuli. These SOAs were controlled automatically by a personal computer. Each SOA was incorporated randomly in a sequence of presented stimuli three times per session, and we prepared six different sequences of presented stimuli in which the same SOA was not repeated successively. We prevented the same sequence from being used repeatedly between sessions in each cross-modal combination.

The participant was asked whether two stimuli, each belonging to different modalities, were presented simultaneously, and they were informed that they were not required to make a quick judgment. The participant was asked to express ‘1’ with their index finger if perceiving two stimuli synchronously, or ‘2’ with their index and middle fingers if perceiving two stimuli asynchronously. They did not display either numbers with their fingers, if they did not perceive either stimulus or both stimuli while the green light was turned on. We observed and recorded the participant’s display with their fingers at all times using a video camera placed in the small room. Furthermore, during measurement we always conducted real-time monitoring of stimulus presentation.

Analysis

Calculation of stimulus arrival time points.

Based on a record of the real-time monitoring mentioned above, we calculated the time point at which the presented stimulus arrived at the receptor (see in detail, [27]).

Calculation of simultaneous response rate and approximation.

We calculated actual SOA values, using the record of real-time monitoring of stimulus presentation. These values were classified into 27 time windows, and the simultaneous response rates were calculated for every time window in each cross-modal combination. Trials in which the participant did not express a judgment with their fingers, as well as trials in which the actual value of the SOA was ≤ −2,050 milliseconds or > 2,050 milliseconds, were excluded from analysis. Out of 3,720 trials acquired for each combination, we analyzed 3,715 (adoption rate of 99.9%) for the olfactory–visual combination, 3,621 (97.3%) for the visual–gustatory combination, and 3,627 (97.5%) for the olfactory–gustatory combination.

Based on the simultaneity judgment responses acquired from participants, we calculated the inter-participant averages of the simultaneous response rates (i.e., values obtained by dividing the number of trials that each participant judged as "simultaneous" by the total number of trials) for all time windows in each cross-modal combination. Furthermore, according to the previous studies [28,29], we assumed a Gaussian distribution for the temporal distributions of simultaneous response rates, we calculated ‘a’, ‘b’, and ‘c’ in y = a×exp{−(tb)×(tb)/(2×c×c)} by the method of least squares.

Comparison using HWHH and PSS.

We calculated the temporal distributions of simultaneous response rates for every participant in each cross-modal combination. We assumed a Gaussian distribution for the temporal distributions of simultaneous response rates, and calculated approximations by the least-squares method. PSS and HWHH are represented by “b” and “c” in the coefficients of approximation mentioned above, respectively.

HWHH [18] is calculated by bisecting the interval between two SOA values corresponding to one half of the peak on that distribution. In order to determine whether HWHH differed among cross-modal combinations, we conducted one-way repeated measures analysis of variance (ANOVA) for HWHH with cross-modal combination as a within-subject factor. Multiple comparisons among combinations using Ryan’s method were performed based on the significance of results obtained with ANOVA.

In order to determine whether PSS was equal to POS, we conducted one sample t-tests in each cross-modal combination.

Results

Calculation of simultaneous response rate and approximation

Temporal distributions of simultaneous response rates and approximate curves for each cross-modal combination are shown in Fig 1. Approximations were as follows.

thumbnail
Fig 1. Temporal distributions of simultaneous response rates and approximate curves in each cross-modal combination.

We calculated actual stimulus onset asynchrony (SOA) values, using a record of real-time monitoring of stimulus presentation. The actual SOA values were classified into 27 time windows, and the simultaneous response rates were calculated for every time window in each cross-modal combination. Temporal distributions of simultaneous response rates (filled circular dots) and approximate curves (solid line) for olfactory–visual, visual–gustatory, and olfactory–gustatory combinations are shown in (a), (b), and (c), respectively. We assumed that the temporal distributions of simultaneous response rates were Gaussian, and calculated approximations by the least-squares method. The error rates of approximations for olfactory–visual, visual–gustatory, and olfactory–gustatory combinations were 0.4%, 0.1%, and 0.3%, respectively.

https://doi.org/10.1371/journal.pone.0174958.g001

In the equations above, the simultaneous response rate and time points (in seconds) are represented by “y” and “t”, respectively. The error rates of approximations (the value obtained by dividing the sum of squares of the difference between actual simultaneous response rates and theoretical values derived from approximation by the sum of squares of the actual simultaneous response rates) for olfactory–visual, visual–gustatory, and olfactory–gustatory combinations were 0.4%, 0.1%, and 0.3%, respectively. Additionally, Pearson product-moment correlation coefficients between actual simultaneous response rates and theoretical values derived from approximation were r > 0.99 in all cross-modal combinations. These results demonstrated the validity of applying the temporal distribution of simultaneous response rates to a Gaussian distribution.

Comparison using HWHH and PSS

We calculated the temporal distribution of simultaneous response rates for each cross-modal combination for every participant (n = 10). We assumed a Gaussian distribution for these temporal distributions, and calculated the approximations by the least-squares method. The error rates of approximations were 0.7–5.5% (mean rate ± SD = 2.61 ± 1.16%) in all sessions. Additionally, Pearson product-moment correlation coefficients between actual simultaneous response rates and theoretical values derived from approximation were 0.95–0.99 (mean rate ± SD = 0.98 ± 0.01) in all sessions.

HWHH in each cross-modal combination were shown in Fig 2(a). One-way repeated measures ANOVA for HWHH did not demonstrate a significance main effect of combination (F (2, 18) = 3.01, p < 0.1). This result revealed that HWHH did not differ among cross-modal combinations.

thumbnail
Fig 2. HWHH and PSS of each cross-modal combination.

Half-width at half-height (HWHH) of each cross-modal combination are shown in (a). One-way repeated measures analysis of variance (ANOVA) for HWHH with cross-modal combination as a within-factor did not demonstrate significance. The point of subjective simultaneity (PSS) of each cross-modal combination are shown in (b). In olfactory–visual combination, stimulus onset asynchrony (SOA) values with positive sign represented the case that visual stimulus led olfactory stimulus. In visual–gustatory combination, SOA values with negative sign represented the case that visual stimulus led gustatory stimulus. In olfactory–gustatory combination, SOA values with negative sign represented the case that olfactory stimulus led gustatory stimulus. One sample t-tests for comparing PSS with the point of objective simultaneity (POS) revealed significant differences for visual–gustatory combination (t (9) = 4.83, p < 0.001) and olfactory–gustatory combination (t (9) = 9.49, p < 0.001). Error bars: standard error (n = 10). *** p < 0.001, * p < 0.05.

https://doi.org/10.1371/journal.pone.0174958.g002

PSS in each cross-modal combination were shown in Fig 2(b). One sample t-tests for comparing PSS with POS demonstrated significant differences for visual–gustatory combination (t (9) = 4.83, p < 0.001) and olfactory–gustatory combination (t (9) = 9.49, p < 0.001). This result revealed that PSS of combination with gustatory stimulus receded significantly from POS.

Discussion

Validity of gustatory and olfactory stimulation devices

Needless to say, precise temporal control of stimuli is indispensable for an SJ or TOJ experiments. Physical sensations, i.e., vision, audition, and touch sensation, are relatively easy to control temporally. On the other hand, chemical stimuli such as olfaction or gustation must be handled using special techniques. We already reported the total stimulus system for this chemical SJ experiment [27]; here, we discussed the essential features of olfactory and gustatory stimulators.

We used an “Olfactometer OM4” (Burghart Instruments) for olfactory stimulus, with real–time monitoring using high–speed ultrasonic gas sensor that we developed. Rise time up to 70% of maximum concentration was less than 20 milliseconds through total experiments. This performance satisfied the criteria proposed by Evans and colleagues [13] for measuring chemosensory event–related potentials, and was sufficient for this SJ experiment. Based on the kinetics of odor retention in the nasal cavity, we presented a 400 milliseconds odor stimulus, followed by about 20 seconds rinse. Air flow rate was 7.5 liters per minute, so that the inside of the nasal cavity was washed by > 2 liters fresh air during every trial, which was sufficient to wash out odorant.

We used gustatory stimulator that we developed to measure gustatory event–related magnetic fields and potentials. As described above, the size of the stimulus area was 0.7 × 0.3 cm. Miller [30] reported that the average density of taste buds at the tip of the tongue is 116 per cm2, equivalent to ~ 25 taste buds in a 0.7 × 0.3 cm area. In addition, based on the subjective comments obtained from our participants after the end of each SJ session, they succeeded in detecting taste easily. According to the rise time for gustation, the calculated duration for the taste solution’s coverage on this area was about 19 ± 2.5 milliseconds, and this performance also satisfied the criteria proposed by Evans and colleagues [13]. We have already measured both event–related potentials [31] and magnetic fields [16,17] using this taste stimulator. In addition, we showed that the primary gustatory area’s activation linearly increased with the log of NaCl concentration [32], but did not respond to water alone (used as a control condition).

Thus, the stimulator for chemosensation (olfaction and gustation) would be appropriate for SJ measurement.

Temporal resolution of simultaneous response rate in each cross-modal combination

HWHH did not differ among cross-modal combinations, and the simultaneous response rate reached 0% in an SOA that was approximately 600 milliseconds away from the PSS in all cross-modal combinations. Fujisaki and Nishida [18], who performed SJ using visual, audio, and tactile stimuli, reported that the simultaneous response rate reached 0% in SOAs that were 100–200 milliseconds. Comparison between the results of this study and those of Fujisaki and Nishida [18] revealed that temporal resolution of synchrony perception was lower in cross-modal combinations involving one or two chemical stimuli than in combinations of physical stimuli.

Two causes might explain the results we observed. One possibility is that the HWHH of temporal distribution might have exhibited a higher value in cross-modal combinations involving one or two chemical stimuli than in combinations of physical stimuli because chemical senses (i.e., olfaction and gustation) have lower temporal resolution than physical senses (i.e., audition, vision, and tactile sensation). Fujisaki and Nishida [18] reported that cross-modal combinations of visual and audio stimuli and visual and tactile stimuli exhibited significantly lower temporal resolution than the cross-modal combination of audio and tactile stimuli. They concluded that these results reflected the fact that temporal resolution in the central processing mechanism is lower for visual information than for audio and tactile information. Furthermore, they argued that their results verified the hypothesis that the temporal resolution of synchrony perception in cross-modal combinations depends on the sensory modality with the lower temporal resolution. The second possibility is that information conversion mechanisms in odor and taste receptors might be related. In olfaction, vaporized odorants are separated into molecular. After the resultant molecules are dissolved in viscous liquid of olfactory mucosa, they influence cells bearing olfactory receptors. In gustation, food that is taken into the oral cavity is suspended in by saliva and secretion from von Ebner’s gland, and is ultimately separated into molecules and ions. This molecule or ion affects the surface film (microvillus) of the taste cells, which constitute taste bud. Thus, it takes time to change the electric potential of cell membranes upon presentation of a chemical stimulus. If the time required for this process is not always constant, its variance might reduce the temporal resolution of chemical senses.

PSS of each cross-modal combination

PSS increased in the following order: olfactory–visual, visual–gustatory, and olfactory–gustatory combinations. The PSS of the olfactory–visual combination was approximately equal to the POS. This finding is consistent with the results of a previous study [18] that performed SJ using visual, auditory, and tactile stimuli. On the other hand, the PSS of visual–gustatory and olfactory–gustatory combinations greatly receded from the POS. Such a result like this has never been observed in simple SJ using combinations of physical stimuli. We inferred that these results might reflect the fact that gustation is specialized for active sensing.

Titchener [33] described the relationship between information processing and attention as fallows: “The object of attention comes to consciousness more quickly than the objects which we are not attending to.” Previous studies on perception of synchrony between sensory information [3438] reported that stimuli to which participants are directing their attention are processed more rapidly than stimulus to which they are not directing their attention, a phenomenon called the prior entry effect. In some studies of SJ and TOJ using combinations of cross-modal stimuli, participants were asked to direct their attention to one or the other of two stimuli or both stimuli, and PSS was compared among these conditions. On the other hand, Yates and Nicholls [39] suggested that even if the experimental procedure of TOJ is extremely carefully performed, we cannot completely eliminate the possibility of response bias. Here, response bias is defined as the participant’s tendency to report that the sensory modality to which they were directing their attention was perceived faster than another sensory modality to which they were not directing their attention in cases in which they were unable to judge the order of two stimuli in a cross-modal combination. Yates and Nicholls [39] speculated that SJ might reduce the occurrence of response bias more easily than TOJ. Although it is harder for SJ than TOJ to produce prior entry effect, this effect is still observed even in SJ [39,40]. A previous study, in which TOJ was performed using visual and tactile stimuli [41] reported that the PSS under conditions in which the participant directed their attention to only the visual stimulus, to both the visual and tactile stimuli, and to only the tactile stimulus receded from POS in the direction of the visual stimulus first by 22, 53, and 155 milliseconds, respectively. Incidentally, results similar to those of Spence and colleagues [41] were observed in TOJ using audio and visual stimuli [42]. Zampini and colleagues reported that the PSS was shifted when participants directed their attention to a specific sensory modality. Based on the above, we inferred that the prior entry effect might be the result of the participant involuntarily directing their attention more to gustatory stimulus than to the other stimulus in each combination, so that the PSS in SJ with visual–gustatory and olfactory–gustatory combinations shifted in the direction of the gustatory stimulus first In everyday life, consumers must take foods into their oral cavities in order to perceive their tastes. We considered this action to be the trigger that activates active sensing in gustation. When the participant does not know when the gustatory stimulus arrives at their tongue, as in this study, they might need to activate active sensing of gustation by directing their attention involuntarily to their tongue.

In visual–gustatory and olfactory–gustatory combinations, although both PSSs shifted in the direction of visual or olfactory stimuli first, the intervals between POS and PSS differed between these two cross-modal combinations. In other words, we considered that the prior entry effect was produced more strongly in the olfactory–gustatory combination than in the visual–gustatory combination. These results might be explained by the ease of confusing olfactory information with gustatory information in everyday life. For example, olfactory disorder patients who complain of subjective gustatory disorders can be classified into two groups: those perform worse than normal range in a gustatory test, and those who performed normally in a gustatory test. The patients in the latter group are defined clinically as having a “flavor disorder” [43,44]. Kitano and colleagues [44] reported that about half of olfactory disorder patients with subjective gustatory disturbance suffer from a flavor disorder. In everyday life, although even healthy people might confuse input of olfactory information to olfactory mucosa with input of gustatory information to taste cells, as observed in patients with flavor disorder, they do not confuse input of visual information to retina with input of gustatory information to taste cells. We speculated that because participants were likely to confuse olfactory information with gustatory information, they might direct more attention to the gustatory stimulus in the olfactory–gustatory combination than in the visual–gustatory combination when performing SJ. On the base of this speculation, because more attention to gustatory stimulus accelerates processing of gustatory information processing, the olfactory stimulus is forced to be presented faster than visual stimulus. As a result, the interval between the POS and PSS in the olfactory–gustatory combination might become larger than that in the visual–gustatory combination.

Neural processing mechanism for simultaneity judgment between cross-modal stimuli

The brain regions involved processing of information from each sensory modality have been gradually identified by non-invasive measurements such as functional magnetic resonance imaging, magnetoencephalography, and electroencephalogram. For vision, after visual stimulus received by the retina arrives at the thalamus through the optic nerve, it is projected onto primary visual cortex. It then branches off into the dorsal visual pathway, which responds to motor vision and spatial vision, and the ventral visual pathway, which responds to form vision, and ultimately reaches orbitofrontal cortex [4549]. For olfaction, after a stimulus received by the olfactory mucosa arrives at the olfactory bulb through the olfactory nerve, it is projected onto piriform cortex, and then branches off to orbitofrontal cortex and thalamus [5054]. For gustation, after a stimulus received by a taste cell on the taste buds of fungiform papillae, which are distributed on the frontal one-third of the tongue, arrives at the thalamus through the chorda tympani nerve, it is projected onto the primary gustatory area, and ultimately reaches orbitofrontal cortex [16,17,5557]. Psychophysical and neuroimaging studies using cross-modal stimulus have identified orbitofrontal cortex as the brain region concerned with interaction among sensory modalities [5861]. Furthermore, recording from single neurons of macaque verified that visual, olfactory, and gustatory information converges in orbitofrontal cortex [62]. Thus, this brain region functions in both unimodal information processing such as visual, olfactory, and gustatory stimuli and in bimodal (or multimodal) information processing such as olfactory–visual, visual–gustatory, and olfactory–gustatory combinations. In the near future, we should investigate whether orbitofrontal cortex is activated when performing SJ.

Effect of orthonasal and retronasal olfaction on simultaneity judgment

Olfaction is the only dual–sensory modality that perceives both odorants in the external world and those in the body (i.e., the mouth) [63]. Orthonasal olfaction occurs when an odorant molecular is delivered via the nares to olfactory epithelium, whereas retronasal olfaction occurs when an odorant molecular is delivered from the oral cavity via the nasopharynx and posterior choanae to olfactory epithelium in the olfactory cleft [64]. Orthonasal and retronasal olfaction are processed in different brain regions [65,66]. Small and colleagues [51] measured brain activity by functional magnetic resonance imaging during orthonasal and retronasal presentation of four odors (butanol, farnesol, lavender, and chocolate). The results revealed that the brain regions that were activated depended on the routes of odor presentation only when chocolate, a food odor, was used. More specifically, orthonasal presentation increased brain activity in the insula/operculum, thalamus, hippocampus, amygdala, and caudolateral orbitofrontal cortex, whereas retronasal presentation increased activity in the perigenual cingulate and medial orbitofrontal cortex. The differences between orthonasal and retronasal olfaction have been described in previous studies of event-related potential [6769], detection [70,71], identification [72,73], and perceived intensity [74,75]. In this study, olfactory stimulus was presented orthonasally. Based on the previous studies, we might obtain different results regarding simultaneity judgment for cross-modal combinations involving an olfactory stimulus, depending on whether it is presented orthonasally or retronasally.

Conclusion

Vision is a physical sense, whereas olfaction and gustation are chemical senses. When we approach the properties of sensory modalities from a different standpoint, we might reasonably suppose that both active and passive sensing function in vision and olfaction, whereas only active sensing functions in gustation. In order to examine the effect of each sensory property on synchrony perception, we had our participants perform SJ using three cross-modal combinations of olfactory–visual, visual–gustatory, and olfactory–gustatory stimuli. We used red LED light, coumarin, and NaCl solution as visual, olfactory, and gustatory stimuli, respectively. We determined the temporal distribution of simultaneous response rates in each cross-modal combination for each participant, and calculated approximations on the assumption that these temporal distributions were Gaussian. Furthermore, using the coefficients of these approximations, we compared HWHH among cross-modal combinations, and compared PSS with POS in each combination. This result revealed that there were no significant differences in HWHH in all paired cross-modal combinations. The HWHH of three cross-modal combinations were higher than the HWHH of cross-modal combinations of physical stimuli obtained in a previous study [18], so we considered that cross-modal combinations involving chemical stimuli might have a low temporal resolution of synchrony perception. The PSS increased in the following order: olfactory–visual, visual–gustatory, and olfactory–gustatory combinations. The PSS of the olfactory–visual combination was appropriately equal to the POS, as in the case of SJ using three physical stimuli (visual, audio, and tactile stimuli). On the other hand, the PSS of visual–gustatory and olfactory–gustatory combinations receded significantly from the POS.

This study is the first report using the method established by Gotow and Kobayakawa [27] to perform SJ for olfactory–visual, visual–gustatory, and olfactory–gustatory combinations. Therefore, in order to generalize these results as the specific to chemical senses (especially gustation) in regard to synchrony perception, we need to verify whether the same phenomena will be observed when SJ is performed for various cross-modal combinations using visual, olfactory, and gustatory stimuli other than red LED light, coumarin, and NaCl solution.

Supporting information

S1 File. The data underlying the findings in this study.

Time points of each stimulus onset, actual SOA values, and responses acquired from participants (i.e., “simultaneous” or “successive”) in olfactory–visual, visual–gustatory, and olfactory–gustatory combinations are shown. We calculated the time points of each stimulus onset and the actual SOA values, using the record of real-time monitoring of stimulus presentation.

https://doi.org/10.1371/journal.pone.0174958.s001

(XLSX)

Author Contributions

  1. Conceptualization: NG TK.
  2. Data curation: NG TK.
  3. Formal analysis: NG TK.
  4. Funding acquisition: TK.
  5. Investigation: NG TK.
  6. Methodology: NG TK.
  7. Project administration: TK.
  8. Software: TK.
  9. Supervision: TK.
  10. Validation: NG.
  11. Visualization: NG.
  12. Writing – original draft: NG.
  13. Writing – review & editing: NG TK.

References

  1. 1. Foucher JR., Lacambre M, Pham BT, Giersch A, Elliott MA. Low time resolution in schizophrenia Lengthened windows of simultaneity for visual, auditory and bimodal stimuli. Schizophr Res. 2007;97(1–3):118–127. pmid:17884350
  2. 2. Navarra J, Vatakis A, Zampini M, Soto-Faraco S, Humphreys W, Spence C. Exposure to asynchronous audiovisual speech extends the temporal window for audiovisual integration. Cogn Brain Res. 2005;25(2):499–507.
  3. 3. Stevenson RA, Fister JK, Barnett ZP, Nidiffer AR, Wallace MT. Interactions between the spatial and temporal stimulus factors that influence multisensory integration in human performance. Exp Brain Res. 2012;219(1):121–137. pmid:22447249
  4. 4. van Eijk RL, Kohlrausch A, Juola JF, van de Par S. Temporal interval discrimination thresholds depend on perceived synchrony for audio-visual stimulus pairs. J Exp Psychol Hum Percept Perform. 2009;35(4):1254–1263. pmid:19653763
  5. 5. Vatakis A, Navarra J, Soto-Faraco S, Spence C. Audiovisual temporal adaptation of speech: Temporal order versus simultaneity judgments. Exp Brain Res. 2008;185(3): 521–529. pmid:17962929
  6. 6. Boenke LT, Deliano M, Ohl FW. Stimulus duration influences perceived simultaneity in audiovisual temporal-order judgment. Exp Brain Res. 2009;198(2–3):233–244. pmid:19590862
  7. 7. Laasonen M, Service E, Virsu V. Crossmodal temporal order and processing acuity in developmentally dyslexic young adults. Brain Lang. 2002;80(3):340–354. pmid:11896646
  8. 8. Sugano Y, Keetels M, Vroomen J. Adaptation to motor-visual and motor-auditory temporal lags transfer across modalities. Exp Brain Res. 2010;201(3):393–399. pmid:19851760
  9. 9. Virsu V, Lahti-Nuuttila P, Laasonen M. Crossmodal temporal processing acuity impairment aggravates with age in developmental dyslexia. Neurosci Lett. 2003;336(3):151–154. pmid:12505615
  10. 10. Keetels M, Vroomen J. No effect of auditory-visual spatial disparity on temporal recalibration. Exp Brain Res. 2007;182(4):559–565. pmid:17598092
  11. 11. Nelson ME, MacIver MA. Sensory acquisition in active sensing systems. J Comp Physiol A Neuroethol Sens Neural Behav Physiol. 2006;192(6):573–586. pmid:16645885
  12. 12. Snyder JB, Nelson ME, Burdick JW, Maciver MA. Omnidirectional sensory and motor volumes in electric fish. PLoS Biol. 2007;5(11):e301. pmid:18001151
  13. 13. Evans W, Kobal G, Lorig T, Prah J. Suggestions for collection and reporting of chemosensory (olfactory) event-related potentials. Chemi Senses. 1993;18(6):751–756.
  14. 14. Kobal G. Pain-related electrical potentials of the human nasal mucosa elicited by chemical stimulation. Pain. 1985;22(2):151–163. pmid:4047701
  15. 15. Kobal G, Hummel C. Cerebral chemosensory evoked potentials elicited by chemical stimulation of the human olfactory and respiratory nasal mucosa. Electroencephalogr Clin Neurophysiol. 1988;71(4):241–250. pmid:2454788
  16. 16. Kobayakawa T, Endo H, Ayabe-Kanamura S, Kumagai T, Yamaguchi Y, Kikuchi Y, et al. The primary gustatory area in human cerebral cortex studied by magnetoencephalography. Neurosci Lett. 1996;212(3):155–158. pmid:8843096
  17. 17. Kobayakawa T, Ogawa H, Kaneda H, Ayabe-Kanamura S, Endo H, Saito S. Spatio-temporal analysis of cortical activity evoked by gustatory stimulation in humans. Chem Senses. 1999;24(2):201–209. pmid:10321821
  18. 18. Fujisaki W, Nishida S. Audio-tactile superiority over visuo-tactile and audio-visual combinations in the temporal resolution of synchrony perception. Exp Brain Res. 2009;198(2–3):245–259. pmid:19499212
  19. 19. Lewald J, Guski R. Auditory-visual temporal integration as a function of distance: No compensation for sound-transmission time in human perception. Neurosci Lett. 2004;357(2):119–122. pmid:15036589
  20. 20. Vatakis A, Spence C. Evaluating the influence of frame rate on the temporal aspects of audiovisual speech perception. Neurosci Lett. 2006;405(1–2):132–136. pmid:16854524
  21. 21. van Eijk RL, Kohlrausch A, Juola JF, van de Par S. Temporal order judgment criteria are affected by synchrony judgment sensitivity. Atten Percept Psychophys. 2010;72(8):2227–2235. pmid:21097865
  22. 22. Navarra J, Alsius A, Velasco I, Soto-Faraco S, Spence C. Perception of audiovisual speech synchrony for native and non-native language. Brain Res. 2010;1323:84–93. pmid:20117103
  23. 23. Vroomen J, Keetels M, de Gelder B, Bertelson P. Recalibration of temporal order perception by exposure to audio-visual asynchrony. Cogn Brain Res. 2004;22(1):32–35.
  24. 24. Toda H, Kobayakawa T. High-speed gas concentration measurement using ultrasound. Sensor Actuat A Phys. 2008;144(1):1–6.
  25. 25. Toda H, Saito S, Yamada H, Kobayakawa T. High-speed gas sensor for chemosensory event-related potentials or magnetic fields. J Neurosci Methods. 2005;152(1–2):91–96. pmid:16257056
  26. 26. Saito S. 1994. Measurement method for olfaction. [in Japanese] In: Oyama T, Imai S, Wake T, editors. Sensory and Perceptual Psychology Handbook New Edition. Tokyo: Seishin Shobo; 1994. pp. 1371–1382.
  27. 27. Gotow N, Kobayakawa T. Construction of measurement system for simultaneity judgment using olfactory and gustatory stimuli. J Neurosci Methods. 2014;221:132–138.
  28. 28. Fujisaki W, Shimojo S, Kashino M, Nishida S. Recalibration of audiovisual simultaneity. Nat Neurosci. 2004;7(7):773–778. pmid:15195098
  29. 29. Zampini M, Guest S, Shore DI, Spence C. Audio-visual simultaneity judgments. Percept Psychophys. 2005a;67(3):531–544.
  30. 30. Titchener EB. Lectures on the elementary psychology of feeling and attention. New York: Macmillan Company; 1908.
  31. 31. Miller IJ Jr. Variation in human fungiform taste bud densities among regions and subjects. Anat Rec. 1986;216(4):474–482. pmid:3799995
  32. 32. Mizoguchi C, Kobayakawa T, Saito S, Ogawa H. Gustatory evoked cortical activity in humans studied by simultaneous EEG and MEG recording. Chem Senses. 2002;27(7):629–634. pmid:12200343
  33. 33. Kobayakawa T, Saito S, Gotow N, Ogawa H. Representation of salty taste stimulus concentrations in the primary gustatory area in humans. Chemosense Percept. 2008;1(4):227–234.
  34. 34. Shimojo S, Miyauchi S, Hikosaka O. Visual motion sensation yielded by non-visually driven attention. Vision Res. 1997;37(12):1575–1580. pmid:9231224
  35. 35. Shore DI, Spence C, Klein RM. Visual prior entry. Psychol Sci. 2001;12(3):205–212. pmid:11437302
  36. 36. Vibell J, Klinge C, Zampini M, Spence C, Nobre AC. Temporal order is coded temporally in the brain: Early event-related potential latency shifts underlying prior entry in a cross-modal temporal order judgment task. J Cogn Neurosci. 2007;19(1):109–120. pmid:17214568
  37. 37. Yates MJ, Nicholls ME. Somatosensory prior entry. Atten Percept Psychophys. 2009;71(4):847–859. pmid:19429963
  38. 38. Zampini M, Bird KS, Bentley DE, Watson A, Barrett G, Jones AK, et al. ‘Prior entry’ for pain: Attention speeds the perceptual processing of painful stimuli. Neurosci Lett. 2007;414(1):75–79. pmid:17197082
  39. 39. Yates MJ, Nicholls ME. Somatosensory prior entry assessed with temporal order judgments and simultaneity judgments. Atten Percept Psychophys. 2011;73(5):1586–1603. pmid:21487928
  40. 40. Spence C, Parise C. Prior-entry: A review. Conscious Cogn. 2010;19(1):364–379. pmid:20056554
  41. 41. Spence C, Shore DI, Klein RM. Multisensory prior entry. J Exp Psychol Gen. 2001;130(4):799–832. pmid:11757881
  42. 42. Zampini M, Shore DI, Spence C. Audiovisual prior entry. Neurosci Lett. 2005b;381(3):217–222. pmid:15896473
  43. 43. Fujii M, Fukazawa K, Hashimoto Y, Takayasu S, Umemoto M, Negoro A, et al. Clinical study of flavor disturbance. Acta Otolaryngol Suppl. 2004;553:109–112.
  44. 44. Kitano M, Kobayashi M, Imanishi Y, Sakaida H, Majima Y. Clinical analysis of hyposmia-associated taste dysfunction. [in Japanese] Nihon Jibiinkoka Gakkai Kaiho. 2009;112(3):110–115. pmid:19364046
  45. 45. Dogan M, Ozsoy E, Doganay S, Burulday V, Firat PG, Ozer A, et al. Brain diffusion-weighted imaging in diabetic patients with retinopathy. Eur Rev Med Pharmacol Sci. 2012;16(1):126–131. pmid:22338559
  46. 46. Green MF, Glahn D, Engel SA, Nuechterlein KH, Sabb F, Strojwas M, et al. Regional brain activity associated with visual backward masking. J Cogn Neurosci. 2005;17(1):13–23. pmid:15701236
  47. 47. Lanyon LJ, Giaschi D, Young SA, Fitzpatrick K, Diao L, Bjornson BH, et al. Combined functional MRI and diffusion tensor imaging analysis of visual motion pathways. J Neuroophthalmol. 2009;29(2):96–103. pmid:19491631
  48. 48. Mullen KT, Thompson B, Hess RF. Responses of the human visual cortex and LGN to achromatic and chromatic temporal modulations: An fMRI study. J Vis. 2010;10(13):13. pmid:21106678
  49. 49. Rolls ET. The functions of the orbitofrontal cortex. Brain Cognition. 2004;55(1):11–29. pmid:15134840
  50. 50. Savic I, Gulyas B. PET shows that odors are processed both ipsilaterally and contralaterally to the stimulated nostril. Neuroreport. 2000;11(13):2861–2866. pmid:11006955
  51. 51. Small DM, Gerber JC, Mak YE, Hummel T. Differential neural responses evoked by orthonasal versus retronasal odorant perception in humans. Neuron. 2005;47(4):593–605. pmid:16102541
  52. 52. Suzuki Y, Critchley HD, Suckling J, Fukuda R, Williams SC, Andrew C, et al. Functional magnetic resonance imaging of odor identification: The effect of aging. J Gerontol A Biol Sci Med Sci. 2001;56(12):M756–M760. pmid:11723149
  53. 53. Wang J, Eslinger PJ, Smith MB, Yang QX. Functional magnetic resonance imaging study of human olfaction and normal aging. J Gerontol A Biol Sci Med Sci. 2005;60(4):510–514. pmid:15933393
  54. 54. Weismann M, Yousry I, Heuberger E, Nolte A, Ilmberger J, Kobal G, et al. Functional magnetic resonance imaging of human olfaction. Neuroimag Clin N Am. 2001;11(2):237–250.
  55. 55. Jacobson A, Green E, Murphy C. Age-related functional changes in gustatory and reward processing regions: An fMRI study. Neuroimage. 2010;53(2):602–610. pmid:20472070
  56. 56. Veldhuizen MG, Albrecht J, Zelano C, Boesveldt S, Breslin P, Lundström JN. Identification of human gustatory cortex by activation likelihood estimation. Hum Brain Mapp. 2011a;32(12):2256–2266.
  57. 57. Veldhuizen MG, Douglas D, Aschenbrenner K, Gitelman DR, Small DM. The anterior insular cortex represents breaches of taste identity expectation. J Neurosci. 2011b;31(41):14735–14744.
  58. 58. De Araujo IET, Rolls ET, Kringelbach ML, McGlone F, Phillips N. Taste-olfactory convergence, and the representation of the pleasantness of flavour, in the human brain. Eur J Neurosci. 2003;18(7):2059–2068. pmid:14622239
  59. 59. Gottfried JA, Dolan RJ. The nose smells what the eye sees: Crossmodal visual facilitation of human olfactory perception. Neuron. 2003;39(2):375–386. pmid:12873392
  60. 60. Ohla K, Toepel U, le Coutre J, Hudry J. Visual-gustatory interaction: Orbitofrontal and insular cortices mediate the effect of high-calorie visual food cues on taste pleasantness. PLoS One. 2012;7(3):e32434. pmid:22431974
  61. 61. Small DM, Voss J, Mak YE, Simmons KB, Parrish T, Gitelman D. Experience-dependent neural integration of taste and smell in the human brain. J Neurophysiol. 2004;92(3):1892–1903. pmid:15102894
  62. 62. Rolls ET, Baylis LL. Gustatory, olfactory, and visual convergence within the primate orbitofrontal cortex. J Neurosci. 1994;14(9):5437–5452. pmid:8083747
  63. 63. Rozin P. “Taste-smell confusions” and the duality of the olfactory sense. Percept Psychophys. 1982;31(4):397–401. pmid:7110896
  64. 64. Leon EA, Catalanotto FA, Werning JW. Retronasal and orthonasal olfactory ability after laryngectomy. Arch Otolaryngol Head Neck Surg. 2007;133(1):32–36. pmid:17224519
  65. 65. Iannilli E, Bult JH, Roudnitzky N, Gerber J, de Wijk RA, Hummel T. Oral texture influences the neural processing of ortho- and retronasal odors in humans. Brain Res. 2014;1587:77–87. pmid:25175838
  66. 66. Shepherd GM. Smell images and the flavour system in the human brain. Nature. 2006;444:316–321. pmid:17108956
  67. 67. Heilmann S, Hummel T. A new method for comparing orthonasal and retronasal olfaction. Behav Neurosci. 2004;118(2):412–419. pmid:15113268
  68. 68. Landis BN, Frasnelli J, Reden J, Lacroix JS, Hummel T. Differences between orthonasal and retronasal olfactory functions in patients with loss of the sense of smell. Archives of otolaryngology—head and neck surgery, Arch Otolaryngol Head Neck Surg. 2005;131(11):977–981. pmid:16301369
  69. 69. Welge-Lüssen A, Husner A, Wolfensberger M, Hummel T. Influence of simultaneous gustatory stimuli on orthonasal and retronasal olfaction. Neurosci Lett. 2009;454(2):124–128. pmid:19429068
  70. 70. Botezatu A, Pickering GJ. Determination of ortho- and retronasal detection thresholds and odor Impact of 2,5-dimethyl-3-methoxypyrazine in wine. J Food Sci. 2012;77(11):S394–S398. pmid:23057415
  71. 71. Voirol E, Daget N. Comparative study of nasal and retronasal olfactory perception. Lebensmittel-Wissenschaft Technol. 1986;19(4):316–319.
  72. 72. Gagnon L, Ismaili ARA, Ptito M, Kupers R. Superior orthonasal but not retronasal olfactory skills in congenital blindness. PLoS ONE. 2015;10(3):e0122567. pmid:25822780
  73. 73. Sun BC, Halpern BP. Identification of air phase retronasal and orthonasal odorant pairs. Chem. Senses. 2005;30(8):693–706. pmid:16177226
  74. 74. Koza BJ, Cilmi A, Dolese M, Zellner DA. Color enhances orthonasal olfactory intensity and reduces retronasal olfactory intensity. Chem. Senses. 2005;30(8):643–649. pmid:16141290
  75. 75. Lee J, Halpern BP. High-resolution time–intensity tracking of sustained human orthonasal and retronasal smelling during natural breathing. Chemosens. Percept. 2013;6(1):20–35.