Cogn Process DOI 10.1007/s10339-012-0533-1
SHORT REPORT
Verifying properties of concepts spontaneously requires sharing resources with same-modality percept Nicolas Vermeulen • Betty Chang • Olivier Corneille Gordy Pleyers • Martial Mermillod
•
Received: 11 June 2012 / Accepted: 18 December 2012 ! Marta Olivetti Belardinelli and Springer-Verlag Berlin Heidelberg 2013
Abstract In the present experiments, participants had to verify properties of concepts but, depending on the trial condition, concept-property pairs were presented via headphones or on the screen. The results showed that participants took longer and were less accurate at verifying conceptual properties when the channel used to present the CONCEPT-property pair and the type of property matched in sensory modality (e.g., LEMON-yellow on screen; BLENDER-loud in headphones) compared to when properties and channel did not match (e.g., LEMON-yellow in headphones; BLENDER-loud on screen). Such interference is consistent with theories of embodied cognition holding that knowledge is grounded in modality-specific systems (Barsalou in Behav Brain Sci 22:577–660, 1999). When the resources of one modality are burdened during the task,
N. Vermeulen (&) ! O. Corneille ! G. Pleyers Research Institute for Psychological Sciences, Universite´ Catholique de Louvain (UCL), 10 Place Cardinal Mercier, 1348 Louvain-la-Neuve, Belgium e-mail:
[email protected] N. Vermeulen Belgian National Fund for Scientific Research, Brussels, Belgium B. Chang Department of Psychology, University of Sheffield, Western Bank, Sheffield S10 2TP, UK M. Mermillod Laboratoire de Psychologie et Neurocognition, Universite´ Pierre Mende`s-France (CNRS UMR 5105), 38040 Grenoble, France M. Mermillod Institut Universitaire de France, 103, bd Saint-Michel, 75005 Paris, France
processing costs are incurred in a conceptual task (Vermeulen et al. in Cognition 109:287–294, 2008). Keywords Embodied cognition ! Concepts ! Load ! Resources ! Simulation ! Knowledge ! Attention ! Perception
Introduction Theories of grounded cognition propose that knowledge directly depends on simulations in sensory-motor systems rather than abstract amodal symbols (Barsalou 1999). There exists a large body of evidence on the role of sensory systems during knowledge access. In conceptual designs, research showed that people are faster and more accurate in deciding whether a property (e.g., Rustling) is a typical feature of a concept (e.g., LEAVES) when the property verified on the previous trial belongs to the same modality as the current property (e.g., BLENDER-loud) versus a different modality (e.g., LEMON-sour) (Pecher et al. 2003). This research was also extended to the verification of affective properties (e.g., ORPHAN-hopeless, SPIDERblack) (Vermeulen et al. 2007). Importantly, in a perceptual-conceptual design, research shows that sensorily related perceptual and conceptual representations share sensory systems during online processing. For instance, in one study, participants had to localize either an auditory, visual, or tactile stimulus prior to verifying the sensory properties of a concept (e.g., judging whether a BANANA is yellow) (van Dantzig et al. 2008). Verification responses were faster when the modality of the previously localized stimulus matched the modality of the verified property, demonstrating the facilitative influence of perception on concept processing. Inversely, research has shown that
123
Cogn Process
specific types of interference (i.e., inhibition) are observed when a sensory load is imposed upon participants as they perform conceptual tasks in parallel multitasking designs. By using primarily a visual load, studies support the idea that viewing or imagining a visual scene impairs the representation of visually related concepts (more so than hearing a description of the same scene). In a study by Mayer and Moreno (1998), for instance, participants showed better comprehension of animations depicting the formation of lightning (while also listening to a corresponding narration) than participants who viewed the same animation while reading a text consisting of the same words as the narration. In order to examine more directly the perceptual-conceptual interference hypothesis, Vermeulen et al. (2008) manipulated both the type of sensory load (auditory/visual) and the modality of the processed conceptual properties (auditory/visual). This study showed that in a high (but not low) sensory load condition (i.e., three tobe-remembered sounds or pictures) visual and auditory property verification took longer under an intra-modal sensory load (i.e., responding whether BLENDER can be loud while memorizing three sounds) than under an intermodal sensory load (i.e., performing the same task while memorizing three pictures). This interaction was interpreted as being related to a possible overload of a ‘‘shared’’ sensory capacity based on processes that were both on-line (during the manipulation of perceptual information) and off-line (during the conceptual representation of sensory properties). Interestingly, recent research has shown that grounded cognition effects may even occur unintentionally (or spontaneously) during simple word processing (Hauk et al. 2004; Vermeulen et al. 2009). In other words, even if participants are not requested explicitly to represent the sensory properties of concepts, they nevertheless show evidence of sensory simulation of those modal properties. In sum, the above-reviewed literature on grounded cognition supports two conclusions: (1) intentional knowledge (i.e., property verification) representation involves simulation in the sensory systems; (2) this simulation may involve the use of modality-specific resources, with resources involved in the processing of visual (or auditory) information used for representing concepts that have a strong visual (or auditory) component. However, to our knowledge, no prior study has examined whether property verification performance could be influenced by the channel used to display the stimuli. Indeed, all the previous published studies presented the concept-property associations onscreen, and none examined whether presenting the associations via headphones could influence performance due to a reduction in available modal resources. If conceptual processing involves simulation in modality-specific sensory-motor systems, then property verification should be disrupted more by a simultaneous load in the
123
same modality that encodes the conceptual property of interest. In order to increase the modal load, prior to each property verification trial, participants had to make a sensory judgment in the visual or in the auditory channel (deciding whether a sound or a picture was of low or high intensity), forcing participants to switch between the channels. We hypothesize that the verification of auditory properties of concepts would be slower and less accurate when the concepts are presented through the auditory channel than through the visual channel. The reverse would be true for visual properties of concepts (faster and more accurate in the auditory channel than in the visual channel).
Study 1 Method Subjects and design Twenty volunteers (17 women, Mean age 20.60 years, SD: 3.07) from the Catholic University of Louvain (Belgium) participated in fulfilment of a course requirement. The study conformed to a 2 (Perceptual judgment: Auditory vs. Visual) 9 2 (Conceptual property: Auditory vs. Visual) 9 2 (Channel use: Auditory vs. Visual) full within-subject design. Materials Eighty critical concept-property associations were developed based on Vermeulen et al. (2007, 2008). Forty associations involved a visual property (e.g., CHEDDARorange), and forty involved an auditory property (e.g., BLENDER-loud). An additional 130 fillers were used to hide the experimental aims involving 48 true non visual or auditory associations (CUCUMBER-bland) and of 82 false associations (ANIMAL-mineral). Concepts and properties of many false trials were semantically associated to ensure that participants would process the conceptual properties on these trials (CAR-asphalt). Half of the 80 experimental and 130 filler associations were presented visually on the computer screen and half were presented auditorily through. Thus, channel of presentation of the concept-property associations was either the visual or an auditory modality, with 20 visual, 20 auditory, and 65 filler associations presented in a visual modality onscreen, and 20 visual, 20 auditory, and 65 filler associations presented in an auditory modality through headphones. For auditory judgments, a target sound (i.e., ‘‘DING’’) of low and high intensity was created. For visual judgments, a target picture (i.e., a square) of low and high intensity was created. Two hundred and ten perceptual judgment/
Cogn Process
conceptual verification pairs were then constructed by coupling specific perceptual judgments with specific CONCEPT-property associations. Half of the pairs involved an auditory judgment, and the other half involved a pictorial judgment (50 % of which were of low versus high intensity in both cases). Procedure Participants were tested in a computer room, seated in front of a monitor, donned headphones and read computer-presented instructions. They were then informed that the first task consisted of assessing the intensity of a sound or a picture presented, respectively, via the headphones or onscreen (perceptual judgment). They were to respond by pressing a ‘‘1’’ on the keypad for low intensity stimuli and a ‘‘3’’ for high intensity stimuli. This task was implemented in order to increase the load in specific modalities. The second task, interspersed with the first, involved verifying whether a property was usually true for its associated CONCEPT. Participants were asked to answer as quickly and accurately as possible on both tasks. Each trial started with a perceptual judgment (Fig. 1). A sound or a picture was presented for 500 ms and followed by a blank screen. Once the response was produced, the concept-property verification item was presented onscreen or over the headphones. For visual presentations, a fixation
stimulus ‘‘*’’ was presented for 500 ms and replaced by two vertically aligned lines of text for the concept-property verification pair. The first line contained the CONCEPT word in uppercase, and the second line contained the property in lower case. Concept and property lines were printed in Arial Bold 20. The two lines appeared simultaneously and remained onscreen until the participant made a ‘‘true’’ (‘‘C’’ key) or a ‘‘false’’ (‘‘B’’ key) judgement. For auditory presentations, a blank screen was presented for 500 ms, replaced by the auditory presentation of the association and followed by a blank screen that remained until a response was given. The next trial started 1,000 ms after the conceptual verification response was made. The experiment was divided into 4 blocks of 52 and 53 trials. Before beginning, participants were presented with an example of each perceptual judgment (e.g., ‘‘this is a low intensity picture’’) and they completed training trials. Results Response times (RTs) on CONCEPT-property trials to which participants responded accurately to both the perceptual and the conceptual judgments were retained for analysis. RTs shorter than 300 ms and longer than 3,000 ms were removed from the analysis. For property verifications, we conducted a 2 (Perceptual Judgment: Auditory vs. Visual) 9 2 (Conceptual Property:
Fig. 1 Examples of trials used in the present experiment
123
Cogn Process
Auditory vs. Visual) 9 2 (Presentation Channel: Auditory vs. Visual) MANOVA on response times (RTs) and accuracy. This analysis revealed that RTs for auditory presentations (M = 1,870 ms; SE = 38) were slower than RTs for visual presentations (M = 1,409 ms; SE = 61), F (1, 19) = 144.31, p \ .001, MSE = 58,909.87. Importantly, a Channel X Property interaction emerged, F (1, 19) = 4.77, p \ .05, MSE = 13,285.17, with slower property verifications when there was a match between the presentation channel and the modality of the conceptual property (M = 1,660 ms; SE = .56) compared to when they mismatched (M = 1,620 ms; SE = .47) (see Fig. 2a). Channel also interacted with Judgment, F (1, 19) = 11.89, p \ .01, MSE = 11,880.42, with slower property verifications when Judgment and Channel modality matched (M = 1,670 ms; SE = .51) than when they mismatched
Auditory Properties
1900
Visual Properties
1800
Accuracy rates
Response Times (ms)
(A)
1700 1600 1500 1400 1300
(M = 1,610 ms; SE = .50) (Fig. 2b). No other effects were significant. Accuracy analyses revealed a main effect of conceptual Property, F (1, 19) = 13.19, p \ .01, MSE = .023, with more accurate verification for visual (M = .913; SE = .02) than auditory (M = .826; SE = .03) properties. A main effect of Judgment was also observed, F (1, 19) = 5.39, p \ .05, MSE = .008, with more accurate property verifications preceded by auditory (M = .886; SE = .02) than visual (M = .853; SE = .02) judgments. Importantly, a Channel 9 Property emerged, F (1, 19) = 4.90, p \ .05, MSE = .007, showing less accurate property verifications when Channel and Property matched in modality (M = .855; SE = .02) than when they mismatched (M = .884; SE = .02) (Fig. 2a). Channel also interacted with Judgment, F (1, 19) = 16.29, p \ .001, MSE = .013,
0,96
Auditory Properties
0,94
Visual Properties
0,92 0,9 0,88 0,86 0,84 0,82
Auditory
0,8
Visual
Auditory
Presentation Channel 0.96
Auditory Judgments Visual Judgments
1900
0.94
Accuracy rates
Response Times (ms)
(B)
Visual
Presentation Channel
1800 1700 1600 1500 1400
Auditory Judgments Visual Judgments
0.92 0.9 0.88 0.86 0.84 0.82 0.8
1300 Auditory
Visual
Presentation Channel
Accuracy rates
(C)
Auditory
Visual
Presentation Channel
0.96
Auditory Judgments
0.94
Visual Judgments
0.92 0.9 0.88 0.86 0.84 0.82 0.8
Auditory
Visual
Conceptual Properties
Fig. 2 Performance of the participants for property verifications as a function of presentation channel and property modality (a), presentation channel and modality of perceptual judgment (b), and property modality and modality of perceptual judgment (c)
123
Cogn Process
with less accurate property verifications when Judgment and Channel matched (M = .833; SE = .02) than when they mismatched (M = .907; SE = .02) (Fig. 2b). A Judgment X Property was also observed, F (1, 19) = 5.59, p \ .05, MSE = .009, showing less accurate property verifications when the prior Judgment and the Property matched in sensory modality (M = .852; SE = .03) than when they mismatched (M = .888; SE = .02) (Fig. 2c). Interestingly, as Fig. 2c shows, the interaction between Judgment and the Property was present for visual properties, t(19) = 3.44, p = .003 but not in auditory properties t(19) \1, ns. No other significant effects were observed. In order to examine whether the effect would be observed in a low load condition, we decided to run a second study in which the secondary perceptual judgment task was omitted from the design. Based on the aforementioned literature (van Dantzig et al. 2008; Vermeulen et al. 2008; but see also Vermeulen et al. 2009), we expected in this single task that no interaction effect would appear. However, a facilitation effect may even appear (van Dantzig et al. 2008) when the channel and the to-beverified properties belong to the same modality (e.g., MIXER-Loud via headphones). Indeed, conversely to study 1, facilitation would appear if the task does not tax participant’s modal resources (Vermeulen et al. 2008).
Study 2 Method Subjects and design Nineteen volunteers (10 women, Mean age 26.16 years, SD: 3.15) from the Catholic University of Louvain (Belgium) participated on a voluntary basis. The study conformed to a 2 (Conceptual property: Auditory vs. Visual) 9 2 (Channel use: Auditory vs. Visual) full within-subject design. Materials The same concept-property associations as used in study 1 were employed in this experiment. Table 1 Mean performances (SE) of the participants for property verifications as a function of presentation channel, property modality, and modality of perceptual judgment
Perceptual judgment
Visual
Participants were tested individually, seated in front of a monitor, donned headphones, and read computer-presented instructions. The task they had to perform was exactly the same as in study 1 except that the secondary perceptual judgment task was omitted from the design. Participants only performed the concept-property verification task. Results Response times (RTs) on CONCEPT-property trials to which participants responded accurately to the conceptual judgments were retained for analysis. RTs shorter than 300 ms and longer than 3,000 ms were removed from the analysis. We conducted a 2 (Conceptual Property: Auditory vs. Visual) 9 2 (Presentation Channel: Auditory vs. Visual) MANOVA on response times (RTs) and accuracy (Table 1). This analysis revealed that RTs for auditory presentations (M = 1,898 ms; SE = 31) were slower than RTs for visual presentations (M = 1,406 ms; SE = 58), F (1, 18) = 145.12, p \ .001, MSE = 4,612,979.41. Neither the effect of conceptual property nor the Channel X Property interaction emerged, F (1, 18) \1, ns. Collectively, the analysis showed that the main effect of channel was equivalent for visual properties (Visual Channel: M = 1,395 ms; SE = .64; Auditory Channel M = 1,889 ms; SE = .32) and auditory properties (Visual Channel: M = 1,417 ms; SE = .55; Auditory Channel M = 1,908 ms; SE = .37). Accuracy analyses revealed only a main effect of conceptual Property, F (1, 18) = 10.04, p \ .01, MSE = .114, with more accurate verification for visual (M = .922; SE = .02) than auditory (M = .845; SE = .03) properties. Neither the effect of channel, F(1, 18) \1, ns nor the Channel X Property interaction, F (1, 18) = 1.39, p = .25 were significant. Collectively, the analysis showed that the main effect of conceptual properties was equivalent in visual channel (Visual properties: M = .934; SE = .02; Auditory properties M = .845; SE = .04) and auditory channel (Visual properties: M = .911; SE = .02; Auditory properties M = .845; SE = .03).
Dependent variables
Visual channel Visual properties
Auditory properties
Visual properties
Auditory properties
RT
1,467 ms (74)
1,415 ms (63)
1,834 ms (36)
1,851 ms (49)
.835 (.03)
.800 (.04)
.920 (.02)
.855 (.03)
1,421 ms (75)
1,334 ms (55)
1,896 ms (44)
1,899 ms (40)
.965 (.02)
.885 (.03)
.930 (.02)
.765 (.04)
Accuracy Auditory
Procedure
RT Accuracy
Auditory channel
123
Cogn Process
General discussion Participants took longer and were less accurate at verifying conceptual properties when the channel used to present the property and the type of property matched in sensory modality. This finding supports the embodied cognition account, which proposes that conceptual processing is supported by simulations in modality-specific sensorymotor systems (Barsalou 1999). It is also consistent with evidence that sensory modalities rely on independent attentional resources (Rees et al. 2001). If resources are modality-specific (Duncan et al. 1997), then sharing modality resources between channel and property verifications should yield a processing cost. The findings also revealed less efficient property verification when the channel used for presenting property information matched the modality of the preceding perceptual judgment. We believe that participants may have inhibited the used perceptual modality after the judgment to maximize attention to the CONCEPT-property associations to both channels. Accordingly, orienting attention back to the inhibited modality was effortful and time consuming, resulting in less efficient property verifications when the perceptual judgment and presentation channel matched in modality. Although speculative, this account seems reasonable in the context of a task that was complex and involved an equal probability of receiving the conceptproperty pairs via a visual or auditory channel. Interestingly also, the interaction between Judgment and the Property was only significant for visual properties which could be related to the typical visual dominance found in the literature (e.g., Spence et al. 2001). This visual dominance may increase the likelihood of observing influences in this modality. We also ran a second ‘‘baseline’’ study in which participants had to verify properties of concepts presented onscreen or via headphones (without the secondary perceptual judgment task). The results only showed main effects of channel and properties confirming the visual dominance in humans (Spence et al. 2001). However, it might still be objected that our interference effect (i.e., switching benefits) appears in study 1 because participants only invoke the modality-specific representations if the modalities are activated by the interspersed perceptual task. It is important to note that, even in the absence of a perceptual task, a channel is always activated by the presentation of the CONCEPT-property pairs (on screen or via the headset). Thus, an activation of a channel is always present independently of the perceptual judgment the participants had to realize. Such absence of interaction between channel and property we found in study 2 may mean, for instance, that a sufficient amount of modal load
123
is needed in order to observe modal congruency costs or switching modality benefits (as we found in study 1). This interpretation is in line with previous work (Vermeulen et al. 2008; Riou et al. 2011) showing, for instance, that modal representation of concepts is hampered only when a high modal working memory load (e.g., 3 pictures or 3 sounds) is imposed on participants while they verified properties. Nothing appeared in a low load condition (1 item to store). Moreover, there are even studies showing that a facilitation effect appears (switching modality costs or congruency benefits) when perceptual judgments precede property verification trials when those ones were always presented onscreen (van Dantzig et al. 2008). Interestingly, study 2 shows, for visual properties at least, a non-significant trend toward facilitation effect when properties and channel matched. To sum up, the present findings demonstrate that the verification of conceptual properties such as loud or yellow is dependent upon the availability of attentional resources in the same modality. Participants were less efficient at verifying properties when the channel used for the CONCEPT-property presentation and the to-be-verified property matched in sensory modality. The difficulty of the task, or more generally, the load imposed upon participants as they perform a conceptual task, was indeed the likely and predicted cause of the interference effect (Vermeulen et al. 2008).
References Barsalou LW (1999) Perceptual symbol systems. Behav Brain Sci 22:577–660 Duncan J, Martens S, Ward R (1997) Restricted attentional capacity within but not between sensory modalities. Nature 387:808–810 Hauk O, Johnsrude I, Pulvermu¨ller F (2004) Somatotopic representation of action words in human motor and premotor cortex. Neuron 41:301–307 Mayer RE, Moreno R (1998) Split-attention effect in multimedia learning: evidence for dual processing systems in working memory. J Educ Psychol 90:312–320 Pecher D, Zeelenberg R, Barsalou LW (2003) Verifying differentmodality properties for concepts produces switching costs. Psychol Sci 14:119–124 Rees G, Frith C, Lavie N (2001) Processing of irrelevant visual motion during performance of an auditory attention task. Neuropsychologia 39:937–949 Riou B, Lesourd M, Brunel L, Versace R (2011) Visual memory and visual perception: when memory improves visual search. Mem Cogn 39(6):1094–1102 Spence C, Nicholls MER, Driver J (2001) The cost of expecting events in the wrong sensory modality. Percept Psychophys 63:330–336 van Dantzig S, Pecher D, Zeelenberg R, Barsalou LW (2008) Perceptual processing affects conceptual processing. Cogn Sci 32:579–590
Cogn Process Vermeulen N, Niedenthal PM, Luminet O (2007) Switching between sensory and affective systems incurs processing costs. Cogn Sci 31:183–192 Vermeulen N, Corneille O, Niedenthal PM (2008) Sensory load incurs conceptual processing costs. Cognition 109:287–294
Vermeulen N, Mermillod M, Godefroid J, Corneille O (2009) Unintended embodiment of concepts into percepts: sensory activation boosts attention for same-modality concepts in the attentional blink paradigm. Cognition 112:467–472
123