This article was downloaded by: [ECU Libraries] On: 24 April 2015, At: 17:32 Publisher: Routledge Informa Ltd Registered in England and Wales Registered Number: 1072954 Registered office: Mortimer House, 37-41 Mortimer Street, London W1T 3JH, UK

Cognition and Emotion Publication details, including instructions for authors and subscription information: http://www.tandfonline.com/loi/pcem20

The time course of attentional modulation on emotional conflict processing ab

ac

ac

Pingyan Zhou , Guochun Yang , Weizhi Nan

a

& Xun Liu

a

Key Laboratory of Behavioral Science, Institute of Psychology, Chinese Academy of Sciences, Beijing, China b

School of Psychology, Beijing Normal University, Beijing, China

c

University of Chinese Academy of Sciences, Beijing, China Published online: 26 Mar 2015.

Click for updates To cite this article: Pingyan Zhou, Guochun Yang, Weizhi Nan & Xun Liu (2015): The time course of attentional modulation on emotional conflict processing, Cognition and Emotion, DOI: 10.1080/02699931.2015.1020051 To link to this article: http://dx.doi.org/10.1080/02699931.2015.1020051

PLEASE SCROLL DOWN FOR ARTICLE Taylor & Francis makes every effort to ensure the accuracy of all the information (the “Content”) contained in the publications on our platform. However, Taylor & Francis, our agents, and our licensors make no representations or warranties whatsoever as to the accuracy, completeness, or suitability for any purpose of the Content. Any opinions and views expressed in this publication are the opinions and views of the authors, and are not the views of or endorsed by Taylor & Francis. The accuracy of the Content should not be relied upon and should be independently verified with primary sources of information. Taylor and Francis shall not be liable for any losses, actions, claims, proceedings, demands, costs, expenses, damages, and other liabilities whatsoever or howsoever caused arising directly or indirectly in connection with, in relation to or arising out of the use of the Content. This article may be used for research, teaching, and private study purposes. Any substantial or systematic reproduction, redistribution, reselling, loan, sub-licensing, systematic supply, or distribution in any form to anyone is expressly forbidden. Terms & Conditions of access and use can be found at http://www.tandfonline.com/page/terms-and-conditions

COGNITION AND EMOTION, 2015 http://dx.doi.org/10.1080/02699931.2015.1020051

The time course of attentional modulation on emotional conflict processing Pingyan Zhou1,2, Guochun Yang1,3, Weizhi Nan1,3, and Xun Liu1 1

Key Laboratory of Behavioral Science, Institute of Psychology, Chinese Academy of Sciences, Beijing, China 2 School of Psychology, Beijing Normal University, Beijing, China 3 University of Chinese Academy of Sciences, Beijing, China

Downloaded by [ECU Libraries] at 17:32 24 April 2015

(Received 20 April 2014; accepted 12 February 2015)

Cognitive conflict resolution is critical to human survival in a rapidly changing environment. However, emotional conflict processing seems to be particularly important for human interactions. This study examined whether the time course of attentional modulation on emotional conflict processing was different from cognitive conflict processing during a flanker task. Results showed that emotional N200 and P300 effects, similar to colour conflict processing, appeared only during the relevant task. However, the emotional N200 effect preceded the colour N200 effect, indicating that emotional conflict can be identified earlier than cognitive conflict. Additionally, a significant emotional N100 effect revealed that emotional valence differences could be perceived during early processing based on rough aspects of input. The present data suggest that emotional conflict processing is modulated by top-down attention, similar to cognitive conflict processing (reflected by N200 and P300 effects). However, emotional conflict processing seems to have more time advantages during two different processing stages. Keywords: Emotional conflict; Cognitive conflict; Top-down attention; N200; N100; P300.

Detecting and resolving conflict between opposing information slows and hampers performance. Resolving emotional conflict is closely tied to human social adaptation and emotion regulation. Thus, the resolution of emotional conflict is vital for facilitating coherent goal-directed behaviour. Initially, valence-induced emotional conflict was investigated with the emotional facial Stroop task developed by Stenberg, Wiking, and Dahl (1998). In this study, participants were required to determine the valence of emotional words superimposed

onto emotional faces. Results showed that reaction times (RTs) were longer for emotional incongruent compared with emotional congruent trials, indicating that facial expressions likely interfered with emotional word evaluation (Beall & Herbert, 2008; Egner, Etkin, Gale, & Hirsch, 2008; Etkin, Egner, Peraza, Kandel, & Hirsch, 2006; Haas, Omura, Constable, & Canli, 2006; Stenberg et al., 1998; Zhu, Zhang, Wu, Luo, & Luo, 2010). Additional research has attempted to dissociate the neural systems involved in resolving emotional conflict

Correspondence should be addressed to: Xun Liu, Institute of Psychology, Chinese Academy of Sciences, 16 Lincui Road, Chaoyang District, Beijing 100101, China. E-mail: [email protected] © 2015 Taylor & Francis

1

Downloaded by [ECU Libraries] at 17:32 24 April 2015

ZHOU ET AL.

versus cognitive conflict during the emotional facial Stroop task (Egner et al., 2008; Ochsner, Hughes, Robertson, Cooper, & Gabrieli, 2008). In Egner et al.’s (2008) studies, participants were required to identify the gender or facial expression displayed in a series of pictures. Results showed that emotional and gender conflict shared a common conflictdetection mechanism, but not the same conflictresolution mechanism. However, most previous studies have investigated the time course of emotional versus cognitive conflict separately (Dong, Yang, & Shen, 2009; Fan et al., 2007; Heil, Osman, Wiegelmann, Rolke, & Hennighausen, 2000; Horstmann, Borgstedt, & Heumann, 2006; Kopp, Rist, & Mattler, 2007; Lichtenstein-Vidne, Henik, & Safadi, 2012; Liu, Xiao, & Shi, 2013; Schirmer & Kotz, 2003; Tillman & Wiens, 2011; Xue & Qiu, 2013; Zhu et al., 2010). Results from previous studies assessing emotional conflict have been inconclusive. For instance, Schirmer and Kotz (2003) suggested that emotional conflict occurs with the emergence of the N400. Conversely, Zhu et al. (2010) found the N170 amplitude to be more negative for emotional than cognitive conflict. A time course analysis of emotional conflict processing could help clarify these inconsistent findings. Furthermore, very few studies have recorded event-related potentials (ERP) to directly contrast the neural time course of emotional versus cognitive conflict processing (Alguacil, Tudela, & Ruz, 2013). Alguacil et al. (2013) suggested that processing of emotional and cognitive conflicts shares a common early perceptual mechanism but relies on different later conflict-resolution mechanisms. However, they did not directly contrast the time advantage between cognitive control and affective control. Therefore, we adapted the cognitive and affective versions of a flanker task developed by Ochsner et al. (2008) and directly contrasted the neural time course of emotional versus colour conflict processing by recording ERP within the same participants. Several studies suggest that cognitive conflict processing is modulated by attention (Kimura, Katayama, & Murohashi, 2005; Mao & Wang, 2008; Wang, Wang, Cui, Tian, & Zhang, 2002;

2

COGNITION AND EMOTION, 2015

Wang, Wang, Kong, Cui, & Tian, 2001). For instance, Wang and colleagues had participants determine whether a second stimulus (S2) was the same as the first stimulus (S1) presented. Results indicated that the N270, reflecting endogenous conflict processing, was elicited when S2 was in conflict with S1, and N270 amplitude differences were enhanced during task-relevant compared with task-irrelevant conflict (Kimura et al., 2005; Mao & Wang, 2008; Wang et al., 2001, 2002). To our knowledge, emotional stimuli are preferentially processed over non-emotional stimuli through biased allocation of attention (Anderson, 2005; Reeck & Egner, 2011; Vuilleumier, Armony, Driver, & Dolan, 2001). It is unclear whether the interaction between attention and emotional conflict processing is different from the interaction between attention and cognitive conflict processing. Thus, by addressing this question from two aspects, the current study explored the time course of attentional modulation on emotional versus cognitive conflict processing. First, to investigate the interaction between top-down attention and emotional versus cognitive conflict processing, a classic stimulus–response compatibility (SRC) task, the Eriksen flanker task (Eriksen & Eriksen, 1974), was adopted for its convenience to investigate the role of top-down attention. The typical flanker task is used to examine cognitive conflict. Responses to a target stimulus are typically slower and less accurate when the flankers and target are incongruent relative to when they are congruent; this has been described as the flanker effect (a kind of SRC effect) (Botvinick, Braver, Barch, Carter, & Cohen, 2001; Eriksen & Eriksen, 1974; Mattler, 2006). An affective version of the flanker task has also been adopted to examine emotional conflict (Chen et al., 2014; Dong et al., 2009; Horstmann et al., 2006; Lichtenstein-Vidne et al., 2012). Here, an emotional target is flanked by two congruent or incongruent facial expressions. Valence incongruence between a target and flankers slows performance (Alguacil et al., 2013; Liu et al., 2013; Ochsner et al., 2008). Thus, the current study assessed two experiments employing male and female faces displaying neutral, happy and

Downloaded by [ECU Libraries] at 17:32 24 April 2015

ATTENTION AND EMOTIONAL CONFLICT

fearful expressions as stimuli. The reason for including a gender dimension, along with a colour/emotion dimension, was to generate two levels of processing—task-relevant and task-irrelevant—so that top-down attentional modulation could be examined. During the colour–gender task, when participants were required to identify the colour of the target, colour processing was taskrelevant and gender processing was task-irrelevant. When the task was to identify the gender of the target, gender processing and colour processing were task-relevant and task-irrelevant, respectively. If colour or gender conflict processing were modulated by top-down attention, a colour or gender SRC effect would be larger than during the relevant task as compared to the irrelevant task. During the emotion–gender task, stimuli were grey scale faces displaying either happy or fearful expressions. When participants were required to identify the emotion of the target, emotion processing was taskrelevant and gender processing was task-irrelevant. Conversely, when the task was to identify the gender of the target, gender processing was taskrelevant and emotion processing was task-irrelevant. We investigated whether emotion or gender conflict processing was modulated by top-down attentional feedback. If emotion is processed outside of attention and/or consumes little or no attentional capacity (Moors & De Houwer, 2006), the SRC effects should not be significantly different between the relevant and irrelevant tasks. Alternatively, we should observe a larger SRC effect during the relevant task than during the irrelevant task for emotional or gender conflict. Two ERP components are sensitive to the modulation of top-down attention. First, the N200 can be modulated by attention and task difficulty (Veen, Cohen, Botvinick, Stenger, & Carter, 2001). Previous studies suggest that negative ERP components can be elicited around 200 ms (N200) post-stimulus onset when the target shows a different attribute from flankers (Heil et al., 2000; Kopp et al., 2007; Purmann, Badde, Luna-Rodriguez, & Wendt, 2011; Veen & Carter, 2002a, 2002b). The N200 is most strongly associated with the presence of conflict (Botvinick et al., 2001; Botvinick, Cohen, & Carter, 2004;

Haas et al., 2006; Weissman, Giesbrecht, Song, Mangun, & Woldorff, 2003), which is localised in the anterior cingulate cortex (Folstein & Van Petten, 2008; Purmann et al., 2011; Veen & Carter, 2002a, 2002b). The N200 amplitude is sensitive to conflict level (Botvinick, Nystrom, Fissell, Carter, & Cohen, 1999; Forster, Carter, Cohen, & Cho, 2011; Veen & Carter, 2002b; Veen et al., 2001), and the N200 latency is associated with stimulus discrimination time (Näätänen & Picton, 1986; Ritter, Simson, Vaughan, & Friedman, 1979). Second, the central-parietal P300, a more positive-going wave in the incongruent trials than in the congruent trials, emerges 600 ms post-stimulus onset (Coderre, Conklin, & van Heuven, 2011; Perlstein, Larson, Dotson, & Kelly, 2006; West & Alain, 1999). Previous studies suggest that the P300 amplitude can be enhanced during task-relevant conditions compared with task-irrelevant conditions (Hajcak, MacNamara, & Olvet, 2010; Squires, Donchin, Herning, & McCarthy, 1977). The P300 amplitude reflects resource allocation during visual tracking tasks, with the amplitude being more positive during a harder task (Isreal, Chesney, Wickens, & Donchin, 1980; Johnson, 1988; Kopp et al., 2007; Luck, 2005). In the current study, if colour or gender conflict processing were modulated by top-down attention during the colour– gender task, the N200 and P300 potentials should have larger amplitudes during the colour incongruent condition than during the colour congruent condition. Colour N200 and P300 effects should be enhanced during task-relevant conflict compared with task-irrelevant conflict; the same should be the case for gender conflict. If emotional or gender conflict processing were modulated by top-down attention during the emotion–gender task, the N200 and P300 potentials should display larger amplitudes for incongruent than for congruent trials. Emotion or gender N200 and P300 effects should be more obvious during task-relevant conflict than during task-irrelevant conflict. Second, to directly contrast the time course of emotional versus cognitive conflict processing, three time windows (200–250 ms, 250–300 ms and 300–350 ms) were selected to examine the COGNITION AND EMOTION, 2015

3

Downloaded by [ECU Libraries] at 17:32 24 April 2015

ZHOU ET AL.

N200 effects for colour, emotion and gender tasks. Previous research has revealed that neural networks of cognitive conflict resolution differ from those involved in the resolution of emotional conflict (Egner et al., 2008; Etkin et al., 2006; Haas et al., 2006; Ochsner et al., 2008). Other studies suggest that emotional stimuli could automatically capture attention even when presented outside the focus of attention (Vuilleumier et al., 2001; Vuilleumier & Schwartz, 2001). Thus, we predicted that the perception and resolution of emotional conflict processing would be earlier than for cognitive conflict processing (given the bio‐ logical significance of emotional stimuli). The anterior N100 has been selected to examine differences between emotional and cognitive conflicts during early stages of perceptual processing. The N100 is a negative ERP component occurring at approximately 100–150 ms post-stimulus onset (Luck, 2005). The N100 is related to focusing attention on a task-relevant stimulus prior to perceptual evaluation, reflecting different degrees of attentional allocation in order to respond to different stimuli (Johnson & Olshausen, 2003; Luck & Yard, 1995). Thus, we predicted that the emotional N100 effect would appear earlier than the colour N100 effect; the emotional N200 effect should appear during an early time window while the colour N200 effect should appear during a later time window.

METHODS Stimuli Stimuli were 12 faces (six females), generated with FaceGen Modeller 3.4. Three different expressions were created for each face: neutral, happy and fearful. To avoid using teeth to detect expressions, both happy and fearful faces were depicted with an open mouth. Coloured faces were converted to grey scale in Photoshop and masked with an oval. All faces were adjusted to have equal skin luminance. Next, grey scale faces were artificially painted in red and blue colours. Another 24 participants rated the grey scale faces, using two 5-point scales, with 1 being most

4

COGNITION AND EMOTION, 2015

negative/masculine and 5 being most positive/ feminine, prior to the study. A one-way analysis of variance (ANOVA) revealed that emotional valence ratings were significantly different across the three expressions (F(1, 23) = 403.48, MSE = 0.04, p < .01). Post-hoc comparisons indicated that ratings for all three expressions were significantly different from each other. A paired t-test revealed that gender ratings were also significantly different (t(1, 23) = 16.92, MSE = 0.05, p < .01).

Participants Seventeen healthy college students (10 women; aged 21.55 ± 1.65 years) participated in this study. All signed informed written consent forms approved by the Ethical Committee of the Institute of Psychology, Chinese Academy of Sciences. All participants were right-handed and had normal or corrected-to-normal vision. Each participant was paid an honorarium for his/her time.

Tasks and procedures The flanker tasks were adopted from a previous study (Zhou & Liu, 2013) and modified for the present ERP experiment. All participants completed two tasks, colour–gender task and emotion– gender task, in a counterbalanced order. During the colour–gender task, stimuli were a row of three coloured faces displaying a neutral expression (Figure 1A). Participants’ task was to identify either the colour or the gender of the central face while ignoring the surrounding distracters. The identities of the flanker and target were randomly selected from a pool of 12 faces (6 men and 6 women), and the selection of each identity as the target or flanker was counterbalanced. The colour and gender tasks were counterbalanced across blocks within the experiment. Based on the relationship between the target and flankers across the colour and gender dimensions, there were four different stimulus conditions: colour different/gender different, colour different/gender same, colour same/gender different and colour same/gender same.

Downloaded by [ECU Libraries] at 17:32 24 April 2015

ATTENTION AND EMOTIONAL CONFLICT

Figure 1. Experimental designs for the colour–gender task (A) and the emotion–gender task (B).

Prior to the task, participants were trained to identify the task-relevant dimension (colour or gender) of the face by pressing the left (F) or right (J) key on a keyboard. Key mappings were counterbalanced, resulting in four task procedures. Participants were randomly assigned to one of the four task procedures. In the first procedure, participants were required to identify either the colour red or a female face based on a centrally located face by pressing the left (F) key on a keyboard. Participants also determined the colour blue or a male face by pressing the right (J) key. Since both red male and blue female faces might lead to conflicting responses, these pairings were not used as the central target stimuli for the present procedure (Figure 1A). The second procedure was similar to the first, except that key mappings were counterbalanced between the left and right hands. For example, participants were required to identify either blue or male by pressing

the left (F) key and to identify either red or female by pressing the right (J) key. Red male and blue female faces were not used as the central target stimuli for this second procedure. For the third procedure, participants were required to identify either red or male for the central face by pressing the left (F) key and to identify either blue or female by pressing the right (J) key. Here, red female and blue male pairings were not used as the central target stimuli. The fourth procedure was similar to the third, except that key mappings were counterbalanced between the left and right hands. Similar to the third procedure, red female and blue male faces were not used as the central target stimuli for the fourth procedure. Post-study debriefing suggested that no participant was aware of this combination of target properties. During the task, the central face (target) was flanked by two faces (distractors). Each face COGNITION AND EMOTION, 2015

5

Downloaded by [ECU Libraries] at 17:32 24 April 2015

ZHOU ET AL.

subtended a visual angle of 2.80° × 3.65° in width and height at a viewing distance of 60 cm. The visual angle between the centre of the target face and the centre of each flanker face was 3.08°. To prevent participants from adopting strategies to visually filter out the flankers, the flankers and target were randomly shifted horizontally at the same time within a visual angle of about 4.21°. All stimuli were displayed on a dark background. Following a central fixation of 150–250 ms, target and flankers were simultaneously displayed for 1000 ms. Each trial ended with a blank screen for 1250–1350 ms. Participants were instructed to respond as quickly and accurately as possible. The task session contained eight blocks, each including 72 trials. There were equal numbers of compatible trials and incompatible trials, resulting in 36 trials for each condition in one block. At the beginning of each block, participants were instructed whether to perform the colour or gender judgement, the order of which was counterbalanced across participants. The emotion–gender task procedure was similar to that of the colour–gender task, except that the emotion dimension replaced the colour dimension (Figure 1B). Stimuli were a row of three grey scale faces depicting emotional expressions. Neutral faces were not used. Thus, there were four stimulus types: emotion different/gender different, emotion different/gender same, emotion same/gender different and emotion same/gender same.

Electroencephalogram recording and analysis Electroencephalogram (EEG) recording was conducted continually with Ag–AgCl electrodes from 64 scalp locations using the 10–20 system and referenced to the left mastoid. Horizontal and vertical electrooculograms (EOG; Kim et al., 2006) were recorded at the outer canthi of both eyes and both supra- and suborbital to the right eye to monitor eye blinks and movements. Both EEG and EOG were sampled at 1000 Hz, with a 0.05–100 Hz band pass filter using a Neuroscan NuAmps digital amplifiers system (Neuroscan Labs, Sterling, VA, USA). All scalp electrodes

6

COGNITION AND EMOTION, 2015

were referenced to the left mastoid online and offline re-referenced to the average of the left and right mastoids. The impedance for all electrodes was kept below 5 KΩ. Offline, ocular artefacts were removed using a regression procedure implemented with Neuroscan software (Semlitsch, Anderer, Schuster, & Presslich, 1986). The EEG data were segmented into 800 ms epochs with a 100 ms pre-target baseline. Trials contaminated with artefacts exceeding ± 90 µV were excluded from averaging. Across the two experiments, accepted trials were averaged for all four stimulus types. The averaged ERP were digitally filtered with a low pass filter at 30 Hz (24 dB/ Octave) with a zero phase shift. Time windows for the N100, N200 and P300 components were identified with the following protocol. The peak latencies for all conditions were detected, and the mean of these latencies was calculated. For the N100, a time window from 75 ms to 125 ms post-stimulus onset, and 25 ms before and after the mean, was chosen at frontocentral electrodes Fz, FCz, FC4, FC3, F4, F3, Cz, C4 and C3. For the N200, time windows were 200–250 ms, 250–300 ms and 300–350 ms poststimulus onset at electrodes Fz, FCz and Cz based on previous studies (Eimer, Holmes, & McGlone, 2003; Wang, Li, Zheng, Wang, & Liu, 2014) and the grand-averaged ERP wave of the current study. For the P300, a time window from 300 ms to 500 ms post-stimulus onset was chosen at electrodes CPz, Pz and POz. Separate repeatedmeasures ANOVAs were performed for the mean amplitudes of the N100, N200 and P300, respectively. Factors were Time Window, Type (cognitive vs. emotional), Task (attend colour or emotion vs. gender), Electrode, Gender Congruency (congruent vs. incongruent) and Target Congruency (colour or emotion congruent vs. incongruent). One participant’s data were excluded due to excessive ocular artefacts and errors. RTs beyond three standard deviations were excluded from further analysis. Significance level was set at p < .05. Bonferroni corrections were used for multiple pair-wise comparisons.

ATTENTION AND EMOTIONAL CONFLICT

RESULTS

Downloaded by [ECU Libraries] at 17:32 24 April 2015

Behavioural results RTs were analysed using a 2 (Type: cognitive vs. emotional) × 2 (Task: attend colour or emotion vs. gender) × 2 (Gender Congruency: congruent vs. incongruent) × 2 (Target Congruency: colour or emotion congruent vs. incongruent) repeated-measures ANOVA. Results indicated interactions between Task and Target Congruency, F(1, 15) = 22.37, MSE = 136.60, p < .01, and between Type, Task and Target Congruency, F(1, 15) = 22.61, MSE = 81.93, p < .01, were significant. Post-hoc comparisons indicated that the colour SRC effect was significant when participants attended to the colour and disappeared when attention was diverted

towards face gender (Figure 2A). The emotion SRC effect was significant when emotion was either task-relevant or task-irrelevant. However, the former effect was greater than the latter (Figure 2B). The interactions between Task and Gender Congruency, F(1, 15) = 32.54, MSE = 62.39, p < .01, and between Type, Task and Gender Congruency, F(1, 15) = 5.19, MSE = 69.68, p < .05, were significant. Post-hoc comparisons indicated that the gender SRC effect was significant only during the relevant task both in the colour–gender task and the emotion–gender task. The interaction between Gender Congruency and Target Congruency was not significant, F(1, 15) = 0.75, MSE = 121.22, n.s., nor was the interaction between Type, Task, Gender Congruency and Target Congruency, F(1, 15) = 0.55, MSE =

Figure 2. RTs and accuracy for the colour–gender task (A) and the emotion–gender task (B). GDCD: colour different/gender different; GSCD: colour different/gender same; GDCS: colour same/gender different; GSCS: colour same/gender same. GDED: emotion different/ gender different; GSED: emotion different/gender same; GDES: emotion same/gender different; GSES: emotion same/gender same. Error bars reflect standard error of the mean. COGNITION AND EMOTION, 2015

7

Downloaded by [ECU Libraries] at 17:32 24 April 2015

ZHOU ET AL.

109.36, n.s. Results also revealed significant main effects of Type, F(1, 15) = 53.90, MSE = 5169.43, p < .01, Gender Congruency, F(1, 15) = 23.98, MSE = 121.83, p < .01 and Target Congruency, F(1, 15) = 35.61, MSE = 159.66, p < .01. The main effect of Task approached significance, F(1, 15) = 3.78, MSE = 2096.86, p = .07. Incongruence between the target and flankers (regardless of colour, emotion or gender) slowed performance. All other interactions were nonsignificant. An analysis of the corresponding accuracy rates revealed that interactions between Task and Target Congruency, F(1, 15) = 7.64, MSE = 0.00, p < .05, and between Type, Task and Target Congruency, F(1, 15) = 5.21, MSE = 0.00, p < .05, were significant. Post-hoc comparisons indicated that accuracy rates were lower for incongruent conditions than for congruent conditions (regardless of colour or emotion) only when the stimuli were task-relevant (Figure 2). The main effects of Gender Congruency, F(1, 15) = 10.31, MSE = 0.00, p < .01, and Target Congruency, F(1, 15) = 7.85, MSE = 0.00, p < .05, were significant. Interactions between Task and Gender Congruency, F(1, 15) = 1.20, MSE = 0.00, n.s., and between Type, Task and Gender Congruency, F(1, 15) = 0.62, MSE = 0.00, n.s., were not significant. Accuracy rates were lower for gender incongruent conditions than for gender congruent conditions when gender was either task-relevant or task-irrelevant during both experiments. All other main effects and interactions were non-significant. In addition, a paired t-test revealed that RTs when attending towards facial emotion were significantly longer than RTs when attending towards facial colour, t(1, 15) = 12.09, MSE = 0.15, p < .01.

ERP patterns N100 Results for the N100 component revealed a threeway interaction between Type, Task and Target Congruency, F(1, 15) = 6.65, MSE = 9.25, p < .05, and a four-way interaction between Type, Task, Electrode and Target Congruency, F(8,

8

COGNITION AND EMOTION, 2015

120) = 2. 19, MSE = 0.15, p < .05. Post-hoc comparisons indicated that the emotion N100 wave was more negative for emotion incongruent compared with emotion congruent trials at five electrodes (Fz, FCz, Cz, F4 and F3) only when emotion was attended (Figure 3); colour incongruent N100 waves did not differ from colour congruent waves during the task-relevant or taskirrelevant conditions. Neither the interaction between Task and Gender Congruency, F(1, 15) = 0.69, MSE = 9.26, n.s., nor the interaction between Type, Task, Electrode and Gender Congruency, F(8, 120) = 0. 93, MSE = 0.25, n.s., was significant. The three-way interactions between Type, Electrode and Gender Congruency, F(8, 120) = 0.81, MSE = 0.10, n.s., and between Task, Electrode and Gender Congruency, F(8, 120) = 1.12, MSE = 0.17, n.s., were not significant. Gender incongruent N100 waves did not differ from gender congruent waves when gender was either task-relevant or task-irrelevant during both experiments. The main effects of Gender Congruency, F(1, 15) = 0.00, MSE = 8.93, n.s., and Target Congruency, F(1, 15) = 1.09, MSE = 3.75, n.s., were not significant. The main effect of Electrode was significant, F(8, 120) = 38.36, MSE = 8.96, p < .01. The mean amplitude of the N100 appeared the largest at the FCz electrode during both experiments. All other main effects and interactions were non-significant. N200 Results revealed a four-way interaction between Window, Task, Electrode and Target Congruency, F(4, 60) = 8.68, MSE = 0.10, p < .01, and a five-way interaction between Window, Type, Task, Electrode and Target Congruency, F(4, 60) = 2. 73, MSE = 0.14, p < .05. Post-hoc comparisons indicated that the colour incongruent condition elicited a more negative ERP deflection (N200) than did the colour congruent condition only when colour was task-irrelevant in the 200– 250 ms time window, which is inconsistent with previous studies. Colour N200 effects were significantly larger during task-relevant conflict than during task-irrelevant conflict at all three

9

Downloaded by [ECU Libraries] at 17:32 24 April 2015

ATTENTION AND EMOTIONAL CONFLICT

COGNITION AND EMOTION, 2015

Figure 3. The grand-averaged N100 waveforms in response to emotion different/gender different (GDED), emotion different/gender same (GSED), emotion same/gender different (GDES) and emotion same/gender same (GSES) conditions during the emotion–gender task at electrodes (Fz, Cz, F4 and F3). The ERPs appeared only when emotion was attended to.

Downloaded by [ECU Libraries] at 17:32 24 April 2015

ZHOU ET AL.

electrodes (Fz, FCz and Cz) in the 250–300 ms time window. Colour N200 effects occurred at electrodes Fz, FCz and Cz only when colour was task-relevant in the 300–350 ms time window. Emotion N200 effects occurred at electrodes Fz, FCz and Cz only when emotion was task-relevant in all three time windows (Figure 4). The fourway interactions between Window, Type, Task and Gender Congruency, F(2, 30) = 3.94, MSE = 1.33, p < .05, and between Window, Type, Electrode and Gender Congruency, F(4, 60) = 2.45, MSE = 0.11, p < .05 were significant. Posthoc comparisons indicated that the gender N200 effect was not significant for two experiments in the 200–250 ms time window; in the 250–300 ms time window, the gender N200 effect was significant at all three electrodes (Fz, FCz and Cz) during the colour–gender task and at two electrodes (Fz and Cz) during the emotion–gender task only when gender was attended to. In the 300– 350 ms time window, the gender N200 effect was not significant during the colour–gender task and was significant at all three electrodes (Fz, FCz and Cz) during the emotion–gender task only when gender was attended to. Neither a four-way interaction between Type, Task, Electrode and Gender Congruency, F(4, 60) = 0.20, MSE = 0.44, n.s., nor a five-way interaction between Window, Type, Task, Electrode and Gender Congruency, F(4, 60) = 0.77, MSE = 0.07, n.s., was significant. A four-way interaction between Window, Type, Task and Electrode reached statistical significance, F(4, 60) = 8. 68, MSE = 0.10, p < .01. The main effect of Electrode was significant, F(2, 30) = 3.85, MSE = 38.24, p < .05. Post-hoc comparisons indicated that the N200 amplitude appeared to be largest at the Cz electrode during both experiments for the 200– 250 ms time window. The N200 amplitude appeared to be largest at the Fz and FCz electrodes for the colour–gender task and the emotion–gender task, respectively, during the 250–300 ms time window. The N200 amplitude appeared to be the largest at the FCz electrode for both experiments during the 300–350 ms time window. Incongruence between the target and flankers, regardless of colour, gender or emotion,

10

COGNITION AND EMOTION, 2015

enhanced the N200 amplitude, as revealed by main effects of Gender Congruency, F(1, 15) = 7.24, MSE = 23.95, p < .05, and Target Congruency, F(1, 15) = 15.86, MSE = 18.11, p < .01. P300 A three-way interaction between Task, Electrode and Target Congruency was significant, F(2, 30) = 10. 69, MSE = 0.05, p < .01. The main effect of Type was not significant, F(1, 15) = 0.32, MSE = 77.73, n.s. Post-hoc comparisons indicated that the colour P300 effect occurred at electrode CPz only when colour was task-relevant; that is, incongruence between the target and flankers for emotion enhanced the P300 amplitude only during the task-relevant condition at electrode CPz. These results indicated that more attentional resources were allocated to resolve colour or emotional conflict in task-relevant conditions than in task-irrelevant conditions. A two-way interaction between Type and Gender Congruency, F(1, 15) = 0.00, MSE = 2.32, n.s., was not significant, nor was the interaction between Task and Gender Congruency, F(1, 15) = 0.40, MSE = 3.44, n.s. The four-way interaction between Type, Task, Electrode and Gender Congruency was not significant, F(2, 30) = 1. 22, MSE = 0.09, n.s; thus, the gender P300 effect was not significant whenever gender was either task-relevant or taskirrelevant for both experiments. The main effect of Task approached significance, F(1, 15) = 3.98, MSE = 17.81, p = .06. The main effects of Gender Congruency, F(1, 15) = 1.49, MSE = 3.14, n.s., and Target Congruency, F(1, 15) = 0.69, MSE = 7.52, n.s., were not significant. The P300 amplitude appeared to be largest at the CPz electrode for both experiments, as revealed by a main effect of Electrode, F(2, 30) = 26.47, MSE = 64.63, p < .01, and a three-way interaction between Type, Task and Electrode, F(4, 60) = 6. 16, MSE = 0.85, p < .01.

DISCUSSION The current study investigated whether emotional conflict processing is modulated by top-down

11

Downloaded by [ECU Libraries] at 17:32 24 April 2015

ATTENTION AND EMOTIONAL CONFLICT

COGNITION AND EMOTION, 2015

Figure 4. The grand-averaged N200 waveforms at midline electrode FCz and topographic maps of scalp voltage. (A) ERP waves in response to colour different/gender different (GDCD), colour different/gender same (GSCD), colour same/gender different (GDCS) and colour same/gender same (GSCS) conditions during the colour–gender task and topographic maps of scalp voltage at 294 ms and 286 ms, obtained by subtracting two congruent conditions from two incongruent conditions at peak differences post-stimulus onset (shown separately for relevant tasks). (B) ERP waves in response to emotion different/gender different (GDED), emotion different/gender same (GSED), emotion same/gender different (GDES) and emotion same/gender same (GSES) conditions during the emotion–gender task and topographic maps of scalp voltage at 258 ms and 320 ms for the relevant tasks, separately.

Downloaded by [ECU Libraries] at 17:32 24 April 2015

ZHOU ET AL.

attention and directly contrasted the time course associated with emotional versus cognitive conflict processing. Behavioural data indicated that response times for the emotion congruent condition were faster than for the emotion incongruent condition, while accuracy rates for the congruent condition were higher than for the incongruent condition, indicating a robust emotional conflict effect. This effect was enhanced during the relevant task as compared with the irrelevant task. ERP analyses revealed a significant emotion N200 effect at midline electrodes only when emotion was attended. Furthermore, the following effects were observed during task-relevant conflict: colour and gender N200 effects, an emotional P300 component during the incongruent minus congruent condition, and colour and gender P300 effects. These results indicate that emotional conflict processing was modulated by top-down attention as compared to colour and gender conflict processing. However, the emotion N200 effect appeared 50 ms earlier than either the colour or the gender N200 effects. Furthermore, the emotion N100 effect was significant at 75–125 ms post-stimulus onset, but both colour and gender N100 effects were not significant. These results suggest that emotional conflict processing appears to be under top-down control but can be identified rather early. The current findings reveal that top-down attention gates the processing of emotional conflict, reflected by emotion N200 and P300 effects during task-relevant conflict. This replicates the primary effect observed in our previous study (Zhou & Liu, 2013) and is in line with studies showing that attention towards a stimulus attribute increases neural activity within areas specialised for processing that particular attribute (Corbetta, Miezin, Dobmeyer, Shulman, & Petersen, 1991; Eriksen & Hoffman, 1974; Moran & Desimone, 1985; Nobre, Allison, & McCarthy, 1998; Treue & Maunsell, 1996; Wang et al., 2001). Desimone and Duncan suggested that simultaneous processing of multiple stimuli compete with each other for limited resources. They proposed the biased competition model of attention to explain this

12

COGNITION AND EMOTION, 2015

phenomenon (Desimone & Duncan, 1995; Pessoa, Kastner, & Ungerleider, 2003; Reynolds, Chelazzi, & Desimone, 1999; Reynolds & Desimone, 2003). This model suggests that competition is biased to a stimulus in two ways. One is referenced in the present study: attentional top-down feedback mechanisms that facilitate stimulus processing emerge when attention is present. The stimulus could win competition and receive preferential processing in spite of limited resources. Previous studies have proposed that the biased competition model is suitable for spatial location (Semlitsch et al., 1986), stimulus attributes (Corbetta et al., 1991; Liu, Stevens, & Carrasco, 2007), emotion (Pessoa, Kastner, & Ungerleider, 2002), as well as cognitive conflict processing (Wang et al., 2001). However, our data extend this theory to emotional conflict processing. This suggests that emotional conflict could be resolved adequately with enough attention during a task-relevant condition but not when attention is diverted away from an emotion. The present study directly contrasts the time course associated with emotional versus cognitive conflict processing for the first time. Our data revealed that the emotion N200 effect emerged 200 ms post-stimulus onset. However, colour and gender N200 effects appeared 250 ms poststimulus onset. These results suggest that emotional conflict can be detected earlier compared to cognitive conflict. This is in line with other components of emotion processing that receive preferential processing through biased attention allocation (Eastwood, Smilek, & Merikle, 2001; Fox, 2002; Pourtois, Grandjean, Sander, & Vuilleumier, 2004; Vuilleumier, 2005; Zhu et al., 2010). However, compared to colour conflict processing, emotional conflict processing slows performance (reflected by longer RTs). This suggests that emotional conflict resolution is more difficult than cognitive conflict resolution. Humans are likely more cautious at communicating important social information, especially the potential benefits or dangers based on whether emotional conflict could be resolved efficiently. Dong et al. (2009) found that the N200 effect appears 300–400 ms post-stimulus onset, which is

Downloaded by [ECU Libraries] at 17:32 24 April 2015

ATTENTION AND EMOTIONAL CONFLICT

a little later compared with what we observed presently (Dong et al., 2009). This discrepancy might be due to the target being flanked by four distracters in Dong et al.’ s study (2009) as opposed to two distracters in the current study. This is in line with previous studies showing that N200 latency is delayed along with increased task difficulty (Näätänen & Picton, 1986; Ritter et al., 1979). In the 200–250 ms time window, results showed that the emotion congruency effect was significant during task-relevant conflict and disappeared during task-irrelevant conflict, indicating that emotional conflict processing was modulated by top-down attention. In the same time window, N200 amplitude was more negative for colour incongruent compared with colour congruent trials but only when colour was task-irrelevant. This is in contrast to previous studies (Anllo-Vento & Hillyard, 1996; Hillyard & Münte, 1984), which found that colour processing is modulated by top-down attention. Our previous study also showed that colour conflict processing was under top-down attentional control (Zhou & Liu, 2013). Slowed performance for colour incongruent trials during task-irrelevant conflict might reflect early perceptual processing of colour rather than colour conflict. This is in line with Bindra et al.’s study showing that RTs for “different” judgements were significantly longer than for “same” judgements made with high discriminate difficulty (Bindra, Donderi, & Nishisato, 1968). During the colour–gender task, discriminate difficulty was higher in the task-irrelevant condition than in the task-relevant condition. Thus, RTs for different judgements increased compared to same judgements. Therefore, we conclude that emotion N200 effects appeared 50 ms earlier than cognitive N200 effects. Our data also revealed that the anterior N100 amplitude difference was enlarged for emotional incongruent compared with emotional congruent trials but not for colour and gender trials. Previous studies found that fearful faces could be discriminated from neutral faces at 100 ms post-stimulus onset (Eimer & Holmes, 2002; Palermo & Rhodes, 2007). However, fearful faces were discriminated from happy faces after 300 ms post-

stimulus onset (Williams, Palmer, Liddell, Song, & Gordon, 2006). Thus, the emotion N100 effect in the present study indicated that differences in emotional valence, not emotional conflict, could be identified at an early stage of perceptual processing based on coarse aspects of input. The frontal-central distributed N100 in the current study is in line with the hypothesis that the orbitofrontal cortex serves as a rapid detector and predictor of emotional information with rough input (Luo, Feng, He, Wang, & Luo, 2010; Rolls, 2004). The present results suggest that the visual search sequence during an emotional flanker task moved from a whole overview to a specific target first. Next, participants focused their attention on the target for valence identification. These results are in line with the feature integration theory of visual search (Treisman, 1991). However, previous findings observed that an emotional incongruent condition could evoke a more negative N170 amplitude than an emotional congruent condition, indicating that emotional conflict can be identified at an early perceptual processing stage (Scott, O’Donnell, Leuthold, & Sereno, 2009; Zhu et al., 2010). This is in contrast to the current study. The disparity in results could possibly be due to emotional meaning from words being combined rapidly with information from facial expressions during a face–word Stroop task than the combination of surrounding facial expressions during the flanker task used in the current study. In conclusion, our data revealed that emotional conflict processing was modulated by top-down attention, which is similar to cognitive conflict processing. However, emotional conflict processing occurs earlier. Additionally, emotional conflict processing slows performance. These findings suggest that humans are more careful when facing emotional conflict compared with cognitive conflict. Only when emotional conflict is identified quickly and resolved cautiously can behavioural harmony be maintained for individuals when time is limited. Acknowledgement The authors would like to thank Yuzhong Wu, Yamei Huang and Lei Liu for their valuable suggestions on a revision of this manuscript. COGNITION AND EMOTION, 2015

13

ZHOU ET AL.

Disclosure statement No potential conflict of interest was reported by the authors.

Funding

Downloaded by [ECU Libraries] at 17:32 24 April 2015

This work was supported by National Key Technologies R&D Program of China [grant number 2012 BAI36B01], the CAS/SAFEA International Partnership Program for Creative Research Teams [grant number Y2CX131003], and the National Natural Science Foundation of China [grant numbers 31070987 and 31271194].

ORCID Xun Liu

http://orcid.org/0000-0003-1366-8926

REFERENCES Alguacil, S., Tudela, P., & Ruz, M. (2013). Cognitive and affective control in a flanker word task: Common and dissociable brain mechanisms. Neuropsychologia, 51, 1663–1672. doi:10.1016/j.neurop sychologia.2013.05.020 Anderson, A. K. (2005). Affective influences on the attentional dynamics supporting awareness. Journal of Experimental Psychology: General, 134, 258–281. doi:10.1037/0096-3445.134.2.258 Anllo-Vento, L., & Hillyard, S. A. (1996). Selective attention to the color and direction of moving stimuli: Electrophysiological correlates of hierarchical feature selection. Attention, Perception, & Psychophysics, 58(2), 191–206. doi:10.3758/BF03211875 Beall, P. M., & Herbert, A. M. (2008). The face wins: Stronger automatic processing of affect in facial expressions than words in a modified Stroop task. Cognition & Emotion, 22, 1613–1642. doi:10.1080/ 02699930801940370 Bindra, D., Donderi, D. C., & Nishisato, S. (1968). Decision latencies of “same” and “different” judgments. Attention, Perception, & Psychophysics, 3(2), 121–136. doi:10.3758/BF03212780 Botvinick, M., Braver, T. S., Barch, D. M., Carter, C. S., & Cohen, J. D. (2001). Conflict monitoring and cognitive control. Psychological Review, 108, 624.

14

COGNITION AND EMOTION, 2015

Botvinick, M., Cohen, J. D., & Carter, C. S. (2004). Conflict monitoring and anterior cingulate cortex: An update. Trends in Cognitive Sciences, 8, 539–546. doi:10.1016/j.tics.2004.10.003 Botvinick, M., Nystrom, L. E., Fissell, K., Carter, C. S., & Cohen, J. D. (1999). Conflict monitoring versus selection-for-action in anterior cingulate cortex. Nature, 402, 179–181. doi:10.1038/46035 Chen, T. L., Kendrick, K. M., Feng, C. L., Yang, S. Y., Wang, X. G., Yang, X., … Luo, Y. J. (2014). Opposite effect of conflict context modulation on neural mechanisms of cognitive and affective control. Psychophysiology, 51, 478–488. doi:10.1111/psyp.12165 Coderre, E., Conklin, K., & van Heuven, W. J. B. (2011). Electrophysiological measures of conflict detection and resolution in the Stroop task. Brain Research, 1413, 51–59. doi:10.1016/j.brainres.2011. 07.017 Corbetta, M., Miezin, F. M., Dobmeyer, S., Shulman, G. L., & Petersen, S. E. (1991). Selective and divided attention during visual discriminations of shape, color, and speed: Functional anatomy by positron emission tomography. The Journal of Neuroscience, 11, 2383–2402. Desimone, R., & Duncan, J. (1995). Neural mechanisms of selective visual attention. Annual Review of Neuroscience, 18(1), 193–222. doi:10.1146/annurev. ne.18.030195.001205 Dong, G., Yang, L., & Shen, Y. (2009). The course of visual searching to a target in a fixed location: Electrophysiological evidence from an emotional flanker task. Neuroscience Letters, 460(1), 1–5. doi:10.1016/ j.neulet.2009.05.025 Eastwood, J. D., Smilek, D., & Merikle, P. M. (2001). Differential attentional guidance by unattended faces expressing positive and negative emotion. Attention, Perception, & Psychophysics, 63, 1004–1013. doi:10.3758/BF03194519 Egner, T., Etkin, A., Gale, S., & Hirsch, J. (2008). Dissociable neural systems resolve conflict from emotional versus nonemotional distracters. Cerebral Cortex, 18, 1475–1484. doi:10.1093/cercor/bhm179 Eimer, M., & Holmes, A. (2002). An ERP study on the time course of emotional face processing. Neuroreport, 13, 427–431. doi:10.1097/00001756-20020 3250-00013 Eimer, M., Holmes, A., & McGlone, F. P. (2003). The role of spatial attention in the processing of facial expression: An ERP study of rapid brain responses to six basic emotions. Cognitive, Affective, & Behavioral Neuroscience, 3(2), 97–110. doi:10.3758/CABN.3.2.97

Downloaded by [ECU Libraries] at 17:32 24 April 2015

ATTENTION AND EMOTIONAL CONFLICT

Eriksen, B. A., & Eriksen, C. W. (1974). Effects of noise letters upon the identification of a target letter in a nonsearch task. Attention, Perception, & Psychophysics, 16(1), 143–149. doi:10.3758/BF03203267 Eriksen, C. W., & Hoffman, J. E. (1974). Selective attention: Noise suppression or signal enhancement? Bulletin of the Psychonomic Society, 4, 587– 589. doi:10.3758/BF03334301 Etkin, A., Egner, T., Peraza, D. M., Kandel, E. R., & Hirsch, J. (2006). Resolving emotional conflict: A role for the rostral anterior cingulate cortex in modulating activity in the amygdala. Neuron, 51, 871–882. doi:10.1016/j.neuron.2006.07.029 Fan, J., Kolster, R., Ghajar, J., Suh, M., Knight, R. T., Sarkar, R., & McCandliss, B. D. (2007). Response anticipation and response conflict: An event-related potential and functional magnetic resonance imaging study. The Journal of Neuroscience, 27, 2272–2282. doi:10.1523/JNEUROSCI.3470-06.2007 Folstein, J. R., & Van Petten, C. (2008). Influence of cognitive control and mismatch on the N2 component of the ERP: A review. Psychophysiology, 45(1), 152–170. Forster, S. E., Carter, C. S., Cohen, J. D., & Cho, R. Y. (2011). Parametric manipulation of the conflict signal and control-state adaptation. Journal of Cognitive Neuroscience, 23, 923–935. doi:10.1037/0033-295X. 111.4.931 Fox, E. (2002). Processing emotional facial expressions: The role of anxiety and awareness. Cognitive, Affective, & Behavioral Neuroscience, 2(1), 52–63. doi:10.3758/ CABN.2.1.52 Haas, B. W., Omura, K., Constable, R. T., & Canli, T. (2006). Interference produced by emotional conflict associated with anterior cingulate activation. Cognitive, Affective, & Behavioral Neuroscience, 6(2), 152– 156. doi:10.3758/CABN.6.2.152 Hajcak, G., MacNamara, A., & Olvet, D. M. (2010). Event-related potentials, emotion, and emotion regulation: An integrative review. Developmental Neuropsychology, 35(2), 129–155. doi:10.1080/8756 5640903526504 Heil, M., Osman, A., Wiegelmann, J., Rolke, B., & Hennighausen, E. (2000). N200 in the Eriksentask: Inhibitory executive processes? Journal of Psychophysiology, 14(4), 218–225. doi:10.1027// 0269-8803.14.4.218 Hillyard, S. A., & Münte, T. F. (1984). Selective attention to color and location: An analysis with event-related brain potentials. Attention, Perception, & Psychophysics, 36(2), 185–198. doi:10.3758/BF03202679

Horstmann, G., Borgstedt, K., & Heumann, M. (2006). Flanker effects with faces may depend on perceptual as well as emotional differences. Emotion, 6(1), 28–39. doi:10.1037/1528-3542.6.1.28 Isreal, J. B., Chesney, G. L., Wickens, C. D., & Donchin, E. (1980). P300 and tracking difficulty: Evidence for multiple resources in dual‐task performance. Psychophysiology, 17, 259–273. doi:10.1111/j.1469-8986.198 0.tb00146.x Johnson, R. (1988). The amplitude of the P300 component of the event-related potential: Review and synthesis. Advances in Psychophysiology, 3, 69–137. Johnson, J. S., & Olshausen, B. A. (2003). Time course of neural signatures of object recognition. Journal of Vision, 3(7), 4. doi:10.1167/3.7.4 Kim, S. J., Lyoo, I. K., Hwang, J., Chung, A., Hoon Sung, Y., Kim, J., … Renshaw, P. F. (2006). Prefrontal grey-matter changes in short-term and long-term abstinent methamphetamine abusers. The International Journal of Neuropsychopharmacology, 9(02), 221–228. doi:10.1017/S1461145705005699 Kimura, M., Katayama, J., & Murohashi, H. (2005). Neural correlates of preattentive and attentive processing of visual changes. Neuroreport, 16, 2061– 2064. doi:10.1097/00001756-200512190-00019 Kopp, B., Rist, F., & Mattler, U. (2007). N200 in the flanker task as a neurobehavioral tool for investigating executive control. Psychophysiology, 33, 282–294. doi:10.1111/j.1469-8986.1996.tb00425.x Lichtenstein-Vidne, L., Henik, A., & Safadi, Z. (2012). Task relevance modulates processing of distracting emotional stimuli. Cognition & Emotion, 26(1), 42–52. doi:10.1080/02699931.2011.567055 Liu, T., Stevens, S. T., & Carrasco, M. (2007). Comparing the time course and efficacy of spatial and feature-based attention. Vision Research, 47(1), 108–113. doi:10.1016/j.visres.2006.09.017 Liu, T. R., Xiao, T., & Shi, J. N. (2013). Neural correlates of conflict control on facial expressions with a flanker paradigm. PLoS ONE, 8(7), e69683. doi:10.1371/journal.pone.0069683 Luck, S. J. (2005). An introduction to the event-related potential technique. Cambridge, MA: MIT Press. Luck, S. J., & Yard, S. A. H. (1995). The role of attention in feature detection and conjunction discrimination: An electrophysiological analysis. International Journal of Neuroscience, 80, 281–297. doi:10.3109/00207459508986105 Luo, W., Feng, W., He, W., Wang, N.-Y., & Luo, Y.-J. (2010). Three stages of facial expression processing: ERP study with rapid serial visual presentation. COGNITION AND EMOTION, 2015

15

Downloaded by [ECU Libraries] at 17:32 24 April 2015

ZHOU ET AL.

Neuroimage, 49, 1857–1867. doi:10.1016/j.neuroimage.2009.09.018 Mao, W., & Wang, Y. (2008). The active inhibition for the processing of visual irrelevant conflict information. International Journal of Psychophysiology, 67(1), 47–53. doi:10.1016/j.ijpsycho.2007.10.003 Mattler, U. (2006). Distance and ratio effects in the flanker task are due to different mechanisms. The Quarterly Journal of Experimental Psychology, 59, 1745–1763. doi:10.1080/17470210500344494 Moors, A., & De Houwer, J. (2006). Automaticity: A theoretical and conceptual analysis. Psychological Bulletin, 132, 297. Moran, J., & Desimone, R. (1985). Selective attention gates visual processing in the extrastriate cortex. Frontiers in Cognitive Neuroscience, 229, 342–345. Näätänen, R., & Picton, T. (1986). N2 and automatic versus controlled processes. Electroencephalography and Clinical Neurophysiology, 38, 169–186. Nobre, A. C., Allison, T., & McCarthy, G. (1998). Modulation of human extrastriate visual processing by selective attention to colours and words. Brain, 121, 1357–1368. doi:10.1093/brain/121.7.1357 Ochsner, K. N., Hughes, B., Robertson, E. R., Cooper, J. C., & Gabrieli, J. D. E. (2008). Neural systems supporting the control of affective and cognitive conflicts. Journal of Cognitive Neuroscience, 21, 1841–1854. doi:10.1080/17470910701401973 Palermo, R., & Rhodes, G. (2007). Are you always on my mind? A review of how face perception and attention interact. Neuropsychologia, 45(1), 75–92. doi:10.1016/j.neuropsychologia.2006.04.025 Perlstein, W. M., Larson, M. J., Dotson, V. M., & Kelly, K. G. (2006). Temporal dissociation of components of cognitive control dysfunction in severe TBI: ERPs and the cued-Stroop task. Neuropsychologia, 44, 260–274. doi:10.1016/j.neuropsychologia.2005.05.009 Pessoa, L., Kastner, S., & Ungerleider, L. G. (2002). Attentional control of the processing of neutral and emotional stimuli. Cognitive Brain Research, 15(1), 31–45. doi:10.1016/S0926-6410(02)00214-8 Pessoa, L., Kastner, S., & Ungerleider, L. G. (2003). Neuroimaging studies of attention: From modulation of sensory processing to top-down control. The Journal of Neuroscience, 23, 3990–3998. Pourtois, G., Grandjean, D., Sander, D., & Vuilleumier, P. (2004). Electrophysiological correlates of rapid spatial orienting towards fearful faces. Cerebral Cortex, 14, 619–633. doi:10.1093/cercor/bhh023

16

COGNITION AND EMOTION, 2015

Purmann, S., Badde, S., Luna-Rodriguez, A., & Wendt, M. (2011). Adaptation to frequent conflict in the Eriksen flanker task. Journal of Psychophysiology, 25(2), 50–59. doi:10.1027/0269-8803/a000041 Reeck, C., & Egner, T. (2011). Affective privilege: Asymmetric interference by emotional distracters. Front Psychology, 2(232), 1–7. Reynolds, J. H., Chelazzi, L., & Desimone, R. (1999). Competitive mechanisms subserve attention in macaque areas V2 and V4. The Journal of Neuroscience, 19, 1736–1753. Reynolds, J. H., & Desimone, R. (2003). Interacting roles of attention and visual salience in V4. Neuron, 37, 853–863. doi:10.1016/S0896-6273(03)00097-7 Ritter, W., Simson, R., Vaughan, H., & Friedman, D. (1979). A brain event related to the making of a sensory discrimination. Science, 203, 1358–1361. doi:10.1126/science.424760 Rolls, E. T. (2004). The functions of the orbitofrontal cortex. Brain and Cognition, 55(1), 11–29. doi:10.1016/S0278-2626(03)00277-X Schirmer, A., & Kotz, S. A. (2003). ERP evidence for a sex-specific Stroop effect in emotional speech. Journal of Cognitive Neuroscience, 15, 1135–1148. doi:10.1080/ 14640747708400601 Scott, G. G., O’Donnell, P. J., Leuthold, H., & Sereno, S. C. (2009). Early emotion word processing: Evidence from event-related potentials. Biological Psychology, 80(1), 95–104. doi:10.1016/j.biopsycho.2008.03.010 Semlitsch, H. V., Anderer, P., Schuster, P., & Presslich, O. (1986). A solution for reliable and valid reduction of ocular artifacts, applied to the P300 ERP. Psychophysiology, 23, 695–703. doi:10.1111/j.14698986.1986.tb00696.x Squires, K. C., Donchin, E., Herning, R. I., & McCarthy, G. (1977). On the influence of task relevance and stimulus probability on event-relatedpotential components. Electroencephalography and Clinical Neurophysiology, 42(1), 1–14. doi:10.1016/ 0013-4694(77)90146-8 Stenberg, G., Wiking, S., & Dahl, M. (1998). Judging words at face value: Interference in a word processing task reveals automatic processing of affective facial expressions. Cognition & Emotion, 12, 755– 782. doi:10.1080/026999398379420 Tillman, C. M., & Wiens, S. (2011). Behavioral and ERP indices of response conflict in Stroop and flanker tasks. Psychophysiology, 48, 1405–1411. doi:10.1111/j.1469-8986.2011.01203.x

Downloaded by [ECU Libraries] at 17:32 24 April 2015

ATTENTION AND EMOTIONAL CONFLICT

Treisman, A. (1991). Search, similarity, and integration of features between and within dimensions. Journal of Experimental Psychology: Human Perception and Performance, 17, 652–676. doi:10.1037/0096-1523.17. 3.652 Treue, S., & Maunsell, R. H. J. (1996). Attentional modulation of visual motion processing in cortical areas MT and MST. Nature, 382, 539–541. doi:10.1038/ 382539a0 Veen, V., & Carter, C. S. (2002a). The anterior cingulate as a conflict monitor: fMRI and ERP studies. Physiology and Behavior, 77, 477–482. doi:10.1016/S00319384(02)00930-7 Veen, V., & Carter, C. S. (2002b). The timing of action-monitoring processes in the anterior cingulate cortex. Journal of Cognitive Neuroscience, 14, 593– 602. doi:10.1006/cogp.1998.0703 Veen, V., Cohen, J. D., Botvinick, M. M., Stenger, V. A., & Carter, C. S. (2001). Anterior cingulate cortex, conflict monitoring, and levels of processing. Neuroimage, 14, 1302–1308. doi:10.1006/nimg.2001.0923 Vuilleumier, P. (2005). How brains beware: Neural mechanisms of emotional attention. Trends in Cognitive Sciences, 9, 585–594. doi:10.1016/j.tics.2005.10.011 Vuilleumier, P., Armony, J. L., Driver, J., & Dolan, R. J. (2001). Effects of attention and emotion on face processing in the human brain: An event-related fMRI study. Neuron, 30, 829–841. doi:10.1016/ S0896-6273(01)00328-2 Vuilleumier, P., & Schwartz, S. (2001). Emotional facial expressions capture attention. Neurology, 56(2), 153– 158. doi:10.1212/WNL.56.2.153 Wang, H., Wang, Y., Kong, J., Cui, L., & Tian, S. (2001). Enhancement of conflict processing activity in human brain under task relevant condition. Neuroscience Letters, 298(3), 155–158. doi:10.1016/ S0304-3940(00)01757-2

Wang, K., Li, Q., Zheng, Y., Wang, H., & Liu, X. (2014). Temporal and spectral profiles of stimulus– stimulus and stimulus–response conflict processing. Neuroimage, 89, 280–288. doi:10.1016/j.neuroimage. 2013.11.045 Wang, Y., Wang, H., Cui, L., Tian, S., & Zhang, Y. (2002). The N270 component of the event-related potential reflects supramodal conflict processing in humans. Neuroscience Letters, 332(1), 25–28. doi:10.1016/S0304-3940(02)00906-0 Weissman, D. H., Giesbrecht, B., Song, A. W., Mangun, G. R., & Woldorff, M. G. (2003). Conflict monitoring in the human anterior cingulate cortex during selective attention to global and local object features. Neuroimage, 19, 1361–1368. doi:10.1016/ S1053-8119(03)00167-8 West, R., & Alain, C. (1999). Event-related neural activity associated with the Stroop task. Cognitive Brain Research, 8(2), 157–164. doi:10.1016/S09266410(99)00017-8 Williams, L. M., Palmer, D., Liddell, B. J., Song, L., & Gordon, E. (2006). The ‘when’ and ‘where’ of perceiving signals of threat versus non-threat. Neuroimage, 31(1), 458–467. doi:10.1016/j.neuroimage.2005.12.009 Xue, S., & Qiu, J. (2013). Neural time course of emotional conflict control: An ERP study. Neuroscience Letters, 541, 34–38. doi:10.1016/j.neulet. 2013.02.032 Zhou, P., & Liu, X. (2013). Attentional modulation of emotional conflict processing with flanker tasks. PLoS ONE, 8(3), e60548. doi:10.1371/journal.pone.0060548 Zhu, X., Zhang, H., Wu, T., Luo, W., & Luo, Y. (2010). Emotional conflict occurs at an early stage: Evidence from the emotional face–word Stroop task. Neuroscience Letters, 478(1), 1–4. doi:10.1016/j.neulet.2010.04.036

COGNITION AND EMOTION, 2015

17

The time course of attentional modulation on emotional conflict processing.

Cognitive conflict resolution is critical to human survival in a rapidly changing environment. However, emotional conflict processing seems to be part...
589KB Sizes 0 Downloads 5 Views