Emotional intelligence (EI) continues to be a hot topic in the I/O and HR communities. Some are big fans, some (including me) are more skeptical.
Last year's article in IOP by Cherniss, et al. and the accompanying commentaries provided a great overview of the current situation, with the bottom line that while many seem to agree on a definition of EI, measurement is all over the place, with different methods measuring different things and a distinct lack of convergent and discriminant validity.
EI has been shown to have important relationships with a variety of criteria, but many researchers and practitioners have been left wondering, what exactly is being measured?
In the latest Journal of Organizational Behavior, O'Boyle et al. attempt to shed light on the situation, primarily by looking at whether measures of EI add predictive validity above and beyond cognitive ability and five-factor personality measures.
You may remember a similar study published back in early 2010 by Joseph and Newman. So why the need for this one? The current authors point out that in addition to updating the data and using more current estimates of other relationships, "our data set includes 65 per cent more studies that examine the relationship between EI and job performance, with an N that is over twice as large." I don't know why I found this so funny--there's nothing wrong with trying to get a better handle on things--I guess it just reminded me of packaging on dishwasher detergent or something. In case you're curious, their final sample included 43 effect sizes relating EI to job performance.
ANYWAY, what did they find? In the words of the authors: "We found that all three streams of EI correlated with job performance. Streams 2 and 3 incrementally predicted job performance over and above cognitive intelligence and the FFM. In addition, dominance analyses showed that when predicting job performance, all three streams of EI exhibited substantial relative importance in the presence of the FFM and intelligence." [Stream 1 = ability measures, Stream 2 = self report, Stream 3 = mixed models]
Let's do the numbers: the correlation between EI and job performance varied depending on which "stream" of EI research was analyzed, from about .24 to about .30. So if you're considering using a measure of EI, any of the reputably-built measures should add validity to your process (and in fact these numbers may be low since the job performance measures were primarily task-based). Since the numbers were pretty similar, it seems they support a similar construct, right? Not so fast.
When they looked at how the streams related to personality measures and cognitive ability, they were all over the place. In their words, "For all six correlates (FFM and cognitive ability), we found significant Q-values indicating that the three streams relate to other personality and cognitive ability measures differently...These differences in how the EI streams related to other dispositional traits provide a contrasting perspective to the assertion that the various measures of EI assess the same construct."
What about the incremental validity? It ranged from pretty much nothing (stream 1) to a respectable .07 (stream 3) with stream 2 in the middle at .05. The stream 1 measures seem to have much more overlap with cognitive ability measures, and since ability measures dominate the prediction of performance, well...there you go.
So does that mean employers should avoid stream 1 measures (e.g., MSCEIT)? Again, not so fast. The authors point out that this type of EI measure may be more resistant to social desirability and faking effects--it primarily seems to lose its value when you're already using a cognitive ability measure. (Although again, we're talking task performance here)
All of which leaves me still wondering: what exactly is being measured? More to come I suppose; after all, EI is a newbie in the history of I/O constructs. The authors themselves point out that "much more work is still needed" on the construct validaty of EI.
On the other hand, the more practical among you may be thinking, "who cares? it works!" And to them, I would only say, remember what Kurt Lewin said: "There is nothing so practical as a good theory." Without the theory...
You can see an in-press version here. It's good stuff.
Oh, and on a totally unrelated point, I highly recommend upgrading your Adobe Reader to version X if you haven't already. Highlighting with bookmarking at the click of a button.
No comments:
Post a Comment