It's time once again for the monthly research round-up. So let's dive right in:
The June International Journal of Selection and Assessment doesn't disappoint; let's take a look:
- More evidence of the link between personality variables and CWBs; this time with concurrent data in China
- Dovetailing nicely with a post I've been working on regarding promotional testing, this research indicates some interesting characteristics of internal test takers
- Why are supervisors open to behavioral interviews but shun discussion of "structure"? Looks like how we communicate about them plays a big role.
- More research on self-efficacy, this time teasing apart the concept a bit.
- Always a popular topic: applicant reactions to selection mechanisms. This time with a sample from Saudi Arabia.
- Speaking of applicant reactions...how about another study? This one comparing U.S. and Vietnamese college students. By the way, not surprisingly work samples came out a winner in both of these studies.
- Next, a fascinating study of a hidden bonus to UIT: despite the cheating element, it likely increases your candidate pool and eventually performance outcomes
- Speaking of response distortion, here's another study, this time of military cadet selection using personality inventories
- Okay, one more on inflation. This time a study of Chinese applicants--no difference compared to American samples.
- Back in March I wrote about a study Jeremy Bernerth published in J.A.P. that got a lot of attention. This time, Bernerth studied ethnic differences and found minority status was negatively related to credit scores.
Moving on to the summer issue of Personnel Psychology:
- The "file drawer problem" is the theory that nonsignificant results are less likely to get published. According to this study, that appears unlikely. But IMHO looking at all correlations is different than looking at the correlations key to one's hypothesis(es)...
- Back to faking (that may be this post's theme!), can response elaboration reduce faking on biodata items? This study suggests so. Although I'm left wondering...what was the impact on validity?
- Speaking of biodata, there are various ways of keying these items. This research suggests the best method depends on your sample size, although rational keying performed the worst.
How about the May issue of Journal of Applied Psychology?
- Well this is interesting...Chad Van Iddekinge and his colleagues have provided an updated meta-analysis on the criterion-related validity of integrity tests. What did they find? Well, the results appear to be less promising than those published previously (e.g., corrected r=.18 for job performance). Much like SIOP's research journal, this time J.A.P. published several commentaries in response to the study that...well, let's just say a debate ensued about the analysis...
- The Dark Triad. It sounds like something in a Dan Brown novel. But in this meta-analysis the authors show that personality characteristics that make up this triad (Machiavellianism, narcissism, and psychopathy) explain some variance in CWBs.
- Why are some people more proactive in seeking career goals than others? It's an important and under-researched question. In this study the authors show that part of the explanation lies in "future work selves", or how people's hopes and aspirations as they relate to work.
- Think self-reports of CWBs are biased? Perhaps not, according to this new study.
- Interested in what causes proactive customer service behavior? According to this multi-national study, self-efficacy is a key (along with service climate).
- Why do some leaders engage in more self-interested behavior than others? Perhaps not surprisingly, it appears due in part to the strength of their moral identity.
The May issue of the Journal of Applied Social Psychology has a couple gems...
- Hey, look, turns out being sensitive to your subordinates pays off. Talk about a lesson that needs frequent repeating...
- And that's it. Oh, wait, just this little study about using Facebook profiles to predict job performance...that I wrote about before....available in FULL right now...
Okay, getting to the end...The May/June issue of HRM:
- An interesting study of adverse impact in promotion decisions for managers in a Fortune 500 retailer. The authors compared three methods (top-down assessment, assessment centers, and multisource appraisal) and the results demonstrate how complex these analyses are!
- Speaking of complex. Think that successful job postings on the web is just fancy graphics? Think again--it still involves some classic factors like the labor market, firm reputation, and compensation incentives. The more things change...
- Identifying future leaders. There are few other issues that are as important for most organizations. Yet how exactly to do it eludes many. These authors propose a model that focuses on four main features: analytical ability, learning agility, drive, and emergent leadership.
Finally, a few from PARE:
- Does item order impact response anxiety? Not according to this study.
- What's that? How do we use a new jacknife procedure to eliminating items and improve structural equation modeling? You're in luck.
- Looks like a lot of research rely on beta weights when interpreting and reporting multiple linear regression results. But there's so much more...