Thursday, June 12, 2008

Unproctored internet testing: Safe for some tests?

One of the biggest trends in personnel assessment is the movement toward on-line testing. Many organizations are experimenting with so called unproctored Internet testing (UIT), where candidates are allowed to take the exams whenever, and wherever, they want.

Benefits? Extremely convenient for the candidate. Less administrative resources needed by the employer.

Costs? Bye-bye exam security, hello cheating opportunities. Not only is your test out for everyone to see, but you have no real way of knowing (sans biometric verification) who is taking the test.

Some organizations have decided the benefits outweigh the risks, and a new study in the June 2008 issue of the International Journal of Selection and Assessment may provide support for their position.

In it, the authors looked at over 800 applicants from nine European countries that took a test of perceptual speed in an unproctored setting, then followed this up with a proctored parallel version. Results? Not only was there no evidence of cheating, they found the opposite effect--people did better in the proctored setting.

Now before everyone throws out their proctored exams, note that this is a type of test that might be hard to cheat on--at least in one way. Because this is a perceptual speed test, there are no "right" answers that can be looked up. It also required very quick responses. So the only way to cheat would be to have someone take the test for you. Implication: it may make more sense to use certain UITs than others.

This topic is a source of much debate in the assessment community, and there is by no means consensus on the right way to go. But studies like this help!

Take a deep breath, because there's a lot more in this issue:

- The preliminary employment interview as a predictor of assessment center outcomes (fascinating look at how the AC may only make sense for mid-range interview scorers)

- A comparison of the common-item and random-groups equating designs using empirical data (for you IRT fans out there)

- The influence of external recruitment practices on job search practices across domestic labor markets: A comparison of the United States and China

- Beneath the surface: Uncovering the relationship between extraversion and organizational citizenship behavior through a facet approach (a more nuanced look at the relationship shows extraversion can predict OCBs)

- Comparing personality test formats and warnings: Effects on criterion-related validity and test-taker reactions (another good one...personality test added predictive validity beyond ability test but no validity difference between forced-choice and Likert scales, nor between warning and no-warning conditions; forced-choice and warnings may produce negative candidate reactions)

- Applicant selection expectations: Validating a multidimensional measure in the military (describes development of a new measure of applicant perception of the selection process)

- Selecting for creativity and innovation: The relationship between the innovation potential indicator and the team selection inventory

No comments: