Celebrating 10 years of the science and practice of matching employer needs with individual talent.
Thursday, June 07, 2007
March '07 issue of Journal of Business and Psychology
1) A study by Topor, Colarelli, and Han of 277 HR professionals found that evaluations of hypothetical job applicants were influenced by the traits used to describe the applicant as well as the assessment method used to measure those traits. Specifically, candidates described as conscientious were rated highest, as were those assessed using an interview--and the combination of the two was the highest-rated combination. (Other traits included intelligence and agreeableness, other assessment methods were paper-and-pencil test and assessment center)
How does this stack up against known evidence? Could be better. In general (and there are variations depending on the job), intelligence has been shown to be the best predictor of job performance. The value of a particular assessment method, on the other hand, depends not only on the job, but on the nature of the instrument as well as the construct being measured. But the research on measuring personality using interviews is relatively new, and certainly not as established as evidence supporting paper-and-pencil measures of cognitive ability (or personality, for that matter). Another example of HR folk not knowing the evidence as well as they should.
2) A meta-analysis by Williams, McDaniel, and Ford of compensation satisfaction research. The authors looked at 213 samples from 182 studies and came up with some interesting results.
- First, measures of satisfaction with direct pay (e.g., pay level, pay raises) were highly related to one another, while satisfaction with benefits showed only modest correlations with direct pay (suggesting people see direct pay and benefits as distinct compensation categories).
- Second, both the perception of the pay raise as well as the objective amount of the pay raise were important in predicting pay raise satisfaction. This fits well with what we know about the importance of procedural justice.
- Third, a modest, negative relationship was found between employee costs and benefit satisfaction. This suggests benefit costs may not be a particularly important factor when trying to lure candidates. It also suggests that asking employees to pick up more of their benefit costs may be a viable strategy as long as attention is paid to other forms of compensation (which apparently have a larger impact on overall satisfaction).
Wednesday, June 06, 2007
Recruiting metrics

Although there are some strong feelings out there about the usefulness of metrics, the fact is that traditional measures of recruiting activity are still out there being used, both as measures of success and for planning purposes.
The two benchmarks that were specifically asked about were:
- Number of fills per recruiter
- Number of hours per recruitment
Unfortunately a lot of the benchmark recruiting metrics out there are not free. Several organizations offer detailed benchmarking data, but ya gotta pay for it. Examples include:
- Staffing.org's Recruiting Metrics and Performance Benchmark Report ($425)
- Society for Human Resource Management (SHRM)'s Human Capital Benchmarking Service ($425 for non-members)
- The Saratoga Institute publishes a Human Capital Effectiveness Report
Even so, I've gathered what data I could get my hands on, so let's take them one at a time.
Fills per recruiter
How many fills, or placements, does a typical recruiter make? Of course it depends on the time frame and what type of placement we're talking about, which makes this statistic challenging to nail down. But here's what I found:
- IPMA-HR's 2006 Recruiting and Selection Benchmarking Report: The median number of recruitments per HR staffer was in the 4-6 category, closely bordering on the 7-10 category, so six is probably a good overall estimate given this data.
The other sources I found were more anecdotal. For example:
- JBN Consulting notes a typical recruiter makes .75 placements per month
- Cluff & Associates state: "A dedicated recruiter in a corporate environment can average 8 fills per month if activity levels warrant. Agency recruiters, however, are likely to generate 2 or 3 placements per month."
- A Google search on the topic brings up quite a few recruiter job advertisements, which gives us some idea of the expected workload. In general, 1-3 fills per month seems a standard expectation.
Conclusion? The typical number of fills per recruiter seems to vary quite a bit depending upon the type of job, job market, etc. For example, Dave Lefkow noted a few years back that 100 fills a month for customer service jobs was considered successful, while 5-10 analyst/developer fills a month might be acceptable. If I had to pick an "average" number, I'd go with the IPMA-HR figure of six (although consider that's public sector organizations).
Hours per recruitment
Fortunately there's a lot more information about benchmarks for how long it takes to conduct a recruitment (here used synonymously with "time to fill"), although here too the figure varies with the type of recruitment:
- Surveys conducted by the Employment Management Association (affiliated with SHRM) consistently show an average time to fill of 42-45 days. Source
- 2004 Workforce Diagnostic System Benchmark by the Saratoga Institute indicated 48 days.
- 2006 SHRM metric for same-industry hires of 37 days (from opening of requisition to offer acceptance) Source SHRM also offers customized reports (for a fee), available here. Samples indicate median time to fill of 35 and 40 days, depending on industry.
- 2006 Corporate Executive Board survey indicated 51 days Source: The Economist
- 2006 IPMA-HR benchmark report: average of 49 days between vacancy announcement and first day on the job, with a low of 44 for labor/maintenance and a high of 57 for public safety
- Staffing.org and HR Metrics Consortium's 2003 Recruiting Metrics and Performance Benchmark: 70 days (this includes time between offer acceptance and showing up to work). Source
Conclusion? Again, depends on the job (and how you define the metric), but we can estimate around 40 days between vacancy announcement and offer acceptance, with another 10-20 days between offer acceptance and first day on the job.
If you have additional data, please share! Inquiring minds want to know...
Tuesday, June 05, 2007
Feds host virtual career fair
The event will kick off at 11 a.m. on June 5th with an overview of federal employment by the Director of OPM, Linda Springer. Subsequent panel sessions will focus on finding and applying for jobs, insight from new federal employees, student programs, IT jobs, opportunities in medicine and public health, how to host a career fair, and several other topics.
According to the news release, videos of all twelve panels will be available through both OPM and the Partnership for Public Service for the remainder of the year. The event is specifically targeted at the 600 colleges and universities around the U.S. who participate in the Call to Serve initiative.
Kudos to these organizations for being creative about highlighting opportunities in the public sector and taking advantage of technology to get the word out.
Monday, June 04, 2007
Summer 2007 Personnel Psychology + free content

First off is the aptly titled, A review of recent developments in integrity test research by Berry, Sackett, and Wiemann, the fifth in a series of articles on the topic. This is an extensive review of research on integrity tests since the last review, which was done in 1996. There's a lot here, so I'll just hit some of the many highlights:
- It appears that integrity tests can vary in their cognitive load depending on which facets are emphasized in the overall score.
- It is likely that aspects of the situation impact test scores (in addition to individual differences); more research is needed in this area.
- Although there have been no significant legal developments in this area since the last review, concerns have been raised over integrity tests being used to identify mental disorders. The authors do not seemed concerned, as these tests (e.g., Reid Report, Employee Reliability Index) were not designed for that purpose thus likely do not violate EEOC Guidelines.
- Research on subgroup scores (e.g., Ones & Viswesvaran, 1998) indicate no substantial differences on overt integrity tests; no research has addressed personality-based tests.
- Test-takers do not seem to have particularly positive reactions to integrity tests, although this appears to depend upon the type of test, items on the test, and response format.
Next, Raymond, Neustel, and Anderson investigate certification exams and whether re-taking the same exam or a parallel form results in different score increases. Using a sample of examinees taking ARRT certification exams in computed tomography (N=79) and radiography (N=765), the authors found no significant difference in score gains between the two types of tests, suggesting exam administrators may wish to re-think the importance of alternate forms for certification, particularly given the cost of development (estimated by the authors at between $50K and $150K). The authors do point out that the generalizability of these results is likely limited by test type and examinee characteristics.
Third, Henderson, Berry, and Matic investigate the usefulness of strength and endurance measures for predicting firefighter performance on physically demanding suppression and rescue tasks. Using a sample of 287 male and 19 female fire recruits hired by the city of Milwaukee, the authors found that both measures (particularly strength measures such as lat pull-down and bench press) predicted a variety of criteria, including a roof ladder placement exercise, axe chopping, and "combat" test. The authors suggest continued gathering of data to support the use of these types of tests (while acknowledging the ever-present gender differences), and discuss several problems with simulated suppression and rescue tasks, now used by many municipalities in light of previous legal challenges to pure strength and endurance measures.
Lastly, LeBreton, et al. discuss an alternate way of demonstrating the value of variables in I/O research. Traditionally researchers have focused on incremental validity, essentially the amount of "usefulness" that a variable adds to other variables already in the equation. (Allows you to do things like determine if a personality test would help you predict job performance above and beyond the test(s) you already use.) Instead, the authors present the idea of relative importance, which shifts the focus to the importance of each variable in the equation. Fascinating stuff (and far more than I can describe here), and something I'd like to see more of. I believe the authors are correct in stating it would be much easier to talk to managers about how useful each test in a battery is rather than the fact that overall they predict 35% of performance. The article includes a fascinating re-analysis of Mount, Witt, & Barricks' 2000 study of the use of biodata with clerical staff.
----
This issue also includes reviews of several books, including the third edition of Ployhart, Schneider and Schmitt's Staffing Organizations (conclusion: good but not great), Weekley and Ployhart's Situational Judgment Tests (conclusion: good as long as you already know what you're doing), and Griffith and Peterson's A Closer Examination of Applicant Faking Behavior (conclusion: good for researchers, not so good for managers).
---
But wait, there's more...the Spring 2007 issue, which had some interesting stuff as well, is free right now! So get those articles while you can. Hey, it's worth surfing over there just for McDaniel et al.'s meta-analysis of situational judgment tests!
Saturday, June 02, 2007
Google buys FeedBurner; introduces Street View
First, they continued their buying spree and purchased FeedBurner, a popular feed syndicating service (and the one I use). This will allow Google to advertise in new ways, and also gives them somewhat of a blog end-to-end now that it owns both Blogger and FeedBurner. More information about the purchase is available on FeedBurner's blog, and they have this FAQ. Presumably this means a marriage of their analytic tools, which hopefully is good news for anyone that publishes content.
Implication for us? If you already publish a blog (or other type of feed) or are thinking about it, you'll want to keep close tabs on what this purchase will mean for publishers (e.g., ad distribution, fees, etc.).
Second, Google has introduced Street View, an add-on to Google Maps that allows for 360-degree viewing at the street level. You can even follow arrows that will lead you down the street. Right now it's available only in San Francisco, Las Vegas, Denver, New York, and Miami.
Why do we care? Seems to me a good way to preview the local area for potential applicants. I'm all about the realistic job previews. Also, probably a good opportunity for us to see how our area is being presented--what do candidates see when plopped down in front of our buildings? Would you want to be there?
Friday, June 01, 2007
Becoming passive employers

Right now, job search is static. Someone searches for a job, and either a vacancy exists or it doesn't. But what if we were a little more creative?
What if instead of getting "zero results for your search", the candidate received something like:
There are no current openings that match your search. However, the following positions exist that may have openings in the future.
What followed would be a detailed description of current positions in the organization that matched the search criteria--jobs people actually had. And you would allow people to submit a job interest request so they would be notified when that job (or similar job) became open. Yes, some systems already have job interest requests, but too often it's based on broad job titles and it fails to provide the rich information a job seeker needs (e.g., who they will work with, learning opportunities).
What else could we do with this feature? We could profile the individuals that are in the current job. Okay, maybe not everyone, but a sample. At the very least we could provide a basic job description (and not a boring one).
This idea fits with a concept I think we all need to focus more on. In addition to seeking passive candidates, we should be passive employers. Passive job seekers aren't looking for a job, but they could be. Passive employers don't have that particular opening--but they could. But unless you tell candidates that, how will they know? How do they know that a perfect match exists in your organization, and if they just had waited another week to search, they would have seen it?
Why do we make applicants the servants of the ATS, not the other way around?
Let's take this a step further. Let's say I'm an attorney in Seattle looking to relocate to Boston. I know I'd like to work for a smallish firm with decent billable hours, co-workers that know their stuff and are good at their jobs but value work-life balance.
How the HECK am I supposed to find that firm? Sure, I can look for current vacancies on job boards. Or maybe I just happen to know someone who works for such a firm and they have an opening. Or I might be able to find some information through a Google search or services such as Vault or Hoover's (although that information is very limited, you still have to know the company name, and information on public sector agencies is anemic). But that'll only get me so far. Then what?
There is no general database of employer qualities to search through (sites like Jobfox are trying a similar idea but it's still based on vacancies). No easy way to punch in the above criteria and have a system spit out, "Here are all the firms that meet your criteria. Here are the ones that currently have openings, here are the ones that don't currently may may in the future."
People search is getting more and more sophisticated. What about employer search? If we expect applicants to take an active role in managing their career, we should give them the information they need to do it. We can, and should, do better.
Wednesday, May 30, 2007
Evidence-based recruitment and assessment

It's one of those business books that's both entertaining and enlightening. As you know, there is a LOT of "fluff" out there--this is not one of those books. The authors are all about using good data and experimentation to discover what really works, not just what sounds good or what someone else recommended.
Case in point: the "War for Talent." This phrase gained popularity in the late 90's through several employees of McKinsey Consulting and their book of the same name. Those authors argued that the best performing companies had a deep commitment (obsession?) with finding and promoting talented individuals and offered data that claimed to support a link between this mindset and firm performance. But as Pfeffer and Sutton point out, a closer look at the data raises some eyebrows. Specifically, talent management practices were measured AFTER performance measures, resulting in a classic case of correlation-causation confusion.
Certainly Pfeffer and Sutton aren't the only ones to raise concerns about a talent obsession, but they do so in a very accessible and thorough manner. They highlight three poor decisions practices that apply to talent management (as well as many other issues):
- Casual benchmarking (for example, the failure of "Shuttle by United" to copy Southwest Airline's success or U.S. automotive companies attempting to copy Toyota's success). We see this in our field when folks want to know "how other people are recruiting" or "what test everyone else is using." Good information to know, but look before you leap.
- Doing what (seems to have) worked in the past (for example, using incentive pay in your new organization because it seems to have worked at another one). The best example of this is interview questions by managers who just know their questions about favorite books and who they'd want on a deserted island work--even though they don't have any data to support their view. In my experience about 20% of managers are good interviewers (and I place a lot of the blame on HR).
- Following deeply held yet unexamined ideologies (for example, equity incentives, the so-called "first-mover advantage", and merit pay for teachers). In our area this includes things like believing applicant tracking systems always result in improvement, or that integrity tests are more discriminatory than other types of tests.
So how do we apply these lessons to recruitment and assessment? Here are just a few ways:
1. Be a critical thinker. We know we're supposed to eye HR metrics with some skepticism, but do we? Do we adopt "best practices" without thinking about how our organization might differ in important ways? Are we lured by shiny new pieces of technology without asking ourselves whether we might be better off without it? On the flip side, do we resist new ways of doing things without even considering the possibilities?
2. Know the evidence. HR is not guess work--we know a lot about what works and what doesn't. Every HR practitioner and manager should read Rynes, Brown, and Colbert's "Seven common misperceptions about human resource practices", with a more detailed analysis here.
3. Push back when you hear something that sounds too simple or too good to be true--it probably is. Two examples: behavioral interviewing does not solve all of our assessment problems, and social networking sites will not solve all our recruiting problems.
4. Model evidence-based decision making. Make it clear that you are making decisions based on the best data you could find/gather and that this is an expectation for everyone. Rather than rushing into a decision, take the extra time to gather whatever information you get your hands on--as long as it doesn't lead to paralysis by analysis.
5. Do experiments whenever possible. Include an assessment instrument as a research-only tool and see if it predicts performance. Try out different advertising methods and mediums and track applicant numbers and quality. Did you know Yahoo! typically runs about 20 experiments at any time, changing things like colors and the location of text? We can't all be Yahoo!, but we can all be experimenters.
Some of my favorite quotes from the book...
"If doctors practiced medicine the way many companies practice management, there would be far more sick and dead patients, and many more doctors would be in jail."
"The fundamental problem is that few companies, in their urge to copy--an urge often stimulated by consultants who, much as bees spread pollen across flowers, take ideas from one place to the next--ever ask the basic question of why something might enhance performance."
"Instead of being interested in what is new, we ought to be interested in what is true."
"There is really only one way around this reluctance to confront the hard facts, and that is to consciously and systematically understand the psychological propensity to want to both deliver and hear good news and to actively work against it."
Tuesday, May 29, 2007
Must-reads on OFCCP compliance
First post here.
Second post here.
(Bolded date may indicate 2006 but these are 2007 posts)
Highly recommended for employers covered by OFCCP laws/regulations.
Monday, May 28, 2007
LinkedIn's JobsInsider
What is it? It's a free browser plug-in/toolbar (for IE or Firefox) that serves several purposes. One is simply as a quicker way to access LinkedIn content. But the much cooler feature is that when you're looking at jobs on Monster, CareerBuilder, HotJobs, Craigslist, SimplyHired, Dice, or Vault, a separate window comes up that notifies you if anyone in your LinkedIn network works for the organization and allows you to contact them to help with making the right connections.
Let's look at an example. I went to SimplyHired and looked up jobs working for Apple in Sacramento, CA. When I click on any of the jobs that come up the JobsInsider window pops up and tells me 207 people in my LinkedIn network work for Apple, and two are friends of my connections. I can click on the link and it takes me to a description of those people. Click on any one person and it tells you how you're linked to that person. Here's what it looks like:
Not only that, but (at least with SimplyHired), when job search results come up, you can click on "who do I know?" for each position to have LinkedIn search your network.
Pretty nifty, huh? So why do we care, other than it being a nifty little piece of technology?
For one, it's another reason to be a member of LinkedIn--at least if you're interested in being contacted by applicants. Given the choice between pursuing a job somewhere where I don't know anyone and a place where I can make a contact, I'll take the latter.
Second, it's a good way to double- (or triple-) check credentials of applicants. Most of these networking sites strongly encourage you to put in your educational background and job experience. If what's listed here doesn't match the resume or application they submitted to you, that's something to follow up on. Could be a simple explanation, could not be.
Finally, another reason to care about this is it's likely a sign of things to come. With meta-people search sites like ZoomInfo out there, and ones like Spock coming on board, we need to be very comfortable with our on-line identities and understand how they link to other people.
One last cool feature of the LinkedIn toolbar. When you open an e-mail in Gmail, Yahoo! Mail, MSN Hotmail, or AOL, you automatically have the option to get someone's LinkedIn information or invite them to your network. In fact there's even a tool that will do the same thing for your Outlook mail.
Happy Memorial Day!
Friday, May 25, 2007
The recruitosphere vs. the assessosphere

There are so many recruiting blogs that there are lists devoted to keeping track of them, and new ones are announced practically every day.
There are so few blogs focused on assessment that I have a hard time thinking of them. The only 100% "true" blog (besides this one) is Jamie Madigan's SelectionMatters--and he just started back up again (thank goodness) after a long hiatus. Okay, Dr. Michael Mercer has one but it doesn't allow comments. Yes, Michael Harris does a great job with EASI-HR Blog, but assessment is just one of the topics he covers. Ditto for Alice Snell's Taleo Blog. Charles Handler has the awesome Rocket-Hire newsletter, but that's periodic and doesn't allow for (as far as I know) reader comments.
Then there's PIOP, a message board that's been around for a while, but it's focused primarily around students. Perhaps the recently established iocareers.com will go somewhere but it too seems focused on students--which isn't a bad thing, but it limits the reach.
So what's up? Why the difference? I have some theories:
1. Recruiters like to hear/read themselves. Yes, that's undoubtedly true. But do you know any assessment professionals that DON'T?
2. Recruiters use them to market themselves. I think this is a pretty compelling possibility. Still, if true, why aren't more assessment professionals? There are plenty of assessment consulting outfits.
3. Recruiters are earlier adopters of technology. Assessment professionals are, IMHO, a conservative bunch when it comes to technology adoption. They've seen things come and go, and don't want to jump wholeheartedly into a possible flash-in-the pan. Prime example: SIOP's excellent news articles are not syndicated.
4. Recruiters are more focused on the business side of things, assessment folk more with research. This seems likely of I/O Ph.D.'ers, but what about the rest of the assessment community? Are we still stuck trying to apply law and rules rather than creating networks and sharing best practices?
5. Assessment folks don't know about the blogosphere. I think there is some truth to this, as well. As a group, assessment professionals just don't seem to have taken to electronic forms of information sharing (the IPMAAC listserv being a notable exception), so I don't see why blogs would be any different. They seem to prefer conferences.
6. Assessment folks don't see the value of blogs. And some in the recruiting field don't either.
7. Recruiters are more extroverted than assessment types. I have absolutely no evidence to support this claim, which hasn't stopped me in the past and won't stop me this time.
Is it just me or is something (not) going on here?
Thursday, May 24, 2007
Technorati Gets a Fresh Coat
According to this post, changes include:
1 - Simplified search. It's less technical now and easier to find what you're looking for.
2 - Enhanced results--searches now includes posts, blogs, photos, videos, podcasts, etc.
3 - Improved user interface (and it's still being tweaked), including a ticker at the top that tracks popular searches.
Why do we care about blog search engines?
- They're a great way to keep up on the latest news, research, articles, etc. in your field of interest. Blogs are oftentimes updated more frequently than other forms of media and are very targeted.
- They're a good way to identify potential applicants--people who are passionate about their interests and keep up to date on developments.
- For those of you considering starting a blog, or in differentiating your blog from others out there, they can be used to canvass the existing landscape and make comparisons.
- They can be used to see what's popular--blogs and searches.
- They can, sometimes, be used to gather information about applicants you're considering for hire. You may be able to find work samples or opinions/thoughts expressed by people you're considering--and you can use that information to follow up/verify in person.
- They're a great way to make connections and become part of the larger community.
Oh, and don't forget about other blog search engines (some people have a preference), such as Google Blog Search.
Wednesday, May 23, 2007
2007 SIOP Conference: Highlights, Part 3
Employment interview structure and discrimination litigation verdicts: A quantitative review
Pool, McEntee, and Gomez analyzed 31 federal court cases from 1990 to 2005 (27 claims of disparate treatment, 7 of adverse impact) to see if there was a relationship between the amount of interview structure and verdicts in employment discrimination cases. Most cases (73%) were brought under Title VII and involved promotional decisions (65%). Race discrimination was the most common allegation (47%) and the vast majority of cases (84%) involved a single plaintiff. For both types of claims, the strongest factors associated with a victory for the defendant (employer side) was having interviewers that were familiar with job requirements and having a guide for conducting the interview. In disparate treatment claims, defendants were more likely to prevail if they also had standardized questions and identical interviews for each applicant. In disparate impact cases, defendants fared better when they had evidence of validity (which makes sense given the burden shifting in these cases). Similar results to Williamson et al.’s 1997 study, but good data to have—see, we’re not just saying standardize those interviews because we’re sadistic HR folks.
Recruiting through the stages: Which recruiting practices predict when?
This meta-analysis by Uggerslev and Fassina of 101 studies looked at the impact that various “recruitment predictors” (e.g., job-person fit, job/organizational attraction) had on various outcome criteria (e.g., job pursuit intention, acceptance intentions). Results depended somewhat on the criterion, but perceived fit between the individual and the job/organization was across-the-board the strongest predictor. The only criterion that matched perceived fit was job characteristics, which tied for predicting acceptance intentions. The strength of the correlations varied, from a low of .15 between perceived fit and job choice to .47 between perceived fit and recommendation intentions. So how do we use this? The authors suggest efforts to increase the appearance of a good fit between the values of goals of applicants and those of the organization may pay off (I'm thinking, say, by focusing on aesthetics and message customization or clearly indicating what you’re looking for).
Meta-analysis on the relationship between Big Five and academic success
Okay, so it's not directly about recruitment or assessment, but it's still interesting. The title pretty much says it all--the presenters (Trapmann, Hell, Hirn, and Schuler) were looking here at the relationship between Big Five personality traits and academic success. Results? As you might expect, it depends what you mean by "success." Neuroticism was related to academic satisfaction (hey, that's why they're neurotic, right?) while Conscientiousness correlated with grades and retention. The other three factors (Extraversion, Openness, and Agreeableness) were not related to success.
That's probably the end of my review of 2007 SIOP presentations, unless I manage to obtain more presentations. Stay tuned for reviews from the upcoming IPMAAC conference!
Monday, May 21, 2007
Red--it's not just for bulls anymore

Using several different experiments, the researchers found that even a brief glimpse of the color red can lower scores on achievement tasks. For example, one of the experiments involving nearly 300 U.S. and German high school and undergraduate students found that simply looking at a red participant number (versus black or green) prior to completing an IQ test resulted in a performance decrease.
The authors hypothesize that the color red evokes an anxiety response which in turn interferes with the ability to complete the task. Where does the anxiety come from? Some possibilities, according to the authors, include:
- Evolution: we may be hardwired to respond to red (think of the association between red and aggression in nature)
- Daily life: red is often associated with warnings or commands (e.g., stop lights, stop signs, dash lights)
- School: who didn't cringe a little when they saw red marks on their essays or tests in school? Maybe you even have a supervisor who does this?
Lesson: be careful with using red in testing material. There's enough error out there being introduced in testing situations without worrying about color.
So...did this article make you nervous?
Hat tip: SIOP.org
Thursday, May 17, 2007
EEOC Meeting Focuses on Employment Testing and Screening
Several issues were discussed, including potential problems with specific screening methods (e.g., cognitive ability tests, credit checks), how the EEOC can better serve employers, and steps employers need to take in order to meet professional and legal guidelines (e.g., gathering validity evidence, investigating alternative methods with less adverse impact). Not for the first time, speakers emphasized that the Uniform Guidelines on Employee Selection Procedures need to be updated.
Speakers included EEOC staff members, plaintiffs in two of the more discussed recent cases (EEOC v. Dial Corp. and EEOC v. Ford Motor Co.), attorneys, and professionals in the field of assessment, including James Outtz and Kathleen Lundquist, who have frequently been retained as expert witnesses in employment discrimination cases.
Said Richard Tonowski from the EEOC:
"A mature technology of testing promises readily-available methods that serve as a check against both traditional forms of discrimination as well as the workings of unconscious bias. If that is the promise, then the threat comes from institutionalizing technical problems not yet fully addressed, the undermining of equal employment opportunity under the guise of sound selection practice, and the unintended introduction of new problems that will require resolution to safeguard test-takers and test-users."
Personality testing was mentioned prominently as an increasingly common practice among employers, but it appears (contrary to my earlier fears) that the focus was on those tests that could be considered "medical tests" under the ADA (such as the original MMPI), which leaves out many products, including the HPI, 16PF, and PCI.
Hopefully I'll have the slides from the presentation to post soon. In the meantime, check out this excellent summary from an attendee, and you can view the EEOC press release here. Statements of the speakers, along with their bios, can be found here, and it looks like the meeting transcript will be available there as well.
Wednesday, May 16, 2007
Targeted job board: I/O Careers
Membership is limited to "individuals who are serious about the field of I/O Psychology" but during the beta of the site, posting jobs is FREE. (After that it's $250 a pop).
Check it out.
Tuesday, May 15, 2007
Court approves Googling employee...sort of

As part of the appeal of MSPB's decision, the individual claimed that his "guaranteed right to fundamental fairness" was violated when the deciding official Googled his name and came across information regarding his work history (essentially termination) with a previous employer. The individual claimed this unduly influenced the decision to remove him.
The court disagreed. The three-judge panel found that because information regarding the previous job loss did not influence the official's decision to remove the appellant, it did not show prejudice. Additionally, they found no due process violation.
The unanswered question here is what would have happened if the prior work history information HAD influenced the termination decision. And what if it was a hiring or promotion situation rather than a termination where voluminous information already existed regarding bad behavior? It will be interesting to see how this area of law evolves.
The appeals court decision (nonprecedential) is here.
Good string of articles about using the Internet to gather background check information can be found here.
Saturday, May 12, 2007
Recruiting Trends Survey from DirectEmployers Association
Data gathered from 47 companies indicated:
- 55% of hires were made from online sources (+8% from last year).
- Employee referrals were the largest single source (21% of hires), followed closely by the organization's website and general job boards.
- Employee referrals also generated the highest quality candidates (82% rated favorable), but niche job boards and search firms tied for second, with campus recruiting a very close third. General job boards were rated favorable by only 22% of respondents.
- The largest percentage, by far, of recruitment/advertising budget went to general job boards (34%). Referrals, the source of the highest quality candidates, received 6% of the budget.
- Putting these numbers together, the source value (cost/hire) was highest, by a large margin, for referrals, followed by the organization's web site and, perhaps surprisingly, social networking technology.
Comments and follow-up conversations indicated a growing frustration with general job boards (especially for IT jobs) as well as a growing reliance on sources of passive candidates, such as social networks, blogs, and search engine optimization.
Read the full report for a much more detailed analysis and insights. Thanks to Rocket-Hire for making this available.
Thursday, May 10, 2007
The Daily Show does Video Resumes
So it was with great amusement that I saw a piece yesterday on video resumes...check it out...
Wednesday, May 09, 2007
Job ads of the future?

Looking for ways to snazz up your postings?
Then read this post over at jobs2web. Check out the graphic.
How close are your postings and/or career portals to this? Are they even in the same ballpark?
How hard would it be to add things like:
- links to a webinar/job preview video
- RSS feed
- subscribe to similar jobs
Answer: not hard. Let's hurry up and get there!
Tuesday, May 08, 2007
2007 SIOP Conference: Highlights, Part 2
Legal risks and defensibility factors for employee selection procedures
Posthuma, Roehling, and Campion analyzed nearly 600 federal district court cases and came up with some very interesting results:
- Employers are most likely to win (by far) when defending tests of math or mechanical ability. Employers also fare well when defending assessments of employment history and interviews.
- Employers did worst when defending physical ability tests and medical examinations. Tests of verbal ability and job knowledge were also more likely to result in a plaintiff win.
Predicting Internet job search behavior and turnover
Using a sample of 110 nurses in Texas, Posthuma et al. found using longitudinal survey data that (among other things) Internet job search behavior was related to turnover--folks weren't just surfing for fun. This suggests that organizations need to pay close attention to job searching behavior among employees; not necessarily to curtail it but instead to figure out why high performers want to leave.
Gender differences in career choice influences
After analyzing survey data from nearly 1,400 fourth-year medical students from two U.S. schools, Behrend et al. found a gender difference in preferred career: specifically, female medical students valued "opportunities to provide comprehensive care" when choosing a specialty much more than men. This is consistent with other work that has showed women to be more "relationship-oriented" than men when it comes to choosing a career.
Portraying an organization's culture through properties of a recruitment website
In this study of 278 undergraduate students, Kroustalis and Meade found that inclusion of pictures on a website that were intended to portray a certain organizational culture did so--but only for certain cultural characteristics. Specifically, pictures that implied a culture of either innovation or diversity had the intended effect--but pictures representing a team orientation did not. Interestingly, "employee testimonials" designed to emphasize these cultural aspects failed to do so for any of the three aspects studied. Finally, individuals who perceived a greater fit between themselves and the organization (in terms of the three cultural aspects) reported being more attracted to the organization.
Recruiting solutions for adverse impact: Race differences in organizational attraction
Last but definitely not least, Lyon and Newman gathered data from nearly 600 university students on their reactions to 40 hypothetical job postings...and came away with some very interesting results. For example:
- Conscientious individuals were more likely to apply to postings that explicitly stated a preference for conscientious applicants.
- Conscientious individuals were more likely to apply to postings that described the company as results-oriented.
- Black applicants with higher cognitive ability were more likely to respond to ads seeking conscientious individuals while White applicants with higher cognitive ability were less likely to do so.
- When a company was described as innovative, Black applicants high on conscientiousness were more likely to apply; this was not the case for White applicants.