It's always a pleasure welcoming other blogs to the fold. Particularly when they're written by intelligent people with a perspective that should be shared.
So I'm very happy to say that Dr. Dennis Doverspike of the University of Akron and his colleagues have started HRLitehouse. In their own words,
"In deciding to create this blog, our goal was to do our small part to attempt to contribute to the ongoing conversation on the management of people at work. In order to do so, we hope to share our views through short reports and commentary on critical issues and current research in the areas of human resource management, personnel recruitment and testing, and organizational business consulting and coaching."
The I/O Psychology program at the University of Akron is one of the top in the country, consistently ranked in the top 10. It's refreshing to see an academic institution take the plunge into this form of knowledge sharing--and they'll probably get a decent amount of social networking out of it as well for their students (see I/O careers for an example of what can be done).
So welcome! Here's the feed.
Celebrating 10 years of the science and practice of matching employer needs with individual talent.
Tuesday, July 22, 2008
Thursday, July 17, 2008
Broadband adoption in U.S.: A mixed bag
A new study out by the Pew Internet and American Life Project shows that while 55% of all Americans have a broadband connection at home, up from 47% in early 2007, poorer Americans saw no increase during this time and their access rates are under 50% compared to more than 80% of upper-income Americans.
Why does this matter? In this age of bandwidth-greedy job preview videos, java-filled interactive career websites, and realistic job assessments, a high-speed connection is becoming increasingly a necessity. The good news is more than half of Americans can engage in these experiences at home. The bad news? Not only will access to some of these sites most likely have an adverse impact against certain groups (see below), this will reduce an organization's ability to draw an applicant pool that contains the most diverse backgrounds.
With that in mind, consider these findings:
* While 70% of those age 18-29 reported having broadband at home, only 50% of those age 50-64 did.
* While 57% of White respondents had broadband at home, only 43% of Black respondents did (let's see, four-fifths of 57% is...). On a more positive note, 56% of English-speaking Hispanic respondents had this access.
* 79% of those with at least a college education had home broadband access; only 40% of high school grads did.
* 60% of suburban respondents and 57% of urban respondents had this access; only 38% of rural respondents did.
"But people can always go to a library," is a response I often hear. That may be true, although not everyone lives within easy access of a library. But libraries aren't open 24/7. And many times they're busy during peak hours. And many aren't exactly a Starbucks cafe. Do you really want to create these barriers?
So what can we do about it? Here are some ideas:
* Make sure your careers site has a low-bandwidth alternative
* Consider offering a staffed on-site computer center that operates during off-peak hours (e.g., 6-8am, 5-7pm, weekends)
* Think long and hard about whether you're adopting bandwidth-hogging features because they're there or because they'll actually add value.
For more details, check out the report.
Why does this matter? In this age of bandwidth-greedy job preview videos, java-filled interactive career websites, and realistic job assessments, a high-speed connection is becoming increasingly a necessity. The good news is more than half of Americans can engage in these experiences at home. The bad news? Not only will access to some of these sites most likely have an adverse impact against certain groups (see below), this will reduce an organization's ability to draw an applicant pool that contains the most diverse backgrounds.
With that in mind, consider these findings:
* While 70% of those age 18-29 reported having broadband at home, only 50% of those age 50-64 did.
* While 57% of White respondents had broadband at home, only 43% of Black respondents did (let's see, four-fifths of 57% is...). On a more positive note, 56% of English-speaking Hispanic respondents had this access.
* 79% of those with at least a college education had home broadband access; only 40% of high school grads did.
* 60% of suburban respondents and 57% of urban respondents had this access; only 38% of rural respondents did.
"But people can always go to a library," is a response I often hear. That may be true, although not everyone lives within easy access of a library. But libraries aren't open 24/7. And many times they're busy during peak hours. And many aren't exactly a Starbucks cafe. Do you really want to create these barriers?
So what can we do about it? Here are some ideas:
* Make sure your careers site has a low-bandwidth alternative
* Consider offering a staffed on-site computer center that operates during off-peak hours (e.g., 6-8am, 5-7pm, weekends)
* Think long and hard about whether you're adopting bandwidth-hogging features because they're there or because they'll actually add value.
For more details, check out the report.
Tuesday, July 15, 2008
Silver tsunami or gradual graying?
There are strong opinions about the upcoming "retirement wave" of the Baby Boom generation--some think employers are in big trouble as they will be forced to replace a huge number of workers with an inadequate supply, while others claim it's an overreaction. Some recent news adds credence to the latter viewpoint, at least in one sector.
According to a recent article published by the Partnership for Public Service, in 2006 and 2007 there were fewer retirements in the federal government than projected. Not only that, but the Office of Personnel Management (OPM) adjusted their retirement projections downward.
Why the change? It's the economy, stupid--or at least that's the theory. The price of gas, the housing market crash, and the reduction in pension value have all contributed to people hanging in there a little longer.
One important caveat: it depends on the job. And here's where we still have reason for concern:
"The ones most able to ignore the current economic problems are those who possess specialized and marketable skills, or who are in the higher-income brackets and can afford to weather the downturn, experts say."
This includes high-demand occupations like IT as well as those with substantial management experience.
It's also quite possible we just haven't seen the tip of the tsunami yet. But it's looking more and more likely that if economic conditions persist, we may actually experience more of a gradual graying.
According to a recent article published by the Partnership for Public Service, in 2006 and 2007 there were fewer retirements in the federal government than projected. Not only that, but the Office of Personnel Management (OPM) adjusted their retirement projections downward.
Why the change? It's the economy, stupid--or at least that's the theory. The price of gas, the housing market crash, and the reduction in pension value have all contributed to people hanging in there a little longer.
One important caveat: it depends on the job. And here's where we still have reason for concern:
"The ones most able to ignore the current economic problems are those who possess specialized and marketable skills, or who are in the higher-income brackets and can afford to weather the downturn, experts say."
This includes high-demand occupations like IT as well as those with substantial management experience.
It's also quite possible we just haven't seen the tip of the tsunami yet. But it's looking more and more likely that if economic conditions persist, we may actually experience more of a gradual graying.
Wednesday, July 09, 2008
Detecting liars is not a skill
One of the most popular pieces of folk wisdom is that some people are better at detecting liars than others. When it comes to selection, some people think they can tell when a candidate is lying about their history or competencies. And if the organization is conducting background checks and/or a polygraph, it becomes particularly important to detect deceptions.
Yet according to a new meta-analysis published in the most recent issue of Psychological Bulletin, we may all be pretty much the same when it comes to lie detection. In the words of the authors:
"Although researchers have suggested that people differ in the ability to detect lies, psychometric analyses of 247 samples reveal that these ability differences are minute."
Where there do appear to be differences is in being able to successfully tell a lie--some people are plain better at it than others.
The article is followed by two commentaries that are critical of this study and a reply by the author.
What about people's overall ability to detect a lie? Check out this study from 2006 by the same authors. Short answer: we're not very good.
Yet according to a new meta-analysis published in the most recent issue of Psychological Bulletin, we may all be pretty much the same when it comes to lie detection. In the words of the authors:
"Although researchers have suggested that people differ in the ability to detect lies, psychometric analyses of 247 samples reveal that these ability differences are minute."
Where there do appear to be differences is in being able to successfully tell a lie--some people are plain better at it than others.
The article is followed by two commentaries that are critical of this study and a reply by the author.
What about people's overall ability to detect a lie? Check out this study from 2006 by the same authors. Short answer: we're not very good.
Monday, July 07, 2008
Grading the EEOC
The General Accounting Office (GAO) recently released a report critical of the U.S. Equal Employment Opportunity Commission's (EEOC) ability to handle its private-sector workload.
Amid the critique are some gems for those of you itching to know more about the EEOC...
* Wondered where the EEOC's offices are? Check out page 18
* Curious how a complaint makes its way through the process? Check out page 20
* Need another workforce planning model? (c'mon, you know you do) Check out page 24
There are lots of other great metrics in here, and I'm sure it's all the buzz at the EEOC.
Amid the critique are some gems for those of you itching to know more about the EEOC...
* Wondered where the EEOC's offices are? Check out page 18
* Curious how a complaint makes its way through the process? Check out page 20
* Need another workforce planning model? (c'mon, you know you do) Check out page 24
There are lots of other great metrics in here, and I'm sure it's all the buzz at the EEOC.
Thursday, July 03, 2008
NEOGOV acquires Sigma
Not exactly breaking news, but for any of you out there that didn't already know, NEOGOV, a significant player in the public sector ATS space, has acquired Sigma Data Systems, a venerable ATS provider that had been purchased not all that long ago by CPS.
Sigma's strength has always been its data analysis capabilities, and presumably this will be folded into NEOGOV's product. It's an interesting move by NEOGOV, and we'll see what impact this has on its rivalry with JobAps, the other major ATS vendor that focuses on the public sector.
You can read more about it here.
Sigma's strength has always been its data analysis capabilities, and presumably this will be folded into NEOGOV's product. It's an interesting move by NEOGOV, and we'll see what impact this has on its rivalry with JobAps, the other major ATS vendor that focuses on the public sector.
You can read more about it here.
Tuesday, July 01, 2008
A review of situational judgment tests
In the latest issue of Personnel Review, Dr. Filip Lievens and colleagues provide an empirical review of situational judgment tests (SJTs), focusing on studies from 1990-2007.
SJTs, sometimes referred to as low fidelity simulations, present test takers with a scenario and ask them to select the appropriate response. Candidates may be asked to select what "should" they do, what "would" they do, the best response, the worst response, or some combination of the above. Here's an example:
You have been assigned lead responsibility for two weeks in the absence of your supervisor. On your first day in this role, one of your new direct reports comes into your office and complains that they were sexually harassed by the security guard when they entered the building. They ask that the situation be kept confidential. What would be your first action in response to this situation?
1. Contact the security guard and conduct an interview to obtain all the facts.
2. Assure the direct report you will look into the situation but cannot guarantee confidentiality.
3. Contact your supervisor to obtain instruction on next steps.
4. Conduct informal interviews with your other direct reports to determine if they have been harassed.
SJTs have some great benefits, and this article points them out. First, they can be valid predictors of performance--particularly when based on job analysis. Second, they show incremental validity beyond cognitive ability and personality tests, making them a valuable addition. Third, group differences tend to be reduced compared to ability tests, particularly when the cognitive load is low. Fourth, applicant perceptions of SJTs tend to be positive. And fifth, SJTs allow you to test large candidate groups simultaneously. I would add that they allow for all kinds of scoring possibilities as well (e.g., +1 for correct response, -1 for incorrect).
SJTs aren't without drawbacks--two major ones to be exact. The first is they can be susceptible to faking, practice, and coaching effects--although how they're built plays a large role in how big these effects are. The second is that we don't always know exactly what SJTs are measuring--is it job knowledge? Personality? Cognitive ability? The authors point out that more research is needed.
Overall, a very good review of a test method that every assessment professional should have in their tool belt. You can read an in press version here.
SJTs, sometimes referred to as low fidelity simulations, present test takers with a scenario and ask them to select the appropriate response. Candidates may be asked to select what "should" they do, what "would" they do, the best response, the worst response, or some combination of the above. Here's an example:
You have been assigned lead responsibility for two weeks in the absence of your supervisor. On your first day in this role, one of your new direct reports comes into your office and complains that they were sexually harassed by the security guard when they entered the building. They ask that the situation be kept confidential. What would be your first action in response to this situation?
1. Contact the security guard and conduct an interview to obtain all the facts.
2. Assure the direct report you will look into the situation but cannot guarantee confidentiality.
3. Contact your supervisor to obtain instruction on next steps.
4. Conduct informal interviews with your other direct reports to determine if they have been harassed.
SJTs have some great benefits, and this article points them out. First, they can be valid predictors of performance--particularly when based on job analysis. Second, they show incremental validity beyond cognitive ability and personality tests, making them a valuable addition. Third, group differences tend to be reduced compared to ability tests, particularly when the cognitive load is low. Fourth, applicant perceptions of SJTs tend to be positive. And fifth, SJTs allow you to test large candidate groups simultaneously. I would add that they allow for all kinds of scoring possibilities as well (e.g., +1 for correct response, -1 for incorrect).
SJTs aren't without drawbacks--two major ones to be exact. The first is they can be susceptible to faking, practice, and coaching effects--although how they're built plays a large role in how big these effects are. The second is that we don't always know exactly what SJTs are measuring--is it job knowledge? Personality? Cognitive ability? The authors point out that more research is needed.
Overall, a very good review of a test method that every assessment professional should have in their tool belt. You can read an in press version here.
Wednesday, June 25, 2008
Supreme Court clarifies ADEA burden shifting
On June 19th the U.S. Supreme Court made several employment-related decisions. Of most interest for us is their decision in Meacham v. Knolls.
The case involved workers over 40 who were suing over their layoffs. They claimed they lost their jobs due to age discrimination, claiming a violation of the Age Discrimination in Employment Act (ADEA). Although they claimed both disparate treatment and disparate impact, the important issue here is the latter--employment decisions that may not intentionally discriminate but have that effect.
How the court ruled is closely tied to its 2005 decision in Smith v. City of Jackson, in which they held that adverse impact cases could be brought under the ADEA, but employers could prevail if they could show (per the language of the ADEA) that the employment decision was based on a "reasonable factor other than age" (RFOA).
So what was the decision? The court made it clear that the employer in these cases bears both the burden of production and the burden of persuasion that the employment decisions were based on a RFOA. This is similar to other adverse impact discrimination cases, such as those brought under Title VII, where an employer must show their practice was "job related and consistent with business necessity."
So what does this mean? It doesn't mean a new requirement. It reinforces that all employment decisions--hiring, firing, and everything in between--should be based on logical, non-discriminatory reasons. The fact that the employer may face a slightly easier hurdle in ADEA disparate impact cases compared to, say, race or gender cases, is practically insignificant.
Important note: the plaintiffs in this case provided expert testimony that employee scores on "flexibility" and "criticality" had both the most manager discretion and were tied the strongest to outcomes. Words like these are often invoked in age discrimination cases (a jury can easily see how these types of words might be proxy for "young"), and employers are wise to strongly consider in hiring and firing situations whether the rating factors are tied to benchmarks and can be shown to be important for success on the job.
The case involved workers over 40 who were suing over their layoffs. They claimed they lost their jobs due to age discrimination, claiming a violation of the Age Discrimination in Employment Act (ADEA). Although they claimed both disparate treatment and disparate impact, the important issue here is the latter--employment decisions that may not intentionally discriminate but have that effect.
How the court ruled is closely tied to its 2005 decision in Smith v. City of Jackson, in which they held that adverse impact cases could be brought under the ADEA, but employers could prevail if they could show (per the language of the ADEA) that the employment decision was based on a "reasonable factor other than age" (RFOA).
So what was the decision? The court made it clear that the employer in these cases bears both the burden of production and the burden of persuasion that the employment decisions were based on a RFOA. This is similar to other adverse impact discrimination cases, such as those brought under Title VII, where an employer must show their practice was "job related and consistent with business necessity."
So what does this mean? It doesn't mean a new requirement. It reinforces that all employment decisions--hiring, firing, and everything in between--should be based on logical, non-discriminatory reasons. The fact that the employer may face a slightly easier hurdle in ADEA disparate impact cases compared to, say, race or gender cases, is practically insignificant.
Important note: the plaintiffs in this case provided expert testimony that employee scores on "flexibility" and "criticality" had both the most manager discretion and were tied the strongest to outcomes. Words like these are often invoked in age discrimination cases (a jury can easily see how these types of words might be proxy for "young"), and employers are wise to strongly consider in hiring and firing situations whether the rating factors are tied to benchmarks and can be shown to be important for success on the job.
Thursday, June 19, 2008
Scare applicants into applying
What if instead of convincing applicants to apply based on your brand, your benefits, etc., you scared them? That appears to be the strategy of North Carolina's Office of State Personnel.
Their latest recruiting video is called "We are here"--and no, it's not a documentary of aliens trapped on earth, although I'll forgive you for mistaking it for one.
It is, bar none, the strangest professionally made recruiting video I've ever seen. I really don't know how to describe it, so do this--go check it out and let me know what you think. It's one of those things you'll want to show your co-workers.
(by the way, I do have to give OSP kudos for their other profiled video, which describes the development of a SAS-based knowledge management system called NC WORKS).
Their latest recruiting video is called "We are here"--and no, it's not a documentary of aliens trapped on earth, although I'll forgive you for mistaking it for one.
It is, bar none, the strangest professionally made recruiting video I've ever seen. I really don't know how to describe it, so do this--go check it out and let me know what you think. It's one of those things you'll want to show your co-workers.
(by the way, I do have to give OSP kudos for their other profiled video, which describes the development of a SAS-based knowledge management system called NC WORKS).
Wednesday, June 18, 2008
Staffing.org's 2008 benchmark report
Staffing.org just released its 2008 Recruiting Metrics and Performance Benchmark Report. From all accounts it's a good source of data, gathered from over 1,000 organizations. It's also $400.
Fortunately, they're releasing details in dribbles through their newsletter. For example:
- Employee referrals are still the most popular recruiting source, followed closely by large job boards. Some of the less used sources include mass media and the military.
- Organizations are increasingly using combination structures--in other words, part of their staffing function is centralized, part is decentralized. The numbers vary greatly depending on industry, with transportation being much more centralized while education is much less so.
- Cost-per-hire varied wildly, from $2,000 (retailing, hospitality) to $16,000 (pharma biotech).
- Same goes with time-to-start, with retailing at 4 weeks and government at 12 weeks, with an average around 7-8 weeks.
- 11% of employers report poor performance among new hires; 20% report superior performance.
- Competition for talent and candidate quality were the two most important issues reported.
- More than 80% of respondents have adopted an ATS but few have a "talent management suite."
You can sign up for their newsletter here. It's one of the shorter, more digestible ones.
Fortunately, they're releasing details in dribbles through their newsletter. For example:
- Employee referrals are still the most popular recruiting source, followed closely by large job boards. Some of the less used sources include mass media and the military.
- Organizations are increasingly using combination structures--in other words, part of their staffing function is centralized, part is decentralized. The numbers vary greatly depending on industry, with transportation being much more centralized while education is much less so.
- Cost-per-hire varied wildly, from $2,000 (retailing, hospitality) to $16,000 (pharma biotech).
- Same goes with time-to-start, with retailing at 4 weeks and government at 12 weeks, with an average around 7-8 weeks.
- 11% of employers report poor performance among new hires; 20% report superior performance.
- Competition for talent and candidate quality were the two most important issues reported.
- More than 80% of respondents have adopted an ATS but few have a "talent management suite."
You can sign up for their newsletter here. It's one of the shorter, more digestible ones.
Monday, June 16, 2008
2008 IPMAAC Conference: Presentations
With memories of last week's IPMAAC conference fresh in my head (and what a great conference it was!), I thought I would mention that presentation slides have already started to appear at the website.
Here's a sample of what's already up:
Police recruiting and retention: "It's Showtime"
Implementing an assessment program for executive candidates
And that's just the tip of the iceberg. Expect many more to pop up in the next few weeks.
Interested in becoming a member? Go here.
Here's a sample of what's already up:
Police recruiting and retention: "It's Showtime"
Implementing an assessment program for executive candidates
And that's just the tip of the iceberg. Expect many more to pop up in the next few weeks.
Interested in becoming a member? Go here.
Thursday, June 12, 2008
Unproctored internet testing: Safe for some tests?
One of the biggest trends in personnel assessment is the movement toward on-line testing. Many organizations are experimenting with so called unproctored Internet testing (UIT), where candidates are allowed to take the exams whenever, and wherever, they want.
Benefits? Extremely convenient for the candidate. Less administrative resources needed by the employer.
Costs? Bye-bye exam security, hello cheating opportunities. Not only is your test out for everyone to see, but you have no real way of knowing (sans biometric verification) who is taking the test.
Some organizations have decided the benefits outweigh the risks, and a new study in the June 2008 issue of the International Journal of Selection and Assessment may provide support for their position.
In it, the authors looked at over 800 applicants from nine European countries that took a test of perceptual speed in an unproctored setting, then followed this up with a proctored parallel version. Results? Not only was there no evidence of cheating, they found the opposite effect--people did better in the proctored setting.
Now before everyone throws out their proctored exams, note that this is a type of test that might be hard to cheat on--at least in one way. Because this is a perceptual speed test, there are no "right" answers that can be looked up. It also required very quick responses. So the only way to cheat would be to have someone take the test for you. Implication: it may make more sense to use certain UITs than others.
This topic is a source of much debate in the assessment community, and there is by no means consensus on the right way to go. But studies like this help!
Take a deep breath, because there's a lot more in this issue:
- The preliminary employment interview as a predictor of assessment center outcomes (fascinating look at how the AC may only make sense for mid-range interview scorers)
- A comparison of the common-item and random-groups equating designs using empirical data (for you IRT fans out there)
- The influence of external recruitment practices on job search practices across domestic labor markets: A comparison of the United States and China
- Beneath the surface: Uncovering the relationship between extraversion and organizational citizenship behavior through a facet approach (a more nuanced look at the relationship shows extraversion can predict OCBs)
- Comparing personality test formats and warnings: Effects on criterion-related validity and test-taker reactions (another good one...personality test added predictive validity beyond ability test but no validity difference between forced-choice and Likert scales, nor between warning and no-warning conditions; forced-choice and warnings may produce negative candidate reactions)
- Applicant selection expectations: Validating a multidimensional measure in the military (describes development of a new measure of applicant perception of the selection process)
- Selecting for creativity and innovation: The relationship between the innovation potential indicator and the team selection inventory
Benefits? Extremely convenient for the candidate. Less administrative resources needed by the employer.
Costs? Bye-bye exam security, hello cheating opportunities. Not only is your test out for everyone to see, but you have no real way of knowing (sans biometric verification) who is taking the test.
Some organizations have decided the benefits outweigh the risks, and a new study in the June 2008 issue of the International Journal of Selection and Assessment may provide support for their position.
In it, the authors looked at over 800 applicants from nine European countries that took a test of perceptual speed in an unproctored setting, then followed this up with a proctored parallel version. Results? Not only was there no evidence of cheating, they found the opposite effect--people did better in the proctored setting.
Now before everyone throws out their proctored exams, note that this is a type of test that might be hard to cheat on--at least in one way. Because this is a perceptual speed test, there are no "right" answers that can be looked up. It also required very quick responses. So the only way to cheat would be to have someone take the test for you. Implication: it may make more sense to use certain UITs than others.
This topic is a source of much debate in the assessment community, and there is by no means consensus on the right way to go. But studies like this help!
Take a deep breath, because there's a lot more in this issue:
- The preliminary employment interview as a predictor of assessment center outcomes (fascinating look at how the AC may only make sense for mid-range interview scorers)
- A comparison of the common-item and random-groups equating designs using empirical data (for you IRT fans out there)
- The influence of external recruitment practices on job search practices across domestic labor markets: A comparison of the United States and China
- Beneath the surface: Uncovering the relationship between extraversion and organizational citizenship behavior through a facet approach (a more nuanced look at the relationship shows extraversion can predict OCBs)
- Comparing personality test formats and warnings: Effects on criterion-related validity and test-taker reactions (another good one...personality test added predictive validity beyond ability test but no validity difference between forced-choice and Likert scales, nor between warning and no-warning conditions; forced-choice and warnings may produce negative candidate reactions)
- Applicant selection expectations: Validating a multidimensional measure in the military (describes development of a new measure of applicant perception of the selection process)
- Selecting for creativity and innovation: The relationship between the innovation potential indicator and the team selection inventory
Tuesday, June 10, 2008
EEOC informal discussion letters
If you're an HR professional in the U.S., chances are you've been to the EEOC's webpage many times. You may even subscribe to their feed. But do you know about their informal discussion letters?
These memos are written by EEOC legal staff, and while they are not "official opinions" of the Commission, they offer insight into several important issues. Consider some of the recent letter titles:
- Background checks of peace officers and the ADA (certain documents may be evaluated post-offer)
- Title VII and ADEA: Job Advertisements (you can "encourage" certain groups to apply, but "seeking" them is probably not a good idea; "journeyman" probably okay)
- ADA: Disability-Related Inquiries; Hiring (screening people out based on medical information must be shown to be job-related and consistent with business necessity)
- Title VII: Use of Conviction Records in Hiring (person convicted for auto-stripping could probably be rightfully denied a tow truck license or job)
- Americans with Disabilities Act: Periodic Testing (questionable whether periodic medical exams of all city bus drivers would be legal)
These memos are written by EEOC legal staff, and while they are not "official opinions" of the Commission, they offer insight into several important issues. Consider some of the recent letter titles:
- Background checks of peace officers and the ADA (certain documents may be evaluated post-offer)
- Title VII and ADEA: Job Advertisements (you can "encourage" certain groups to apply, but "seeking" them is probably not a good idea; "journeyman" probably okay)
- ADA: Disability-Related Inquiries; Hiring (screening people out based on medical information must be shown to be job-related and consistent with business necessity)
- Title VII: Use of Conviction Records in Hiring (person convicted for auto-stripping could probably be rightfully denied a tow truck license or job)
- Americans with Disabilities Act: Periodic Testing (questionable whether periodic medical exams of all city bus drivers would be legal)
Thursday, June 05, 2008
GINA signed into law...did anybody notice?
It's not often that we have a new federal statute dealing with employment discrimination. So I was a little surprised that the recent passage and signing into law (on May 21) of the Genetic Information Nondiscrimination Act (GINA) hasn't gotten more press.
Perhaps it's because employers don't see it as a big issue--at least not yet. This is one of the few instances of the law being proactive. Many employers may not see how this relates to them, but consider the details:
- The law prohibits employers (generally as defined under the CRA of 1964) from failing to hire, or terminating, someone because of genetic information. This is what most people think of, and the language is similar to other statutes that prohibit discrimination (e.g., it also includes compensation discrimination).
- The law also prohibits employers from considering genetic test information from family members of the applicant/employee.
- It does NOT prohibit practices that result in an adverse impact based on genetic information. The law does, however, specify that this will be reviewed 6 years from when the law goes into effect, which is November 21, 2009.
- Finally, it prohibits discrimination based on "the manifestation of a disease or disorder in family members of such individual." This may be the least known aspect of the law, similar to the ADA provision that prohibits employers from considering a record or perception of a disability.
Remedies in most situations track those under Title VII, and enforcement of the law will be overseen by the EEOC.
For one of the better summaries, check this out.
Perhaps it's because employers don't see it as a big issue--at least not yet. This is one of the few instances of the law being proactive. Many employers may not see how this relates to them, but consider the details:
- The law prohibits employers (generally as defined under the CRA of 1964) from failing to hire, or terminating, someone because of genetic information. This is what most people think of, and the language is similar to other statutes that prohibit discrimination (e.g., it also includes compensation discrimination).
- The law also prohibits employers from considering genetic test information from family members of the applicant/employee.
- It does NOT prohibit practices that result in an adverse impact based on genetic information. The law does, however, specify that this will be reviewed 6 years from when the law goes into effect, which is November 21, 2009.
- Finally, it prohibits discrimination based on "the manifestation of a disease or disorder in family members of such individual." This may be the least known aspect of the law, similar to the ADA provision that prohibits employers from considering a record or perception of a disability.
Remedies in most situations track those under Title VII, and enforcement of the law will be overseen by the EEOC.
For one of the better summaries, check this out.
Monday, June 02, 2008
Taking a look at VisualCV
Last week I spoke with a rep over at VisualCV. She had been nice enough to put together an employer site for me, so we chatted about the site's capabilities as well as some other details about the company.
What is VisualCV? As you would suspect from the name, the site offers job seekers the ability to create a visually appealing version of their resume/CV.
Here's an example of what a job seeker's VisualCV might look like. As you'll see, users have the option to add pictures, videos, and files, and I can tell you from playing on the employer side of things, it's a simple point-and-click affair.
The website has been open since February and according to the rep already has around 10,000 resumes and profiles of about 350 employers. The fact that the service is free (for now) should help raise those numbers.
In terms of search capability, right now you can only search for people by name. So this would be handy if you already knew someone had a VisualCV, but not much help if you're trying to generate names. The plan is to expand search capability in the coming months.
In terms of contacting individuals, there is no charge for doing so (a big difference from other databases like LinkedIn) unless you were doing something like an e-mail blast.
All in all, definitely worth checking out. And if you'd like to read more, I'm certainly not the first to post about 'em. For more information, check out Joel's and Amybeth's posts.
What is VisualCV? As you would suspect from the name, the site offers job seekers the ability to create a visually appealing version of their resume/CV.
Here's an example of what a job seeker's VisualCV might look like. As you'll see, users have the option to add pictures, videos, and files, and I can tell you from playing on the employer side of things, it's a simple point-and-click affair.
The website has been open since February and according to the rep already has around 10,000 resumes and profiles of about 350 employers. The fact that the service is free (for now) should help raise those numbers.
In terms of search capability, right now you can only search for people by name. So this would be handy if you already knew someone had a VisualCV, but not much help if you're trying to generate names. The plan is to expand search capability in the coming months.
In terms of contacting individuals, there is no charge for doing so (a big difference from other databases like LinkedIn) unless you were doing something like an e-mail blast.
All in all, definitely worth checking out. And if you'd like to read more, I'm certainly not the first to post about 'em. For more information, check out Joel's and Amybeth's posts.
Thursday, May 29, 2008
Predicting turnover

According to many surveys (e.g., salary.com's recent one), these are the types of things people report as primary motivators driving them to change employers.
But these are all factors outside of the employee. What about aspects of employees themselves that might contribute to turnover? We know that people are changing jobs more frequently these days (every 2-3 years in the U.S.), and there seems to be a persistent dissatisfaction among the Gen Xers with their careers, but what about someone's personality? Might there be individual differences between people when it comes to changing jobs?
You bet, according to a new study published in the Summer 2008 issue of Personnel Psychology. After meta-analyzing 86 studies, author Ryan Zimmerman found that personality factors, particularly emotional stability and agreeableness, play a big role in predicting turnover. Emotional stability best predicted intent to quit, while agreeableness best predicted actual turnover.
In fact, personality traits predicted turnover better than did non-self report measures such as job complexity and job characteristics.
Implications? Many initiatives designed to reduce turnover may disappoint because it's not the job, it's the person. The next time you design an exit interview or turnover study, make sure to add this reason for why the person left: It had nothing to do with the job, it was just me.
This also provides more support for using personality tests to predict important outcomes.
...
The other study in this issue we should look at provides some support for all you O*NET fans out there. You know...O*NET? The replacement for the Dictionary of Occupational Titles? Developed by the Department of Labor? A fount of job analysis knowledge? If you don't know it, you should.
Anyway, in this study, the authors used O*NET data to predict literacy requirements across a wide variety of occupations compared to scores on the national adult literacy survey (NALS). Results? O*NET did well--quite well in fact, with correlations around .80.
What does this mean? It means that occupational requirements listed in O*NET just got a big boost in terms of their validity. When it comes to job analysis, don't leave O*NET out.
Tuesday, May 27, 2008
Another member of the blog family
I consider this blog to be part of a fairly small family (with a few notable exceptions, e.g., Selection Matters) in that I focus on trying to put personnel psychology research into layperson terms. But I think I may have found a lost member of my blogging family.
The Association of Test Publishers (ATP) has an I-O Division, and lo and behold, they have a blog! And it's been up since January! I feel so...out of touch.
Anyway, check out some of their recent posts:
- The Validity-Diversity Dilemma (yes, we both blogged about this)
- Input Needed on "Model Guidelines" Revision
- Economic Study of Impact of Pre-Employment Assessment
So, belated welcome to the blogosphere! Like what you see? Pick up the feed.
The Association of Test Publishers (ATP) has an I-O Division, and lo and behold, they have a blog! And it's been up since January! I feel so...out of touch.
Anyway, check out some of their recent posts:
- The Validity-Diversity Dilemma (yes, we both blogged about this)
- Input Needed on "Model Guidelines" Revision
- Economic Study of Impact of Pre-Employment Assessment
So, belated welcome to the blogosphere! Like what you see? Pick up the feed.
Friday, May 23, 2008
How good are you at test accommodation?
How good is your organization at accommodating individuals with disabilities when test-time rolls around?
A recent article in Diversity Executive magazine highlights the work of Certiport, a software certification outfit, and the different test accommodations they offer, including:
- Voice recognition software
- Test assistants/surrogates
- Separate, larger rooms
- Extended test times
The article also points out some disturbing facts, like the 70% unemployment rate of individuals with disabilities and the fact that 2 out of 3 of these unemployed individuals would like to work.
Makes me wonder (for the millionth time) about the true nature of the looming "talent shortage"...
A recent article in Diversity Executive magazine highlights the work of Certiport, a software certification outfit, and the different test accommodations they offer, including:
- Voice recognition software
- Test assistants/surrogates
- Separate, larger rooms
- Extended test times
The article also points out some disturbing facts, like the 70% unemployment rate of individuals with disabilities and the fact that 2 out of 3 of these unemployed individuals would like to work.
Makes me wonder (for the millionth time) about the true nature of the looming "talent shortage"...
Wednesday, May 21, 2008
New blogs to watch
Here are a couple of new blogs to head over and check out:
HR Recruiting Alert; here are a few recent articles:
The other to check out is HR-Worldview; here are some recent articles:
Like it? Here's the feed.
Now if I could just add 5 hours to my day to read all this great stuff!
HR Recruiting Alert; here are a few recent articles:
- Recruitment strategies for older workers
- EEOC warns of bias in 2 common hiring practices
- Who won this case: Objective interviewing or biased process?
The other to check out is HR-Worldview; here are some recent articles:
Like it? Here's the feed.
Now if I could just add 5 hours to my day to read all this great stuff!
Monday, May 19, 2008
B = f (P,E)
One of the most famous axioms in social psychology is what's sometimes called "Lewin's equation" (after the famous psychologist Kurt Lewin): behavior is a function of both the person and the environment. This equation is good to keep in mind when looking at all kinds of human behavior, including recruitment and assessment.
Research presented in the May 2008 issue of Journal of Applied Social Psychology addresses this equation. Let's take a look at it and see if helps us answer an age-old question: What's more important--the observer or what's being observed?
Tell me if this situation sounds familiar. A hiring manager insists on hiring someone based on something they saw in the person's resume (e.g., the candidate graduated from a particular college), even though the person did not do well on a structured, validated assessment. The first study shows that HR is not immune to this phenomenon. In it, HR managers were presented with two types of information about a candidate: preliminary information (like a resume) and performance on an assessment center. The managers were then asked to rate the candidate. Results? Managers were unable to exclude the preliminary information, even though they had better information (the assessment center results) in front of them.
The second article looks at the legitimacy perceptions of promotion decisions and how they relate to information on deservedness (candidate performance) and entitlement (affirmative action). Participants felt that both deservedness and entitlement were related to legitimacy, but there was a gender effect--female participants felt increased resentment when the male candidate was promoted.
The third article is a fascinating take on how people how people perceive discrimination. Specifically, the authors looked at ambiguous situations and the impact of how "prototypical" the person doing the discriminating is. What they found was that the amount of control the perceiver felt they had over discrimination in their lives moderated the influence of the prototype effect. In other words, whether a white male (the prototype) was acting in a discriminatory fashion depended a great deal on the perceiver. Like research on stress, control was found here to have a significant effect on perceptions.
So given these three articles, what's more important--the observer or what's being observed? The research above gives us a clear answer, and one that validates the wisdom of Kurt Lewin: both.
Research presented in the May 2008 issue of Journal of Applied Social Psychology addresses this equation. Let's take a look at it and see if helps us answer an age-old question: What's more important--the observer or what's being observed?
Tell me if this situation sounds familiar. A hiring manager insists on hiring someone based on something they saw in the person's resume (e.g., the candidate graduated from a particular college), even though the person did not do well on a structured, validated assessment. The first study shows that HR is not immune to this phenomenon. In it, HR managers were presented with two types of information about a candidate: preliminary information (like a resume) and performance on an assessment center. The managers were then asked to rate the candidate. Results? Managers were unable to exclude the preliminary information, even though they had better information (the assessment center results) in front of them.
The second article looks at the legitimacy perceptions of promotion decisions and how they relate to information on deservedness (candidate performance) and entitlement (affirmative action). Participants felt that both deservedness and entitlement were related to legitimacy, but there was a gender effect--female participants felt increased resentment when the male candidate was promoted.
The third article is a fascinating take on how people how people perceive discrimination. Specifically, the authors looked at ambiguous situations and the impact of how "prototypical" the person doing the discriminating is. What they found was that the amount of control the perceiver felt they had over discrimination in their lives moderated the influence of the prototype effect. In other words, whether a white male (the prototype) was acting in a discriminatory fashion depended a great deal on the perceiver. Like research on stress, control was found here to have a significant effect on perceptions.
So given these three articles, what's more important--the observer or what's being observed? The research above gives us a clear answer, and one that validates the wisdom of Kurt Lewin: both.
Subscribe to:
Posts (Atom)