Tuesday, October 31, 2006

Unusual job advertisements south of the border

While anti-discrimination laws in the U.S. are visible and enforced often at significant cost to employers, in Mexico there have been examples of U.S. companies advertising jobs in a blatantly discriminatory manner.

Consider:

- An automotive supplier advertised for a "female ... aged 20 to 28 ... preferably single ... with excellent presentation."

- A Chicago law firm advertised for a real estate attorney -- a male one. According to a firm recruiter in Mexico, clients prefer male attorneys.

Although Mexico has anti-discrimination laws, they are rarely enforced, and lawsuits are long and expensive propositions. In addition, the scarcity of jobs serves to discourage complaints.

Reminds me of job advertisement featured in
Cascio's book from Hong Kong:

"Obedient Young Secretary
Very obedient young woman required by American Director for position as Secretary/Personal Assistant. Must be attractive and eager to submit to authority, have good typing and filing skills and be free to travel. Knowledge of Mandarin an advantage. Most important, she should enjoy following orders without question and cheerfully accept directions. Send handwritten resume on unlined paper and recent photo to..."


Wonder which job analysis procedure they were using.

Monday, October 30, 2006

Rocket-Hire annual survey results

check box
Rocket-Hire conducts an annual survey on the usage of web-based screening and assessment tools. We get a preview of this year's results in an article Dr. Charles Handler wrote for ERE.

Respondents:
136 "people professionals" who read ERE--a mixture of recruiters, HR execs, and hiring managers from a variety of companies.

Some interesting results:
1. There was a direct relationship between size of company and usage of applicant tracking systems with 56% of smaller companies using them but 100% of companies with more than 5,000 employees using them.

2. Compared with 2005, there was a drastic increase in the use of personality measures (65% usage) and a sizable jump for "measures of fit" (53%). I would be curious to see which instruments are being used.

3. 42% of respondents used some measure of cognitive ability--this seems very high to me and I'd be curious to see which organizations were more likely to use these.

4. Only 21% reported using simulations and 16% reported using biodata.

5. The job level least likely to involve on-line assessment was Executive/Vice President. (Just interviews, I'm assuming?)

6. The type of job most likely to involve on-line assessment was Managerial/Supervisory; Professional and Customer Service jobs were also frequently mentioned.

7. The types of job least likely to involve on-line asessment included Retail, Manufacturing/Labor, and Consulting/Advising. Seems like there's a lot of opportunity there.

8. Only 30% of assessment users collect metrics regarding the quality of these tools but those that do are MUCH more likely to see them as having a positive impact on their organization (89% vs. 50%).

9. "Qualifications" (experience, education, etc.) are by far the most popular screening/assessment tool under consideration for future use. Another reason why having
good information out there about the use of T&E measures is so important.

10. The most common obstacles mentioned to the adoption of on-line screening and assessment had to do with lack of faith in ROI. Lots of opportunity here for assessment professionals to
clearly articulate the value of valid selection tools and tie their usage to business goals.

Full results will be available mid-November and can be obtained by e-mailing Rocket-Hire at info@rocket-hire.com.


10 tests to consider

An article on Inc.com describes 10 tests (including cost from select vendors) that are, in their words, "extensively validated, highly respected."

The list includes several that I would also recommend folks consider, including the Watson-Glaser and Wonderlic for cognitive ability and the NEO, 16PF, HPI, and OPQ for personality. (By the way, if you're going to invent a test, make sure you consider the acronym; in fact you may want to come up with the acronym first)

But, as pointed out on
EASI-HR blog you need to think carefully about the situation before running out and buying one. Do you have documentation showing the competency/KSA measured by the test is required for the job? What are you going to do about pass points? These are just some of the questions that need to be addressed.

By no means should that dissuade the use of tests. But like any other high-stakes purchasing decision, conduct a risk assessment and cost/benefit analysis before making the leap. And use resources like
Buros test reviews to check out what the experts have to say.

Sunday, October 29, 2006

Introducing Selection Search

man looking for something
Thanks to Google Co-op there's a new addition to this page.

It's called Selection Search and it's located on the right navigation bar.

What is it? It's a search engine that searches only quality sites that have information on personnel selection and assessment. Sites like IPMAAC, OPM, and SHRM.

Try it--I think you'll like it. And if you have suggestions for websites that should be included, let me know!

Saturday, October 28, 2006

Are you discriminating against obese applicants?

Like height or attractiveness, weight is one of the least-talked-about-but-most-common-judgments that get made about people that impact hiring decisions.

A recent
article discussing the sixth circuit decision in EEOC v. Watkins points out that although the circuit courts are split on the idea of weight being protected under ADA, there are two practical tips every U.S. employer should follow:

1. Remember that even if being overweight or obese isn't protected under the ADA (e.g., because the plaintiff is unable to show it has a physiological cause), being PERCEIVED as disabled is. And we
know that this is the most commonly claimed violation under ADA.

2. Being overweight or obese may be related to another underlying physical condition, so don't assume simply because someone is heavy that you don't have to accommodate them or you are free to discriminate against them.

And I would like to add one of my own:

3. Don't forget about other laws, such as state discrimination laws. Some states (e.g., California) have laws that are broader (for the applicant) than the ADA.

Lessons? Don't make assumptions about what people can and can't do. If you have doubts, ask: "Are you able to perform the essential functions of this job with or without reasonable accommodation?" and ask it of ALL candidates. Or, have each candidate do some type of work sample exercise. And of course make sure the essential functions of the job have been identified and communicated to applicants.

Like all criterion used in making hiring decisions, your decision should come back to: Is it job-related?

Friday, October 27, 2006

Presentations available from 2006 IPMA training conference

Many presentations from the 2006 IPMA training conference are now available .

Some of the highlights:

-
Recruitment: Filling hard-to-fill positions by the always-entertaining Harry Brull from PDI

-
Demographic and market cycles: The economic effects of an aging baby boom generation by Doug Robinson. You have to read this one just to see what the average age of a motorcycle purchaser is (hint: it's not 18).

-
Legal update by Richard Whitmore of the omnipresent Liebert, Cassidy and Whitmore. If you haven't been following legal developments in HR, this is a GREAT summary.

-
Building bridges across generations by Rosie Rodriguez of the University of Connecticut Health Center.

Thursday, October 26, 2006

New TIP issue

woman reading
Okay, so it's taken me so long to post about the new issue of TIP that you could legitimately claim that it isn't "new" anymore. And to you, I say: feh.

I will say this is one of the best issues I've seen in a while. Highlights include:

1. Great
article by Wayne Cascio titled "The new human capital equation" where he weaves statistics about recruiting into a conversation about work life balance and becoming an employer of choice.

2. The always-interesting Frank Schmidt
writes about--wait for it--meta-analysis and cognitive ability. He makes some great points, though, including the fear about discussing group differences, the danger in looking to courts to provide us with answers, and the inability to put research findings into practice.

3. A
too brief piece by Robert Hogan on the relationship between character and personality.

4. A
perspective on the Burlington Northern v. White case by legal guru Art Gutman.

5. The always interesting
Good Science--Good Practice column by Marcus Dickson and Jamie Madigan with a great note from Ken Lahti about his study where they found that in high-volume selection situations (5,000 hires/year) the efficiency ROI gains from unproctored internet testing outweigh the ROI lost from omitting a proctored cognitive ability test. A thought provoking finding with direct application to large employers. Savings plus reduced adverse impact on the front end? Sounds good to me!

6. On a lighter note,
retrospectives from two pioneers in the field--Bob Guion and Paul Sackett. Two gems from Guion: "Career choices are often more the result of circumstances than of careful planning" and "There are more ways, Horatio, to solve organizational problems than are dreamt of in your philosophies and theories." Makes me think of LA Story.

7. Last but not least, some
results from the 2006 SIOP member survey. Things that stood out for me: SIOP needs to do a better job at being THE trusted authority in the area and being a strong advocate to policy makers. Just a couple thoughts out of the gate: e-newsletters, certification courses, test reviews, and more press releases.

Enjoy!

New IPMAAC conference reimbursement policy

For anyone that's wanted to go to IPMAAC conference but doesn't have the dough, they just implemented a new policy (see section C-6) whereby presenters are given a break on fees. Should help a little for those of us that aren't independently wealthy.

Monday, October 23, 2006

Google points the way toward good hiring practices


Not content to rule search, Google is conducting an applied class in smart HR by adjusting their hiring practices to focus more on job-relatedness and candidate service.

Historically Google has put candidates through the wringer, with up to six interviews (not uncommon for high-tech) and long wait periods between interviews and the final word. They've also focused heavily on academic credentials--GPA and college degrees. Earlier this year, they brought in a new HR director who is shaking things up and helping the company hire more volume but retain quality.

After a survey of current employees and linkage to performance measures (read: criterion-related validity study), they've made some changes:

1 - Fewer interviews. Average has gone from 6.2 at the beginning of the year to 5.1 in June.

2 - Targeted interview feedback based on pre-defined factors.

3 - Multiple scores rating a candidate's KSAs.

4 - Biodata (e.g., "Have you ever turned a profit at your own non-tech side business?"). BTW, many of the survey questions were biodata as well, such as age they first used a computer, how many foreign languages they spoke, and how many patents they have. Would love to see those correlations! Wonder how they broke down the jobs...

5 - Personality (e.g., how assertive the person is)

6 - Work style (e.g., prefer to manage others or do work yourself?)

7 - Higher standards. Google is looking for overqualified people. (Take that,
City of New London, CT )

The head of HR gets it: he says Google tries "to strike the right balance between letting candidates get to know Google, letting us get to know them, and moving quickly."

Thanks to
EASI blog for the heads up.

Updated: Is your website accessible?

woman with hand to ear
Here's a case you may not be familiar with. Target is being sued in federal court for violating the Americans with Disabilities Act (ADA) and two California laws because their website is not accessible to the blind.

On September 6, the judge
ruled that the class action suit could go forward and that the laws do apply to the website.

Target failed to set up its website so screen-reading software could vocalize text. The site also contains inaccessible images and graphical features that are not accessible.

I'm not an expert in website accessibility, but it does not sound like it is terribly difficult to implement this technology.

Is your website accessible? Are you unintentionally turning away qualified applicants with a disability?

UPDATE: Information about how to make your blog accessible to the blind can be found
here . Tips for how to make websites more accessible can be found here . The most thorough resource on making websites accessible to people with disabilities can be found here .

The costs of assessment


Recently, Todd Rogers over at ERE wrote an article about the "cost" of assessment. The article isn't about ROI formulas or utility analysis, but focuses on turning off high quality candidates with bureaucracy and a mountain of application forms and requirements.

Yes, to the extent possible, assessment should be streamlined. And candidates should be given good reasons why they're being put through the paces. But let's not confuse content with process. Let's break things down:

1) Remember folks, EVERYTHING we do to narrow down the candidate pool is, legally, an "assessment." This includes minimum qualifications, phone interviews, resume reviews, etc. We're not just talking applications and written tests.

2) Good assessment takes time.
We know from studying thousands of candidates that assessments that are put together well and gather significant information plain work better. Unstructured interviews and casual phone screens have NOT been shown to be reliable and they are definitely not as legally defensible as structured interviews or targeted skills testing. They will work every once in a while, just as phrenology might, but over the long haul are not the best tools in our toolbelt. Not only that, but for employers sifting through thousands of applicants they're not going to have individual conversations with all of them.

3) That said, there's no reason to put candidates through hell (other than because we're evil test developers). On-line testing, good ATS packages like
Neogov , and clear job previews and requirements help things run smoothly.

4) A rigorous assessment process, in addition to likely being more valid and defensible, sends several messages to candidates: we've done our homework, we know what it takes to succeed in these jobs, and being hired here means something. Far from being a turnoff, people usually respect this and are likely to spread the word, probably to those passive candidates we're all after. You don't have to be Microsoft or Toyota to establish a reputation as a selective employer.

5) The more you allow someone to tell YOU about THEM, the greater the risk of falsification and spin. On-line profiles are fine and dandy, but caveat emptor.

6) An example is given in the article about a friend who routinely "beats" a drug test. This tells me the organization isn't doing their job in making sure the tests are valid--it says nothing about the usefulness of a good drug screening process.

Last, let's be careful with our terms. The phrase "psychological test" is used to refer to anything from a traditional interview to the
MMPI , but most people in the assessment field would think of the latter, which is a very different type of selection mechanism and involves different legal considerations . Finally, very few people go through anything approaching "psychoanalysis" as part of a hiring process, so in general let's avoid that term unless we're talking about individual interviews for public safety positions or the like (and even then it makes most folks think of Freud).

I'll be the first to agree that lengthy forms doth not good assessment make. But rather than throw the baby out with the bathwater, let's focus on defining the competencies we need, reaching out to the most qualified, marketing ourselves properly, and putting together a solid selection system. In the end, that will have the biggest payoff for both the organization and the job seeker.

Saturday, October 21, 2006

Blue Chip Expert



Comes now Blue Chip Expert , which according to Business 2.0 , is "A MySpace for Job Seekers." The site focuses on free agents or contract workers, particularly "high end" such as software engineers and creative directors. When an employer successfully hires through Blue Chip, the site gets a (not insignificant) cut but also gives everyone in the referral chain a piece of the pie. So if I'm in the database and I get Mo Rocca to enter his data, and XYZ Company hires Mo, I get a portion of the contract.

Strikes me kinda like a pyramid scheme. No, scheme's not the right word. Pyramid plan. No, wait. Structured social networking career pathing. That's it.

All kidding aside, this is a great example of niche recruiting efforts mixed with Web 2.0. This is the future of things, and it remains to be seen how the current big players (Monster, etc.) respond.

Friday, October 20, 2006

Economist issue highlights "the search for talent"


The October 7 issue of The Economist included a special report on "the battle for brainpower." You can read the front article here .

The article, as you would expect from The Economist, focuses on international aspects of talent search. Some gems I gathered:

- A recent
Corporate Executive Board (CEB) survey of international senior HR managers found that 62% worried about company-wide talent shortages.

- Hiring managers in another CEB survey reported that candidate quality has declined by 10% since 2004 while average time to fill has increased (to 51 days).

- While China has twice as many engineering graduates as the U.S. only 10% are "equipped to work for a Western multinational."

- Talented people, in many cases, need organizations less than organizations need them, particularly with the current job market and the shift in the means of production to computers and knowledge work.

- Candidates are taking advantage of websites like
vault.com to research organizations and get the "real scoop." (Expect this trend to continue and for smart organizations to take advantage of this trend by offering more authentic information of their own)

- Paying a premium for stars has a (not so hidden) benefit: it attracts other talented individuals.

- Boosting your current worker's employability goes a long way toward enhancing your reputation and becoming a magnet for talent.

On this last point, the articles repeatedly emphasize that your actions post-hire, such as training and performance management, are as important, if not more important, to sustaining an organization's long-term ability to attract, hire, and retain talent.

A good reminder that organizations are systems and smart recruitment and assessment considers the entire
spectrum of HR management.

Wednesday, October 18, 2006

Practice guidelines for personnel selection?


I was listening to a piece on NPR this afternoon that got me thinking about best practices in personnel selection and assessment.

The news story was about the apparent undue influence that drug manufacturer Eli Lilly had on the creation of what are called
"practice guidelines" that are essentially best practices for the diagnosis and treatment of medical conditions.

This started the wheels turning...why aren't there any guidelines for personnel selection?

Yes, I know about the
Uniform Guidelines . But (1) they're very much out of date, (2) they're not concise, and (3) they're not written for the lay HR person. This may be somewhat unfair criticism, as they were not meant to be "personnel selection for dummies", but them's the facts.

Yes, I also know about
SIOP Principles . But many of the criticisms leveled about the Uniform Guidelines can also be made in reference to this document: It's not easy to read and it doesn't serve as pracical guidance for your average HR person. Not was it meant to.

So where does that leave us? With a lot of books in the business section on how to hire, hiring the best, etc. etc. Most of these books are quite lengthy and cover way too much.

What I want is something people can (1) easily understand, and (2) point to as THE current, professionally agreed upon set of standards for planning, developing, and administering good assessment instruments. And there are two documents that come close to my vision.

The first is the Department of Labor's
guide on testing and assessment . It's easy to read, very well organized, and relatively brief. And it accurately summarizes best practices and even manages to weave in a little research. Very nice.

The second is SIOP's
FYI on testing . Also well organized, easy to read, but perhaps a bit lengthy.

Now neither of these documents is required reading. No one is going to check to see if you're testing in accordance with them. But we should be. We should be holding selection processes to a standard, and one that pretty much the entire profession agrees upon. We should put something out there that says: don't use a three point un-anchored rating scale when rating an interview response. Do a job analysis--don't argue, just do it--and here's how it needs to be done. And here's how you write a good multiple choice test.

Where is this document? Where is this help for the lone HR person trying to figure out if they're doing things the right way? And why haven't we put something together by now?

Video games and assessment


Everyone has ideas for things they'd like to do one day and they just haven't gotten around to it yet. For me, one area I'm interested in is the application of video game technology to recruitment and assessment.

This is
not a new idea but it deserves much more attention that it receives. Why? Because (1) more than 45 million American households own video game machines, and (2) it's a technology with boundless potential for engaging students, teaching competencies, offering realistic job previews, and assessing job-related competencies.

The Federation of American Scientists just released a
report on the value that video games can play in education which is discussed in this AP article .

As far as I know, the only organization doing serious work in this area is the Army (not coincidentally they also have a great recruiting website), who distributed a "first person shooter"
video game (free, no less).

It's time we moved beyond static recruiting and situational videos and seriously start to explore how interactive computer simulations can assist with recruitment and assessment efforts. I'm all eyes/ears if anyone knows of other applied work in this area.

Tuesday, October 17, 2006

OPM's Hiring Toolkit

Been to the Office of Personnel Management (OPM)'s website recently? They have a great resource called the Hiring Toolkit that is an example of how to do HR right. It's very easy to navigate but contains all the essential information. It takes advantage of technology to present information but doesn't stray from the fundamentals.

Bravo. Can't wait to see this website develop as more resources are added.

The recruitment and assessment team


Recently Lou Adler posted an article on ERE about how to create the ideal corporate recruiting team.

Great idea, I think, with one major reservation.

Only one position is devoted to assessment--the "Assessments leader", devoted to "skills-based" assessments. None of the other roles appear to cross over with traditional assessment activities, such as job analysis or developing job knowledge tests.

So in the spirit of building a fantasy sports team, and with an acknowledgement to Mr. Adler for suggesting that a large amount of resources need to be devoted to recruitment and assessment, I suggest my own assessment roles. In line with Adler's proposition, these do not have to be performed by one individual, and in larger organizations you may have more than one person devoted to each role. (In fact, if you pigeon hole people into these roles they may burn out)

Each of those roles requires the following competencies:
1. Reading comprehension
2. Written communication
3. Analytical skill
4. Knowledge of assessment best practice
5. Knowledge of relevant laws and rules
6. Creativity
7. Working well with a variety of individuals
8. Facilitation skill
9. Interviewing ability
10. Attention to detail

That said, I present my fantasy team:

1. Job Analysis Expert - this person should be familiar with a variety of ways of conducting job analysis that meet the requirements of the
Uniform Guidelines as well as the needs of the organization.

2. Written Test Developer - Although on-line training and experience measures seem to be the current golden child, written measures of job knowledge, situational judgment, biodata, and personality (to name a few) aren't going away. This role requires fluency in item analysis.

3. Interview Developer - Interviews are the most commonly used form of assessment, as well as the most accessible by stakeholders. This person needs to work well with hiring supervisors and be able to create a variety of questions and rating methods.

4. Simulation Developer - Simulation, work sample, performance, high-fidelity--whatever you want to call them, these tests share with interviews the one-two-three punch of typically working well, being accepted by candidates, and having less adverse impact than written tests. This role requires a lot of creativity, flexibility, and a certain amount of risk taking.

5. Systems Specialist - The marriage of IT and assessment is here to stay. This person needs to remain constantly on top of the latest developments in applicant tracking and computer/internet-based assessment tools. They must work hand-in-hand with the rest of the team to help decide how and when to use technology to enhance assessment activities--and when not to.
6. Statistical Expert - Every assessment team needs as least one person who can speak fluently about item analysis, regression, correlation coefficients, standard error, significance testing, and factor analysis (to name a few). And it's not enough to know these things--you have to be able to communicate them to lay audiences in a way that doesn't scream "ivory tower."

7. Reference checker - Some people are good at checking references, some are not. A good reference checker is assertive, excels at listening to others, and is very perceptive.

8. Outreach specialist - This person is responsible for preaching the gospel of sound assessment and doing things like attending conferences, making presentations, and producing newsletters. Role-specific skills include oral presentation skill, agreeableness, openness to experiences, and assertiveness. A primary outcome is establishing the reputation of the assessment group to attract highly qualified individuals and reinforce the value of their work.

9. Recruitment Liaison - Recruitment and assessment functions are often treated separately when they are fundamentally intertwined. This person communicates constantly with the recruitment side of the house to talk about needs for planning purposes, gathers customer satisfaction information (both internal and external), and is available for consultation.

10. Trainer - Last but definitely not least. The best trainers have both content knowledge and the ability to engage and teach their audience. This person must be able to present information in a way that is accessible and usable. They must have skills in the area of curriculum development and be fluent with presentation technology. A sense of humor is a must, as is the ability to constantly gauge the audience and switch directions when needed.

I think that about covers it. All we need now is an assessment olympics.

Sunday, October 15, 2006

Fun with salary statistics


Yes, I just used 'statistics' and 'fun' in the same sentence. Don't let that stop you from reading on.

An
article came out recently that talked about using U.S. Bureau of Labor Statistics information to compare salaries across location. According to the article, nurses get paid the most in San Jose, CA. Correctional officers make the most in Vineland, NJ.

"Well, sure they get paid a lot in San Jose--it costs a lot to live there," you might be saying. (You might also be saying, "What are you doing blogging on a Sunday?" or "Go Tigers!") Yes, this is true, and it's what I thought too. But surely there must be a way to compare jobs across the country taking cost of living into account...So I proceeded to attempt to find a tool to do just that...and failed.

So I started thinking...

What would you need to do to figure this out? Well, first you'd need salary information by occupation by geographic region. Fine, the BLS has
data we can use to do that (click on "Metropolitan Area Cross-Industry Estimates"). And because we're smart cookies , we'll use median annual salary rather than mean because medians give a better picture of the "typical" number.

Next, we need cost of living data. You can get this from several places, including Yahoo, but I prefer
Sperling's Best Places which offers all kinds of nifty information broken out by city.

Now all we need to do it match them up. Here's where our first challenge comes in. Cost of living data is broken down by city, salary data by metropolitan area. So what I did was average (and here I used the mean) the cities that make up that metropolitan area. No, it's not perfect, but it'll do.

Second challenge: how do we use these numbers? There are lots of ways, but as I always say, when in doubt, divide something by something else. In this case, I divided median annual salary by cost of living. This gives us some sense of the "true" salary when compared to where the job is. The higher the cost of living, the lower the number.

So what happens when we do this? Well, first of all, we wonder why someone hasn't done this before. But then, we do a sample job to see how this all works out. For my example I chose (purely at random) "Human Resources, Training, and Labor Relation Specialists." Let's start with how BLS displays the salary data. When we look at the BLS results, and these are available for
all occupational groups , we see that salary is highest in these locations:

1. Framingham, MA ($80,070 mean annual wage)
2. Bridgeport-Stamford-Norwalk, CT ($76,200)
3. San Francisco-San Mateo-Redwood City, CA ($71,900)
4. Durham, NC ($69,260)
5. Wilmington, DE ($68, 210)

What happens if we use medians? Pretty much the same results, except the #5 spot goes to Hartford, CT and Wilmington slips to #7. Must be some HR specialists in Wilmington making bank.

Finally, what happens when we adjust median salary for cost of living? Let's call that the Bryan Index. Here are our top contenders (which I limited to approximately the 30 top salary cities--this stuff is time consuming, after all):

1. Niles-Benton Harbor, MI (median salary $63,150, Bryan Index of 842)
2. Durham, NC ($69,260 and 700)
3. Kennewick-Richland-Pasco, WA ($60,390 and 647)
4. Longview, WA ($62,750 and 644)
5. Wilmington, DE ($68,210 and 640)

So Durham and Wilmington hang in there--they have a good combination of high salary and relatively low cost of living. But the list has three new contenders, who are there because their cost of living is so competitive. Framingham drops to #10 due to its high cost of living.

This type of information is very useful, in many ways. Aside from being a great source of salary survey data, they help us answer this question: Are folks really underpaid or is that just the perception? Not to mention: Wow, we pay pretty well all things considered--are we doing anything with that information, like, I dunno, advertising it?

This is usually where someone says, "Hey Bryan, there is something that does this. Just go to www.youdidntgooglehardenough.com" Oh well. I had fun with statistics.

Friday, October 13, 2006

Blogger ed


Another reason to use blogs: they can be an effective educational tool.

According to an
article in the Seattle Times teachers are using blogs more and more to supplement instruction, organize material, communicate with parents, encourage discussion, and promote deeper learning.

Results?
1 - More engagement, at least on the part of some students and parents.
2 - Enhanced writing skills.
3 - Kids use current technology, helping to prepare them for the workplace.

How is this relevant to recruitment and assessment?
1 - Practical application of writing skills using technology results in more qualified talent pool.
2 - Use of technology for self-expression may result in more positive perceptions of computer-based testing.
3 - Focuses kids on creating intelligible and appropriate writing, not TM-speak (c u l8r!)
4 - Use of blogs leads to expectation that others will do the same, such as employers. This is turn puts pressure on organizations to create blogs--and to do them right.

With all the controversy surrounding education, it's just plain nice to see some solutions that work.

New issue of Human Performance


The latest version of Human Performance (v.19, no. 3) is out and has some articles relevant to assessment.

Before I get into 'em, allow me to express some disappointment that all but one of these was conducted using college students. That said...

The
first , by Nicholas Vasilopoulos and others focuses on forced-choice personality tests. The authors found, among other things, that individuals with high cognitive ability scores were able to inflate their scores (applicant v. honest condition) on forced choice scales, while individuals scoring lower on cognitive ability were not. Also, only in the forced choice/applicant condition were personality scores (using Goldberg's 300-item measure ) predictive of GPA. Lastly, personality scores did not add very much to the prediction of GPA after accounting for cognitive ability. Bottom line? Smart folks can figure out forced choice personality tests. These personality tests may not get you a whole lot if you are already measuring cognitive ability. Also, they may result in higher adverse impact than you might anticipate.

The
second , by Todd Thorsteinson, looked at the effect of framing when judging item difficulty. Mirroring a task similar to the Angoff procedure , he had participants in one condition rate the likelihood that an average test taker would get the item correct; those in the other condition rated the likelihood the item would be answered INcorrectly. (Note he used "average" not "minimally qualified") Result? Those in the first condition (most similar to the traditional Angoff method) produced lower critical scores (i.e., judged the test to be more difficult). Is this good or bad? Probably good since courts generally favor lower pass points when there is any doubt. But a reminder that how we frame tasks for our stakeholders matters.

The
third , by De Meijer et al. examined ethnic score differences on a cognitive ability test, personality test, assessment center, interview, and final employment recommendation among a sample of applicants to a police academy in The Netherlands. Results? Score differences between majority group and first-generation minority groups were comparable to previous findings (e.g., d around 1.0 for cognitive ability) but score differences between the majority group and second-generation minority groups were much smaller (about half). On the cognitive ability and personality tests, most of the difference was explained by language proficiency. Implications? Efforts to increase language proficiency may pay huge dividends in reducing adverse impact traditionally associated with some of the most valid forms of assessment. Whether this is provided by our educational system or by employers is probably one of the biggest issues facing society.

The
fourth , by Buhner et al. provides additional support for the idea of using working memory tests, particularly in predicting multi-tasking performance. Interestingly, the authors found that if the goal is to predict speed, measures that target "coordination" (building relationships between information) work best, whereas if the goal is to predict errors, measuring "storage in the context of processing" (what we usually think of as short-term memory) works best.

Heady stuff for a Friday! Have a great weekend!

Thursday, October 12, 2006

The candidate experience

Attended a very good webinar this morning (along with nearly 300 other people) called "The candidate experience: Black hole or north star?" put on by Gerry Crispin and Mark Mehler from careerxroads.com . It was good because it was short and very focused.

The subject? A candidate's perception of applying for jobs with your organization over the Internet. The presenters outlined what they believe to be the most important aspects of the candidate experience:

1. Target: Does the job seeker feel that the employer is looking for THEM? Does the bulletin/ad/etc. describe in detail the type of person they're looking for?

2. Engage: Does the job seeker understand why they should join the organization and why people stay? Does the site profile key employees?

3. Inform: Does the site provide information the candidate will want? Can they see data on, for example, the number of female engineers currently in the organization and how many are in senior management positions?

4. Respect: Does the site outline privacy policies clearly? Is the candidate respected, or treated like a piece of data?

They brought up some other very good points, such as the fact that only 2/3rds of applicants to the "100 Best Companies to Work For" received acknowledgements of their application, that many acknowledgements are canned pre-programmed versions, and that candidates rarely receive the information they're really after (i.e., did I get rejected?).

The
white paper that addresses the same issues is also a good read. So rarely do we see things from "the other"'s perspective and think about the impact that perceptions and emotions have--a small but important example is the web of negativity that can result when someone has a bad application experience. Chances are they know people, who know people, who know people, and so on, and so on.

Bottom line for me: Embrace technology, but don't forget the basics of treating people with respect and fairness.

What not to wear...HR edition

In every couple there are shows that one person likes and the other doesn't. One such show for my wife and I used to be "What not to wear" a reality-type show on BBC America (and elsewhere) in which two fashion experts do makeovers of British women. I say "used to" because recently I've gotten hooked.

While much of the show is devoted to an analysis of what the woman wears, how she wears her hair, etc. the heart of each episode is the psychological analysis that the two hosts perform on their victim, er, guest. The person is deconstructed (including standing in front of a mirror wearing only their underwear) and provided with brutally honest feedback regarding their body (although always focusing on a person's strengths) and their fashion tastes. Not surprisingly, after a drastic wardrobe, hair, and makeup change, the person looks 100 times better. The more interesting part of the show is watching the emotional reaction these reserved women have when seeing themselves and observing the reaction of others.

So this got me to thinking...when in life do others provide us with this type of brutally honest feedback? Sure, if you had a life coach or personal shopper they might perform this type of service. But what about in the work world? A career coach might do something similar. But what about thinking bigger picture? What if an organization brought someone in to offer this type of analysis, on a voluntary basis, for its workforce? Imagine an assessment expert going in and offering a bevy of high quality selection instruments for anyone interested in feedback. The consultant compiles the results then provides VERY honest feedback to individuals--"You need to work on your reading comprehension ability"--mixed of course with high scores and how to benefit and sustain those as well.

Could this work? Has it worked? Would this just be a recipe for misuse by the organization? (not if results were confidential)

Wednesday, October 11, 2006

9th Circuit rules against UPS


Yesterday the Ninth Circuit Court of Appeals (covering the states of Alaska, Arizona, California, Hawaii, Idaho, Nevada, Oregon, and Washington) ruled in favor of a class of workers applying for "package-car" drivers with United Parcel Service (UPS).

Essentially, the court ruled that UPS failed to show that a "qualification standard" used by UPS in its screening process--requiring applicants to meet DOT hearing standards mandated only for drivers of vehicles with a gross vehicle weight of 10,0001 pounds or greater--was job related and consistent with business necessity as required by the Americans with Disabilities Act (ADA).

This is consistent with other rulings in the realm of safety (e.g.,
Chevron v. Echazabal ) that select-out decisions must be based on an individual assessment of an individual--it's not enough to make general claims about people with disabilities not being able to be as safe as those without disabilities. Of course it didn't help that UPS was unable to find a study acceptable to the court that provided statistical evidence that deaf or hard of hearing individuals drove significantly worse than hearing individuals.

There's a lot in this case and it's worth a read. I'm looking forward to reading other people's (read: lawyers) interpretations, but here's some goodies I pulled out:

- Unlike individual or pattern-and-practice discrimination suits, this case involved a facially discriminatory policy. As a result, burden-shifting protocol was not necessary and the only question is whether the employer discriminated based on presence of the disability (which they admitted to doing).

- Although the ADA prohibits discriminating against "qualified" individuals, the parts that speak to employer defense make no mention of "qualified." In the context of this case, the result was:

"...when a plaintiff challenges a categorical "qualification standard," the plaintiff does not have the burden of establishing that that qualification standard excludes "qualified individuals with disabilities." Rather, to establish statutory standing, the plaintiff has the burden of establishing that she meets other qualifications, unrelated to the challenged standard. In addition, the plaintiff has the burden to prove that the challenged qualification standard "screen[s] out or tend[s] to screen out an individual with a disability or a class of individuals with disabilities." S 121129(b)(6). The burden then shifts to the employer to establish the business necessity defense."

- Employers attempting to defend such a safety standard must show, per the
Morton v. UPS decision that "substantially all [individuals with the disability] present a higher risk of accidents than [non-disabled applicants]."

- Content-validity evidence (e.g., SME judgments of KSA importance) are insufficient in this type of case. The court said:

"Neither the subjective bleiefs of deaf drivers nor the ways in which deaf drivers compensate for their hearing loss was considered. In addition, [this study is] not relevant to the question at hand, as it merely demonstrated the existence of "hearing critical" tasks but did not establish a link between those tasks and safety."

UPS was using a standard that was never intended to be applied to its package delivery trucks. They were called on it and were unable to come up with evidence to support it. So now they have an injunction (supported by the 9th circuit) that requires them to come up with "some form of individualized assessment" to make more refined decisions. Stay tuned true believers, the story may not be over, as it could be appealed to the Supreme Court.

IPMAAC accepting conference proposals


IPMAAC has posted its Call for Proposals for next year's conference in St. Louis. The title of the conference is "The Gateway to Excellence in Assessment" and is being held June 10-13. The IPMAAC conference is, for my money, the best value for scientists-practioners. Next year's conference will feature talks by some of the best minds in the field, including Wayne Cascio, Nancy Tippins, and Bob Hogan.

Monday, October 09, 2006

Employers of illegal immigrants get ICEd


U.S. Immigration and Customs Enforcement (ICE), an investigative unit within the Department of Homeland Security, announced that two temporary labor companies, the president of these companies, and two of their corporate officers pled guilty today in U.S. District Court to conspiring to provide hundreds of illegal aliens to work for a national air cargo firm.

Immigration laws are the only ones relating to hiring (that I know of) that can result in criminal penalties, and this case is a good example. Although the judge has yet to determine the actual sentence, the maximum penalty for each individual is 10 years in prison and a fine of $250,000. The companies face a maximum punishment of five years probation, and fines of $500,000 or twice the gain they received from the crime. The plea agreement for the president of the two labor companies also calls for him to forfeit $12 million, which represents the proceeds of the crime and the property used to commit the crime.

Ouch. Make sure you follow
best practices when hiring. This includes possibly participating in DHS's Basic Pilot Program which allows employers to use SSA and DHS databases to electronically verify employment authorization.

Grab 'em when you see 'em


When was the last time you gave your card or contact information to someone who provided you with excellent service?

This is a recruiting technique I like to call "grab 'em when you see 'em" and has been used
very effectively .

This technique is something that is very familiar to most recruiters, but probably not to assessment specialists--after all, isn't our job to test people, not get them in the door?

Yes and no. While assessment folks are expected to know how to analyze jobs and create tests, they are part of the greater HR community and should have their eyes and ears open at all times for talent. Assessment units have a tendency to slip into their own world, to their detriment as well as the organization's. Only by truly integrating themselves into the fabric of the organization can they truly be effective. Reaching out to potential employees is one way to get you there.

Friday, October 06, 2006

Modular testing is the way to go


One of the best presentations I went to at this year's IPMAAC conference was made by Eric Palmer from the City of Fort Worth, TX .

In the presentation (which unfortunately isn't up but Eric is very good about sharing), titled "Modular Testing: Cheaper, Faster, and Smarter Assessment", Eric described a system that I think every mid- to large-size organization should investigate.

Modular testing focuses on testing candidates once, for a wide range of KSAs, rather than making people test each time a job comes up. The candidate decides which test modules they want to take--examples might include Reading Comprehension, Analytical Skill, Mathematics, etc. In the City of Fort Worth, candidates end up what they call a "KSA profile" which lists their standardized scores in each category.

When an opening comes up, the system knows (having been previously programmed based on job analysis information) to pull out candidates based on their scores in the system. A clerical job might require high levels of Typing and Word Processing knowledge. An analyst position might require higher Analytical and Written Expression scores. Point is, the system knows to spit out a list of the most qualified candidates based on their scores compared to the position-specific KSA profile.

What a great idea. It's not new, but Eric showed an example of how something like this works in practice. How come more organizations, particularly in the public sector, aren't doing this? I know the
State of California is investigating doing this for its IT jobs but I don't hear many people talk about it. This is, IMHO, where we should all be.

Thursday, October 05, 2006

vFlyer - creating free attractive job announcements


This is just so dang cool I don't know what to do with myself.

Joel Cheesman over at Cheezhead
brings our attention to a little website called vFlyer . vFlyer is in the business of taking your advertising content and turning into something pretty.

"What's so cool about that?" you say.

Well...

1) Most organizations create really, really, really, ...really... ugly ads. So it's helpful just to have a template to get the creative juices flowing.

2) They will automatically post your job bulletin to a variety of job boards.

3) It's very easy to navigate--someone obviously put some effort into the interface.

4) You can customize the fields that you want shown.

5) You can easily upload photos to show on the bulletin.

6) ...wait for it....it's free.

Check it out.

Wednesday, October 04, 2006

SIOP Leading Edge Consortium

Feel like going to Charlotte, North Carolina? In, say, 3 weeks? I've got just the thing.

SIOP's
Fall Consortium is titled "Talent attraction, development, and retention" and has some very interesting speakers and topics lined up, including:

- Innovative practices among some of the "best companies to work for", including presentations by folks from Genentech, Nike, Microsoft, and Starbucks

- Strategic talent management, including presentations from PepsiCo and Johnson & Johnson

- Emerging practices from Bank of America and leading research institutions

- A keynote from the VP and Chief Learning Officer of Home Depot

aaaaannnnddd....

- Two presentations on recruitment and assessment over the Internet

Looks pretty great! If you go, I'd be happy to post your observations.

Fairness in selection


So I got the Autumn issue of Personnel Psychology a few days ago and have had time to digest it.

One of the articles (by Schleicher, et al.) is on fairness perceptions of the selection process--specifically, an individual's opportunity to perform (OTP) (i.e., having the opportunity to demonstrate your competencies). What the authors found (using a fairly large sample of applicants to a U.S. government agency) is that OTP was strongly related to how fair the applicants felt the overall selection process was. As we know, procedural fairness perceptions are related to all kinds of important things, like intention to accept job offers, recommending the employer to others, and likelihood of filing a complaint. In short, how you treat job applicants
matters .

So how can we increase OTP? Some of the authors' recommendations include:
- Include performance tests
- Use both structured and unstructured interviews during selection procedures (Ed: just make sure you can defend them)
- Provide adequate time and resources and prevent environmental distractions
- Tailor assessment tools to the likely candidate population (Ed: ditto)
- Provide clear instructions before the assessments.

Seems like basic stuff, but it's amazing how many hiring processes fail to follow these fundamental guidelines.

Correlation graphing tool

Michael Harris, over at EASI blog has provided a great reference for situations where you need to illustrate what a correlation coefficient means--to students, stakeholders, whoever.

The tool can be found
here and allows the user to model N values ranging from 5 to 100 and r values from -1.0 to +1.0.

It also allows you to generate as many samples as you want to illustrate how points might be distributed AND analyze the leverage that particular data points have on the coefficient AND allows you to move each data point and see the impact! Check it out.

Tuesday, October 03, 2006

Tyson Foods settles OFCCP discrimination claims


Tyson Foods, headlining recently with the U.S. Supreme Court decision in Ash v. Tyson Foods is in the news again.

This time it's due to a
settlement with the Department of Labor's OFCCP (Office of Federal Contract Compliance Programs).

The OFCCP, which enforces anti-discrimination laws for certain federal contractors, claims to have found several instances of discrimination--both gender and racial--after conducting compliance reviews from 2002-2004.

So what's the cost? $1.5 mil in backpay. But wait, that's not all. The company is also hiring 476 class members and undertaking "extensive self-monitoring measures for two years."

Let this be a lesson to everyone covered by Executive Order 11246--don't count the OFCCP out. Particularly in light of the recently issued
Internet Applicant rule and the impending creation of a database that will make it much easier to track employers covered by EO 11246.

Supreme Court refuses to hear appeal of damages

On Monday, the U.S. Supreme Court issued an order denying petition for review by the defendants in an age discrimination (ADEA) lawsuit.

In this case, the EEOC successfully sued Sidley Austin Brown & Wood, an international law firm, for downgrading and forcing out partners under an age-based retirement policy (based on reaching age 65).

The firm appealed the 7th Circuit decision, which upheld the EEOC's right to seek individual relief (money, reinstatement) even though none of the individual partners filed claims of discrimination.

Upshot? Another win for the EEOC and a reminder that employment decisions, whether to hire, promote, demote, or terminate, should be based on the person's ability to do the job, not their chronological age.

Polygraphs upheld for FBI and Secret Service applicants


U.S. District Judge Emmet Sullivan ruled Monday that the FBI and Secret Service may continue to use polygraph tests to screen applicants.

Most private sector employers are prohibited from requiring applicants to take a polygraph under the
Employee Polygraph Protection Act but federal, state, and local government agencies are exempted.

This story may not be over, as the case could be appealed, but I wouldn't hold your breath.

Sunday, October 01, 2006

Behavioral or situational questions?


Whether to use behavioral or situational interview questions is a source of constant debate. Behavioral questions are all the rage, yet research has consistenly shown that both types of questions, when part of a structured interview, have value.

The September issue of the Journal of Occupational and Organizational Psychology reports the results of a
study that looked at which type of question predicted performance for a sample of managerial positions (N=157).

Results? Behavioral questions bested situational questions (r=.32 versus r=.09). This result has been found before--namely for higher level positions, behavioral questions tend to predict better. Why? One reason is the constructs measured using these types of questions are a better match for the KSAs you're looking for in managers (e.g., personality traits). It also makes intuitive sense--the more experienced someone is, the more likely they are to have a history of experiences they can draw on that illustrate their decision making style, supervision style, etc.

My take: both types of questions can work. Look to your job analysis to determine what type of question makes sense given what you're looking for. Don't be afraid to mix up question types, and as always, combine interviews with other types of assessment.