Thursday, March 15, 2007

Q&A #4: Joe Murphy

This is the fourth in a series of Q&As I'm doing with thought leaders in the area of recruitment and selection.

This edition features Joe Murphy, VP and Co-Founder of Shaker Consulting Group. Brief bio: Joe is a former principal at SHL USA, Inc. and co-founder of Olsen, Stern, Murphy & Hogan, Inc. He has published articles in a variety of professional publications and presented seminars on staffing metrics at a number of conferences. Joe is very active in the recruiting space (e.g., ERE) as well as an active member of SHRM, having conducted the 2004 survey of assessment practices and a national web cast on Quality of Hire.

I've been meaning to post about Shaker's Virtual Job Tryouts for a while because I like what I've seen, so this gives me a chance to highlight it. Joe's thought-provoking responses are longer than previous Q&A'rs, but well worth the read.

(Note: as always, some links are provided by yours truly)

BB: What do you think are the primary recruitment/assessment issues that employers are struggling with today?

JM: Education. Both for consumers of assessment products and providers of assessment resources. The retailer SYMS uses the phrase: “an educated consumer is our best customer.” This is also true in the field of assessment.

Just as recently as last year a VP of HR asked me if using assessment was legal in the US. I was a bit shocked.

I walked the exhibit hall at the EMA conference in San Diego and the SHRM conference in DC last year and stopped into many booths that offered some form of testing or assessment. I asked a few simple questions that I thought might trigger a good dialogue and allow me to gauge the quality of the resource. I varied my probing based upon the responses I received.

  1. What do you do to establish job relevance of your assessment?
  2. What steps do you take to conform to the uniform guideline on employee selection procedures?
  3. Describe your approach to job analysis.
  4. What criteria have been used in your validation work?
  5. Do you have a sample validation technical report I could examine?
  6. What norm groups do you have?

With the exception of a few providers, those questions brought blank stares or “What do you mean?, type responses. One of my favorite responses was: “Well we didn’t use prison inmates like one firm did.”

My take aways were:

  • Many assessment providers have no meaningful results or best practices to share
    Training personnel working in an exhibitor’s booth on knowledge of assessment fundamentals is not valued
  • And last, but most telling - Assessment providers do not expect a well educated assessment consumer

I have been involved in objective candidate evaluation practices as a user or provider for over 25 years. From my experience HR practitioners have limited knowledge regarding the use of objective candidate evaluation methods. This is most evident when exploring one word: Valid.

Validity is not an easy construct to wrap your arms around. However, many assessment consumers fall prey to the generic offering “Our test is valid.” It never dawns on many to ask: “Valid for what?” The answer to that question is far more important than just being valid.

I strongly recommend that any one considering the use of assessment go to a nearby university and take two courses: one on Tests and Measures and another on Personality Theory or Measurement. These courses will raise the caliber of users significantly.

BB: What is an example of an innovative or creative recruitment/assessment practice that you've seen recently?

JM: A focus on the nature of the candidate experience has been leading companies to be innovative. There are a number of companies that have taken the resume out of the front end screening and candidate evaluation process. The resume is then only used as context for the structured interview. By using scorable applications or biodata questionnaires, companies have become more objective at putting candidates into “Yes” and “No” piles. This also sends a nice message: “No resume needed to apply here.”

At TalUncon [Ed: Joe's link] in Redwood City, CA Gerry Crispin was talking about competency based interviews captured on video. A candidate could receive a set of questions, record their responses and post them to a YouTube like location for the recruiter. This raises questions about what is really being evaluated, but is certainly can provide side-by-side comparisons.

We like to think our Virtual Job Tryout™ is out on the leading edge. This approach to objective candidate evaluation transforms the candidate experience into an interactive, informative two-way exchange. The candidate learns about the job by performing work samples, and making choices on how they might handle specific interactions. Recruiters learn about work style and job related competencies, all from one experience.

BB: What is an area of research that you think deserves increased attention?

JM: Three things have been gnawing at me recently:

  1. Attitudes and beliefs around un-proctored, on-line assessment.

I believe industry has to accept that there are sound and effective methods to administer and interpret un-proctored assessments.

  1. Why do organizations place such low expectations to be measurement oriented on HR?

I believe we hide behind the term “soft costs” as an alternative to doing the rigorous work of developing the infrastructure and discipline of measurement required for the task at hand

  1. How can the misguided opinion of one executive prevent an organization from realizing the gain available from objective candidate evaluation methods?

I believe we need to find better methods of documenting and presenting the benefits of assessments for executive audiences.

Here are three examples that illustrate my points.

One of our clients has the enviable, yet challenging scale of a 500 to 1 applicant to hire ratio. Their current methods of screening have been deemed both unfair and un-scalable. Unfair because the recruiter wrestles with looking at 50 candidates and makes a hiring decision, and wonders what talent was in the 450 with only a resume in the data base. Un-scalable because their growth plans simply make this semi manual process and the scope of needed recruiters unreasonable. The marketplace demands a faster and more objective data gathering process from the candidate. Relying on resumes searches or forcing the candidate into an on-site experience adds steps and negates cycle time reduction initiatives.

I heard a participant at a conference on HR Metrics openly state she went into HR because she was “bad with numbers.” The unfortunate part is that being light on that competency was OK with the company’s executives. The term “soft costs’ shows up a great deal in HR discussions. Some might say that term is used because you can’t measure it. If that is the case, will your boss mind if you double your soft costs next year?. There are sound methods to make the intangible more tangible. At the other end of the measurement continuum, one recruiter I met at an EMA conference stated her company captured and reported on 27 variables that made up their cost per hire figure. It can be done.

The now retired CEO of one of a company had a bad experience with an assessment center for executive development early in his career. As he rose through the ranks, he was able to forbid the further use of assessment in the organization. This edict lasted several decades. Last year, objective candidate evaluation was implemented at a new plant start-up. The financial implications were described by the VP of HR as one of the highest ROI projects in the company last year. Objective candidate evaluation has economic value.

BB: As someone who has their own consulting business, have you noticed any changes or patterns in the types of requests you're getting from clients?

JM: I see two camps. The novice user is still seeking the silver bullet. They want the omnibus assessment that can be used for many positions and predict any job behaviors. This is exasperated by the providers who lead under-educated HR practitioners to believe such an assessment exists.

Practitioners with a passion for process improvement get excited about taking measurement more seriously and seek more job relevant approaches to objective candidate evaluation. Project teams with six sigma training see the application and value of assessment clearly. The experienced user has achieved some degree of success and has learned that the more job relevant the assessment, the more accurate and valuable the results. Sound analytics help a user sees both the limitations and value from assessments.

BB: Have you read any books or articles lately that you would recommend to the professional community?

JM: I am in the middle of reading Susan Conway’s The Think Factory (Wiley 2007) [Ed: Joe's link]. Her mission is to bring Lean and Six Sigma practices to Information Worker (I Worker)process models. In reading it from a recruiting perspective (one of my major frames of reference) I continually see how formal and objective data gathering methods such as job analysis, assessment, and predictor-criteria correlation analysis fit right into her framework. The hiring process is a perfect example of her definition of I-Work: gathering and/or transforming data and making decisions,.

I have been pushing forth the concept of measuring the financial impact of waste and re-work in staffing processes for years. Conway addresses these issues, re-work in particular, as being significant drains on profitability. The high turnover in many entry level jobs causes re-work to the tune of millions of dollars. Yet company presidents have looked me in the eye and said, “That’s just the way it is in our business.” Responses like that change quickly when someone owns the budget for staffing waste.

Susan also addresses issues of efficiency and effectiveness in a manner that reinforces the real value driver is the later. While cost per hire and cycle time factors are important to manage, the high gains come from increases in effectiveness. This translates into making hiring decisions based on candidate data that correlates to performance outcomes.

I can’t wait to finish the book and apply a range of her assertions and models in communicating what we do. The alignment is fascinating.

BB: Is there anything else you think recruiters/assessment professionals should be focused on right now?

JM: I conducted a small survey (N = 558,) with SHRM in February of 2004. 85% of respondents stated they had not conducted an ROI analysis of their selection practices. This forced me down two paths:

  1. Why do organizations allow so many resources to be invested in a business process without demanding some measure of return or contribution?
  2. Why do staffing and HR professionals jeopardize their credibility by not holding themselves accountable with sound economic measures of performance?

The most common metrics in staffing are based upon cost and time. When you report cost, you get pressure to make it smaller. In essence budget constraints on staffing functions might be directly attributed to the nature of reporting.

I suggest, if not compel recruiters to partner with other in-house disciplines such as finance and process improvement. Use these combined resources to focus on value metrics, and invest in the rigor and discipline of capturing business process data that can be used to document the contribution of recruiting to the top line and the bottom line.

If a tree falls in the woods and nobody hears it, does it make a sound?

If a recruiter adds value but nobody knows it, does it make a difference?


Indeed. Thank you Joe!