A while back I mentioned RecruitmentRevolution, a UK site focused on temporary employment that allows previous employers to input reference scores for use by future employers. Creative idea, if ya ask me. I've often wondered if someday there will be a general database of verified work history that employers could easily check.
Now along comes TalentSpring with a similar idea. This time it's not previous employers, it's peers. TalentSpring uses something it calls a "Merit Score." From the website:
"TalentSpring creates accurate merit scores by using the votes from candidates. Advanced mathematics are used to detect inaccurate votes and remove them while still accurately ranking candidates. The top resume in an industry receives a merit score of 2,000 and the most entry level candidate receives a merit score of 1,000."
How does it work?
"The voting process used to generate the Merit Score rankings is very simple. Voters are shown a series of resume pairs. With each pair the voter is asked which of the two candidates is most likely to be brought into an interview for the typical job opening in this job category. It is that simple - is Candidate A or Candidate B better in this job category. There is no worrying about previous pairs or what resumes are going to show up next. Each pair is considered in isolation."
And who's voting?
"Your resume is voted on by other people seeking to be ranked in the same category you are. Just as you are voting on other candidates in the same job category you are in. Since TalentSpring offers quite a few job categories to choose from, on occasion you may be voting on (and be voted on by) candidates in related job categories. For example, a C++ programming candidate might end up voting on Java programmers."
What about accuracy?
"We know when people are voting outside the "normal" range and remove these votes from the ranking calculations. We think that the ability to accurately vote is a skill that recruiters are interested in because it reflects both your understanding of the position you are interested in and your attention to detail. That is why we calculate and post your voting score as part of your Candidate Overview."
So what do you think? I was disappointed at who's doing the ranking--I assumed by "peers" they meant one's actual co-workers. Now that would be interesting, given that these type of peer ratings are at least partially trustworthy. I wonder how accurate ratings of competing job hunters will be. With no control over subject matter expertise, this relies solely on (what I assume is) statistical abnormalities. Not particularly encouraging. In addition, selecting one person over another for a general job category may prove to be an impossible task, as even jobs in a single category can vary substantially in terms of competencies/KSAOs required.
BUT, that said, it is encouraging to see steps taken in a different direction. If we could just combine some of these approaches, we may be making our way slowly toward a database that employers could feel good about. Of course that means a whole other discussion on rating formats...
Hat tip.
No comments:
Post a Comment