The competencies required in the modern workplace change seemingly on a daily basis. Whereas computer skills were once a "nice to have" competency, now at least moderate computer proficiency is a "must have" for many (if not most) jobs. Whereas the ability to collaborate frequently with others may have been an infrequent requirement, now it is a prerequisite for many occupations.
Many times these changes are required to take advantage of new technologies and to keep up with competitors, and for the most part the transition works. On the other hand, sometimes we plan for one change in competencies when we should have gone another direction.
An excellent demonstration of this is the military. In this article from The Economist titled Armies of the future, the author describes the vision of former U.S. Defense Secretary Donald Rumsfeld:
The army's idea of its “future warrior” was a kind of cyborg, helmet stuffed with electronic wizardry and a computer display on his visor, all wirelessly linked to sensors, weapons and comrades. New clothing would have in-built heating and cooling. Information on the soldier's physical condition would be beamed to medics, and an artificial “exoskeleton” (a sort of personal brace) would strengthen his limbs.
At first, with initial successes in Afghanistan and Iraq, it seemed this vision would be validated. But as the nature of the warfare changed, so did soldier requirements. The current person in charge of the war in Iraq, General David Petraeus, has co-authored a new manual on counter-insurgency. According to the article:
Counter-insurgency, [the manual] says, is “armed social work”. It requires more brain than brawn, more patience than aggression. The model soldier should be less science-fiction Terminator and more intellectual for “the graduate level of war”, preferably a linguist, with a sense of history and anthropology.
This has huge implications for how the military will recruit, assess, and train soldiers, and we can argue about whether this shift could have been predicted or not, but there are some clear lessons here:
1 - Job analysis is important, and don't just do it once and put it in a file. Not that we needed any more evidence of this, but the example above vividly demonstrates how critical it is to carefully study a job (and keep studying it) to inform recruitment and assessment. Of course simply doing a job analysis doesn't guarantee success, as I'm sure there was a not-insignificant amount of thought that went into the "future warrior" idea.
2 - Look before you leap. Plan for how you will select people and train existing staff. The article doesn't go into this in depth, but one big implication for such a significant shift in competency requirements is what to do with existing staff. We've all been there--we implement a new software program, we require everyone to start writing or presenting more--and sometimes it works, sometimes it doesn't. Is our success due to the skill level of existing staff? Our careful planning? Or dumb luck?
When it comes to ensuring people are successful at what they do, fortunately we don't have to rely on luck. But we do have to devote time, resources, and careful attention to doing recruitment and assessment right. The consequences are important--whether they're saving lives or helping an organization be successful.