Bob Sutton has an intriguing post, A Surprising Study of Infant Mortality Rates: Evidence-Based Management Meets Evidence Medicine. One of the surprising conclusions:
One kind of collaboration was linked to higher mortality rates. When front-line employees became more involved in unit governance — doing things like being involved in decisions about who was hired and fired, the creation of new positions, scheduling, and budget allocation decisions – mortality rates WENT UP.
[…] I would also speculate that the staff who were involved in those decisions might have been distracted from their jobs – taking care of sick little babies – and that in some cases (although they were given lots of information) they may have been given a greater voice in decisions that they lacked expertise about.
It’s critical to consider where and how people collaborate. I bet some of those folks were not interested in budget allocation, especially if budgeting in hospitals is the sham it is in many of the organizations I work with. (The senior managers have already determined the budget. Why do they make managers do more work??)
I don’t know the answer to how much collaboration is the right amount. If the collaboration is about how we as a group work together, what work we do, how do we know when we’re done–that’s all ripe for collaboration. Even for hiring, the team needs to be involved in the interviewing and in the hire/don’t hire decision, but not necessarily about the hiring strategy or the planning, offer, or first day parts. If it’s about paperwork, get someone else to do it. It really depends on who makes the decision.
If the decision is already made, don’t involve the team. If the team can’t influence the decision, don’t involve the team. But if the team can influence or make the decisions, and the decisions affect how they do their work, then that decision is likely ripe for collaboration.
It’s certainly worth thinking about.