Trench warfare

One new experience I've had recently is IT operations being split slightly unusually in that technology specific departments have their own IT staff (and knowledge repositories), but still a centralized IT function exists which performs internal network management, new desktop provisions and so on. There's been obvious issues visible which are to be expected, but more interesting to me is the more subtle knock-on effects. I should state here that by "change" I'm not speaking of altering production systems, I'm referring more to deployment of new systems into production and the changes needed for that to happen. DNS record additions, IP allocation, firewall changes, etc.

There are of course, obvious issues. The tribalism caused is obvious, and one group causing issues for the other leads to anyone wishing to solve their immediate problem simply flowing around the roadblock. Rather than trying to battle jointly against the problems at hand, infighting can take over and as each person stakes their claim to be recognized as the go-to point of knowledge to the exclusion of all others of course, communication breaks down.

To see these things happen is not a surprise, but having first hand experience of it now has shown some aspects of the outcome which were to me quite unexpected.

Because resistance from the other party occurs for all changes regardless of significance, the knock-on effect is that each change is blown up to be larger than it actually is. This causes a tendency to dig-in to a stance which isn't yet defensible. By that i mean that when unduly challenged that an change is even needed, it's easy to find that you have to nail your colors to the mast - even if the idea is only partially expected to work. The ability to try new things suffers, as you end up in a situation where ideas put forward need to be 100% guaranteed to work. The issue is that often in IT there are no guarantees, and doing meaningful tests and trail runs are also hampered by the same opposition groups. The net result is that if any sign of weakness that the idea put forward is not immediately perfect then it likely will not be undertaken by the other party.

As an IT function, this is in no way healthy. Part of being agile as an IT business function is having the ability to roll forwards with change knowing that not all solutions are created perfectly first time or work without tweaking. Indeed, often more time is spent trying to ensure all things are perfect first time than would be spent initially failing and resolving the issues found. Introducing unneeded barriers to very minor changes to infrastructure results not in an improvement of the final solutions put forward, but to a bullishness behind potentially bad ideas. The outcome being that the initiating party finds it easier to just flow around the roadblocking party by outsourcing or writing exceptions to overcome pointless push-back. This creates fragile infrastructure which has many temporary fixes in place to problems which should never have occurred in the first place, and wouldn't have been solved so badly if experimentation was allowed for in the initial development.

It's not just the initial parties that suffer either. If an outside entity is somewhat trusted for their opinion by both parties then they can be drawn into a dispute to act as some sort of judge and jury. In all likelihood they will not be getting all the facts in the matter, and are effectively forced to pass judgment out of hand. Often, to avoid being dragged in they will simply try and make a quick decision which they hope both parties will respect by simple virtue of the fact that it wasn't the other party that came up with it.

Again, I'm not saying you should not test your solutions before attempting to deploy them, but deploying to production should not be a roadblocking point of contention on the release cycle caused by tribalism.

blogroll

social