Enterprise architecture started as a happily unreconstituted techy activity. When that didn’t always work, a certain Maoist self-criticism kicked in, with an exaltation of “the business” above all else, and taboos on even thinking about IT. Today’s Open Group sessions threatened to take that reaction to its logical extreme. Fortunately, it didn’t quite end up that way.
The trouble with realising that getting anywhere with IT involves changing the rest of the organisation as well, is that it gets you out of your assigned role. Because the rest of the organisation is guaranteed to have different perspectives on how it wants to change (or not), what the organisation’s goals are and how to think about its structure, communication is likely to be difficult. Cue frustration on both sides.
That can be addressed by going out of your way to go to “the business”, talk it’s language, worry about its concerns and generally go as native as you can. This is popular to the point of architects getting as far away from dirty, *dirty* IT as possible in the org chart.
So when I saw the sessions on “business architecture”, my heart sank. More geeks pretending to be suits, like a conference hall full of dogs trying to walk on their hind legs, and telling each other how it’s the future.
When we got to the various actual case reports in the plenary and business transformation track, however, EA self-negation is not quite what’s happening in reality. Yes, speaker after speaker emphasised the need to talk to other parts of the organisation in their own language, and the need to only provide relevant information to them. Tom Coenen did a particularly good job of stressing the importance of listening while the rest of the organisation do the talking.
But, crucially, that doesn’t negate that – behind the scenes – architects still model. Yes, for their own sake, and solely in order to deliver the goals agreed with everyone else, but even so. And, yes, there are servers full of software artefacts in those models, because they are needed to keep the place running.
This shouldn’t be surprising. Enterprise architects are not hired to decide what the organisation’s goals are, what its structure should be or how it should change. Management does that. EA can merely support by applying its own expertise in its own way, and worry about the communication with the rest of the organisation both when requirements go in and a roadmap comes out (both iteratively, natch).
And ‘business architecture’? Well, there still doesn’t appear to be a consensus among the experts what it means, or how it differs from EA. If anything, it appears to be a description of an organisation using a controlled vocabulary that looks as close as possible to non-domain specific natural language. That could help with intra-disciplinary communication, but the required discussion about concepts and the word to refer to them makes me wonder whether having a team who can communicate as well as they can model might not be quicker and more precise.
I tend to agree on the issues of “Business Architecture”. As an EA, I find that more often than not i am also a management consultant, working organization, structural and operational issues within the “business”.
I point out to people that EA is a self repairing process, (Properly done, which is isn’t always). And as such improves upon it’s understanding of the organization. This level of discipline, tracking, management and modelling doesn’t exist within most business units, in fact the concepts of same are typically lost on “Business Managers”, well, if not the concepts, then certainly the practice.
I am reminded (aren’t anedotes wonderful). Of working for a Telco, whose main line of business was GSM (Mobile phones). We were discussion with the business, Business Continuity and its implications. Their “core systems” did three functions, Billing, Activations of new services, and Customer Service on existing accounts. They were 100% focused on “Billing”, because, as I recall, they billed $50M a week. I pointed out that this was flawed thinking. How so? Their “feeder systems, that picked up call details etc from the telephony network, were resilient, dual site clusters, in different locations to the billing (and the exchanges kept files for several billing cycles anyway). So the probability of losing the “billable data” was zero, the real issue was loss of the platform on which to process it. Rather than spending several million on a warm standby for this, the suggestion was that an agreement be put in place with a) another business unit using similar hardware, or even the vendor of the patform to “fast track” a replacement from Manufacturing, when needed.
Why, because, knowing that the data was there, and the ability to process it could be secured, the only issue on billing was that it would be delayed. Ergo, the Billing cost would be the cost of delayed revenue, effectively the “interest” on $50M for a week or two, Much, much less than $50M, in fact about $50,000 if my memory serves me. However, compare this with the cost of “lost activations”. (Activation could still occur of this system was offline, with engineers logging directly into the HLR (Home Location Registers) – effectively the authentication servers of the telephone network. But, there was no back channel to propogate this back into the billing/activation system (in fact there was no ability to audit the differences per se, until we conducted an audit and found thousands of discrepancies, i.e. people that ultimately weren’t being billed for the service they were getting (and I am sure they would be rushing to tell us).
The “new activations per day, was in the order of 1200-1500. The target spend for the like of the contract per client was around $2000. So, for each day, that we couldn’t activate (cleanly) we would be losing $3M in revenue (Can’t activate? Client walks down the street to competitor never see that revenue, ever… – worse, the word gets out, and panic ensues amongst existing clients). Now, hot standbying activation versus billing were orders of magnitude different in terms of processing power. We were “Dual siting” the activation component (in the DR Plans) separate from the Billing, and there was then a panic about the “Bandwidth necessary” to keep two sites in Sync. When you consider that 1500 activations (and maybe another 3000 account changes) a day, was one event every 20 seconds or so, and that the links to the telephone network to implement these changes were 9600 baud lines, the actual rate of data change was miniscule (compared to the 2G pipes that the “business” thought they needed. (Age old – tell me what you need, not what you think you want…)
The bottom line? It’s not always the “big money” that is most important, and as highlighted, assuming the people that are responsible for that big money see the even bigger picture, is often flawed thinking.
Hi Wilbert,
Interesting post. Your observation that:
“Enterprise architects are not hired to decide what the organisation’s
goals are, what its structure should be or how it should change.
Management does that.”
is certainly true of many organizations, but it may not apply to universities as a whole.
In my experience, managers of support service units within the university do have a lot to say about the structure of their units, the role definitions (job descriptions) of staff who work for them, the mission of their units within the organization, and so forth.
But university administrators do not have much say about the role of faculty. Although I’m going outside my direct experience here, I would say that this is also true of many professional organizations (law firms, hospitals) where the professional employees have learned to play their role through many years of schooling prior to entering the workforce.
It would make sense, therefore, if technocrats in some industries are given free reign to define and automate the roles of some employees. An example of this might be the Automatic Teller Machine in the banking industry.
On the other hand, I shutter to think what the response of research faculty would be if would be technocrats at the university tried to introduce the idea of the Automatic Professor Machine!
And, in most research universities the faculty have the political clout to protect their interests since they outrank IT technocrats in the university’s pecking order.
Unfortunately, the professoriate may be able to resist even good ideas for change because they feel their interests are threatened by technocrats with expertise in areas relevant to the objective of improving instruction and/or reducing its cost, areas such as IT, cognitive psychology, educational psychology, and so forth.
Of course, entrepreneurial educational startup firms may be in a position to challenge and disrupt the incumbent institutions with innovative new technologies, new business models, and new job roles. We may be seeing this today with the rise of MOOCs.
In any event, if you’re interested in more on how different organizational forms respond to innovation, check out my blog post on:
Mintzberg’s Taxonomy of Organizational Forms
http://innovationmemes.blogspot.com/2012/11/mintzbergs-taxonomy-of-organizational_24.html
Cheers,
Fred
Hi Fred,
Universities – especially large research ones – are indeed a special case. They seem more like loose conglomerates that sort of share a brand and maybe a couple of services, than a single organisation with a clear governance structure.
As you say, many groups within those universities have considerable clout, and a bunch of IT guys pretending to be business people, coming in to rationalise organisational structures in the name of efficiency seems a somewhat Quixotic undertaking. They can do it within their own domain, but the trouble with IT is that it is infrastructure that reaches all parts of the organisation…
The Mintzberg taxonomy article looks interesting- very much in line with the cybernetics that colleagues at CETIS are working with!
Thanks!
Wilbert