Should Technology be at the Heart of IT?

When working with senior IT management a while back an example served to highlight different approaches to managing the IT function, or at least sub-components of it. What appeared obvious is the influence of the background and experience of management as people tend to play to their strengths. The example was a discussion around a new position that had been created in the IT governance team relating to process and ITIL.
 
What struck me about the approach was that the technology was already embedded, and the person was being recruited to manage the technology and build process around it. In other words technology was at the core of the business objective with administrative and procedural components being mapped around it. Is this the right way to approach IT management? The obvious risk is that this may render the situation at the mercy of the given technology, and should the technology be replaced within the organisation this will require a refocus on skills and possibly infrastructure or other components radiating outwards. Documentation and other sound governance elements will also require a revisit.
I don’t believe this model to be entirely wrong – there may be instances where building a function around a technology are advantageous, especially when core business objectives revolve around the technology.
 
From a maturity, reuse and risk management perspective I believe the process should be at the core of the function with technology acting as an enabler, especially given the current rate of change. I mention risk because in the Information Security space this is analogous to the development of an InfoSec Policy, which starts at a sufficiently high level to be technology agnostic and then to plug in current and future technology at lower-level standards and sub-policies. If technology comes before process then the latter constantly has to be updated as the technology matures.
 
So, is the decision to place technology or process at the centre due to the skills/experience of management driven by those more in touch with the technology (or those who shout the loudest), and is it destined to be tactical rather than strategic?
Advertisements

The price of 24/7 uptime?

Many sectors of the economy, especially those supported by technology (ecommerce or other) are under pressure to increase service delivery. The proliferation of access and mobility means clients want to access their products/accounts/services when they want at any time of day or night.

100% (expected) uptime has an impact on operational tasks – in the past it was ok to take systems down during planned maintenance time frames to reboot, resolve issues, install patches, etc. How does an organisation continue to perform essential maintenance when clients could be accessing the systems at any time? Technology solutions are available, and more accessible in the redundant and virtual world. However the pressure still remains, which could have an impact on the governance of operational tasks such as change management. In order to keep a system up, one may feel the need to reduce the level or depth of testing prior to implementing a change. This should be a key area of focus for those monitoring or auditing the system development and change management processes.

What are the symptoms of such a scenario? We could analyse issues and incidents that affect a system and see if the root cause is related to changes. For example a high number of emergency changes right after a scheduled change may indicate the planned changes were not adequately designed or tested. Requiring staff to work longer hours to resolve issues is one thing. But when the client experience is affected, this is more serious. We will wait to see if further details arise, but it could be that issues experienced this weekend by the National Australia Bank could be related to this pressure:

http://www.news24.com/World/News/No-cash-after-computer-crash-20101127

Combined Assurance, or just collaboration?

In South Africa the King III Report and Code was released just over a year ago. The Code talks to corporate governance and in this latest release places an emphasis on IT governance (which is good – we’ll come back to this later).
Another catch phrase to gain prominence from the Report is ‘Combined Assurance’. The concept isn’t new, but in these economic times when efficiencies are sought to manage costs, it is not surprising that the role of the assurance providers are being scrutinised. Those that don’t know better may feel that the likes of Internal Audit, Risk (Operational, Market, Credit, etc) and Compliance are fulfilling very similar functions, or at least appear to do so from an execution perspective. If business perceives this to be the case then they are most certainly going to ask more questions about alignment and avoiding duplication. Enter Combined Assurance. This is the phrase business has been looking for to tell the assurance providers that it is time to spend more time talking to each other and less time talking to their hard-working staff. Correct on one level, but risky on another. This example is oversimplified of course – I am  setting the stage for future discussions on the topic to explore it from various angles.
From a pure risk management perspective the idea of Combined Assurance makes sense – collaboration is key to improving the effectiveness of a risk management model enabled by a clear definition of responsibilities, dependencies and relationships. In the IT space the concept a multi-layered line of defence (or ‘defence-in-depth’) strategy has been around for some time. The objective is less around efficiency and more around ensuring that in the event of a control failure, or if something falls between the cracks, then another control will be in place. This is vital when dealing with a constantly evolving threat landscape although may not directly address the issue of independence.
A risk is that too much focus on efficiency may ignore effectiveness. By concentrating on how better to avoid duplication something may be missed, or we an additional layer or monitoring could be removed that really needs to be there. Perception may be the only thing that needs to change, or not. But this topic is certainly worth exploring. Business asking questions should not be the only driver – it makes logical business sense.