client/server — striking a balance

6
June 1995 Computer Audit Update Access rights: to central data 4 42 84 to other data 18 79 24 Integrity: of central data 5 34 86 of other data 13 84 18 No-one % Data Owner/ End User % Central Computer Service % Able 3: Perceived responsibilities for aspects of data management. (Multiple choices were posslore. Tables do not add up to 100%) References 1. 2. 3. KPMG Management Consulting reported in Computing, London, 2 July 1992, p.7. National Computing Centre, poll of members, reported in Update, March 1992, p.3. Doswell, B.,‘Down-sizing Disaster Recovery’, private paper to the Contingency Planning Special Interest Group of the Computing Services Association, London 1992. Keith Hearnden is a Senior Lecturer at the Centre for Hazard and Risk Management, Loughborough University. The research on which this article is based was commissioned by IBM (UK) Ltd and the Computing Software and Services Association. 0 Copyright Keith Hearnden 1995 CLIENT/SERVER - STRIKING A BALANCE John Edler Client/Server architecture has been strongly promoted in recent years by the vendors of computer hardware and software. Whilst there are many benefits to adopting this approach, there are some drawbacks that are not obvious at the time of systems purchase and project initiation. The effects of these drawbacks also grow with the system and it is therefore particularly important to have a balanced view if an investment in such a system is to live up to expectations. In most organizations, general managers are influenced by three drivers to change from conventional systems architectures. The three main sources of pressure are competition, as more organizations use IT to successfully achieve competitive advantage, technical staff whose interests are fed by technical innovation and vendor marketing which is discussed in more detail below. The message frequently delivered to general management by all of these sources is that the price of computer hardware has dropped dramatically which means that more sophisticated things can be achieved through IT. Client/Server is promoted as one of the techniques that has come of age now that sufficient processing power is affordable and available. The case is supported by the argument that we have progressed from managing data and now manage information and that Client/Server empowers end users by making information readily available. This article considers the apparent advantages of Client/Server, the apparent disadvantages and draws on a case study for the advantages and disadvantages actually 01995 Elsevier Science Ltd 7

Upload: john-edler

Post on 05-Jul-2016

214 views

Category:

Documents


2 download

TRANSCRIPT

June 1995 Computer Audit Update

Access rights: to central data 4 42 84 to other data 18 79 24

Integrity: of central data 5 34 86 of other data 13 84 18

No-one

%

Data Owner/ End User

%

Central Computer

Service %

Able 3: Perceived responsibilities for aspects of data management. (Multiple choices were posslore. Tables do not add up to 100%)

References

1.

2.

3.

KPMG Management Consulting reported in

Computing, London, 2 July 1992, p.7. National Computing Centre, poll of members, reported

in Update, March 1992, p.3. Doswell, B.,‘Down-sizing Disaster Recovery’, private

paper to the Contingency Planning Special Interest

Group of the Computing Services Association, London

1992.

Keith Hearnden is a Senior Lecturer at the Centre for Hazard and Risk Management, Loughborough University. The research on which this article is based was commissioned by IBM (UK) Ltd and the Computing Software and Services Association.

0 Copyright Keith Hearnden 1995

CLIENT/SERVER - STRIKING A BALANCE

John Edler

Client/Server architecture has been strongly promoted in recent years by the vendors of computer hardware and software. Whilst there are many benefits to adopting this approach, there are some drawbacks that are not obvious at the time of systems purchase and project

initiation. The effects of these drawbacks also grow with the system and it is therefore particularly important to have a balanced view if an investment in such a system is to live up to expectations.

In most organizations, general managers are influenced by three drivers to change from conventional systems architectures. The three main sources of pressure are competition, as more organizations use IT to successfully achieve competitive advantage, technical staff whose interests are fed by technical innovation and vendor marketing which is discussed in more detail below.

The message frequently delivered to general management by all of these sources is that the price of computer hardware has dropped dramatically which means that more sophisticated things can be achieved through IT. Client/Server is promoted as one of the techniques that has come of age now that sufficient processing power is affordable and available. The case is supported by the argument that we have progressed from managing data and now manage information and that Client/Server empowers end users by making information readily available.

This article considers the apparent advantages of Client/Server, the apparent disadvantages and draws on a case study for the advantages and disadvantages actually

01995 Elsevier Science Ltd 7

Computer Audit Update June 7995

experienced over the first two to three years of a sizeable project.

Clearly there are any number of vendors who will put the case for Client/Server and this paper attempts to redress the balance. By definition therefore it has been necessary to focus on some of the difficulties and in so doing it is the author’s intention to put all of the issues before the would be investor, not to dismiss the benefits that this approach can deliver.

The market dynamics

In considering any ‘new’ approach in IT, it is worth bearing in mind that change is almost always driven by product vendors. An analysis of the history of computing shows that the price of computer hardware is always dropping and that the cost of a given unit of computing power halves roughly once every three years. This is undoubtedly the result of competition and by and large, benefits the customer. Given this situation and the growth aspirations of the vendors however, they are under considerable pressure to introduce any and every technique that will consume more processing power. Change is essential to the vendors’ businesses and whilst this may often be in their customers’ interests it is important to bear in mind this pressure when considering Client/Server or indeed any other new introduction.

Client/Server - a model

In recent years we have been asked to chose between the convenience of PCs which lack serious data processing power and the ability of centralized systems to process and control large volumes of data but at the expense of flexibility and an hospitable environment for users of the system.

In the Client/Server model we have two distinct elements, the client system and the server system and these are usually, but not always separate pieces of hardware. The strengths of Client/Server come from the way in which processing is shared by these two environments.

A transaction is split into two elements. The front end of the transaction resides on a PC and makes use of the friendly features that it provides whilst the number crunching part of the transaction and volume data management take place on a more powerful system.

As systems users make requests on the front end client system, it passes these on to the

server. The server produces a result and the response is sent to the client system which in turn formats the response and passes it to the user.

It is important that in a true Client/Server based system we have two live executing

programmes awaiting requests and sending responses. It is here that Client/Server offers

most of its advantages and it is also from here that most of its drawbacks arise. At any time both systems are open. This means that all security access and control on these systems is left open and therefore unused in the interests of freedom and speed of access.

The advantages of Client/Server

The case for Client/Server put by product vendors revolves around the following advantages:

l Reduced hardware expenditure.

The effect of using PC hardware for a substantial part of the job is that hardware is purchased in much smaller increments than a centralized system. PCs also deliver front end processing (e.g. graphics) much more cost-effectively than large platforms.

l Reduced development costs.

A further effect of using PCs is that software such as Windows provides an excellent tool-kit for developing front ends. Similarly, because the server system concentrates on processing and managing data, tasks to which it is ideally suited, reduced development costs for server based systems are also achievable.

01995 Elsevier Science Ltd

June 1995 Computer Audit Update

’ Reduced development time.

For the same reasons as development costs can be reduced, so too can development times. This can be particularly valuable for applications delivering significant competitive advantage. Where new systems deliver a competitive edge benefits to the organization are greatest during the early life of the system and therefore influenced by the speed with which implementation can be achieved.

’ Improved usability and end user productivity.

Clearly the ability to use PC front end software as a means of access to systems is an encouragement to experienced PC users.

l Empowerment of users.

Client/Server provides powerful data processing which is independent of the front end software. Using this capability behind flexible tools such as spreadsheets gives substantial power to users of these systems. In addition two users may utilize the same processing capability on the server but use entirely different front ends of their own making to achieve different effects.

. Enables business process re-engineering.

Client/Server uses a point of familiarity, the PC, as astable point in change. By using a known front end, new systems can be introduced rapidly and with minimized disruption.

These are the benefits put forward by vendors to promote the case for Client/Server. It is not their role to put forward the shortcomings.

The apparent drawbacks of Client/Server

The potential drawback usually identified before the start of a Client/Server project is security. The concerns can be summarized as follows:

Server access controls are neutralized.

This is a function of freedom of access. There

are a number of compensations built into the more comprehensive network software products,

l Multiple systems are available to each user.

In a Client/Server environment, users can have access to several different servers and systems. This requires careful management if the advantages are not to be lost in a melee of different logons, passwords etc.

an

Multiple users have access to a system.

This is the complement of the point above. In environment where many different access

methods are used, this adds a layer of complexity to the requirements of server based systems.

l Transaction integrity.

As no one piece of software carries out an entire transaction, recovery from an unplanned halt can be very complex and requires careful design to ensure that complete and incomplete transactions can be identified and recovered.

l Systems backup.

It is vitally important that there is a comprehensive strategy for backing up data.

l Development strategy.

The imposition of a formal development strategy does go some way to dulling the freedom and flexibility that Client/Server promises. Allowing end users complete freedom does however pose a significant risk and a pragmatic balance needs to found.

Advantages actually achieved

Experience suggests that many of the apparent advantages of Client/Server are actually being delivered. Study of one implementation now some three years old, and providing an environment of approximately 150 PCs shows that the following advantages have actually been achieved as the system reaches a state of maturity:

01995 Elsevier Science Ltd 9

Computer Audit Update June 1995

Hardware costs have been reduced in the sense that increments have been greatly reduced. This is achieved by smaller unit costs being involved in expansion and proliferation of the system. As most organizations now use PCs widely, not all end users will need new hardware and better utilization of existing equipment can also be achieved.

Systems have been both simpler and quicker to build although the need for considerable integration testing effort has to be balanced out against these advantages.

Training has been greatly eased as users have enjoyed one interface with systems and the simplicity of Windows.

Considerable improvement in the effective use of systems has been achieved through uniformity and easy to use presentational methods such as Windows front ends.

Client/Server has been found extremely valuable in presenting an integrated face to a number of older mainframe based systems which actually lack commonality. In environments with a long history of mainframe development this can be a valuable mechanism for revitalizing ageing software.

These advantages are very easily delivered early in the life cycle of a Client/Server based system but as the size of the system increases and the population of end users grows, a trade off occurs. The initial advantages which are overwhelming for a small number of users become less and less advantageous and the disadvantages discussed below start to outweigh the advantages. The achievements of Client/Server are therefore strongly linked to the size of population to be served. The larger the end user community the less appealing Client/Server appears to become.

Disadvantages actually experienced

The experience of organizations using Client/Server based systems is that the disadvantages are limited to security discussed above and escalating systems management costs. It is however the magnitude of these disadvantages and in some cases the disregard of them at project initiation that has been at the root of disappointing results.

In practice the security issues can be quite considerable. A Client/Server system deliberately sets out to strip away the protective shell that sits around most conventional systems. In adopting this new architecture we give any end user complete freedom of access, complete freedom to change software, introduce new software, even modify applications in an undisciplined and unstructured fashion! In practice access to software such as File Manager has to be controlled and it is the reintroduction of controls that have traditionally been delivered by the operating system that adds the first overhead onto Client/Server.

We have to consider carefully the impact of having, say, File Manager freely available to all users of the system. Not only does everyone get a ‘warts and all’ view but the control and educational issues should not be underestimated.

Everybody will of course reintroduce disciplines to reduce or nullify these effects. But then we have reinvented the wheel. Experience suggests that a very considerable effort goes into replacing the disciplines that have been stripped out by discarding the old conventional environment provided by the operating system.

There are three influences that result in escalating systems management overheads. These are:

l The size of the user population.

l The complexity of the system.

l The age of the system.

10 01995 Elsevier Science Ltd

June 1995 Computer Audit Update

What is difficult to predict is that these influences interact and result in a cubic effect on support effort and costs.

The first issue is demonstrated by the amount of effort necessary to achieve a systems upgrade. Most of us are used to applying an upgrade in say software or memory just once and that upgrade is then effective to any number of users. Even a major upgrade on a large mainframe based system will be achieved in say eight or ten hours and this can upgrade several hundred or several thousand end users. In addition we know that when we test that upgrade, it is verified and consistent across the user population.

To upgrade a few PCs is very straightforward but the effort required to upgrade say the memory of each of one hundred PCs or to install new software in the same population is quite different. To upgrade one hundred end user systems in say a ten hour window allows only a few minutes per device and that includes locating each PC. In practice even using several teams of support staff, upgrading becomes a major consideration in an environment of more than about one hundred devices. The prospect of upgrading a thousand end user systems simultaneously poses major management and logistics problems.

In theview of one Systems Manager referring to a system with 150 PCs;

“Faced with a major change, say a new release of Windows and Novell and a memory upgrade, it is much easierto ship in preconfigured systems”.

Whilst this illustrates the effort involved, the cost of such an approach will soon become prohibitive.

It is at this point that the paradox at the heart of Client/Server comes to light. If Client/Server systems remained in a steady state, the advantages would almost certainly balance out the disadvantages, but the constant movement of products makes this an impossible dream.

Nor does the customer have the option to freeze their system. Every time a new piece of hardware is delivered it comes with the latest BIOS, new drivers etc., and before long inconsistency is delivered into the environment without the customer having any control or choice in the matter. Recent quotes from a systems manager illustrate the point:

“The components don’t happily co-exist because we want them to”

“There are far too many changes for us to cope with. Even a simple environment of just two or three products rapidly gets beyond management”

New product releases force upgrades to the installed systems and upgrades are logistically taxing. As the end user population grows, this becomes the overwhelming factor governing success. As systems age, the extent of upgrades becomes more intense as, for example, we reach the limit of installed memory and are forced into hardware upgrades to all installed systems to accommodate new software. And as each of these influences compound, the support effort rapidly grows beyond manageable proportions.

The actual system administration issues which grow out of change and out of multiple suppliers cover the following different systems components:

l hardware;

l applications software;

l operating systems;

l network hardware;

l network software;

l other protocols (e.g. 3270).

At any given time, the vendor providing any of these products may be introducing an updated version of their product and with it they introduce

01995 Elsevier Science Ltd 11

Computer Audit Update June 1995

potential incompatibilities which will echo throughout the system.

The management issues that arise from maintaining a constantly changing environment with this number of variables very quickly becomes beyond any reasonable level of control.

Summary

The lessons learned from Client/Server to date can be summed up as follows:

Client/Server can be a very attractive approach to delivering new systems but the attractions may be swiftly outweighed as the end user population grows.

Security and discipline are hard to achieve in the end user environment.

To balance these,

New systems can be delivered very swiftly and with very attractive front ends with products such as Windows which encourage end users.

Client/Server is therefore likely to provide an attractive option for those systems where time to implementation and speed of delivery is key to

success of the project. In addition Client/Server will be at its most successful where a system or a project is contained within a comparatively small end user community. Finally it is those applications that require little or no formality that are likely to produce the biggest benefits. Applications such as electronic mail lend themselves very well to the Client/Server

approach whereas those systems that are subject to strict audit are likely to test security and discipline to a high degree.

Postscript

What has started to emerge is the important role that systems suppliers have to play in the success of Client/Server. There is some value to a ‘One Stop’ approach, buying all products from

a single source, but it is limited. A dealer providing a bundle of hardware and software products has little control over the originators of those products. There is however an opportunity for the service providers to play a far more positive role with a service that provides preconfigured systems with an undertaking to maintain

compatibility. To the author’s knowledge no such service is yet available in the UK, it would go a long way to fulfilling the promise of Client/Server.

THE AUDIT OF INFORMATION

Hugh Parkes

Today’s audit professional faces an enormous challenge - how to identify the issues

that matter amidst a sea of competing priorities. In an increasingly automated world, the auditor has to decide how much time to allocate to the technological environment, the application and system software environment, the communication environment and the information environment. This last - the information environment-is the focus of this article. It is also the context where the greatest value can be provided by an auditor - and yet, paradoxically, it is the least discussed in contemporary audit

literature, and, apparently, the least understood by many auditors.

So what is information? How do we need to go about auditing it in the future? How will it change what we do now? How will it help the wider audit profession to bridge the assorted expectation canyons we currently face? How will

we provide new information-linked services to our customers? And how will information help us to make a dollar?

The value of information

Information is an intangible asset: it has a definite value, but is hard to touch and sometimes

12 01995 Elsevier Science Ltd