high-performance computing and insurance actuarial...
TRANSCRIPT
High-Performance Computing and Insurance Actuarial Modelling
Published: Nov 2008
Abstract
This white paper discusses the pressures that exist in the insurance industry for more sophisticated,
more timely actuarial modelling. It examines the role of regulators and business drivers. It describes
how the Microsoft® Windows® High-Performance Computing Server 2008 platform can help insurers
of all sizes remove performance bottlenecks, improve decision-making and satisfy the demands of
regulators and rating agencies. It also examines how global professional services firm Towers Perrin
uses the flexibility and performance offered by Microsoft to enhance products and services for end-
users.
High-Performance Computing and Insurance Actuarial Modelling Page 2
This is a preliminary document and may be changed substantially prior
to final commercial release of the software described herein.
The information contained in this document represents the current view
of Microsoft Corporation on the issues discussed as of the date of
publication. Because Microsoft must respond to changing market
conditions, it should not be interpreted to be a commitment on the part
of Microsoft, and Microsoft cannot guarantee the accuracy of any
information presented after the date of publication.
This White Paper is for informational purposes only. MICROSOFT MAKES
NO WARRANTIES, EXPRESS, IMPLIED OR STATUTORY, AS TO THE
INFORMATION IN THIS DOCUMENT. Complying with all applicable
copyright laws is the responsibility of the user. Without limiting the
rights under copyright, no part of this document may be reproduced,
stored in or introduced into a retrieval system, or transmitted in any
form or by any means (electronic, mechanical, photocopying, recording,
or otherwise), or for any purpose, without the express written
permission of Microsoft Corporation.
Microsoft may have patents, patent applications, trademarks,
copyrights, or other intellectual property rights covering subject matter
in this document. Except as expressly provided in any written license
agreement from Microsoft, the furnishing of this document does not
give you any license to these patents, trademarks, copyrights, or other
intellectual property.
© 2008 Microsoft Corporation. All rights reserved.
Microsoft, Active Directory, Excel, Visual Basic, Visual Studio, Windows,
Windows Server, and the Windows logo are trademarks of the Microsoft
group of companies.
All other trademarks are property of their respective owners.
High-Performance Computing and Insurance Actuarial Modelling Page 3
Contents
Contributors and acknowledgements..................................................................................................... 4
High-Performance Computing and Insurance Actuarial Modelling ........................................................ 5
High-Performance Computing background ............................................................................................ 7
Microsoft and High-Performance Computing..................................................................................... 7
Introducing High-Performance Computing with Windows Server ..................................................... 8
Why do insurance companies need High-Performance Computing? ................................................... 10
Benefits of High-Performance Computing ............................................................................................ 12
Remove performance bottlenecks ................................................................................................... 12
Better decision-making ..................................................................................................................... 12
Regulators and rating agencies ......................................................................................................... 13
Auditability and manageability ......................................................................................................... 13
Other applications ............................................................................................................................. 14
Actuarial software and HPC .................................................................................................................. 15
Introduction ...................................................................................................................................... 15
Towers Perrin .................................................................................................................................... 15
Implementing High Performance Computing ....................................................................................... 16
Example #1 – Windows HPC Server 2008 applied to Excel 2007 ...................................................... 16
Example #1(a) – Excel- HPCS Adapter for Legacy XLLs ..................................................................... 17
Example #2 – HPCS and Legacy VBA Code ........................................................................................ 18
Example #3 – A Windows HPCS “Call to Action” for Systems Developers........................................ 20
Summary ............................................................................................................................................... 21
Appendix 1 – more information ............................................................................................................ 22
Appendix 2 – improvements in Microsoft Office Excel 2007 ................................................................ 23
High-Performance Computing and Insurance Actuarial Modelling Page 4
Contributors and acknowledgements
Michael Newberry, Product Manager, High-Performance Computing, Microsoft
Gordon Ejsmond-Frey, Insurance Industry Manager for EMEA, Microsoft
David Dorfman, Solution Specialist, High-Performance Computing, Microsoft
Van Beach, Senior Consultant, Towers Perrin
Brian Heale, Product Director for Towers Perrin’s Software Solutions
Brian Sentance, Chief Executive Officer, Xenomorph
High-Performance Computing and Insurance Actuarial Modelling Page 5
High-Performance Computing and Insurance Actuarial Modelling
The essence of insurance is risk management. The better an insurer can quantify risks and manage
assets, the more accurately they can price their products and the better use they can make of their
reserves.
Insurers face immense competitive, regulatory and stakeholder pressures. This in turn is driving a
need for improved risk management, better financial performance, more robust quantitative
analysis and greater transparency. Simultaneously, regulators, rating agencies and investors are
placing increasing demands on insurers to manage these aspects of their business in a controlled and
auditable manner. Rating agencies and regulators demand robust enterprise risk management
(ERM) programmes that that are linked to sophisticated risk analysis and economic capital models.
Investors expect greater transparency, control and governance. Effective financial modelling
solutions are a critical component of any insurer’s response.
To meet the growing demands insurers need to take a holistic approach to actuarial/risk
management (i.e., ERM) and be able to consolidate market, insurance, credit and operational risk
management to provide a complete picture of risk across the organisation. This enterprise view of
risk analysis is complex and requires a number of organisational, modelling and technology
responses to be in place. Considering the demands that lie ahead, most insurers recognise that
today’s fragmented modelling solutions will not meet tomorrow’s needs.
Actuarial/Risk management is not only focused on protection from adverse results (should this be
events/circumstances). Particularly in insurance, the process needs to find and leverage risk
assumption opportunities. Actuarial modelling solutions must help insurers to capitalise on available
opportunities to improve business performance.
A priority for insurers will be to gain access to faster, more powerful modelling systems to support
their responses to the new demands, both to comply with external regulators and market analysts
and to run a more nimble, agile business. Better decision-making can lead to revenue growth,
greater profits and value creation.
High-Performance Computing and Insurance Actuarial Modelling Page 6
Improved accuracy is also needed to prevent catastrophic problems, either from too-rapid growth or
from the failure to carry sufficient reserves. This is why regulators and rating agencies are paying
more and more attention to risk management and the way in which insurers do it.
Regulations such as Solvency II, individual state regulation in the US and new international
accountancy standards show that regulators are switching from deterministic formulae to principles-
based modelling and model assurance. As this transition reaches its conclusion, the demand for
computing power in insurers will grow. It also means that the rules governing reserve setting and
other projections are evolving.
At the same time, actuarial modelling itself is becoming much hungrier for raw computer processing
power. Policies are more sophisticated, for example when they include embedded guarantees.
Models are bigger and more complex. For example, a stochastic-on-stochastic 30-year projection
requires hundreds of millions of individual policy valuations.
Microsoft’s research suggests that a very large majority of IT people in insurance firms run models in
batches overnight or over the weekend. This is hardly the near-real-time analysis that insurers need.
These changes have led to a bottleneck. There is a mismatch between what insurers need from their
computer models and what those models are actually delivering.
The answer lies in distributed computing where complex, time-
consuming model runs are broken down into small calculations
which are parcelled out to many different computers. It’s divide
and conquer applied to computer science. A job that takes 200
hours on a single processor could run in as little as two hours on
100 processors.
This is where Microsoft comes in. Microsoft’s High-Performance
Computing platform builds on Windows HPC Server 2008 (HPC
Server) – an enhanced version of the tried and tested Microsoft server software which facilitates
distributed computing.
Installations run from 16 processors to support a small insurer or individual actuarial department, to
mammoth data centres with over 4,000 processors which might be used by the very largest insurers.
The Microsoft solution supports the whole range of requirements.
Actuarial software from providers such as Towers Perrin now includes support from the Microsoft
High-Performance Computing platform. This allows actuaries to use familiar, trusted software to
create their models. It also allows IT managers to set up processor farms to run them using familiar,
easy-to-manage Microsoft software.
High-Performance Computing and Insurance Actuarial Modelling Page 7
High-Performance Computing background
The history of high-performance computing begins in the Second World War. Alan Turing and the
code breakers at Bletchley Park built a proto-computer called Colossus to speed up decryption of
German Enigma codes. The process was simply too time-consuming to do manually. The stakes were
high and time was short.
Since then, spy agencies have continued to use super computers. Other applications have evolved:
weather forecasting, engineering, climate simulation, weapons research and so on. But access to the
most powerful machines required a top secret clearance or membership of an academic or
professional elite.
All that is changing. Microsoft’s mission of putting a computer on every desktop and in every home
has contributed to a dramatic fall in the price of computer power. Moore’s Law – that the number of
transistors on a chip (and processing power) would double every two years – has held true for over
40 years1. The first microprocessor had 2,200 transistors. Today’s processors have over a billion.
The result has been a dramatic drop in price for high-performance computing. A Cray
supercomputer in 1991 delivered 10 gigaflops of processing power and cost around $40m. Today
you can buy a multiprocessor, multi-core 64-bit server running Windows that delivers the same
horsepower for around $4,000.
The story doesn’t end with a single computer. In principle, two computers should complete the same
job in half the time and one hundred should get it done even faster. In practice, parcelling up a piece
of software between different computers – known as parallel or distributed processing – is a
challenge. Further challenges lie in coordinating the processors, managing assignments and dealing
with hardware and software failures in order to maximise performance. This is the science of high-
performance computing.
Microsoft and High-Performance Computing
Microsoft has set out to democratise High-Performance Computing (HPC) and make it as widely
available as possible. A supercomputer on every desk, perhaps. In particular, Windows HPC
technology has been designed to:
Leverage existing user skills such as familiarity with Microsoft® Windows® and Windows-based applications, Microsoft® Excel® as an example
Build upon existing .NET developer skills through tight integration of HPC functionality within Microsoft® Visual Studio®
Enable business user productivity through easy-to-use applications for cluster management and job scheduling
Provide extensive command line interfaces for power users and developers that need more flexibility and control
Offer support for HPC industry standard interfaces such as MPI (Message Passing Interface) for clustering
1 Moore’s Law: http://www.intel.com/technology/mooreslaw/index.htm
High-Performance Computing and Insurance Actuarial Modelling Page 8
Provide an easy-to-program job scheduler interface for integrating desktop applications directly to the cluster
Offer integrated cluster setup, deployment and monitoring capabilities
Build on Microsoft® Active Directory® to offer powerful, built-in cluster security management functionality
Combining these goals with Microsoft’s partners’ ability to deliver the out-of-the-box business
functionality based on Windows HPC technology, means that the insurance industry can take
advantage of high-performance computing in a way that was simply impractical or too expensive
before.
Introducing High-Performance Computing with Windows Server
Microsoft High-Performance Computing is delivered through Windows HPC Server 2008, which in
turn includes two components
Windows Server 2008 HPC Edition is Microsoft’s 64-bit operating system offered specifically for HPC use. It offers the power and integration of the Windows Server family at a value point designed for distributed, multiple-server computing.
Microsoft HPC Pack 2008 is Microsoft’s integrated cluster suite, offering the necessary tools
for deploying, maintaining and operating a compute cluster including comprehensive
deployment and management tools, a service oriented job scheduler, a message passing
interface (MPI) and support for high speed networking fabrics.
In addition, we need to add further components to complete the full picture of the Microsoft HPC
offering:
Microsoft Visual Studio
Microsoft .NET Framework
Microsoft SQL Server
Windows Server 2008 HPC Edition
Microsoft HPC Pack 2008 Windows HPC Server 2008
Secure, reliable, tested
Support for high-performance hardware (x64, high-speed interconnects)
Job scheduler
Resource manager
Cluster management
Message passing interface
High Speed Networking
Integrated solution, out of the box
Maximizes investment in Windows administration, skills and tools
Makes cluster operation as easy and secure as a single system
Microsoft Visual Studio is the most advanced and widely-used integrated development environment
(IDE) on the market, and has a wealth of features designed to ensure it is the IDE of choice for
developing and debugging distributed HPC software. The Microsoft .NET Framework should need no
introduction as Microsoft’s strategic managed code programming model for Windows and web
application development and deployment.
High-Performance Computing and Insurance Actuarial Modelling Page 9
Windows HPC Server 2008 (HPCS) provides an integrated application platform for deploying, running
and managing high-performance computing applications. For customers who need to solve complex
computational problems, HPCS accelerates time-to-insight by providing an HPC platform that is easy
to deploy, operate and integrate with your existing infrastructure. HPCS allows researchers to
research and financial analysts to conduct analysis with minimal IT administration. HPCS operates on
a cluster of servers that includes a single head node and one or more compute nodes (see above
diagram). The head node controls and mediates access to the cluster resources and is the single
point of management, deployment and job scheduling for the compute cluster. HPCS uses the
existing corporate Active Directory infrastructure for security, account management and overall
operations management using tools such as Microsoft System Center Operations Manager and
Configuration Manager.
High-Performance Computing and Insurance Actuarial Modelling Page 10
Why do insurance companies need High-Performance Computing?
Initially, regulators and credit rating agencies have driven the demand for high-performance
computing in the insurance sector. The extra processing power allows companies to run more
accurate and more frequent risk calculations.
Increasingly, however, companies want better risk management information. Fast, frequent,
detailed projections allow them to manage risks more effectively; use capital more efficiently, set
prices more accurately and target their portfolios better. HPC leads to robust, timely analysis to
support quarterly, monthly and even daily decision-making.
A third driver is the growing complexity of the models and products themselves. As insurance
products become more sophisticated, these models have become much more demanding. In
addition, companies want to run ever-more complex ‘what-if’ calculations including intensive
stochastic-on-stochastic projections. It is no longer acceptable to wait several days to get an answer.
As these and other risk management demands for integrated analysis become more important, we
will see a move from desktop modelling systems towards technology capable of running and
aggregating multiple model outputs, automatically feeding data in from multiple sources and
integrating with a wider architecture. The diagram below represents the one possible modelling risk
management infrastructure that would typically meet the requirements of most large-scale insurers.
High-Performance Computing and Insurance Actuarial Modelling Page 11
Technology is the enabler that can bring together all the elements and provide the services and
computing power required. A complete infrastructure must also be able to access and connect with
other enterprise systems within the organisation. In recent years, insurers have begun to invest in
developing standard architectures for their organisations, allowing all components to talk to each
other in a standard. As you can see from the above diagram technology plays a major part in the
infrastructure to ensure that risk and actuarial modelling development and production runs can be
controlled properly. Insurers need to know that they are running the right version of the right model
with the right data and at the right time.
For companies listed in the USA, Sarbanes-Oxley has additional implications about the way in which
the models themselves are constructed and managed. Demonstrating effective controls doesn’t, in
itself, require lots of performance. However, HPC goes hand in hand with more centralised control
and management of models. This, in turn, helps companies with Sarbanes-Oxley compliance.
HPC allows forward-thinking insurance companies to progress from compliance to competitive
advantage and from desktop to distributed computing.
High-Performance Computing and Insurance Actuarial Modelling Page 12
Benefits of High-Performance Computing
Remove performance bottlenecks
Individual computers struggle to complete calculations quickly enough. When companies, or even
individual actuaries, run a model on a desktop computer it can take hours or even days to complete.
Stochastic models attempt to estimate the probability of different possible outcomes based on the
random variation of different inputs over time. For example, an insurer might want to calculate its
future liabilities as a result of variations in mortality rates, interest rates or stock market
performance. Nested or stochastic-on-stochastic modelling takes the concept further. A typical
application is to calculate a range of outputs for a given year and then use each one as the starting
point for the next year’s calculations. A 30-year projection might require hundreds of millions of
individual calculations. HPC is adept at this kind of calculation.
It’s common practice to leave such calculations running overnight or over the weekend. While this
may be a good use of ‘dead time’, they may not have finished when everyone gets back to work.
Worse, it is not possible to monitor or restart failed jobs in a well-managed way. Similarly, it is
difficult to queue and prioritise multiple jobs.
Windows Server 2008 HPC Edition allows companies to divide and conquer; splitting complex jobs
into smaller calculations and dividing them across multiple computers. The Microsoft HPC Pack 2008
schedules and allocates jobs efficiently and deals with failures and restarts.
Better decision-making
HPC allows insurance companies to run models faster. For example, a complex multi-variable
stochastic-on-stochastic projection might take a few days on a single PC but a few minutes or hours
on a cluster of computers (depending on how many computers were available).
This means that insurers can do two things. First, they can run more detailed models. For example,
instead of taking a sampling of policies, they can run calculations across their entire portfolio.
Second, they can run the models more frequently. Running more variations on the same model will
improve accuracy and running a model on demand with rapid results will improve responsiveness.
The ability to run sophisticated models is especially important where policies include embedded
guarantees, such as guaranteed minimum withdrawal or death benefits (regardless of the
performance of the underlying investments). This means that insurers need to understand the
impact of a range of possible outcomes on the profitability of their products. It also means that they
need accurate and fast information in order to set up hedging strategies to mitigate risk.
Access to better-quality information closer to real time can only improve decision-making. For
example, companies can:
Project and compare different segments of the market
Increase return on capital by prioritising profitable parts of the insurance portfolio
High-Performance Computing and Insurance Actuarial Modelling Page 13
Select appropriate risk transfer options to improve the risk profile and minimise capital
requirements
Evaluate portfolios with a view to acquisition or disposal
Improve risk management through the use of securitisation or other alternative risk finance
vehicles
Protect underwriting margins by varying rates, or altering risk acceptance criteria
Better information and insight through HPC can help companies improve profitability, optimise their
underwriting and fine-tune their portfolios.
Avoiding disaster is the other side of this coin. More than 60 percent of insurance company
insolvencies are due to deficient loss reserves and rapid growth, according to A.M. Best. Better
information and more responsive analysis can help companies ensure that they maintain sufficient
reserves and manage their growth prudently.
Regulators and rating agencies
Insurance companies operate under a high level of scrutiny. Regulators, investors and rating
agencies look for accurate reporting and assurance about the quality of the modelling used to
generate it. Regulations and reporting standards such as Solvency II, laws in individual U.S. states or
International Financial Reporting Standards demand such visibility. Investors expect greater
transparency, control and governance, and, for US-listed companies, Sarbanes-Oxley is at the tip of
the spear. Similarly, rating agencies look closely at insurers’ risk management.
Rating agencies also exert similar pressures. They are, in effect, quasi-regulators because credit
ratings impact share price, cost of capital and market perception. They are pushing insurers for more
information about their risk management activity, including modelling approaches and standards.
With several layers of oversight on industry, state, national and super-national levels, companies
must try to anticipate future requirements and comply with the superset of all the regulations that
apply to them. It’s a challenging environment. Sophisticated on-demand modelling can help
companies meet all these different requirements and is likely to become an obligation.
Companies that can prove that they have robust, timely modelling accrue a commercial benefit. In
particular, insurers may be able to reduce the amount of capital they need to hold. The cost of doing
the modelling is much less than the revenue this can generate.
Auditability and manageability
Regulators and rating agencies don’t just want the modelling done. They want to know how it is
done. For example, Pillar III of the Solvency II regulations requires that insurance companies are
transparent about their modelling process. Ad hoc individual spreadsheets and weakly-managed
models will not pass muster.
Instead of manual processes, companies need to use technology to manage their models.
Companies need an infrastructure, such as Microsoft’s, to manage, schedule and prioritise model
High-Performance Computing and Insurance Actuarial Modelling Page 14
runs. At the same time, intranet software such as SharePoint can be used to manage files and access
to them.
This doesn’t mean the abolition of Excel. It is a ubiquitous analysis and reporting tool. Excel 2007
includes a multi-threaded calculation engine which can take advantage of multiple processors on a
desktop computer.
At the same time, Excel Services and Excel Web Access allow companies to create web-accessible
spreadsheets using Microsoft Office SharePoint technology that allows Excel data, such as model
outputs, to be used in a consistent, well-managed way across the whole business.
The result is that companies can manage model runs on a high-performance cluster and distribute
the results in a controlled way to end-users who can use Excel, a familiar tool, to analyse and present
the results.
Other applications
While modelling using high-performance computing techniques will become ubiquitous, the
technology has other applications in the insurance industry:
Implementation of hedging strategies. Where companies sell policies with built-in options (for
example an annuity with a guaranteed minimum return but upward-only bonus that depends on the
stock market), the insurer will want to implement a stop loss hedge strategy against the potential
liability. Companies can use HPC to calculate these strategies across an entire portfolio on a day-to-
day basis.
Asset liability matching. The treasury department in a bank typically answers the question “Do I
have enough money to pay out what I need to pay out tomorrow?” but in insurance, it involves
actuarial calculations as well as financial ones. Matching assets to liabilities in this way is another use
for HPC.
Catastrophe modelling. Any insurers that are covering risks against perils such as storms, floods and
droughts are having to build more complex models to understand their exposure, and to identify any
concentration of risk. For example Lloyd’s of London has recently announced that it will model it’s
environmental risk exposure on the basis of two catastrophes per year with a total loss of $50bn or
more. Swiss Re, the world’s largest reinsurer is also creating global catastrophe models to help it to
price risks. HPC is needed to accommodate the wide range of risk scenarios and numbers of
individual risks, and to enable insurers to respond rapidly as events such as hurricanes impact their
portfolios.
High-Performance Computing and Insurance Actuarial Modelling Page 15
Actuarial software and HPC
Introduction
One of the main aims of the Windows HPC platform is to make high-performance computing
available to both business users and technologists alike, and, in so doing, help to increase the
productivity of all users. Users work on a Windows desktop, using familiar Windows applications,
such as Excel and familiar third-party actuarial software, scaling seamlessly across a Windows
cluster. All this can be done using a single set of security credentials. Users require no additional
skills or training to benefit from the power of distributed computing.
From a technologist viewpoint, the Windows HPC platform offers integrated access to high-
performance computing using the Microsoft Visual Studio IDE, tightly integrated security and
management features as mentioned above, powerful command line scripting and conformance to
parallel computing industry standards such as MPI with the Microsoft Message Passing Interface
(MS-MPI) and a parallel debugger.
Towers Perrin
Towers Perrin produces MoSes actuarial software. It lets insurers model their policies, liabilities and
assets as well as external variables such as interest rates and mortality rates. It is used by around
400 companies worldwide, and many users run it on a single desktop computer. As models increase
in complexity, a single run can take a couple of days to complete.
Towers Perrin looks to the Windows HPC platform to transcend this performance bottleneck. It is
possible to run the same model on ten or one hundred nodes with a matching increase in
performance. A model that used to take days to complete now runs in a matter of hours.
Another factor is the ability to schedule and manage the models. Microsoft’s software takes care of
this so that users have better control of when models run and better understanding of when they
will be finished. The software also alerts users if something goes wrong. This is a significant
improvement compared with turning up on Monday morning to find that the model failed on
Saturday night and needs to be re-run.
The HPC-compatible version of MoSes was launched in 2008, followed closely by Towers Perrin’s
new RiskAgility modelling engine, which also uses Windows HPC as standard.
High-Performance Computing and Insurance Actuarial Modelling Page 16
Implementing High Performance Computing
Example #1 – Windows HPC Server 2008 applied to Excel 2007
Whilst we have seen some of the business cases for using HPC in insurance in preceding sections, it
is probably worth stating that the majority of these business cases for HPC have an additional
common theme. That theme is that many of them will be front-ended (or at least prototyped) with
the “lingua franca” of financial markets, Microsoft Office Excel. The tight integration that is possible
between Windows and Microsoft Office Excel 2007 has the potential to “democratize” access to high
performance computing. Through spreadsheet access, high performance computing can be
delivered directly to the trading, portfolio and risk managers who need it.
Microsoft Office Excel has become a mission critical application for most insurers, where business
users find that the spreadsheet is the only environment that can quickly bring together position
data, complex instrument data, market data and pricing models, and at the same time support the
most complex analysis under a wide variety of market scenarios. Whilst Excel has been an
overwhelming success in insurance, these same business users have been pushing Excel to the limit
and demanding enterprise performance and reliability. In summary, this increasingly “mission
critical” usage of Excel requires:
Guaranteed execution of mission critical calculations.
Improved performance by completing parallel iterations on models or long-running
calculations.
Improved productivity of employees and resources by moving complex calculations off
desktops.
Transparent access to centralized spreading reports and analysis.
Excel add-ins are the primary way in which models are deployed and used in insurance. Complex
spreadsheets used to price derivatives, simulate market movements or optimize portfolio
composition are compute-intensive and, as we have seen, invariably it would be desirable to greatly
speed up such calculations. Given the addition of the new multi-threaded calculation engine of Excel
2007 (see Appendix 1 for more detail), users will notice dramatic calculation speed improvements as
Excel can now distribute its cell calculation load in parallel across the available cores and processors
on the machine it is running on. In addition, the greatly improved memory-management features of
Excel 2007 will ensure that less time is taken accessing physical disk storage for data intensive
calculation runs.
There are many cases however, where the parallelization of the calculation load on one machine will
not be sufficient. In these scenarios, Excel 2007 can be combined with Windows HPC Server in order
to deliver calculation scalability and fault tolerance across many machines containing multiple
processors and processor cores. In order to achieve this, user-defined functions can be created and
installed on 64-bit servers running HPCS. Using the new multithreaded calculation engine of Excel
2007, simultaneous calls can be made to the cluster servers to perform remote calculations. Since
calculations may be performed in parallel by multiple servers, many complex spreadsheet
calculations can be performed far quicker than before whilst, at the same time, the load on the local
client machine can be significantly reduced. Hence traders and risk managers can achieve faster time
High-Performance Computing and Insurance Actuarial Modelling Page 17
to market with new ideas, and at the same time not bring their local desktop to a standstill as a big
spreadsheet is calculated.
With this solution, organizations are able to move formulas out of the Excel workbooks and store
them on a set of servers. Additionally, since the processing is moved off the desktop, organizations
have the option to lock-down access to the formulas used in the calculations and just provide user
visibility to results. This scenario requires creating a local, client, User-Defined Function (UDF) as an
XLL that will schedule jobs on the head node via the web services API. Additionally a server-version
of the UDF (or its functional equivalent) will need to be created which will live on the cluster servers.
The “server” UDF will perform the calculations and return the results back to Excel.
Example #1(a) – Excel- HPCS Adapter for Legacy XLLs
As mentioned in Example #1 above, the integration of Windows HPC Server 2008 (HPCS) with Excel
2007 requires some coding work to be done to build new “server-side” UDF functions and to wrap
existing legacy add-ins (often developed as XLLs) so that they can be accessed in HPCS from Excel. As
a result of their desire to make the application of the Windows HPC Platform open and productive
for all users, Microsoft has developed a tool called the Excel-HPCS Adapter. The Excel-HPCS Adapter
allows a developer to automatically create a cluster-enabled add-in from an existing XLL and
provides at runtime a system for the distributed execution of XLL function calls on the cluster,
typically resulting in accelerated performance for supported scenarios.
The primary benefits of the Excel-HPCS Adapter include:
No custom development required to adapt the existing XLL libraries for compute cluster
deployment; provided XLL UDFs meet the deployment criteria
High-Performance Computing and Insurance Actuarial Modelling Page 18
The source code for the original XLL is not required as the automated proxy generation
process analyzes the public interface of the XLL binary component itself to create a
compatible compute cluster proxy component.
Multi-user support where many different users can each run their own spreadsheets and call
the same server-side UDFs on the compute cluster server.
Multi-XLL support where each spreadsheet calls UDFs in multiple XLL libraries, each of which
can be distributed to the server. Mixed deployment (i.e. spreadsheets with both local and
remote XLLs are also supported).
The diagram above illustrates the Excel-HPCS Adapter architecture in more detail. It should also be
pointed out that certain deployment criteria must be met by the legacy XLL before this tool can be
applied to it. Those readers interested in reading more about this productivity initiative for Excel
add-ins should follow the link below:
Example #2 – HPCS and Legacy VBA Code
When designing Excel Services, Microsoft took the decision that “unmanaged” (non-.NET) code was
not appropriate for secure, server-based spreadsheet access. Whilst the technical reasons for this
approach are understandable and valid, it does however leave a productivity gap for those of users
that are dependent on functionality contained within “unmanaged” Visual Basic® for Applications
(VBA) code. Whilst the Windows HPC platform allows for many configurations of high performance
clustering behind Excel 2007, it is possible and practical to combine legacy versions of Excel and
Excel spreadsheets with HPCS. In this scenario, the legacy version of Excel and would be installed on
each compute node of the HPCS cluster.
High-Performance Computing and Insurance Actuarial Modelling Page 19
An application or command-line script would then marshal what jobs (calculations within
spreadsheet workbooks) need to be undertaken by Excel within the cluster and where results are to
be placed. Obviously, this approach is only appropriate where the desired calculation can be
parallelized. However, this is often the case where VBA code has been used to both control and
implement Monte-Carlo simulations for risk management and pricing purposes.
High-Performance Computing and Insurance Actuarial Modelling Page 20
Example #3 – A Windows HPCS “Call to Action” for Systems Developers
Given the levels of Excel usage within insurance, it is perhaps unsurprising that some of the
preceding examples have focused on the use of Windows HPC Server 2008 with Excel. This also
reflects Microsoft’s belief that Excel is one of the main ways in which business users can begin to
harness and benefit from the computational power of HPC.
However, Excel is only one way of accessing HPC, and in this regard Microsoft has implemented very
tight integration of HPC with the Microsoft Visual Studio 2005 IDE. The .NET Framework and features
such as parallel debugging of distributed calculations can help systems developers and independent
software vendors to deliver HPC-enabled applications faster than ever before. As we have seen, the
market need is there for HPC-enabled applications and Microsoft is making this transition to
distributed architecture computing as simple and straight forward as possible.
High-Performance Computing and Insurance Actuarial Modelling Page 21
Summary
The insurance industry relies on complex actuarial models to calculate risk, price products and make
the best use of capital reserves. Near-real time model runs and more detailed models produce
better business outcomes. At the same time, regulators, ratings agencies and financial reporting
standards are pushing them towards ever-more sophisticated and well-managed modelling.
The price performance ratio of PCs and servers allows clusters of them to act as virtual
supercomputers. Insurers can use this kind of High Performance Computing (HPC) to run models
faster and in more detail, leading to better, more timely results.
Windows HPC Server 2008 makes it easier for companies such as Towers Perrin, to take advantage
of this new-found, relatively low cost computer power.
High-Performance Computing and Insurance Actuarial Modelling Page 22
Appendix 1 – more information
Product information
www.microsoft.com/hpc
www.towersperrin.com/riskagility
Improving Performance in Microsoft Office Excel 2007
http://msdn2.microsoft.com/en-us/library/aa730921.aspx
Microsoft Office Excel 2007
Microsoft Office Excel Services Technical Overview:
http://office.microsoft.com/search/redir.aspx?AssetID=XT102058301033&CTT=5&Origin=H
A102058281033
http://office.microsoft.com/excel
Blogs
http://blogs.msdn.com/hpc/
http://blogs.msdn.com/excel
http://blogs.msdn.com/cumgranosalis
High-Performance Computing and Insurance Actuarial Modelling Page 23
Appendix 2 – improvements in Microsoft Office Excel 2007
Microsoft has made significant investments towards the improvement of Excel in the 2007 release.
With Excel 2007, users will experience a redesigned interface that makes it easier and faster to
create spreadsheets that match the growing needs of the market. Here are some of the key features
of Excel 2007 that can greatly improve the data analysis capabilities of spreadsheet users and add-in
developers:
Expanded capacity for spreadsheets. Previous releases of Excel were limited to 256 columns
by 65,536 rows. Users can now create spreadsheets with columns going to XFD (that’s
16,384 columns!) and include up to 1,048,576 rows.
Maximised memory usage. Previous releases of Excel could only take advantage of 1GB of
memory. That limit has been removed and Excel can now take advantage of the maximum
amount of memory addressed by the installed 32-bit version of Windows.
Excel 2007 now includes a multi-threaded calculation engine. By default, Excel will maximise
the utilisation of all CPUs on a machine. For customers running multi-core, multi-CPU, or
hyper-threaded processes, linear improvements can be seen in performance, assuming
parallelisation of calculations and the use of components and environments that are multi-
thread capable.
Continued compatibility with existing Excel XLL add-ins, with the addition of an extended
Excel XLL C API to support the functionality outlined above, plus new features such as
support for strings up to 32,767 characters in length.
For more information on Excel 2007, visit: http://office.microsoft.com/excel; for more detail on
developing add-ins for Excel 2007 using the updated Excel XLL C API, see:
http://msdn2.microsoft.com/en-us/library/aa730920.aspx