Error processing SSI file

Systems Testing & the
Triangulation Approach

Life Insurance Systems - Testing Values and the Triangulation Approach

Spotlight Issue 4 - 2012

Back in the late 1990s, I worked for a large, multi-national life insurance company that had four separate administration systems and had illustration software on three different platforms. Like most companies at the time, there was a desire to somehow consolidate all business on a single administrative system and illustration/inforce platform.

I was a managing actuary in the company's actuarial department at the time, reporting to the chief actuary. There were two other managing actuaries at the time, and we were in the process of assimilating a statutory entity into our existing book of business. We thought about how best to arrange the department into three entities for each of the managing actuaries to manage. Of course, the first two sub-departments would be product development/pricing and financial reporting/reinsurance, but what about the third department?

We thought hard on this issue for some time, and slowly but surely, the idea to have a systems oriented branch of the actuarial department really made a lot of sense, especially given the numerous administrative and illustration/inforce platforms in-house. Also, for as long as I had been in the profession (1982), there was always some big systems initiative going on as companies implemented new products, upgraded systems or converted to a new platform altogether. Y2K was a few years away and everyone was freaking out about that. Illustration systems were also beginning to increase in complexity as companies started to demand more functionality.

When I discussed this idea with the chief actuary and COO, they both liked the idea and the COO came up with the department name, Actuarial Systems.

System's Testing

Along with the birth of the new branch of the actuarial department came a new approach to systems testing that affected the entire company. Rather than rely on the existing Model Office approach where department specialists were shuttled off to serve time in Model Office (now referred to in most companies as the QA Department), the idea was to build a sub-department within each department that was impacted by systems (new business, client services, commissions, accounting, actuarial, agency, underwriting, etc.). The task of each sub-department was to proactively be responsible for ensuring the success of future systems efforts. The proactive part meant that each sub-department would be responsible for drafting their own test conditions, verifying the accuracy of their area's test results and coordinating with QA regarding testing and cycle run expectations. QA became a smaller hub than before, with their responsibility including the managing/coordinating of each system project and running test cycles.

This idea worked quite well as meetings between the QA project lead and sub-department representatives allowed each one of those present not only the ability to discuss their agendas and needs, but it also allowed for optimal coordination as each department knew what was going on with respect to other departments. This prevented changes from being put into the system that helped one department, but inadvertently broke code for another department, for example.

Triangulation

As the head of the newly formed Actuarial Systems area, I was really excited about the opportunity and really liked what I saw going on in the company at the time as we also began the adoption of the new systems testing approach. Up to that point, the actuarial area had used simple spreadsheets (originally in Lotus and later converted over to Excel) to test illustration software and administrative system values. However, we faced a little bit of a dilemma as the complexity of products, calculations, test conditions and regulatory issues continued to ratchet up. Most of these spreadsheets tested isolated portions of functionality, so there ended up being a lot of separate spreadsheets for each product.

Furthermore, there was a desire within the upper echelons of the company to ascertain that at issue illustrated values and at issue administrative values from the new business system matched. This idea was also extrapolated out to our inforce ledger systems matching the values produced by the administrative system for inforce cases.

This led to us developing more thorough and sophisticated testing tools in Excel that were far and above the size and complexity of the prior tools that were in play when the testing effort was more unstructured. Our goal was to have an independent set of tools for verifying the accuracy of the at issue or inforce illustration against the at issue or inforce case on the administrative system. One of the members of the actuarial department labeled this type of approach as a "triangulation" of systems. This basically meant that the optimal outcome for all three items (administrative system, illustration and Excel tool) would be that they matched one another.

Using Excel as our testing tool was a no brainer at the time. Excel is part of the MS Office suite of products and has many advantages that other similar software did not offer.

Other software packages were either not easy to program and maintain and/or debug. With Excel having all its formulas within a cell based format, it was easy to spot where bugs lay when comparing against illustrations and/or administrative output values, as the information services department had developed structured system output for the various items that were being tested. Of course, there were a good number of times where the Excel program had to be tweaked or corrected. The ultimate goal was for the systems and testing tool to match.

One of the benefits of using Excel was that it was supported by the information services department given that it was part of the MS Office suite of packages and readily understood by our mainframe programmers. At a prior company, I had learned a little APL (a mainly actuarial tool whose acronym means "A Programming Language"), but this tool had significant pitfalls that made it untenable for usage within the company, primarily since the programmers did not know how to read it (and did not care to learn) as well as the fact that APL was very difficult to understand for someone inheriting someone else's APL program as there are infinite ways to program functionality. This could occur if your APL programmer left the company and his/her programs were inherited by an extant member of the staff. Also, APL was symbol driven and there were (at the time) a few hundred symbols that one had to learn in order to master APL. Nonetheless, APL is a very powerful tool and as such is still in use in some actuarial departments.

Triangulation - Vendors

We also tried the triangulation approach with software vendors with great success. This came in handy as we were switching illustration providers and we had our existing illustration platform, Excel spreadsheet for existing products and the new vendor's illustration platform.

Under the old paradigm where the only items being compared are the old illustration platform and new illustration platform, when there are value discrepancies, it is easy for each party to say, "We don't match and we think our values are correct, so fix your software." This can lead to finger pointing, numerous volleys back and forth between parties, can make a good relationship go south in a hurry, as well as losing precious time.

Having a third testing tool in the mix alleviates a lot of this headache. Generally, when two of the three systems match, we need to look at the third system and wonder where the divergence originated. This does not mean that the two systems that match are correct mind you - they may both be wrong with the same bug. This too is easy to ascertain in many instances.

However, as mentioned earlier, having the Excel tool rapidly accelerated the debugging process as one can easily pinpoint where the divergence is occurring using Excel, given its cell-based layout, and then determine where to proceed. Excel also allows the user to make formula changes to cells, columns or rows to experiment with certain formula changes and then readily see the impact of formula changes.

This contrasts to the illustration and administrative platforms whose output is just numbers on a piece of printer paper or screen. How can one (especially an actuary) determine which two sets of values are correct especially when they're just values on a screen and the formulas are buried in the code? Actuaries are further tasked with living up to the actuarial motto "The work of science is to substitute facts for appearances and demonstrations for impressions".

This is one of the main advantages of Excel—values and their corresponding formulas exist in the same location, within the cell. Also, the aforementioned flexibility of Excel, of being able to change cell formulas and spot the impact of the change immediately, is another important quality.

Excel also comes into play in developing demonstrations to show one or both parties how the answer of one (or none) of the other two systems may not be entirely correct. These demos can be sent to the other party with (or without) formulas in the cell. Once again, another major advantage of Excel comes forth in that just about everyone in the actuarial, accounting, programming or other finance-related industry knows how to use and read Excel at some basic level.

Of course, Excel has its limitations, mainly with respect to capacity and computational speed, but I've found that I can still build fairly large Excel spreadsheets without running out of memory. Also, even if a spreadsheet takes a few minutes to run (as can be the case when there are numerous 7702/7702A recalculations for a particular test case), it can save hours of back and forth between you and the vendor in trying to determine who is right.

A large percentage of life and annuity companies use a triangulation approach of sorts. Most companies use Excel or in some cases APL as the third leg of their triangle. The triangulation approach existed before we adopted it back in the late 1990s, so please don't think I am taking credit for its existence!

Triangulation - At the Vendor's Level

The triangulation approach can also be successfully used at the vendor level. As the engine developers are programming the software calculation engine, the vendor's actuarial staff busies themselves with the development of an Excel tool to use for products that are high in complexity and/or have numerous bells and whistles. Sometimes the Excel tool is provided by the client company, which benefits both parties tremendously as any mistakes in the Excel tool that are uncovered by the vendor can be communicated back to the client.

Most clients enjoy this aspect as the extra pairs of eyes may find bugs that were unknown up to that point, allowing the client to debug and further enhance the accuracy of the software.

When an Excel tool is not provided by the client, which is the client's choice, we can develop our own testing tool using one of our templates for testing more complex and demanding products. Companies each have their own idiosyncrasies with how they develop products and the thought that they give to system implementation.

Most medium and small size companies, for example, are short on resources and available capital, so their focus is generally on developing a product that fits in as well as possible with their existing administrative system. This is to preclude the significant expense that can be associated with modifying the administrative, statutory valuation and/or illustration systems to handle a certain feature.

Large companies, while still budget conscious, generally have more available capital, more demands from their top producers and fewer constraints on developing state of the art products that require moderate to extensive system modification to support. The products, as a result, tend to have more bells and whistles that are helpful to the sales force in being able to promote their products over other companies whose products might not contain such features. They also tend to have significantly more granular computations that smaller and medium size companies either don't have the resources to build in or don't wish to commit the capital necessary to build in.

The Excel tool (provided by the client or developed in-house) comes in handy in numerous situations, such as the verification of 7702/7702A values (which includes recalculations, MEC avoidance under the Necessary Premium Test, recapture ceiling, etc.), accumulation values across all ledgers, premium solves, verification of maximum loan and/or maximum withdrawal available values, cost basis, gain, net outlay, dividend allocation on par products, withdrawals switching over to loans at basis, preferred loans, indexed product bucket allocation, targeting shadow account values at specific durations to ensure that the policy stays inforce, etc.

Generally, the Excel tool comes into play when there is a discrepancy between the two other legs of the triangle. This is almost always for cases at the high end of complexity (for example, a case where due to face increases, the guidelines and 7-Pays are recalculated several times and there is a divergence in one of the latter recalculations that leads to divergences of the remaining 7702/7702A recalculations).

It can also work the other way, as the engine developers start to make use of the spreadsheet in cases where a discrepancy arises, if they spot a bug in the spreadsheet, they can notify the actuarial area and one of the actuaries will handle the debugging.

The use of the spreadsheet to resolve discrepancies has reduced the time taken to analyze and debug the illustration software and/or vendor testing tool.Insurance Software Think of the back and forth that can occur when there are only two legs on the triangle. It is like a tennis match where the ball can continue to go back and forth across the net. With the use of the spreadsheet, we can stop the volleying and get to the heart of the matter quickly (like a line judge walking out on the court after a couple of volleys and grabbing the tennis ball as it crosses the net!).

The goal is not to use Excel to build another illustration platform. Face and income solves, for example, are generally not included. Nor are other functional items such as cost indices, which are not generally complicated to calculate.

Not that these items couldn't be built in, but the goal of the spreadsheet is to determine whether or not one or both of the other legs of the triangle are correct and to be able to provide background support for the correction that needs to take place. If the spreadsheet is not correct for a specific test case, it is almost always debugged.

The Future

The proverb "May you live in interesting times" really rings true for most of us, especially on the technology side. At one of my first jobs, back in 1984, one of the actuaries demonstrated a personal computer for the department. We sat and watched the actuary load the spreadsheet program on the PC (this was before hard drives!) and were amazed by some of the functionality we witnessed. It took a little while for the PC to catch on, but after a while we noticed more and more PCs in the department. Soon everyone had one.

At home, we had one PC for many years. Now, my entire family has at least one, not to mention all the other PC like gadgets (iPad, for example) that are proliferating. Sitting at a gate in an airport can make my head turn as I see numerous gadgets being used here and there.

Similar to the advent of the PC, the widespread use of MS Office components like Excel, has proliferated. In addition, we not only witness more advances in computer software, but also the amount of capacity that computers can handle. I recall working back in 1983 on a DEC 64K mini-computer and thinking that it was pretty neat at the time.

I've witnessed the growth of insurance software at the illustrative and administrative level and have seen the same type of tremendous growth in these platforms. Actuaries and programmers are making great use of the expanded memory and computing capacity that continues to increase which has led to fairly complex products being developed and administered.

As life insurance and annuity products continue to ratchet up in complexity as competitors try to one up each other, so too will the necessity of testing and ascertaining that these products pass muster with respect to output accuracy, user-friendliness and computational speed.

Triangulation is one method used to accelerate the process of resolving discrepancies in values across multiple systems. Time seems to always be of the essence nowadays, with speed to market and shorter product shelf lives being the main drivers of solving such issues quickly and accurately.

 

William "Bill" Aquayo
SVP, Actuarial Research
FSA, MAAA, CFA, ChFC, CLU
Insurance Technologies

 

Error processing SSI file