On that note, I’ve been thinking a lot about my own experiences with technology-driven change in the reinsurance industry, where I worked as an analyst from 2017 until recently.
During these three short years, I observed a radical shift in data analysis methodologies. Excel-based models, which had seemed top-of-the-line suddenly were too slow and too rigid; Integration with 3rd party data sources, which was once a luxury, became the norm; And analysts began to utilize scripts to accomplish many labor-intensive tasks typically performed by hand or in spreadsheets.
I added the emphasis. I assume she’s referring to models implemented in Excel that had been considered top-of-the-line, but even in 2007, I can’t think of any serious numerical computing in Excel would be seen as top-of-line. It was always second best, at best.
Lots of things to engage with in the piece, though…
Think COBOL is dead? About 95 percent of ATM swipes use COBOL code, Reuters reported in April, and the 58-year-old language even powers 80 percent of in-person transactions. In fact, Reuters calculates that there’s still 220 billion lines of COBOL code currently being used in production today, and that every day, COBOL systems handle $3 trillion in commerce. Back in 2014, the prevalence of COBOL drew some concern from the trade newspaper American Banker.
I created an excel spreadsheet for group health benefits underwriters (something to do with pricing) back about 20 years ago. Last I heard, it’s still in heavy use.
Also, at least online, I don’t understand why some people are moving to python. We’ve been using php for a coons age and it’s not getting older, it’s getting better. Last version brought execution speed up to be comparable to compiled programs. Hopefully I’ll be retired before php is.