work:  project details: IRAnalytics


Revolutionizing Analytics

Large financial companies have a problem: determining which of their customers are good customers, and which are bad. For a long time there has been a solution to this problem (albeit unrefined in it's current state). It's called predictive modeling.

Predictive modeling is a process in which an analyst studies historical data that the company has collected about it's customers. (Data like: balance transactions, salary, address, credit history, etc.) The analyst knows which of these customers are "good" and "bad" customers because it's historical data; they can look for patterns in this old data that are consistent with positive or negative customer behavior and then use those patterns as a template to to try to predict which of the company's existing customers will also turn out to be "good" or "bad".

For the last few decades these financial companies have had only two options for producing predictive models both of which are slow and expensive.
 
 

Choice 1: Outsourcing

The first is outsourcing, which takes between weeks to months of time per model. The process is simple though:
 
  • Decide what your target is
  • Hand data and target to an outsourced analytics company
  • Receive exactly one model (up to a month or two later)

The drawbacks are easy enough to identify: the company had no control over the process, the process takes a long time, and the cost was reoccurring and very expensive.
 
 

Choice 2: In-House

The second choice is to purchase the tools and hire a PHD statistician to run them for you. The model is produced in just 1-2 days, but the process is laborious.

data preparation:
  • Load data
  • Run custom script to display fields in order along with pertinent data summaries
  • Print out list of fields
  • Hand annotate print-outs (3-8 hrs)
  • Run scripts, one feature at a time to make necessary transformations (could take many hours, depending on the size of your data set)
 
create predictive models:
  • Decide what your target is
  • Declare target and other parameters in a script run script on data (build your model, could take more than an hour)
  • Export results to Excel
  • Define columns and formulas to extract pertinent data for results (could take more than an hour)
  • Configure charts and graphs in excel to visualize success or failure of your one model (could take more than an hour)
 
The upside was having control in-house and less time, but the cost is still high and the process for determining success was crude at best. Not to mention the difficulty of trying to find a qualified PHD statistician.

Choice 3: IRAnalytics

IRAnalytics puts a GUI on this process that enabled regular analysts to do the same work that once took days or months in just hours. The process is once again simple.
 
 
 
 
 
Data Preparation
  • Load data
  • Use data characterization UI to make necessary transformations in the system (could take up to an hour, depending on the size of your data set)

 
Creating Predictive Models

 
Build Model
 
 
View Model Results
 
 
And then You can
  • Tweak your model and build again
  • Build new models
  • Set up a series of subtly different model experiments then kick off an overnight build
  • Compare those models with each other the next morning (graphs, charts, stats, key differentiators, etc.)




Design Lead: Vaughn Aldredge
Project Management: Marc Pottier
 
 
  
 
© Copyright 2005, Up & To the Right.  All rights reserved.  All other trademarks, service marks, and copyrights are the property of their respective owners.

Log powered by: WordPress  |  About  |  Contact