Keys To Making Big Data Affordable
Wednesday, August 17, 2016
CUToday (August 17, 2016) - The only way small credit unions can afford Big Data is through collaboration, according to the Minnesota CU Network, which has a new offering to help not-so-big cooperatives dig much deeper into their member information.
MnCUN has partnered with OnApproach to white label the company’s data analytics solution for its member credit unions.
John Ferstl, VP of network services at the league, said the program is designed to be affordable for credit unions with assets of $50 million and up.
“Today, to do data analytics well and on your own, you have to be a credit union with at least $600 million in assets,” said Ferstl. “We see the big banks and big retailers using Big Data and they are threatening to come into our space and do our job better than we do.”
Ferstl contends there is no real data analytics solution in the marketplace for CUs that is agnostic to all the systems credit unions have implemented.
Credit unions, with all their different platforms now—from mobile to mortgages to online banking—face issues with being able to pull data from all these platforms.
“The systems that route to these credit union platforms are primarily provided by the core system providers and they do not play well with other systems,” explained Ferstl. “With this being the case, many credit unions are limited to only warehousing the data that is captured by their core system. Unfortunately, the good and really powerful data is captured outside the core.”
Ferstl explained that the solution offered through the league takes OnApproach’s M360 data analytics platform and makes it available through a service bureau.
“There is the base layer, a sequel database, that captures all of the credit union’s data for them, and then pulls sequel reporting from it via a variety of tools,” said Ferstl. “We take that information, do all the data processing for the credit union, and then kick the reporting back to the CU.”
What this does, according to Ferstl, is eliminate the credit union’s need to buy data analytics hardware and software, address licensing, and take on the expense for the sequel programmer and the data scientist. Ferstl added that the system has an open API capability, which allows simple linking to new business areas without having to pay a connection charge.
Ferstl said the tough situation facing small credit unions is that to do Big Data on their own generally means a lot of staff hours over several years to create their own database, at a cost of around $2 to $3 million.
“Going with the league’s solution will be a small fraction of that cost,” said Ferstl.
Ferstl noted that the league can go back as far as seven years and collect data.
“Now you can do trending and analysis on all that data,” he said.
Ferstl added that participating credit unions that don’t have several years of data on a certain product can benefit from trending models produced from an aggregate of similar CUs that have signed on with the service bureau.
“For example, a small CU that does not have enough loss history to build their own appropriate ALM model,” said Ferstl. “But if I can take data from a large number of similar-size, similar-portfolio credit unions in Minnesota and go back seven years, we can build an ALM model that is appropriate for their credit union.”
The data analytics CUSO OnApproach said it is looking to expand the offering, which it termed as important to credit unions as ATM sharing and shared branches, to other providers across the country.