Brian Fitzpatrick: Contamination Is the Problem

Posted by Michael Bates on June 26, 2017 No Comments
Categories : Person Of The Week

PERSON OF THE WEEK: Brian K. Fitzpatrick is president and CEO of LoanLogics Inc., which focuses on loan quality management and performance analytics technologies for the mortgage industry.

We recently checked in with him to get a sense of how technology, data and loan origination are intertwined in today’s lending environment.

Q: With the abundance of new technology being introduced to the mortgage industry in support of fully electronic originations, is there a lack of trust that has affected adoption? Why?

Fitzpatrick: There is a general lack of trust among lenders, which stems from the industry’s huge data integrity problem. Loan officers and mortgage companies collect a lot of information from borrowers, including asset, income and bank account information – and then they collect more information so they can confirm what the borrower said on the application. Soon you’re dealing with an increasing number of partners, documents and data. The more information that is gathered, the more likely you are to get conflicting information, and then your data starts to cross-contaminate.

Simply moving toward electronic originations does not necessarily fix a lender’s data integrity problem. For example, instead of using smart docs, many lenders are taking documents that were signed electronically and turning them into “dumb documents,” which are just images of electronically signed documents. Basically, you have these hybrid type of documents, which makes it harder to verify and validate the data on them unless you have the proper tools.

Q: Do some lenders fall back on manual operations simply because they don’t trust technology? How pervasive do you believe this is?

Fitzpatrick: Yes, many lenders are clinging to manual processes because they can’t trust their technology to ensure data integrity. You only have to look at the MBA’s statistics on per-loan costs to see how pervasive it is. The average loan underwriter is stuck working on approximately 1.5 loans per day because they can’t always trust the accuracy of the data – they are spending most of their time reviewing documents and all the data in those documents. However, a growing number of lenders are starting to see that technology can help by showing all the data at once, so they can more easily identify the risks.

Q: What misconceptions do lenders currently hold about technology that add to the problem?

Fitzpatrick: Right now, there’s a perception that the digital mortgage experience and point-of-sale technologies will solve everybody’s problems. While it may be making the borrower’s experience better, in many cases, it’s making the industry’s data integrity problem even worse.

While lenders are providing a digital mortgage experience to borrowers on the front end, many are still using an older system on the back end. The older system uses a different data model and rules, so lenders can’t validate or verify all the data that borrowers are submitting. At the extreme, some lenders are even taking the information the borrower submits and retyping it into their LOS. It’s like putting lipstick on a pig – plus it invites bad data.

Another misconception is that the LOS is the sole source of truth. There is a distinction between a source of truth that spans multiple systems and documents and a system of record that ultimately can house that information. I can take garbage and put it in my LOS, so the information is in the file, but it is still garbage. That’s why there is such a huge need for lenders to continually validate and verify data across systems and documents, and then to update that information in their LOS. We’ve worked on loans from lenders using all the various LOS vendors. They are plagued equally by data defects that are propagated through the origination process.

Q: What can technology vendors do to ease those concerns?

Fitzpatrick: Lenders need a certain level of comfort when it comes to data. The key is to make sure the data is accurate by validating and verifying it both in real time and throughout the mortgage process.

For example, if a borrower must show he has $30,000 in funds for a down payment, the lender is going to ask for a bank statement. Today, it’s very easy to create a fake paper bank statement that looks real; you can’t tell the difference between the fake statement and the real statement. But if the borrower gives the lender permission to check the data electronically through the borrower’s bank, nobody can tamper with the information.

Could someone tinker with the data once it’s in the system? Sure, but that’s why you continue to validate and verify the data. Underwriters are suspicious by nature – after all, it’s their job to assess risk.

Q: What is it about the mortgage process that consistently creates loan defects, and what should lenders be doing about that?

Fitzpatrick: This gets back to what I was saying earlier about the increasing amount of data from various sources that goes into the loan file. The main culprit behind loan defects is simple human error and the retyping of information or information being corrected during a closing and not making its way back to the system of record. Because the mortgage process is so disconnected, with many different partners, many different handoffs of data and many different processes, it’s feasible that people are retyping information several times during the course of the transaction.

For example, my wife and I recently refinanced our home. When we got the appraisal back, I noticed that the order was correct, but my name was spelled with a “y” – plus the street name and my wife’s name was spelled wrong. Of course, I didn’t misspell my name or my wife’s name and the name of my street when I applied for the loan. Somebody had to retype it into a different system when actually ordering the appraisal, or someone at the appraisal company did it. Every time that happens, it increases the likelihood of contaminated data in the file.

Q: Is the constant stream of new regulations and investor demands for better quality loans to blame, and if so, why can’t technology seem to keep up with them?

Fitzpatrick: Blaming new mortgage regulations and other requirements is like saying the county health department is bad for the restaurant business. Before we had health inspectors, a lot more people were getting sick. When we created regulations, restaurants were forced to improve their equipment. New technologies and processes to ensure food safety played a role. It not only made consumers healthier, but restaurants were also able to avoid problems that could have put them out of business.

Regulations are also making our industry easier for consumers to understand. My wife, who formerly ran compliance for a large lender pre TRID, recently saw TRID documents for the first time and was amazed at how simple they were to read and understand.

I don’t agree with the idea that technology cannot keep up with our industry’s regulations and requirements. If a lender is relying on their LOS for compliance – remembering that the LOS is a system of record, not the source of truth – then yes, you will have problems because of the rampant data contamination that is pervasive throughout the origination process.

But there are tools that enable lenders to pull data from any source, validate and verify it, and run it through an automated rules engine that can be called from system, whether it’s an LOS or a point-of -sale system. There will always be those who blame their circumstances on industry regulations, but that’s not the issue. The issue is data integrity and consistency of rules and workflow to automate processes across the enterprise. If we focus on solving that problem, everything else will fall into place.

Register here to receive our Latest Headlines email newsletter




Leave a Comment