In these days of strict regulatory compliance, many banks, financial institutions and other B2B businesses must now verify the identity of every new business customer, a process called Know Your Business (KYB).
When choosing between business identity verification vendors, it’s important to run a data test to ensure the product experience, outputs and performance all align with your current and future compliance needs. This guide explains how to prepare for a data test, conduct a data test, and evaluate the test’s results when choosing a KYB vendor.
The importance of data tests
Any KYB compliance evaluation should provide insightful and tailored recommendations that demonstrate the solution's projected value based on the data test findings. Data tests evaluate:
- Data quality: is the data returned reputable, truthful and useful?
- Data coverage: does the vendor thoroughly search all possible, reputable data sources?
- Data freshness: new businesses are formed every day–is the vendor’s data up-to-date?
Further, you need to understand what metrics you will use to evaluate each data test (i.e. hours saved, reducing the number of required analysts, etc.)
By emphasizing data quality, coverage, and freshness in testing KYB vendors, organizations can mitigate compliance risks and maximize conversion. A data test can assess the recency and relevancy of the returned outputs, and whether the fields align with your industry, regulatory requirements, and business objectives. The results of the test can also help determine where the KYB provider should be within the onboarding flow and vendor stack.
Preparing for the data test
Defining the ideal output
The old adage holds “Measure twice, cut once.” The same is true of KYB data tests. In order to get the most value out of a data test, it’s important to first meet with key stakeholders to determine what your business needs in a KYB vendor. When gathering your internal team before a data test, example questions to ask include:
- What would our auto-approval metrics look like for these specific attributes or rulesets?
- What steps do our current process or team of analysts suspect to be the highest friction point? Examples include document verification/upload, address lookup, website identification, and industry classification.
- What are the critical requirements for delivering the above use cases?
- Does the vendor have all of the necessary data and data sources to achieve the critical requirements?
- Can the vendor map out which products and features align with our critical requirements?
- What are the auto-approval rules or attributes that you are looking to assess?
A data test loses all usefulness if a key requirement isn’t tested for. When this happens, the data test needs to be set up and run again, which can both negate the effectiveness of the data test and delay the process of choosing a vendor. Be sure to check in with everyone involved in regulatory compliance to ensure your vendor sets up the test in the most effective way.
Also, work with your vendor on this. At Middesk, we generally have our customers guide us through their onboarding flow. Together we discuss pain points, bottlenecks, and applicant drop off points, and ultimately create a strategy to perform a data test that addresses all of these factors.
Defining parameters and criteria for success
When setting up the data test, the next step is to clearly define what you want to see. This includes the following:
- Compliance rulesets: What rules will you use to approve, reject or send a potential customer into “needs review” status.
- Competitor benchmarks:The auto-approval or approval rates your competitors within the industry rely on
- Other relevant metrics: Are you looking to save analysts’ time or replace other vendors in your customer approval stack?
Having a clear understanding of these three outcomes will ensure that you can clearly evaluate the results of the KYB data test.
Moving on to next steps
Once you defined the results, the next steps are to determine:
- What are the specific goals of the data test?
- What information outputs must the data test provide?
- What key requirements must the data meet?
- Who will be involved with the data test?
- How do my company’s current growth stage, product status, and onboarding bottlenecks affect the data test?
- How will you judge the results? Manual review?
Conducting the data test
The scientific method requires running the same test again and again under the same conditions in order to reach a trustworthy result. While it’s nearly impossible to run identical data tests with different KYB vendors due to the fact that each vendor will rely on different sources and provide different data outputs, it is vital to get as close to an apples-to-apples comparison as possible.
Ideally, the information you provide for each vendor for their data test will represent your business’s current traffic. At Middesk, we recommend including your business’s target customer size, industry, and financial resources. For example, you’ll want to include a realistic percentage of registered businesses vs. sole proprietors that you expect to service. You’ll also want to include applicants that would represent both targeted and unqualified customers.
It’s tempting to test a new vendor against historic data, however consider that both your auto-approval rules and customer demographics generally change over time. Comparing a new vendor’s data with historic data doesn’t generate an apples-to-apples comparison.
When it comes to a data test:
- Provide each vendor with the same random and realistic sample to run
- If you serve sole-proprietorships and small businesses, ensure your sample includes a comparable percentage of each
- Include applicants that would represent both qualified and unqualified customers
- Provide recent applicants to test the vendor’s data freshness
By submitting the same samples to each vendor, you can evaluate how each vendor stacks up to one another and how they meet your business objectives.
Evaluating the results of a data test
If you’ve created an apples-to-apples comparison with your data tests, evaluating the results should now surface any issues and allow you to compare and contrast vendors.
Return to the questions you asked your team when preparing for the data test. Did the data test answer your questions? Are there areas for improvement?
To facilitate this process, we recommend:
- Comparing the data test results with any incumbents and all potential new KYB vendors
- Evaluating any use case and workstream recommendations provided by each potential KYB vendor - your vendor should be able to show you how their process solves the pain points you identified in the preparation stage
- Evaluating any efficiency and conversion gains resulting from each solution
- Identifying any areas where vendors fell short on the data test’s goals
From there, we recommend basing your choice of KYB vendor on the following:
- The vendor’s data test meets critical technical and product requirements
- The value prop of the product solves existing and future compliance pain points
- The vendor’s recommendations (and/or customized use cases) align with your current needs and future product roadmap