Your Credit History Isn't the Only Thing With a Score

Ratings on all kinds of things increasingly affect people’s lives

Office with people reviewing monitors
•••

Tom Werner / Getty Images

You might know what your credit score is. But what about your money-laundering risk score, your insurance risk score, or the score a court might have that shows your likelihood of committing a crime? 

Key Takeaways

  • Companies and governments increasingly are relying on computerized scoring using both public and private data to make important decisions about how they treat individuals.
  • Scores influence how much medical care people receive, whether they’ll be jailed or set free, or what advertisements they’re targeted with. 
  • Unlike with a credit score, people are usually unaware that they’re even being rated, let alone what information is being used or how the scores work. 
  • The GAO recommended Congress consider regulations to allow consumers to see their data and correct mistakes.

Hospitals, colleges, banks, insurance companies, and even the criminal justice system all increasingly score people in various ways that are often unknown to the people being scored as well as unregulated by federal consumer laws, according to a report released Thursday by the Government Accountability Office. The watchdog agency, which said it couldn’t even determine how many such scores are being used or entirely what they were being used for, said consumers would benefit if the government made some rules for other scores the same way it does for credit scores.

These lesser known scores can be hugely influential: Much the way a credit score can determine whether you’re approved for a loan, other kinds of scores can help determine whether a credit card transaction is flagged as fraudulent, whether a healthcare provider reaches out to you with special services, or whether you’ll be sent to jail or let go if you’re arrested. 

“Unlike traditional credit scores, these scores may not be subject to consumer protection laws that seek to assure fair and transparent treatment,” the GAO said in its report. “Consumers are generally unaware of how they're scored. We urged Congress to consider a consumer right to view and correct this data and more.”

The GAO did not mention specific companies in its report, but such scores are numerous across businesses and government agencies. For example, in addition to calculating consumer credit scores, FICO offers a “medication adherence score” to help healthcare providers assess how likely a patient is to take their prescription.

If you’re arrested in New Jersey, whether you’ll go home or be sent to jail is not up to whether you pay bail, but rather a computer calculation that scores your likelihood of fleeing or committing more crimes while awaiting trial. 

In order to comply with banking laws, financial institutions use automated tools to determine whether a client may be involved in terrorism financing or money laundering.

And one company, unnamed in the report, scored Hispanic / Latino consumers on “cultural integration” and used that information to target marketing and advertising.

While the report acknowledges that consumers may sometimes benefit from such scoring—when it helps companies detect fraud and identity theft, for example—the lesser-known scoring also raises major concerns about privacy and transparency. Consumers often do not know what information is being used to create the scores, how they’re being calculated, or even that they exist, the GAO said. Out of 49 websites offering scoring services that the GAO reviewed, only two even offered the option for consumers to find out their scores, assuming they even knew that there was such a score.

Then there’s the issue of privacy: The GAO found that scores were often created using public records such as court and property records, information gleaned from sources like social media and newspapers, and private data such as store loyalty card activity, Internet search terms people have used, and websites they’ve visited. And organizations often use predictive modeling, machine learning, and other analytical techniques to generate the scores in ways that are unknown to the people being scored.

Have a question, comment, or story to share? You can reach Diccon at dhyatt@thebalance.com.

Want to read more content like this? Sign up for The Balance’s newsletter for daily insights, analysis, and financial tips, all delivered straight to your inbox every morning!

Article Sources