Behavioral Credit Scoring
Citation: 101 Geo L.J. 807 (2013)
In 2011, Kevin Rose, the influential founder of the website Digg, caused a stir when he took to his video blog to share a “random idea.” “This might be potentially the dumbest . . . least vetted idea that I’ve ever thrown out there,” Rose said, “But…what if we could make credit cards a little bit more social?” Rose suggested a system in which trusted friends would invite each other into an extended credit network sponsored by a bank. The more friends you could invite and the better credit those friends had, the lower your own interest rate would be and the higher your available credit. Contrary to delinquency to a large, faceless bank, where the only real consequence of missed payments would be a slight credit-score reduction, Rose argued that your trusted friends in the newly formed credit circle would be loath to miss payments because they would be letting you and the network down. Peer pressure and tight social bonds would provide far greater enforcement than the coercive power of a large institution ever could. It would be, in a sense, a return to a preindustrial small-town ideal where credit was extended on a handshake, based on one’s standing in a close-knit community.
As our commercial and social lives are increasingly mediated through digital technologies, companies large and small have begun to make credit “social,” and not only in the seemingly benign way proposed by Rose. By collecting and mining the enormous wealth of personal data generated by performing the mundane tasks of daily existence—shopping, reading, socializing—credit card companies, large banks, and a host of start-up companies in the data-collection and lending fields are at the cusp of a revolution in the way they determine and price risk in credit markets. In the coming years, who you know, where you shop, and what you read may dramatically affect your access to credit. Although much scholarly attention has been paid to the privacy implications of online data mining and aggregation, or “dataveillance,” for use in targeted behavioral advertising, relatively little attention has been focused on the adoption of these techniques by lenders. And although the efficiency and accuracy justifications for total access to consumer information may be at their highest when determining credit risk, these practices also raise unique concerns regarding our privacy expectations in digital space. The heightened potential for discrimination facilitated by online tracking deserves closer attention. This Note seeks to address some of these concerns.