The lack of online privacy is affecting your chances of getting loans and jobs

Authors: Mary Chaffee and Nikita Ostrander

The way that corporations use the data that they harvest from people’s digital wakes makes it seem like the internet is booby-trapped. Having your potential job prospects or access to credit depend on unknowable interpretations of your browsing history, interests, or purchases, takes control out of people’s hands. When you can’t control what data is shared, that’s a violation of privacy. When you can’t control how that data is used, it erodes individual rights to self-determination by making the consequences of your actions unknowable. In short, it’s predatory behavior. In the image, we have illustrated this by showing how your digital wake is gobbled up by different corporations whether or not you notice.

When people spend their time on the internet, their data are harvested by Google, Facebook, and a slew of other companies. Not only are user data sold to advertisers, but it is also utilized by financial institutions, other money lenders, and employers to determine if they’re a worthy candidate for whatever funds or jobs they apply for. The mindset behind this mass extraction of user data, or “big data”, is centered upon two main assumptions: more information is always better, and data is more objective than human beings. Unfortunately, there are flaws in these assumptions that manifest themselves in practice.

These days with the prominence of social media and overall internet usage in the US, many of us are beginning to realize that these “free” services come with actual costs: our personal data. According to Kim Komando in “How to Stop Your Smartphone from Tracking Your Every Move, Sharing Data and Sending Ads” on USA Today, “‘Targeted advertising’ is a massive phenomenon. Companies are eager to flood your screen with ads, which are primarily influenced by your day-to-day habits. Facebook, Apple, Microsoft, Amazon, Google and many others make money off mobile ads, and they need this information to power their data-mining machines.” Companies use internet users’ personal data to target advertisements at them and tailor their social media and internet experiences to them. In a lot of cases, the people with your data “know you more than you know you”. In a way, it’s a form of clever manipulation that keeps people using their sites and gets them to click “buy” buttons when they otherwise wouldn’t. Because of this, data on their users’ demographics and browsing tendencies are extremely valuable to these companies.

Advertisements aren’t the only thing user data are used for. Banks and businesses have begun to use “e-scores” to evaluate users based on things like their internet surfing history, their zip codes, and their recent purchases. User data are fed to an algorithm that evaluates their eligibility for loans or other services and spits out an e-score that can make or break their chances. According to Cathy O’Neil in Weapons of Math Destruction, “Many of their pseudo-scientific models attempt to predict our creditworthiness, giving each of us so-called e-scores. These numbers, which we rarely see, open the doors for some of us, while slamming them in the face of others. Unlike the FICO scores they resemble, e-scores are arbitrary, unaccountable, unregulated, and often unfair…” This practice becomes problematic as loans are refused to people who live in neighborhoods that condemn them as “risky borrowers” and they become unable to lift themselves out of that situation.

The lack of privacy on the internet has enabled companies to produce these e-scores and other judgements of people that stem from relatively irrelevant types of data. In the case of banks and credit worthiness, algorithms that incorporate arbitrary data like user zip codes and internet history reinforce harmful biases that lock people in poverty along racial lines. Algorithms are biased because data are biased, and data are biased because they’re observed, collected, and interpreted by human beings. A UC Berkley study titled “Consumer-Lending Discrimination in the FinTech Era” showed that minority borrowers still pay 11 to 17 percent more over the lifetime of the loan than white borrowers do. The study also found that credit risk analysis only accounts for about 70 percent of pricing variation in lending. They speculate that this represents some amount of ‘strategic pricing’ based on other information, which may include the kinds of information gobbled up by corporations following your digital wake.

Citations:

Bartlett, R., Morse, A., Stanton, R., & Wallace, N. (2019). Consumer-Lending Discrimination in the FinTech Era. NBER. doi: 10.3386/w25943

O’Neil, C. (2016). Collateral Damage: Landing Credit. C. O’Neil (Author), Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy (pp. 141-160). Great Britain, New York: Penguin Books.

Komando, K. (2019, March 07). How to stop your smartphone from tracking your every move, sharing data and sending ads. Retrieved December 04, 2020, from https://www.usatoday.com/story/tech/columnist/komando/2019/02/14/your-smartphone-tracking-you-how-stop-sharing-data-ads/2839642002/

In: