What’s Inside That Black Box: What Regulating Data Privacy and Policing Drunk Driving Have in Common

By Jerri-Lynn Scofield, who has worked as a securities lawyer and a derivatives trader. She is currently writing a book about textile artisans.

Writing in today’s NYT, I Got Access to My Secret Consumer Score. Now You Can Get Yours, Too., Kashmir Hill outlined how to get access to your secret consumer scores:

As consumers, we all have “secret scores”: hidden ratings that determine how long each of us waits on hold when calling a business, whether we can return items at a store, and what type of service we receive. A low score sends you to the back of the queue; high scores get you elite treatment.

In the run-up to the implementation of California’s Data Privacy Act on 1st January 2020, some companies that compile and sell consumer data are making it easier for US consumers to access their data (for background, see California Privacy Law Looms). Prior to California’s action, the EU implemented its General Data Protection Regulation, in 2018. The NYT notes:

…Some companies have decided to honor the laws’ transparency requirements even for those of us who are not lucky enough to live in Europe or the Golden State.

“We expect these are the first of many laws,” said Jason Tan, the chief executive of Sift. The company, founded in 2011, started making files available to “all end users” this June, even where not legally required to do so — such as in New York, where I live. “We’re trying to be more privacy conscious. We want to be good citizens and stewards of the internet. That includes transparency.”

How to Get Your Data

First things first.  The NYT provides information on how you can get those data and I include it here, for readers who want to go that route:

There are many companies in the business of scoring consumers. The challenge is to identify them. Once you do, the instructions on getting your data will probably be buried in their privacy policies. Ctrl-F “request” is a good way to find it. Most of these companies will also require you to send a photo of your driver’s license to verify your identity. Here are five that say they’ll share the data they have on you.

  • Sift, which determines consumer trustworthiness, asks you to email privacy@sift.com. You’ll then have to fill out a Google form.

  • Zeta Global, which identifies people with a lot of money to spend, lets you request your data via an online form.

  • Retail Equation, which helps companies such as Best Buy and Sephora decide whether to accept or reject a product return, will send you a report if you email returnactivityreport@theretailequation.com.

  • Riskified, which develops fraud scores, will tell you what data it has gathered on your possible crookedness if you contact privacy@riskified.com.

  • Kustomer, a database company that provides what it calls “unprecedented insight into a customer’s past experiences and current sentiment,” tells people to email privacy@kustomer.com.

Despite the jocular tone of the NYT article, what it reveals about the extent to which companies compile and analyze data gleaned from our on-line transactions. But that fact shouldn’t come as any great surprise to readers of this site (or anyone who’s been paying attention, for that matter).

I’m not going to linger on that point.

Instead I want to say, okay, you have the data. So what? That raw data alone doesn’t really get you very far.

The more pressing concern: What do the companies do with the data? In other words, how do they transform raw data into my secret score?  Just what exactly is inside the black box?

This is exactly the question asked by Laura Antonini, the policy director at the Consumer Education Foundation (CEA), and a co-author of a June report with CEA president Harry Rosenfield. According to the Grey Lady,

[The CEA] wants the Federal Trade Commission to investigate secret surveillance scores “generated by a shadowy group of privacy-busting firms that operate in the dark recesses of the American marketplace.” The report named 11 firms that rate shoppers, potential renters and prospective employees.

“I don’t really care that these data analytics companies know I made a return to Victoria’s Secret in 2009, or that I had chicken kebabs delivered to my apartment, but how is this information being used against me when you generate scores for your clients?” Ms. Antonini said. “That is what consumers deserve to know. The lack of the information I received back is the most alarming part of this.”

In other words, most of these companies are just showing you the data they used to make decisions about you, not how they analyzed that data or what their decision was.

Omnipotent, Mysterious Black Boxes

Outsourcing a crucial decision to a mysterious black box should raise a large red flag – especially when juxtaposed against another NYT article, this one from yesterday, These Machines Can Put You in Jail. Don’t Trust Them.

The NY Times conducted an extensive investigation of the use of breathalyzers to measure the concentration of alcohol in the blood. One cannot refuse to take such a test in any state; and if the machine registers a blood alcohol level of greater than 0.08%, one’s virtually certain to be convicted of a drunk driving offense. Over to the NYT:

But those tests — a bedrock of the criminal justice system — are often unreliable, a New York Times investigation found. The devices, found in virtually every police station in America, generate skewed results with alarming frequency, even though they are marketed as precise to the third decimal place.

Judges in Massachusetts and New Jersey have thrown out more than 30,000 breath tests in the past 12 months alone, largely because of human errors and lax governmental oversight. Across the country, thousands of other tests also have been invalidated in recent years.

The machines are sensitive scientific instruments, and in many cases they haven’t been properly calibrated, yielding results that were at times 40 percent too high. Maintaining machines is up to police departments that sometimes have shoddy standards and lack expertise. In some cities, lab officials have used stale or home-brewed chemical solutions that warped results. In Massachusetts, officers used a machine with rats nesting inside.

Technical experts have found serious programming mistakes in the machines’ software. States have picked devices that their own experts didn’t trust and have disabled safeguards meant to ensure the tests’ accuracy.

The Times interviewed more than 100 lawyers, scientists, executives and police officers and reviewed tens of thousands of pages of court records, corporate filings, confidential emails and contracts. Together, they reveal the depth of a nationwide problem that has attracted only sporadic attention.

A county judge in Pennsylvania called it “extremely questionable”whether any of his state’s breath tests could withstand serious scrutiny. In response, local prosecutors stopped using them. In Florida, a panel of judges described their state’s instrument as a “magic black box” with “significant and continued anomalies.”

Even some industry veterans say the machines should not be de facto arbiters of guilt. “The tests were never meant to be used that way,” said John Fusco, who ran National Patent Analytical Systems, a maker of breath-testing devices.

I could go on, but won’t. You get the point.

Data Privacy Protection: What Is to Be Done

Now, obviously, machines and algorithms have their place.

But they need to be understood, maintained, and regulated. Any time a decision is outsourced to a black box, and oversight surrendered….watch out!

Cave! Hic dragones.

Back to that June CEA report. #REPRESENT, a public interest group created by the CEA has asked the Federal Trade Commission to investigate and stop illegal surveillance scoring.

According to a June account  in the Hill, Advocates push FTC crackdown on secret consumer scores:

The complaint comes as lawmakers are increasingly scrutinizing major technology companies over their handling of user data. Facebook and Google have received the brunt of Washington’s attention because of their massive size and ability to microtarget advertisements based on their users’ behavior.

But #REPRESENT is hoping to shine a light on a part of the world of unregulated data collection that has received relatively little attention and has the potential to enable companies to discriminate against consumers on a massive scale.

“The ability of corporations to target, manipulate and discriminate against Americans is unprecedented and inconsistent with the principles of competition and free markets,” the complaint reads. “Surveillance scoring promotes inequality by empowering companies to decide which consumers they want to do business with and on what terms, weeding out the people who they deem less valuable. Such discrimination is as much a threat to democracy as it is to a free market.”

The surveillance score resembles the much-maligned credit core. Over to The Hill

But unlike credit scores, there’s no transparency for consumers, and Rosenfield and Antonini argue that companies are using them to engage in illegal discrimination while users have little recourse to correct false information about them or challenge their ratings.

Jerri-Lynn here. Ha! I wouldn’t hold out the credit scoring system as a model for anything. Off the top of my head, just a smattering of its defects includes: It’s far from transparent, credit reports frequently contain errors – which are time-consuming to correct – and the system is vulnerable to hacking (see Biometric ID Fairy: A Misguided Response to the Equifax Mess that Will Only Enrich Cybersecurity Grifters and Strengthen the Surveillance State.)

So, by all means, request your data. But far, far more is needed to protect data privacy throughout the United States.

Print Friendly, PDF & Email

15 comments

  1. ambrit

    The trend to “credentialize” American daily life has an end game; the Chinese Social Score. When I remember that the ‘Great Firewall of China’ was built with a lot of help from American technology firms, I don’t worry. The Fix is in.
    Living off the “grid” may not be very easy, but it appears to be the fate chosen for the Deplorable class in America. The basic idea is age old. Wells foresaw it and wrote about it in his “The Time Machine.”
    How’s the textile book coming?

    Reply
    1. Generalfeldmarschall von Hindenburg

      I think it was Gordon White (occult podcast dude & author) who came up with the term, ‘world without sin’. That’s what’s being prepared. If your car detects you going a tick over the posted limit, the fine will automatically be deducted from your account. The Meddling Mothers of the 1980s gave us a moral panic that was probably devised in the planning rooms of Big Insurance. While it’s all well and good to have sanctions for dangerous behavior, this has gone well beyond that.

      Reply
    2. Stadist

      USA, and many other western countries, already have similar credential systems as the Chinese Social Score. Criminal record and credit score are act in very similar manner, the only difference is that the chinese system apparently aims for a system that collects everything in one database, while the american system is essentially some sort of privatized decentralized version where the information is in different databases but accessible for purchase. The difference between whether the information is aggregated to single database or is spread to different databases which are still easily accessible is only a semantic discussion if the practical end result is the same.

      Reply
      1. ambrit

        The unique danger from the Chinese version of the “Permanent Record” is that in their version, a close to unitary political class controls the implementation. In the West so far, the implementation is competed for among loosely allied elites, plural. I realize that it can be seen as a matter of semantics, but it is a crucial difference.

        Reply
  2. chuck roast

    I’ll make a long and continuing story short here…
    I recently made application to a local 501C3 for a residential rental. Their primary business is renting and maintaining historical residential units. They wanted my social security number for screening purposes, but I refused to give it to them.
    I did the following: filled out the application, gave them the application fee, gave them a monthly statement from my financial provider demonstrating solvency, gave them an entirely unsolicited letter from former and current landlord, gave them my professional CV, showed them a computer program called Whitepages widely used by landlords that requires no SS#. It determined that I had no criminal record, liens, foreclosures, or legal issues outstanding. In addition, three locals (including my landlord) visited the 501C3 offices in support of my application. Not only did they reject my application, they lifted my membership and kicked me out of the organization. All because I refused to give them my SS# on the grounds of privacy.
    It is extremely unwise of them to screw around like this with a committed class warrior.
    My next step is to get a legal case brief prepared on Privacy and Housing Discrimination.
    Onward!

    Reply
    1. ambrit

      Nail them to the wall! I found out that Social Security will not let you have access to your online records, (I try not to use words like access as verbs,) if you have a ‘freeze’ on your name at the credit reporting agencies. Face to face is then required.
      I’m beginning to wonder about the feasibility of creating “Burner Identities” for daily use.

      Reply
    2. Late Introvert

      “It is extremely unwise of them to screw around like this with a committed class warrior.”

      Nice. Good luck. That sounds like a battle that you can win and will thoroughly annoy and deflate people who don’t know the law very well. But given USA! USA!, maybe not. Either the judge will be corrupt or they (the 501C3) will blame libruls and vote Trump harder.

      Reply
    3. Late Introvert

      “It is extremely unwise of them to screw around like this with a committed class warrior.”

      Nice. Good luck. That sounds like a battle that you can win and will thoroughly annoy and deflate people who don’t know the law very well. But given USA! USA!, maybe not. Either the judge will be corrupt or they (the 501C3) will blame libruls and vote Trump harder.

      Reply
    4. danny

      @Chuck roast,
      You’d lose that battle in California. I presume that means you’d lose it in every other state.

      Landlords can require you to provide a social security number to run a credit check. Smaller, mom and pops outfits may be a bit more loose with their application rules, but larger organizations generally aren’t. Consistency in process and requirements are best practices that help mitigate legal risks to such an organization. If they accept other forms of information in lieu of a credit check from you then they’d effectively need to do the same with every other applicant.

      Save your money and time.

      Reply
  3. Ken

    … breathalyzers to measure the concentration of alcohol in the blood. One cannot refuse to take such a test in any state….

    Not exactly. Washington State RCW 46.20.308: Prior to administering a breath test pursuant to this section, the officer shall inform the person of his or her right under this section to refuse the breath test, and of his or her right to have additional tests administered by any qualified person of his or her choosing as provided in RCW 46.61.506.

    But…who knows the possible problems with the breath analyzer??? I do know one person who knows that their body chemistry will trip the breath test but show clear on a blood test. Or so he says….

    Reply
    1. Michael Quinlan

      “https://www.irishtimes.com/news/girl-s-drink-drive-study-helps-breathe-new-life-into-science-1.1232935”
      I can’t find if anyone investigated her findings or if it was brushed under the carpet.

      Reply
  4. smoker

    Looking at that linked Consumer Education Foundation (CEA) June report pdf file, the letter lists HireVue as one of the Data Analytics Companies Generating Secret Surveillance Scores. That’s the same company linked over a week and half ago, on Links/October 23: A Face Scanning Algorithm Increasingly Decides Whether You Deserve the Job Washington Post.

    Surprise! So much for HireVue’s implication that candidate culling was a matter of unbiased AI Psychological assessment:

    Nathan Mondragon, HireVue’s chief industrial-organizational psychologist, told The Post the standard 30-minute HireVue assessment includes half a dozen questions but can yield up to 500,000 data points, all of which become ingredients in the person’s calculated score.

    Perhaps those interview podcasts are actually secondary to HireVue’s main business.

    As someone who has very recently been slandered with an utter lie on one of the criminal online reputation score sites (I highly suspect by some powerful for profit ‘healthcare’ entity I’ve locked heads with over their criminal behavior and someone’s wellbeing), and can’t afford to do, nor have the time to do, a thing about it, I hope they’re all locked up at the end of the day. I am absolutely positive that millions have been thrown into utter terror (I was) – reputations destroyed – about what unfounded slander these sleazy, immoral reputation brokers can perpetrate on someone they’ve never even known or communicated with.

    That Congress (particularly Silicon Valley/Bay Area Reps), which should have done something about this decades ago, has any ratings of confidence at all, is amazing. Meanwhile, the suicide rates keep exploding, that’s what happens many times when a person’s life is destroyed, particularly when the destroying factor was an utter lie and they have no affordable recourse.

    Reply
  5. HotFlash

    So much information in the metadata of our shopping lives that lays our lives bare. Think you might be redlined b/c you have bought Blue Magic Conditioner? And don’t forget that reporters have been tipped off to major govt actions by pizza orders. WaPo (1998) here, more instances about the info you can get from DC pizza orders is here.

    Re the algos themselves, I would have to point out that Google’s promise of targeted ads is specious at best. I went online looking for a replacement for my broken crockpot liner over the weekend, since have been hounded by companies that think I want parts for commercial food prep machines (they really, really want to sell me Hamilton Beach spigots, of all things), including on NC. I clicked on a few of them, to 1.) screw up their algos even more, 2.) to give Yves a few denarii, and 3.) purely out of spite. If they can’t get that right, I hesitate to think how accurate their other analyses — social? political? — are.

    And finally, to all those people who assure me that if I have done nothing wrong, then I have nothing to hide, I respond, “I am sure Anne Frank’s parents didn’t think they had done anything wrong, either.”

    Reply
    1. ambrit

      I quote the line that Bogart uses to the District Attorney in the film “Maltese Falcon”; “Everybody has something to conceal.”
      Errol Morris’s film “The Thin Blue Line” is an excellent primer on the contortions that ‘Truth’ is forced to take.
      A Louisiana State copper once told my wife, after she had tried to talk him out of giving her a ticket, “Maam. We have a ticket for every occasion.”
      The Fix is in.

      Reply

Leave a Reply

Your email address will not be published. Required fields are marked *