What’s Your Threat Score?

Yves here. Pre-crime has arrived. And if you think it’s hard getting your credit record corrected, it’s a piece of cake compared to fixing errors in your threat score. Plus it’s not hard to imagine how threat scores can be used to deter certain types of activity, say like visiting certain parts of the world, or watching deemed-to-be-radical documentaries.

By Sarah Burris. Originally published at Alternet

Police have found a new way to legally incorporate surveillance and profiling into everyday life. Just when you thought we were making progress raising awareness surrounding police brutality, we have something new to contend with. The Police Threat Score isn’t calculated by a racist police officer or a barrel-rolling cop who thinks he’s on a TV drama; it’s a computer algorithm that steals your data and calculates your likelihood of risk and threat for the fuzz.

Beware is the new stats-bank that helps officers analyze “billions of data points, including arrest reports, property records, commercial databases, deep Web searches and…social-media postings” to ultimately come up with a score that indicates a person’s potential for violence, according to a Washington Post story. No word yet on whether this meta data includes photos and facial recognition software. For example would an ordinary person, yet to commit a crime, be flagged when seen wearing a hoodie in a gated Florida community.
The company tries to paint itself as a savior to first responders, claiming they want to help them “understand the nature of the environment they may encounter during the window of a 911 event.” Think of it like someone pulling your credit score when you apply for a job. Except, in this instance you never applied for the job and they’re pulling your credit score anyway because they knew you might apply. It’s that level of creepiness.

Remember the 2002 Tom Cruise movie Minority Report? It’s set in 2054, a futuristic world where the “pre-crime” unit arrests people based on a group of psychics who can see crimes before they happen. Only, it’s 2016 and we’re not using psychics, we’re using computers that mine data. According to the Post piece, law enforcement in Oregon are under federal investigation for using software to monitor Black Lives Matter hashtags after uprisings in Baltimore and Ferguson. How is this new software any different? In fact, this is the same kind of technology the NSA has been using since 9/11 to monitor online activities of suspected terrorists—they’re just bringing it down to the local level.

According to FatalEncounters.org, a site that tracks deaths by cop, there were only 14 days in 2015 in which a law enforcement officer did not kill someone. So, leaving judgment up to the individual hasn’t been all that effective in policing. But is letting a machine do it any better? Using these factors to calculate a color-coded threat level doesn’t seem entirely practical. Suppose a person doesn’t use social media or own a house but was once arrested when he was 17 for possession of marijuana. The absence of data might lend itself to a high threat level. The same can be said for online meta data that might filter in extracurricular interests. Could a person who is interested in kinky activity in the bedroom be tagged as having a tendency toward violence?

The Fresno, Calif. police department is taking on the daunting task of being the first to test the software in the field. Understandably, the city council and citizens voiced their skepticism at a meeting. “One council member referred to a local media report saying that a woman’s threat level was elevated because she was tweeting about a card game titled ‘Rage,’ which could be a keyword in Beware’s assessment of social media,” the Post reported.

While you might now be rethinking playing that Mafia game on Facebook, it isn’t just your personal name that can raise a flag. Fresno Councilman Clinton Olivier, a libertarian-leaning Republican, asked for his name to be run through the system. He came up as a “green” which indicates he’s safe. When they ran his address, however, it popped up as “yellow” meaning the officer should beware and be prepared for a potentially dangerous situation. How could this be? Well, the councilman didn’t always live in this house; someone else lived there before him and that person was likely responsible for raising the threat score.

Think what a disastrous situation that could be. A mother of a toddler could move into a new home with her family, not knowing that the house was once the location of an abusive patriarch. The American Medical Association has calculated that as many as 1 in 3 women will be impacted by domestic violence in their lifetimes, so it isn’t an unreasonable hypothetical. One day the child eats one of those detergent pods and suddenly the toddler isn’t breathing. Hysterial, the mother calls 911, screaming. She can’t articulate what has happened, only that her baby is hurt. Dispatch sends an ambulance, but the address is flagged as “red” for its prior decade of domestic violence calls. First responders don’t know someone new has moved in. The woman is giving CPR while her husband waits at the door for the ambulance. What happens when the police arrive?

It’s a scenario that can be applied to just about any family and any situation. Moving into an apartment that previously was a marijuana grow-house; buying a house that once belonged to a woman who shot her husband when she found him with his mistress in the pool. Domestic violence calls are among the most dangerous for police officers. Giving police additional suspicion that may not be entirely accurate probably won’t reduce the incidents of of accidental shootings or police brutality.

The worst part, however, is that none of these questions and concerns can be answered, because Intrado, the company that makes Beware, doesn’t reveal how its algorithm works. Chances are slim that they ever will, since it would also be revealed to its competitors. There’s no way of knowing the accuracy level of the data set given in the search. Police are given red, yellow or green to help them make a life-changing or life-ending decision. It seems a little primitive, not to mention intrusive.

“It is deeply disturbing that local law enforcement agencies are unleashing the sophisticated tools of a surveillance state on the public with little, if any, oversight or accountability,” Ryan Kiesel of the Oklahoma ACLU told me. “We are in the middle of a consequential moment in which the government is unilaterally changing the power dynamic between themselves and the people they serve. If we are going to preserve the fundamental right of privacy, it is imperative that we demand these decisions are made as the result of a transparent and informed public debate.”

While mass shootings are on the rise, violent crime and homicides have fallen to historic lows. You wouldn’t know that watching the evening news, however. Is now really the time to increase the chances of violent actions at the hands of the police, all while intruding on our civil liberties under the guise of safety?

Print Friendly, PDF & Email

47 comments

  1. Clive

    At my TBTF, we were “treated” earlier this week to a death-by-Powerpoint presentation about how the Risk function has got all sorts of wonderful metrics about how this- or that- factor was present in employees who committed fraud. Things like age, sex, length of service, job tenure, region, role and so on. They didn’t say, but I’d bet that ethnicity, marital status and numbers of depends as well as income and indebtedness are also tracked and similarly scored.

    I was horrified at this on several levels. Firstly, this sort of data-as-profilling is pseudoscience. The trap the researchers fall into is that, just because an individual hits the right (or in this case, the “wrong”) score absolutely does not mean they are going to actually commit any wrongdoing. It’s a big-data version of guilt by association.

    This was my minor (by comparison) gripe. My big problem with it was, the crimes being targeted were crude “hand in the till” sorts of pilfering. Cashiers embezzling a few thousand, loan officers writing fraudulent business to non-existent borrowers — that sort o thing. White collar crime, but pretty unsophisticated.

    As we NC readers know all too well the real wrongdoing happens in the C-suite where customers, investors and taxpayers are scammed out of hundreds of millions or billions. You can be sure that the perps of those crimes won’t appear on any behavioural proclivity heuristic analysis exercise.

    1. craazyman

      you better watch it Clive. Posting comments on NC is a BIG RED (no pun intended) FLAG. In fact, you’re probably already guilty of something, no doubt. Probably “Refusal to be Vigilant about the Probability of Perpetration in the Workplace”. If you have long hair and a mustache then you’re score just hit 10000. But you probably don’t. I bet you’re clean cut like me. We’re safe for now, but we have to keep the barbershop visits on a regular schedule and shave at least every 3 days.

        1. Clive

          The worse part was, everyone else was ooh-ing and ahh-ing about how this really clever stuff and generally A Good Thing. If they considered it otherwise, they kept it quiet. Yves did a great summary and definition of self-censorship a little while back but I can’t remember which post it was on so can’t plagiarise quote it here, but if they viewed this as the kind of menacing, sinister development I took it to be, they hid it especially well. The audience was a load of “data guys” (they were almost all guys, which probably didn’t help) from various walks of life in a TBTF (mostly in bulk payment transaction monitoring and credit risk modelling) so it was a case of preaching to the converted.

          But really, yes, we are sleepwalking our way into totalitarianism.

          1. fajensen

            The worse part was, everyone else was ooh-ing and ahh-ing about how this really clever stuff and generally A Good Thing.

            Most IT-people that I know all have authoritarian tendencies to a varying degree. I think it comes from all the data-flow modelling they do. People becomes little bit like Data to them, something that programmers must do something to or with, something that sucks resources, must be managed because it doesn’t do anything useful on it’s own and must be padded or pared down to fit the internal design of the machine the programmer has, because otherwise the processing is inefficient (those 8-bit reads really kills 128 bit reads and murder the cache efficienci ;-).

          2. rfdawn

            The new “because database” unreasoning is here to assist the existing “because markets” rule. My home address has two variants that will live forever in (at least) two commercial databases. You might think that one of those variants must be wrong. But you would be wrong. Databases are always right, even when they contradict each other about the simplest facts. Resistance is futile.

      1. Clive

        The thought had occurred to me! Even talking to the likes of you probably whacks my pre-crime FICO score up by a few points. I’d better read a few Daily Mail articles to try and reset the baseline. If that doesn’t work, I can always watch Fox News for half an hour. Fortunately, if I try to grow a moustache I end up looking like an escapee from the Village People so that’s never been an issue. But you’re right, I think a visit to the barbershop would still be a wise precaution.

        1. scott

          Having a Harper’s subscription for a decade has me on so many liberal mailing lists ….posting immunity?

        2. Praedor

          I think what you need is to get a web history going of regular visits to Faux News websites and a subscription to the Wall St Journal. We all know that people that frequent these sites or read that rag NEVER commit crimes…at least, none that the government is willing to do anything about (*COUGH* Goldman-Sachs, Bank-o-Murica, the Bundy’s *COUGH*).

            1. OpenThePodBayDoorsHAL

              We’ve had pre-crime “enforcement”, not just monitoring, for years, under Clinton, Bush and then turbocharged under Obama. Should be termed “enforcement with extreme prejudice” as the administration of this pre-crime “justice” comes in the form of HawkEye missile explosions, with a fully 4% accuracy rate in applying said justice to the correct projected future criminal (talk about holding the scales while wearing a blindfold…).
              Maybe Raytheon can develop a system where the missile hovers for a moment and reads a summary execution notice by loudspeaker to the client before blowing him/her to a red mist (that includes anyone unlucky enough to be standing nearby). Due process!
              But they hate us for our freedoms…really, no, no, really..

              1. fajensen

                A disturbing lack of Faith – Everyone around any strike location are “Enemy Combatants”, You Know This to be True because The Great Obama made it thus.

                … of course us hicks would think that someone was painting bulls-eyes around every shot to fix the numbers, but, we are not lawyers so what do we know about justice?

      2. craazyboy

        I’m worried blog handles could be a problem too. I’m considering changing mine to

        Obedient_Buttkisser_Of_The_Glorious_Authoritarianism

        That might help my score.

        But I’ll still have to stop saying bad things about ZIRP and QE. And Republicans. And Hillary!. And the MIC. And Corporations. And OCare. I guess we can still talk about the weather. Weather isn’t climate, so that should be ok. Or that the Green Bay Packers beat the Redskins. Sports is always a safe topic. Plus football fans should rate high in non-violence scoring. Maybe even half as high as gun owners.

        1. craazyman

          keep your hair short, shave and watch football on Sunday. If you watch baseball too, you should be fine.

          If Bernie Sanders wins the presidency, you can read George Orwell books and talk about them in public. Go Bernie!

          On a serious note, somtimes good things happen. I recall for years potential employers would ask for your credit report if you were interviewing for a job. That always incensed me. What the hell business is it of theirs? What kind of derisve pile of fetid scorn is that to heap on a “job seeker”? The other day I was waiting for the subway train and saw an advertisement poster there on the platform, a public service ad, it’s now illegal in New York for a potential employer to ask for your credit report, evidently, and the ad had some strong language about how wrong it was when they could. That kind of made me smile. Actually, it did.

        2. giantsquid

          Best avoid talk of the Packers. They’re owned by the People of Green Bay and a non-profit to boot, the only non-profit, community-owned professional team in the U.S.A. And I’m fairly certain they wear pinko long johns under those ecoterrorist green uniforms.

      3. GuyPatterson

        Just wanted to let you know your comments routinely make my day (perhaps even my life, draw your own conclusions about my life and trajectory)

  2. Gabriel

    Something I picked up from Andrew Cockburn’s “Kill Chain” is that a very attractive feature of boondogles involving “algorithms” (btw, Criswell predicts: “neural networks” will follow when “algorithm” loses its shine) is that any problems or scandals only serve to create continued demand for the “algorithm-making sector” that produced the garbage in the first place If, say, some high schooler gets tasered because of the “Intrado” corporation’s algorithm flagging him as a imminent shooter, this’ll just create more demand for algorithm-making, either by Intrado or a competitor, since clearly the flawed version cannot be left in place. The more flawed the original product, in fact, the greater necessarily will be the demand for future revisions–there’s “no downside”, as they say.

  3. digi_owl

    “One council member referred to a local media report saying that a woman’s threat level was elevated because she was tweeting about a card game titled ‘Rage,’ which could be a keyword in Beware’s assessment of social media,” the Post reported.

    No surprise there. These kinds of systems suck at context.

  4. Synoia

    Do the algorithms include profession, and professional history?

    Would they predict (snark) unlawful violence by a Police Officer or Police Dept? Or corruption by one of our ever-so-pure elected officials?

    I can see a self-inflicted wound in this profiling. Independent AI with full data access might just have some significant benefits.

    1. scraping_by

      When the subject is AI, remember its boosters always talk about the Intelligence, while those who have to live with it deal with the Artificial part.

      All attempts to replace messy, expensive humans with cheap, clean electronics will result in getting less. The present example of using a database to reproduce cops who know the people in their beat and judge them on experience is yet another false economy.

  5. fresno dan

    According to FatalEncounters.org, a site that tracks deaths by cop, there were only 14 days in 2015 in which a law enforcement officer did not kill someone….THAT WE KNOW ABOUT
    FIFY

    Oh, and originally being from Fresno, I of course have the tattoos and electronic implants for tracking purposes already in me – of course, this was years ago before transistors, so they just sewed a 1962 RCA radio onto me. What makes it so bad is that it doesn’t even have FM…

  6. cnchal

    The other day, optimander left a great link to this article. http://www.bloomberg.com/graphics/2015-paul-ford-what-is-code/ and out of the 38,000 plus words, this paragraph explains the economics of it.

    Data management is the problem that programming is supposed to solve. But of course now that we have computers everywhere, we keep generating more data, which requires more programming, and so forth. It’s a hell of a problem with no end in sight. This is why people in technology make so much money. Not only do they sell infinitely reproducible nothings, but they sell so many of them that they actually have to come up with new categories of infinitely reproducible nothings just to handle what happened with the last batch. That’s how we ended up with “big data.” I’ve been to big-data conferences and they are packed.

    Lots of innocent people are already being killed by police and this garbage barge threat assessment software will ensure that ever more are killed. The gullibility of police chiefs to fall for these sucker deals is astounding. No respect for any tax dollars from them.

    Has the time arrived when calling for budget cuts to police departments puts one on their threat list?

  7. jfleni

    The best way to deal with this “yuppie-nerd” nonsense is to post and repeat “Expletive Deleted” in the bluntest and rudest way possible when you find it. Even the dullest will get the message!

  8. Thure

    Wow!

    Predictive analysis based on an opaque sample set of de-contextualized data.

    Now, that’s real bizarre fortune telling and a recipe for self-fulfilling prophecies. This is very extreme profiling and ripe for all kinds of abuse.

    What happed to our Constitutional rights about probable cause and unlawful search and seizure.

    How can any of this ever hold up in court? It would seem the police would have to expose the algorithms and data that led to the action.

    1. perpetualWAR

      Have you been in our courtrooms lately?
      If not, you’d be surprised by how little upholding “the law” matters there.

      1. Jim Haygood

        “Criminal justice today is for the most part a system of pleas, not a system of trials.”

        — “Justice” Anthony M, Kennedy, May 2012

        There is no airing of evidence to “hold up in court” any more. That’s a scene out of a Norman Rockwell painting.

        These days, you either plead out to a lesser charge (over 90% do), or you face a litany of piled-on charges and the maximum sentence if convicted, for having wasted the conviction machine’s precious time with the tedious anachronism of a “trial.”

        The Sixth Amendment — “the right to a speedy and public trial, by an impartial jury” — is stone dead, killed by Anthony M. Kennedy and his fellow hacks in black.

      2. tegnost

        only people who’ve never been to court or involved in court cases think that it’s all fair. All court does is arrange a new starting point after a problem (crime, divorce, injury) It’s a way for society to move on, this is the accepted story of what happened, this is the redress, it’s over, get on with your life. It’s not really about justice even if justice may happen.

  9. Corey

    Well, at least we can be sure that the putative savings from the reduction in crime resulting from all this will be passed on to the consumer, right? Right?

    OK, I’ll go sit down now.

  10. flora

    This is pseudoscience, like physiognomy or phrenology. But, hey, it’s done by a computer and computers are never wrong. /s
    Computer code used as snake oil.

    1. scraping_by

      In the early 80’s, a company was created selling some financial product, annuities I think, and their only point of difference was the salesman would pitch using a Macintosh to show the mark, er, prospect the sales presentation. According to the advertising, that made them a ‘high tech’ financial company.

      But I assume the police have all grown too sophisticated to credit everything that shows up on a monitor as revealing mystical knowledge. Right?

  11. griffen

    Curious and even curios-er. Something about safety, something about liberty, that’s a quote running through my empty mind.

  12. OIFVet

    Good lord, I thought I had left 1980s Eastern Europe behind. This makes Stasi look like a bunch of enthusiastic amateurs.

    1. fajensen

      STASI at least created gainful employment for millions of people, this “neo-STASI” leaves nothing on the table for anyone.

  13. JTMcPhee

    When one puts this, http://www.bloomberg.com/graphics/2015-paul-ford-what-is-code/, with this, https://en.wikipedia.org/wiki/Butlerian_Jihad, and with the content of this post and all the stuff that it illuminates a bit, and all the dreck that is AI (The NExt Big Thing?) and the wonderful wistfulness of so many for the advent of Skynet/IOT, maybe one understands what Frank Herbert was channelling when way back in the Hippie Days he coined up the notion of the Butlerian Jihad (and just typing the word, let alone posting it, likely adds more “threat points” to my avatar and persona and fragile little person as registered the hit pile in the Panopticonium…)

    Too bad nobody is coding a “vulnerability index” that will identify, assign strengths to and publish the degree to which we fokking stuopid humans, or various subsets of us, are vulnerable to the destabilizations and compressions and explosions and vast wastings that the “successful” among us are adding to every picofemtosecond of every day…Financialization, combustion, messing with genetic material, nanobots and “Terminator’ autonomous killing machines http://www.thedoctorwhosite.co.uk/dalek/, novel combinations of carbon atoms. playing with nuclear fireworks (at least there’s one wetware-based multi-variable index for that: http://www.cbsnews.com/news/doomsday-clock-moves-two-minutes-closer-to-midnight/)

    Excuse me, folks, time to go assume the position, http://www.gettyimages.com/photos/fetal-position?sort=mostpopular&excludenudity=true&mediatype=photography&phrase=fetal%20position

  14. flora

    Guess none of these “scientists” have read Stephen Jay Gould’s “The Mismeasure of Man.”
    Their algorithms are just a tarted up (with computers!) repetition of the old idea that certain out-groups of people with certain characteristics are inferior to and dangerous to the in-crowd. And they can prove it! Instead of measuring cranial diameter and capacity, or the distance between the eyes, or some other bunk, they now measure various factoids in data reports. Oddly enough, or not, I’m pretty sure the result is the same as those older techniques for proving inferiority, danger, and exclusion. The in-group looks for data that confirms the in-group’s status and the out-group’s exclusion; this time using not biologic determinism, but Big Data determinism. All in a seemingly unbiased analysis. The more things change, the more they stay the same.

    1. Clive

      That is an important point that gets overlooked. Often, these “advances” are just based on previously debunked old tropes and cliches dressed up in a modern, trendy technology garb. New wine in old bottles is the phrase that springs to mind.

      This may be the reason we don’t catch onto them quicker. As scams go, they’re quite clever.

  15. LongWTC7

    The seventeenth century
    philosopher Descartes, regarded as the
    founder of modern philosophy, gave expression to this primary error with his
    famous dictum (which he saw as primary truth): “I think, therefore I am.”
    This was the answer he found to the question “Is there anything I can know
    with absolute certainty?” He realized that the fact that he was always
    thinking was beyond doubt, and so he equated thinking with Being, that is to
    say, identity – I am – with thinking. Instead of the ultimate truth, he had
    found the root of the ego, but he didn’t know that. (from A new earth, eckhart tolle)

    — Now the police don’t know how to think, so they uses “big data” to create a illusion of thinking for them.

    How to make people start to think, and more, to be ? A side note, David Rostcheck and Dan Rather among others were the first to think, and to ‘be’ with the event, and they later un-think, and un-be :-)
    Their threat-scores thus should come down, in the era of anthrax anonymous mailers.

Comments are closed.