A Florida Controversy over Intelligence-Led Policing or is it Childhood Social Credit Scoring?

A model is described where children are scored for having been a victim of emotional or physical abuse—a score that increases the chance they will be identified and targeted by deputies; twice victimized.

One Florida county has created its own version of Chinese Social Credit Scoring—a rating system for American children.  Florida government agencies collect data on children such as abuse histories, intelligence, grades, GPA, health information, attendance records, etc.  Then a local government agency uses its data in a scoring model to identify, target, and reportedly harass its young citizens.  

The Pasco County Florida Sheriff’s Office intelligence-led policing (ILP) manual explains how they identify youth at-risk of falling into a “life of crime”.  A model is described where children are scored for having been a victim of emotional or physical abuse—a score that increases the chance they will be identified and targeted by deputies; twice victimized.  This sounds eerily like the dystopian Chinese Social Credit Scoring (SCS) system.  

Who is in This Law Enforcement System?

Children who are not “at risk” appear to have their identity fed into a law-enforcement system.  The ILP manual states:

“…we take the active rosters for each school in the county and match each student with data from the schoolboard’s early warning system (EWS), our records management system (RMS), and DCF’s Florida Safe Families Network (FSFN). Students who are on-track across all categories are removed from the analysis” [emphasis added]

If a student is not at risk is their data still fed into a law-enforcement database, and then only removed from the analysis? Does the “on track” child’s identity remain in a law-enforcement system? Does this mean every child in that public school system has a profile with law enforcement?

Who is Labeled “at risk”?

According to Tampa Bay Times (TBT), a list of at-risk children is created from “the rosters for most middle and high schools in the county with records so sensitive, they’re protected by state and federal law”. The Tampa Bay Times explains neither parents nor children are notified when a child is added to the list.

An “at risk” mark comes easy in the Educational Risk Category: three days absent in one quarter or one “D”—according to the manual.  It appears that a straight-A student who traveled for a three-day family fun vacation would also receive a mark of “at risk”.

Florida Politics described a Pasco County Sheriff’s Office Facebook post as a “lengthy rebuttal”.  The Sheriff’s Office wrote, in part:

“Contrary to the [TBT] report, ILP is not a futuristic, predictive model where people are arrested for crimes they have not yet committed. Instead, the system is based SOLELY on an individual’s criminal history.” [emphasis added]

It seems the Pasco Sheriff’s Office mischaracterized what the Tampa Bay Times reported. Not a single instance could be found where Tampa Bay Times said “people are arrested for crimes they have not yet committed”.  Harassment, interrogation, and targeting are not the same as people being arrested

The ILP model used to identify at-risk youth doesn’t have to be “futuristic” to be wrong.  The ILP program reminds many of George Orwell’s novel 1984, written in 1948.  What seemed “futuristic” in 1948 is now easily within our technological ability—there is nothing futuristic about the program, but it is dystopian.

The Pasco Sheriff’s Office doubles down in the next quote from Facebook, bringing attention to another point: the claim that “SOLELY” criminal history is used in the model. 

“Let us, again, be profusely clear that this model is based SOLELY on an individual’s criminal history. It is nameless, faceless, ageless, genderless and removes ALL identifying factors of an individual, EXCEPT for their criminal history.”[emphasis added]

The Pasco Sheriff’s Office Intelligence-Led Policing Manual (Rev. 01/2018 pp 70-74) specifically states the educational risk factors “indicative of a juvenile at-risk for developing into a chronic recidivist offender” include grades, GPA, credits, office discipline referrals, and days absent.  Recidivist is another way of saying a habitual criminal.

Why are grades, GPA, and days absent noted as risk factors for labeling a child as a “juvenile at-risk for developing into a chronic recidivist offender” if the model (and ILP system) is based “SOLELY on an individual’s criminal history”? When did GPA become part of one’s criminal history?

The Pasco Sheriff’s Office clarified the model removes “ALL identifying factors” and that “It is nameless, faceless, ageless, genderless”.  This might be relevant for creating a non-biased model for political purposes, but it doesn’t mean much with respect to ethics or privacy.

Privacy of Sensitive Data

How is an individual identified if they are “nameless”? Are model results stored without identifying factors to protect the privacy of children and families? How is data and its integrity protected in a “spreadsheet”? 

Are “identifying factors” different from personally identifiable information (PII)?  All personally identifiable information would likely include name, date of birth, social security numbers, phone numbers, relatives, zip code, and much more.  De-identifying datasets is so difficult that technical papers and standards are written on procedures to achieve this goal (see documentation at HHS.gov, NIST.gov, Ed.gov, and the Future of Privacy Forum). 

Does the Pasco County Sheriff’s Office understand the technology and how easily data can be re-identified even if names, social security numbers, birthdates, etc. are excluded?  Maybe they were just referring to how output is created in a politically correct manner and not the security of the stored datasets.

In December, the Student Privacy Compass wrote “the Sheriff’s Office’s current data practices violate not only its contract with the school board but also the privacy protections required by the federal education privacy law, FERPA.” They also explain:

“Law enforcement and SROs should not be able to self-assert that their conduct is aligned with federal and state student privacy laws. They must receive robust student privacy training from qualified professionals…”

But will the State of Florida listen? This is not the first time they have heard this.  In September 2019, I wrote several U.S. legislators (with Governor DeSantis and Senators Rubio and Scott on copy) advising the same:

“School districts that share student data with third parties, for any purpose, should be required to employ a full time credentialed CyberSecurity and Student Privacy Expert. Each state should have a Medical and Education CyberSecurity and Privacy Commissioner. School districts must be required to train all employees in FERPA, COPPA, and PPRA. Each state must have a complaint and resolution process for medical and student privacy violations…”

Where Does a Childhood Rating System Lead?

Those who are skeptical need to consider what has already happened: The College Board was sued last year for selling student data, also produced an “adversity” scoring model based on student data, and has been found sharing student data yet again. In Pasco County, FL students are being scored, targeted, and reportedly felt harassed by deputies who claim they are using “objective data”.  How much can we trust what government does with our data when the State of Florida was caught selling driver’s license data?  FortifyFL, a mobile application required and funded by Florida law, collects data about students in true communist fashion—encouraging students and the community to report on each other.  Data collected in FortifyFL is relayed to the “appropriate public safety agencies and school officials”. 

A Florida safety law, passed by the legislature in 2018, requires data be aggregated from at least the following sources: “(a) Social Media; (b) Department of Children and Families; (c) Department of Law Enforcement; (d) Department of Juvenile Justice; and (e) Local law enforcement.” The Florida law also requires “data analytics resources”.  Did the law in any way limit the kind of “analytics” or programs that could be used to derive or lead conclusions about an individual?  Is artificial intelligence (AI) allowed to help flag children?  No limitations were placed on what kind of analytics products are procured under the legislation.

A tenet of our justice system is that we are innocent until proven guilty.  Yet children and adults are now susceptible to having their social standing, mental-health, and lives destroyed from analytically generated warnings that could be just as damaging as jail-time.  Innocent until proven guilty should also extend to a dehumanizing social credit score.

What is the tendency to over-rely and abuse scoring systems? The people of China know the answer to that question.  China’s citizens can be blacklisted based on a social credit score—potentially losing the ability to fly, ride a train, or “…entry into universities because of their parents’ bad social credit scores.”  According to The Washington Times China’s “Communist Youth League” has devised an app to score users, but “the app is also being used by China’s leaders to spy on the political leanings and other activities of the users. Those who post anything critical of the party will lose points.” The Washington Times also explains:

“In the past, the party relied on a system called “dongan” or personal file — millions of dossiers on citizens filled with personal information ranging from comments made in high school to remarks made to coworkers.”

“The heart of the SCS is the more than 200 million video cameras… all networked to vast stores of personal data sifted by increasingly advanced data mining software.”

A recent Guardian article reported that China’s predictive policing led to the internment of Muslim minorities for “transgressions that include simply being young, or speaking to a sibling living abroad”.  China’s massive database, Integrated Joint Operations Platform (IJOP), “includes information ranging from people’s physical characteristics to the colour of their car and their personal preference of using the front or back door to enter their house…”

Pasco County’s intelligence-led policing on children reeks of hidden dossiers including created data (e.g. scoring) that might not be true or accurate.  How many other Florida counties are using similar techniques?

Using predictive tools to label an individual child as “at-risk for developing into a chronic recidivist offender” based on statistical modeling or predictive scoring models is downright sickening and should alarm all citizens and parents.  The policies, laws, and agreements with school districts that have allowed and enabled this kind of unethical activity and behavior to occur must be reversed.  This kind of activity is not American; it is communism.

This entry was posted in Uncategorized. Bookmark the permalink.