In 2018 Florida passed legislation requiring the rollout of a reporting app called FortifyFL. This software was developed by AppArmor, a Canadian company. The app promotes communist-style snitching on your neighbor but in this case students on students. In communist Cuba and the former East Germany (DDR), reporting on your neighbor for activities like “participating in opposition movements” was encouraged and used against citizens.
There is no way to know how data submitted about your children in FortifyFL or other Education Technology (EdTech) apps at school will be put to use by BigTech or authorities now or in the future. The data entered into student reporting apps can be very personal and private. This app can record what someone else claims you said. “Your” words are stored in an online system and disseminated, without you ever knowing.
The possibilities stemming from the collection of data are abundant, as evidenced in this report about children being harassed in Pasco County, FL. The Sheriff’s Office claimed “its program was designed to reduce bias in policing by using objective data”. Was there a statistically significant reduction in bias? How was bias defined? Does harassment qualify as bias?
In a Spiegel International article from 2015, the un-neighborly reporting that occurred in East Germany is discussed:
“No matter where one shared information, the state would put it to use. The East German reporting system kept track of the country’s citizens from kindergarten, throughout their working lives and even into retirement…Files were even kept on schoolchildren: “Wears Western clothes,” “exhibits affinity for punk music,” “demonstrates pacifist attitudes.”, and
“…they were totally normal citizens of East Germany who betrayed others: neighbors reporting on neighbors, schoolchildren informing on classmates, university students passing along information on other students, managers spying on employees…”
The East German reporting system sounds eerily like a combination of FortifyFL and Florida’s integrated database that collects and stores student (and family) data from multiple sources. Many EdTech products, like the Canvas software, aim to store student data from kindergarten through post-graduate education. If that isn’t enough, then companies track their employees with monitoring and surveillance software.
Where does this end—in an Orwellian state, a Stasi-like state?
The Spiegel International article explains that many East Germans who felt their lives had been “de-railed” later learned that documents from factories and universities revealed that “making an ill-considered comment at the student union” or even simply lacking certain viewpoints could lead to one’s removal from university.
This kind of report on your neighbor’s activity can be subjective, and officials encouraging its use might mislead the public into believing they are contributing to the good of society. Reporting unverified suspicions into systems that can permanently document your behaviors, actions, beliefs, or hasty statements can get out of hand; as evidenced by the multitude of examples (e.g. revenge reports) in the Spiegel article. Like data compiled in the East German system, FortifyFL likely does not produce positive data about Florida’s schoolchildren, and the data compiled is not even necessarily legitimate.
Those who naively assume data amassed by EdTech and government is safe underestimate risks like hacking or transmission abroad to foreign enemies. The security of collected data and where it all gets transferred is a mystery.
Do not trust EdTech companies any more than a pickpocket with your purse. I will take a missing wallet over propagation of private data any day—I can cancel credit cards and get a new driver’s license, but I cannot make private very personal information once it has been made public or shared with “trusted partners”. One might read policies like the data is stored in a secured location in the U.S., but who believes that given the trustworthiness of tech companies and the lies they have told about our privacy.
Hacking and ransomware events creep through our data like kudzu; but it seems we are not always told about those events and I put the blame squarely with Florida law (another related topic).
Here is my common sense interpretation of the frenzied collection and use of data: Politicians and government agencies are hungry to rely on data, statistics, and modeling because it offers a refuge from responsibility. Politicians could say the data told me to do it! It becomes an out for every bad decision made. The data was wrong, the model was wrong, hackers infiltrated the database—there are a myriad of possible responses for legislators and local politicians to point fingers at to avoid being responsible for making their own common sense decisions. Pointing to inanimate objects (computer models and data) is a fitting escape when no one wants to be held responsible for bad outcomes.
Florida Republicans supported the deployment of FortifyFL. Are Florida Republicans aligning with communist reporting methods? Florida legislators need to take a deeper look at the privacy and ethical problems with the public safety act they enacted. The privacy invasive aspects of the law are not in the best interests of Florida citizens. Encouraging citizens to spy on each other, especially in such an overt way, is an anti-American activity.
Aside from the ethical implications of having student on student reporting and the schism and suspicion that can drive between fellow citizens, the app is reportedly problematic and has been misused by students (that’s a shocker). The FortifyFl app was attributed by “the St. Petersburg police commander“ as a driver for an increase in the Baker acting of students in Pinellas County, FL. If you don’t know what Florida’s Baker Act is and how dangerous it can be for children, you should read about it in this bombshell article.
I objected to the use of FortifyFL the day after it was announced in 2018, especially given it was being pushed to students under 13. My questions directed to then Attorney General Pam Bondi about FortifyFL’s compliance with the Children’s Online Privacy Protection Act Rule (COPPA) went unaddressed on Twitter. I posted a copy of the FTC’s COPPA FAQ that states:
“…it is not sufficient to provide such notification and choice to the child user of a website or service. If the operator intends to collect geolocation information, the operator will be responsible for notifying parents and obtaining their consent prior to such collection.”
Weeks later, a news report by Elizabeth Djinis revealed that although FortifyFL was marketed as an anonymous reporting app during its rollout, it was not automatically anonymous upon installation. It required the user to enable/disable the app’s location services. The initial requirement that a user (e.g. child under 13) disable location services is contrary to the COPPA FAQ quoted above.
According to the Djinis article, Florida Attorney General’s Office spokesperson Whitney Ray stated: “As an added precaution, this feature has been turned off and responding agencies will no longer receive this information”. Whitney Ray’s quote was not specific about which feature: the collection of location information or the dissemination (after collection) of a child’s location information to responding agencies. Djinis did not clarify the statement from the Attorney General’s Office spokesperson, but the headline is “Florida school security reporting app will no longer track location”—is the reporter’s headline correct?
COPPA’s FAQ states operators are required to get parental consent if location information is intended to be collected from anyone under the age of 13. In general, location information can be collected from an individual and not sent to responding agencies. It remains unclear to me if geo-location functionality has been deactivated in the application or if verifiable parental consent (see linked item 4) is obtained before location data is collected. After downloading and attempting to turn on/off location functions (without submitting a tip) it appears FortifyFL will allow a user to turn on/off location services in the tool and consent to submit a tip without verifiable parental consent. Despite the non-specific comments quoted from the former Florida Attorney General’s office, FortifyFL permissions in Google Play currently include location services.
Florida Statute 943.082 states: “That if the reporting party chooses to disclose his or her identity, that information shall be shared with the appropriate law enforcement agency and school officials…” There is no explanation for how children under 13 will be handled in order to comply with the COPPA Rule, enforced by the FTC. In 2019 Google and YouTube faced a $170M penalty for violating COPPA. The FTC explains that operators who violate the Rule can be held liable for civil penalties of up to $43,280 per violation.
The FTC’s COPPA Rule FAQ (B.6) holds federal agencies accountable to the rule: “…all websites and online services operated by the Federal Government and contractors operating on behalf of federal agencies must comply with the standards set forth in COPPA.”
But is the State of Florida and/or AppArmor violating COPPA? The FTC explains in a Business Blog from 2015 that “While we encourage all types of entities to respect children’s privacy, the FTC’s enforcement authority doesn’t extend to information collection by state governments or most nonprofits.” So who is protecting children’s privacy without the loopholes—if not Florida, if not laws like COPPA and FERPA, if not non-profits, then who? Schools get exemptions from FERPA and COPPA, state and local governments apparently get exemptions from COPPA, and all these government agencies can contract to share data with third party small, medium, and big tech companies.
Regardless of what laws are passed, it seems existing laws are blatantly violated or contain loopholes and exemptions. What hope do children have for any modicum of privacy?
There is more bad news on FortifyFL—an alarming Twitter post indicates there was prior knowledge that FortifyFL lacked anonymity, contrary to what is required in the Florida law.
Who else but citizens are checking that government follows its own rules?
While you are not forced to download the product, you cannot prevent another person from entering data (true or not) about you into the app. This is the schism being massaged into society. This could be a student reporting a friend’s suicidal thoughts that were shared in confidence. That data could be released in a poorly redacted state (containing still-discernible identifying information like a full name) and then used by an award-winning reporter in an internet article. This actually happened; ultimately, the sensitive personal information shared by a child in confidence was made public. That data could also apparently be transferred to another country where U.S. laws and protections do not apply.
Who knows what happens to personal information about your child. This personal information should not be propagated to servers, in the U.S. or around the world. These children—our citizens, brothers, and sisters—deserve more decency from their fellow man than this impersonal approach attempting to program a response out of a disconnected government.
The disregard for student privacy and the problems that exist in securing student data points to the ignorance surrounding technologies by those that have held and currently hold positions of power in our state.
Florida politicians need to get their hands out of their pockets and protect our children from invasive, prying technologies.
We the people want to protect our kids. Where are our legislators? If the Department of Defense agency DISA, Microsoft, NASA, and others can be hacked, how secure is student data? Do you think student data is secured better than data held by these companies and agencies? Is it not vital to the future of our country and our national security to protect private information about our young citizens? These young citizens will soon be adults and by then it will be too late: the data will already be there to use against them, even if the data is wrong.
It seems Pandora’s Box of data privacy is already open—we are so far down the road there might be no return. Is Hope still at the bottom of Pandora’s box or is it gone? Money is involved—entire industries (e.g. EdTech) have developed on the basis of collecting data. Going forward, what are we doing to protect our children from how the data could be used against them?
Here’s a thought– quit legislating the collection and aggregation of so much personal data about our citizens!