Imagine Everything
Surveillance is done in fear. Due diligence is done in Love.
June 24, 2019

EdWeek published an article on June 12 promoting the five things administrators need to know about digital surveillance. The article makes some good points about privacy, but as with most journalism today, takes a polar stance rather proposing a balanced approach to freedom and security. While I credit Benjamin Herold, the journalist, for taking the time to request and review hundreds of pages of alerts and notifications by competitive surveillance systems, the conclusion to the article either negates or incorrectly portrays certain facts.

1 - School boards are not motivated by fear.

We have connected with a significant number of Canadian school boards to discuss student safety. Do you know how many were motivated by fear? Zero. These are intelligent, thoughtful, superintendents, associate superintendents, student services directors, and technology directors, sharing a similar altruistic motivation: protecting vulnerable kids. They know many are slipping through the cracks and they’re looking for progressive methods to reach them -- that's it.

2 - There is reasonable cause for due diligence.

In 2016 the Kids Help Phone (Canada) conducted a survey of 1319 teens aged 13 to 19 across a diverse demographic which explored common issues faced by teens such as suicide, body or self image, relationship issues, and bullying. The conclusion was that 20% of teenage children seriously consider suicide as an way out of the problems they are experiencing!

According to trending data collected from 2006 to 2016 by Statistics Canada, each province is likely to have the following number of youth suicides in 2019: BC - 26, Alberta - 33, Saskatchewan - 15, Manitoba - 9, Ontario - 71, Quebec - 98; there was not enough data for the Atlantic provinces to speculate. Statistics Canada also notes that suicide is the 2nd leading cause of death in children age 10 to 19 years old. It’s not just a Canadian thing either, the WHO organization reports nearly identical numbers worldwide.

The research we’ve done in collaboration with school boards validates the above information. In an anonymized study we conducted with 23 school boards encompassing 134,000 students over a five month period we discovered 941 human verified instances of students searching for things like: quick and painless ways to commit suicide, easiest way to cut your wrists, drugs to mix to commit suicide, etc.

We also discovered 1,297 instances of child sexual abuse images being accessed online by students on school devices. The bulk of this deeply perverse material being accessed was illustrated or rendered using 3D applications -- in Canada, this content is illegal. Unfortunately, it’s generally legal across the United States which increases production, distribution, and accessibility by students.

There were 315 instances of what we call "school violence." This ranges from kids exploring how to make pipe bombs, how to 3D print guns, to requests to join terrorist factions such as ISIS.

3 - Understand privacy science.

You can’t have a reasonable conversation about student privacy and safety without first understanding the science behind how these technologies work; or, equally important, how the data is handled when high risk incidents are detected.

Any time potentially risky data is classified by a system it’s going to fall into one of four categories:

  • True positive. The system believes something is safe, and it turns out to be safe. In this case there is no invasion of privacy as the data is never flagged or reviewed.
  • True negative. The system believes something is dangerous, and it turns out to be dangerous. In this case there is an invasion of privacy, but for a just cause, as the student is at risk.
  • False positive. The system believes something is dangerous, but it’s actually safe. In this case there is an invasion of privacy, but without just cause; the system made a mistake.
  • False negative. The system believes something is safe, but it’s actually dangerous. In this case there is no invasion of privacy, but a legitimately high risk incident went undetected.

In a perfect world with no privacy violation, a system would only report the true negative content and would have zero false positives. It would ensure all the true positive (good, safe content) remained private and is never reviewed by school staff. But, we don’t live in a perfect world, and technology makes mistakes.

In Benjamin’s article he describes a ludicrous number of false positives with one of the systems. In this case, school boards are reviewing innocent content and invading a students privacy; they’re also creating a tremendous amount of work for the school board staff. Why?

Benjamin described how the companies in his review attempted to provide "sentiment analysis" by an "emotionally intelligent" app. When you eliminate all the marketing jargon what that really translates into is more mistakes by less predictable systems, and more invasion of student privacy.

I don’t believe for a minute a reasonable person would agree with students looking at child sexual abuse images online (for the student’s mental health or due to how it victimizes the children in those photos). I also don’t believe a reasonable person would think it is acceptable for children to be exploring the fastest, easiest way to commit suicide without somebody being notified. School boards need to be asking the right questions: What is a reasonable level of false positive reporting? What type of data is being accessed and reviewed? (e.g. reviewing web traffic logs is a lot different than reading people’s emails) Is that data reviewed in an anonymized way until it has been verified as a true negative?

4 - We’re talking about the privacy of kids, not adults.

It’s important to draw a distinction between the anti-surveillance privacy rights of adults using their own home equipment versus children who are privileged enough to use publicly funded (tax dollars) school equipment.

Children need guidance and mentorship. As a person who spent seventeen years as a child I can personally attest to the need for the fortitude, discipline, and wisdom that only comes with vast life experience. Giving children too much space to get into trouble can have life-altering consequences.

As a society, we need to be very careful to balance the freedom we choose to give children with the need for a close hand capable of intervening when they steer towards imminent danger. I am a huge fan of learning through failure, but when it comes to the consequences of accessing sexually perverse online material (both the legal repercussions and mental wellness) there needs to be a clear backstop.

What does Imagine Everything do to protect the privacy rights of students?

  1. Our company board of directors are all senior leadership who work for school boards. The board has complete autonomy over product feature development, product pricing, company policies and regulations, executive-level hiring, and budget. You read that correctly: school boards are pricing their own technology and steering our privacy mandate.
  2. Our mission is to rescue vulnerable or high risk students. We are not profit driven as reflected in our transparent pricing and budgeting process; there are no ulterior motives. Any serving board member has full book access to the company.
  3. We do not collect personally identifiable information (PII). This is a major distinction between ourselves and our competitors -- it’s intentional, and rooted in privacy. We analyse web traffic and correlate the data back to an email address; outside of the province of New Brunswick, in accordance with all provincial privacy acts, neither email address or web traffic are considered personally identifiable. We do not scan email content, page content, or post content.

    Most of our competitors "directory integrate" which gives them far more student data including first name, last name, full email send and receive history, full document create history, access to chat logs, etc. Furthermore, their implementation will often scan web page and post data. This is a far more intrusive, and unnecessary tactic.
  4. Limiting investigator’s access to private data. Outside of a super-admin account operated by one or two very high level school officials with FOIP privacy training, investigative accounts operated by a principal or counsellor has zero access to data until an investigation has been launched. Even then, they are given a very narrow view of the high risk traffic until the investigation has been verifiably marked as a legitimate concern. If the incident does happen to be a false positive the investigation is closed and all access to that data is immediately revoked.
  5. We’re a Canadian company following both Canadian and international privacy regulations. All school board data is stored in Canada and we are very clear that the data ownership is maintained by school boards: we just help process and visualize it. We are also fully GDPR compliant and, while that type of regulation has not made its way to Canada yet, affords students and parents rights over the data as well. We don’t do that because we have to, we do it because it’s the right way to handle information on behalf of the students and parents.
  6. Data is retained only for a very brief period. We only store non-risk data for a 30 day window before it is completely eradicated by our system. The only time data lives past the retention period is when it enters an active investigation into a high risk incident.
  7. System security. Our multi-tenancy design physically separates each school boards data, we have an enterprise-grade intrusion detection system, we employ IP-based security that restricts data access to Canada, we support and strongly encourage two-factor authentication by administrators and investigators, and all data is properly encrypted during transport and rest.
  8. Data access logging holds administrators and investigators accountable. Our system stores a log of transactions any time data is accessed. This helps ensure that even the most privileged administrators are accountable when accessing data. This log cannot be altered in any way by even the highest ranking school officials using the system.

Polarity sells. It creates sensational headlines and attracts readers, but it misinforms them by only sharing half of the story. Using surveillance scare tactics is no different than using school violence scare tactics -- we need to be discerning enough to look at the data and make educated, calculated decisions that account for both safety and privacy.

I trust that all agents of our company and the school boards we’ve partnered with will continue to operate with the highest integrity, in a data-informed manner, in the best interest of students using love, not fear, as a motivator. If we felt in any way that a school board was abusing our technology, to violate the fundamental rights and freedoms of a student, we would immediately sever all connections with that school board.

We will remain in constant pursuit of a balance between safety and security and I believe we have created a very unique, publicly-transparent operating structure to remain diligent in that balance.

Contributed by Brad Leitch, CEO of Imagine Everything.


Imagine Everything is a new education technology start-up guided by the vision and energy of school boards and fueled by the innovation and efficiency of private industry software engineers. We’re a company without clients, a community of creators who believe in a hybrid private + public partnership. Together, we are building affordable school technology that creates a safer online space for students, parents and teachers.

View All Recent News