Mental Fitness Programming Adverts Crises And Saves Lives

The Aspen School District made several changes to its technology policies this school year, banning cellphones and personal laptops on campus during school hours.

In lieu of personal devices, administrators provided students with Chromebooks in August programmed with Blocksi software, which tracks students’ online activity, blocks certain websites, and alerts schools to concerning web behavior.

According to Kim Zimmer, the co-director of learning and innovation at the Aspen School District, alongside Jason Pfeifer, trends are already surfacing.

Reporter Halle Zander spoke with Zimmer on Oct. 1, to discuss the details of the Aspen School District’s new online monitoring system. A transcription of their conversation is below, edited for clarity.

Halle Zander: What was the inspiration for bringing Chromebooks into Aspen High School and removing personal devices to begin with?

Kim Zimmer: When I came to the school district, I noticed that there was a bring-your-own-device program at the high school, and I knew that with a bring-your-own-device program, one of the things you lose out on is management of the device. There was no other way to really get inside of the search patterns, the tendencies during the school day, what’s distracting our students during the school day — any sort of safety, security, things going on on a student device that we were unable to track. So now we have systems in place that make that type of thing much easier. It’s just creating a much safer school environment.

Zander: So you’re using a software called Blocksi to track a lot of this activity. Tell me a little bit about the company and what you’re monitoring for.

Zimmer: So Blocksi allows us to just … it’s a basic web filter, right? So we can set different terms by which it is filtering our students. So there are categories. Those categories could be violence. Others categories could be drugs, advertising, spam, phishing, malicious websites. So we get to use Blocksi and really customize our safety and security in our district.

We can actually drill down as micro as the user. So if there is a particular student that is very distracted by YouTube — we’ve had this happen where a parent has actually requested, “please turn off YouTube for my kid. I don’t want them on YouTube from the hours of 6 p.m. to 6 a.m.” We can really drill down to that level.

Teachers in the classroom, they can actually go into what’s called a Blocksi session. They can capture all the student screens, and we really encourage them to do that for assessments. They can send out messages. So it really gives the teacher a lot of control over their classroom if they choose to use it.

Zander: OK, gotcha. And you’ve said that there’s a certain list of things that, you know, pop up on your system that you actually get from the FBI. Can you speak to that a little bit?

Zimmer: So, yeah, the FBI does have a list of like 2000 terms. Now, the FBI list is mostly related to incidents almost related to terrorism, security breaches, things like that. The FBI list is more situational. And yes, they want us to incorporate some of those things. Basically, we’re looking for search patterns. Is a student obsessively searching for September 11th? Is a student obsessively searching for a particular terrorist? Those are the terms that the FBI looks for regularly.

So we think we’re monitoring kids on the school level. The FBI is monitoring all internet activity for everyone in our country. So that is how a lot of school shooter information is found after the fact. Right? They look at the web history. They look at the social media of the kid and then they realize, “My gosh, there were signs, there were signals.” So we’re trying to sort of prevent those things from happening.

Zander: Okay. And you’ve had this system in place for the elementary and the middle school for a little while. This is the first year it’s been implemented at the high school. So you only have about a month and a half of data. But what trends are you starting to see with that limited information?

Zimmer: Shopping, gaming and in the most extreme cases, some inappropriate web searching: adult content, weapons, and most of it’s very innocent. That’s sort of what we’re looking for.

Zander: Do you have a system or a process set up for when a behavior becomes worrisome enough that it requires intervention with a kid?

Zimmer: Absolutely. So if something gets flagged, the flag comes to Jason (Pfeifer) and myself, as well as whatever building principal is associated with that student. Flagged words are typically around violence, drugs. Mental health is something I haven’t mentioned yet, and I really want to emphasize that the mental health component of this is almost more important than security, in my opinion. It’s like, “How are our kids thinking? How are they feeling?” And a lot of that is expressed with a web search.

Zander: Can you give me an example?

Zimmer: “Signs of depression.” “My friend is depressed. What do I do?” If I’m depressed, what do I do?” These systems are set up to really read like what’s going on in a student’s world and then kind of allow us to react to it if we choose to.

Zander: So things like that, looking up signs of depression, does that prompt an intervention?

Zimmer: It actually does.

Zander: Okay.

Zimmer: Okay. And I don’t want to prevent anyone from using the internet for that purpose because (they) might think that we can see it. It really has to be a pattern for us to get an alert, and it has to be kind of serious enough to warrant a follow up. And typically, that follow up will include just finding that student during the school day. It’ll typically be a principal or an assistant principal just kind of having a conversation. Maybe a counselor is brought into the mix, if it warrants that. Maybe an SRO (school resource officer) is brought into the mix, if it warrants that.

Zander: So one of the bigger concerns I’ve been reading about with software like Blocksi is the security of student data. When there’s a breach at a company like this or if some of these companies are selling that data, how do you make sure that all of this data that you’re collecting about these kids remains secure?

Zimmer: Now, companies are actually required to provide us with their terms of what happens to student data. So before we buy any software subscription, we ensure that we are getting that information from the company and then making an educated decision based on where our data goes.

Zander: Is there proof that monitoring programs like this can actually curb violent behavior or distracted behavior?

Zimmer: I know in our school district it has helped prevent these types of behaviors

Zander: Already?

Zimmer: Already. We have already been able to have conversations with students and prevent something before it became a bigger thing. And that is mostly related to mental health, not school violence, but we have had — not necessarily specific to the high school — but we’ve even had at the middle school a student, even if they’re joking and typing in something on their friends computer, it alluded to school violence, and we had to do a full threat assessment. So our school resource officers, it’s important to know, they are trained on threat assessment and they know exactly what to do with an internet kind of issue. They have been on top of it when a threat assessment is required.

Zander: Anything else you want to add just before we wrap up?

Zimmer: I just really hope that our students are learning to be responsible and understanding that throughout their life they’re going to be provided a device, if they’re in a career or if they’re at a job, that device is probably going to be monitored and they’re going to have to adhere to the acceptable use policy of whatever company they work for. So we’re basically teaching life skills, helping our kids really learn and grow as users of the internet and technology.

Zander: Kim, thank you so much for being here.

Zimmer: Thank you, Holly. I really appreciate it.