With the recent changes to Google's security features for users under 18, I have begun reviewing the level of access that I grant to each application in the Google admin console. In doing so, I hope to lay out a criterion that an app or service must meet in order to be granted the "trusted" status. My aim is that in setting such a standard, it will simplify the new approval process of educators requesting access to a service for use in their classroom.
My question for you all is: do you have a set of privacy requirements that a service must meet in order to be granted access to student GSuite data? If so, what has been your experience with verifying that a service meets these requirements?
Michael, this is a great question. We do not yet have a process in place for verifying "trusted" apps. At this time we are going through each request individually but don't yet have clear guidance/policy to determine what should be trusted and not trusted. In addition, our teachers often find these apps during a class and, when students are unable to use their school emails to login, they are asked to use their personal email addresses. Despite our efforts to discourage this, the trend continues. I am interested to see what other folks are doing as far as verifying apps and discouraging use of personal emails to login to these apps.
Thank you for your reply! That's something we're hoping to avoid as well. I'll make sure to set an explicit expectation with educators about this - students should ideally not be using their personal emails for anything.
A good first step is to move from the app to your policy. The privacy policies and terms of service are usually pretty explicit, and will trump your own policies (e.g. "our product is not intended for people under the age of___" ). Most things are going to be set up in accordance with FERPA and COPPA.
One district I worked for was very concerned about intellectual property rights; they didn't want students using anything that shared the IP rights for anything they created. This can be quite limiting though, as nowadays many (especially free) creative sites share IP rights with creators.
That's good advice - I'll definitely take a look at the privacy policies that the applications lay out themselves. It is also good to know that these companies are aware of FERPA and COPPA laws. Those IP restrictions do sound quite challenging! I'll have to talk to other folk at my school to see if that's something we're concerned about as well.
We evaluate apps on:
Thank you so much! This is very helpful as a starting point, and I'll definitely include all three of those criteria on our evaluation document.
It's not perfect by any stretch and many sites are missing, but I always check the Common Sense Privacy Evaluations and Criteria. It's been helpful to give me a high level overview of what to look for.
Thanks for sharing those resources - even if their database isn't comprehensive, it definitely gives a sense of what to look out for. I'm sure it'll help rule out some sketchier tools as well.
In case you missed it, we just had a very popular session addressing the changes Google is requiring. You can access the archive and resources for that session here.
I hope that helps!
Thank you so much! This is very helpful.
I know this isn't a direct answer to your question, but Learn21 has been running a student privacy awareness campaign this month. It's called Scary Apps and it's just a fun way to draw attention to apps with privacy policies that should give educators a fright. EdWeek did an article about it last week. If you're interested you can see our daily posts on LinkedIn (Learn21) and X (@Learn21Team). I've attached the Week 3 Recap Infographic.Thanks,
If anyone is willing to jump on a call and do a mutual sharing to see how others have configured their environment, I'd be down. Feel free to reach out (firstname.lastname@example.org).
Code of Conduct
Learn more about ATLIS membership
© Association of Technology Leaders in Independent Schools4 Weems Lane #257Winchester, VA 22601