Discussions

 View Only
  • 1.  Using the Common Sense Privacy Rating in Your School's Vetting Process

    Posted 02-17-2024 01:12 PM
    Edited by Peter Frank 02-21-2024 09:49 AM

    As the conversations swirl around the application, services, vendor vetting, and AI data concerns (see @Nick Marchese recent post - AI tools that use data prompts and user data to train models), I am curious about the criteria used in the evaluation.

    This has been an ongoing question for us at MKA and one we continue to work on and explore - thanks, @Erica Budd. As we look at how we will tackle this question, we have a few things to consider in our current process. It is the last I have wanted to ask this community their thoughts. 

    1. Has the application, service, or vendor been vetted against our existing vetting form? (included in the ATLIS 360 Compansion Manual)
    2. Are there other similar applications, services, or vendors we use that do similar, if not the same thing?
    3. If using "Sign in with Google," is the app "Verified" by Google, and what services does it have access to? (more info on "verified third-party apps")
    4. How do "Common Sense Privacy Ratings" score the application, service, or vendor? 

    The Basic Question: How do you use the Common Sense Privacy Ratings in your vetting process, and what categories are most important to your process? Stop reading, skip the rest, and share your thoughts in the comments.

    The Detailed Question: There are numerous levels and details with the Common Sense Privacy Ratings, and how do you leverage them with faculty and staff within your vetting process? How do you balance the base-level rating and score(s) against the individual category ratings? See the details below and share your thoughts in the comments.

    Common Sense provides free ratings and evaluations for many commonly used tools. The Ratings are relatively simple: Pass, Warning, or Fail scoring tools. The pass criteria include the following criteria:

    • No selling data
    • No third-party marketing
    • No targeted advertisements
    • No third-party tracking
    • No tracking across apps
    • No profiling for commercial purposes

    warning means that "the product is unclear or says they engage in one or more worse privacy practices that are prohibited in the Pass rating..." and fail means that "the product does not have a privacy policy, or the privacy policy is not sufficient, because it does not disclose any details about the worse privacy practices prohibited in the Pass rating...". 

    This is the simplest rating they provide, and faculty or staff members can give this rating a look at whether the tool will likely pass a full review. However, many of us will need greater depth before approving a tool.

    They provide three layers to their evaluations to provide a greater level of depth. The Quick and Basic can be used by faculty and staff in their initial look to provide additional context to the question of use. The Full evaluation is what I believe is the best for the work Tech and Ed Tech members need to use in their process. 

    1. Quick Evaluations - Based on 6 rating questions - Includes Pass, warning, or Fail ratings. No Overall Score or Evaluation Concern
    2. Basic Evaluations - Based on 30 rating questions - Includes Pass, warning, or Fail ratings. Includes Overall Score and Evaluation Concern
    3. Full Evaluations - Based on 155 rating questions - Includes Pass, warning, or Fail ratings. Includes Overall Score and Evaluation Concern

    The Basic and Full evaluations look at "Evaluation Concerns," which receive a score ranging from Best to Poor (Best (81-100) | Good (61-80) | Average (41-60) | Fair (21-40) | Poor (0-20) and are evaluated within the following categories and this is where the fun begins.

    Within each of these categories is a subset of measures used for evaluation. Each category is scored based on the subset of measures defined by Common Sense. For example, Kahoot is rated as Pass with an overall score of 82% (Best); however, when you look at the score within the ten areas outlined in the Evaluation Concerns, you need to look deeper as these numbers may tell a different story with only three of the ten category measures equalling the overall BEST score of 82%.

    • Data Collection - 70% - GOOD
    • Data Sharing - 70% - GOOD
    • Data Security - 45% - AVERAGE
    • Data Rights - 95% - BEST
    • Individual Control - 55% - AVERAGE
    • Data Sold - 40% - FAIR
    • Data Safety - 60% - AVERAGE
    • Ads & Tracking - 85% - BEST
    • Parental Consent - 75% - GOOD
    • School Purpose - 88% - BEST

    Common Sense provides details about their rating, which I am not outlining here, but as asked in my "Detailed Question," how do you use these ratings in your vetting process, and which do you value more than others? Does a score of 40% for data sold and 45% for data security negate the overall score of 82%?

    This is not easy...

    As we try to empower faculty and staff to use tools in their work to improve teaching and learning, along with building efficiencies in school operations, how can we more effectively vet these applications, services, and vendors to ensure that their privacy and data use policies and practices do not put our schools and those that we server at risk both now and in the future?

    [NOTE: This has been cross-posted on my blog - www.williamstites.net]


    #General
    #TeachingandLearning
    #CybersafetyandDataSecurity

    ------------------------------
    William Stites
    Montclair Kimberley Academy
    Montclair NJ
    ------------------------------



  • 2.  RE: Using the Common Sense Privacy Rating in Your School's Vetting Process

    Posted 02-19-2024 01:42 PM

    Hi William,

    You bring up good points and ones I dealt with being a K12 CTO for 25 years. We recognized that having a process (that included a privacy team) to set policy, process and educate staff, families was essential. Having benchmarks to use is essential when vetting apps (as you outlined in your questions).

    However, we learned that not everyone in our K12 school community understood the need for privacy policies (and for that matter cybersecurity) so we adopted the  national data privacy agreement (NDPA) from the Student Data Privacy Consortium (SDPC). The NDPA provides the framework for all the ingredients you mentioned and requires vendors to sign the NDPA. In fact, if a vendor did not sign we did not do business with them.

    At Learn21, a nonprofit, one of our areas of focus is on thought leadership regarding student data privacy. We are the Ohio Alliance for SDPC. We promote that schools use a framework and implement the NDPA. You may have seen our October Scary App of the day campaign where we called out companies that have scary privacy policies. In January we did a a follow up social media campaign on New Year New SDP. We focused on privacy best practices. All focused on raising awareness.

    All to say, Common Sense tools are helpful but we see the large idea - an ecosystem of staff education and developing a culture of privacy. We've seen success when schools engage in professional learning, data privacy audits, and managing data privacy agreements. Whether with our support or using proven strategies, it needs to be part of the culture. And honestly, sometimes its hard because you cannot always be a prophet in your our land. We've seen that Common Sense can even get outdated do the rapidly changing space.

    I'm happy to connect if you want to discuss further. This is an essential topic in education.

    -Bill
    wfritz@learn21.org



    ------------------------------
    Bill Fritz
    CEO
    Learn21
    Cincinnati OH
    5134022121
    ------------------------------



  • 3.  RE: Using the Common Sense Privacy Rating in Your School's Vetting Process

    Posted 02-20-2024 04:25 PM

    Bill,

    I'm following this!  We have been having conversations about this for a while and I've been gathering information from various sources to use in the vetting process.  Earlier this year I had to approve the use of an app after being asked to vet it...only to find that the app itself does not have a good score with Common Sense, but that didn't matter because another teacher has been using it for years, so we just needed to let the other teachers in the department use it.  In the end, I'm glad I was asked and made aware of this, even if I had to allow another teacher to use the application.  

    We have a unique situation in independent schools where teacher autonomy sometimes rules the day.  That is not always in the best interests of our students and families, even with privacy and security concerns set aside.  There are many factors to consider, and I appreciate you placing the list of categories above.  I think as an institution, we need to assign a weight to the category that indicates its importance to our school. I believe that would help with objectively evaluating applications and measuring each of those with the same formula. Have you done anything like that?

    I'm interested in reading more replies from other schools!

    Best,

    Susan



    ------------------------------
    Susan Fuhs, Director of IT Services
    Norfolk Academy
    Norfolk VA
    swfuhs@norfolkacademy.org
    ------------------------------



  • 4.  RE: Using the Common Sense Privacy Rating in Your School's Vetting Process

    Posted 02-21-2024 12:18 PM

    I'd like to throw out another question about this vetting process that keeps crossing my mind. Are you looking at any different or additional criteria when vetting generative AI tools?



    ------------------------------
    Erica Budd
    Montclair Kimberley Academy
    Montclair NJ
    ------------------------------



  • 5.  RE: Using the Common Sense Privacy Rating in Your School's Vetting Process

    Posted 02-23-2024 06:15 PM

    Hi Susan, 

    I hope you don't mind if I jump in. I am the Chief Academic Officer at Learn21 and work with Bill. Before Learn21, I was the Director at Davidson Academy Online, an independent school. We still vetted our apps and required that teachers submit a request form to our tech team before they used any apps. There were several apps that we declined because of how harmful they would have been to our student's privacy. In these cases, our team worked with the teacher to find a safe app that met their educational goals. 

    Edtech vendors are excellent at marketing to teachers and not sharing all the details about their apps. As a former public school tech director, I know how dangerous some of these apps can be. It's important that we educate teachers on what they are signing their student up for. I know that as an independent school we don't have to follow many of the federal privacy guidelines, but who wants to tell a parent that we don't care as much about student privacy as a public school has to?

    I'm happy to connect further and share ideas or some of the work on SDP we've done. Our Scary Apps campaign in October was a huge success.



    ------------------------------
    Stacy Hawthorne, Ed.D., CETL
    Chief Academic Officer
    Learn21
    ------------------------------



  • 6.  RE: Using the Common Sense Privacy Rating in Your School's Vetting Process

    Posted 03-07-2024 01:38 PM

    Hi, William.

    Thank you for you post. Like so many, I have grappled with student data privacy and ed tech use at our school for many years. It became a personal crusade when I was still an early adopting educator; a parent voiced her concerns to me about the ed tech I was using in my instruction when very few people were having these conversations (and Common Sense was in its infancy). When I transitioned out of the classroom to our small tech team, I was tasked with taking on vetting apps for our school. The adventure continues, of course, but I learned a lot via trial and error along the way.

    I use Common Sense as my primary resource for vetting, but I also refer to Student Privacy Pledge, iKeepSafe, and Privacy Shield Framework when an app has not been evaluated by Common Sense. I also check to see how many schools report using an app in the Student Data Privacy Consortium when all else fails.

    Although I personally find the ten Common Sense category subsets personally informative, I first look to the Statues Rating within the report, which list scores for the following legislation:

    • CalOPPA
    • COPPA
    • FERPA
    • SOPIPA
    • GDPR
    • CPRA

    If any of these scores fall below 50, that's when I take a closer look at the subset scores.

    It's rare I have to refuse a teacher request to use a new app, but on those occasions that I do, I meet with them and show them the concerning subset score. And almost always, they understand and are appreciative for the information. One example for the recent past is IXL. Long story short, I had to revoke its use based on its scores (and horrible student and parent reviews on Common Sense Media) and help the team find alternatives; they were crestfallen but understood once I shared all the information with them. After three years of dutifully trying  the alternatives, one of the team members asked if we could once again subscribe to IXL, which prompted me to check its scores again. Turns out, IXL made some adjustments to their privacy policy and practices which raised their scores enough for me to feel comfortable adding them back into our ed tech arsenal.

    I hope that helps answer how we use the scores in our process.

    Cheers,

    Nissa



    ------------------------------
    Nissa Hales
    Laguna Blanca School
    Santa Barbara CA
    ------------------------------