I'd like to throw out another question about this vetting process that keeps crossing my mind. Are you looking at any different or additional criteria when vetting generative AI tools?
Original Message:
Sent: 02-20-2024 04:25 PM
From: Susan Fuhs
Subject: Using the Common Sense Privacy Rating in Your School's Vetting Process
Bill,
I'm following this! We have been having conversations about this for a while and I've been gathering information from various sources to use in the vetting process. Earlier this year I had to approve the use of an app after being asked to vet it...only to find that the app itself does not have a good score with Common Sense, but that didn't matter because another teacher has been using it for years, so we just needed to let the other teachers in the department use it. In the end, I'm glad I was asked and made aware of this, even if I had to allow another teacher to use the application.
We have a unique situation in independent schools where teacher autonomy sometimes rules the day. That is not always in the best interests of our students and families, even with privacy and security concerns set aside. There are many factors to consider, and I appreciate you placing the list of categories above. I think as an institution, we need to assign a weight to the category that indicates its importance to our school. I believe that would help with objectively evaluating applications and measuring each of those with the same formula. Have you done anything like that?
I'm interested in reading more replies from other schools!
Best,
Susan
------------------------------
Susan Fuhs, Director of IT Services
Norfolk Academy
Norfolk VA
swfuhs@norfolkacademy.org
Original Message:
Sent: 02-17-2024 01:11 PM
From: William Stites
Subject: Using the Common Sense Privacy Rating in Your School's Vetting Process
As the conversations swirl around the application, services, vendor vetting, and AI data concerns (see @Nick Marchese recent post - AI tools that use data prompts and user data to train models), I am curious about the criteria used in the evaluation.
This has been an ongoing question for us at MKA and one we continue to work on and explore - thanks, @Erica Budd. As we look at how we will tackle this question, we have a few things to consider in our current process. It is the last I have wanted to ask this community their thoughts.
- Has the application, service, or vendor been vetted against our existing vetting form? (included in the ATLIS 360 Compansion Manual)
- Are there other similar applications, services, or vendors we use that do similar, if not the same thing?
- If using "Sign in with Google," is the app "Verified" by Google, and what services does it have access to? (more info on "verified third-party apps")
- How do "Common Sense Privacy Ratings" score the application, service, or vendor?
The Basic Question: How do you use the Common Sense Privacy Ratings in your vetting process, and what categories are most important to your process? Stop reading, skip the rest, and share your thoughts in the comments.
The Detailed Question: There are numerous levels and details with the Common Sense Privacy Ratings, and how do you leverage them with faculty and staff within your vetting process? How do you balance the base-level rating and score(s) against the individual category ratings? See the details below and share your thoughts in the comments.
Common Sense provides free ratings and evaluations for many commonly used tools. The Ratings are relatively simple: Pass, Warning, or Fail scoring tools. The pass criteria include the following criteria:
- No selling data
- No third-party marketing
- No targeted advertisements
- No third-party tracking
- No tracking across apps
- No profiling for commercial purposes
A warning means that "the product is unclear or says they engage in one or more worse privacy practices that are prohibited in the Pass rating..." and fail means that "the product does not have a privacy policy, or the privacy policy is not sufficient, because it does not disclose any details about the worse privacy practices prohibited in the Pass rating...".
This is the simplest rating they provide, and faculty or staff members can give this rating a look at whether the tool will likely pass a full review. However, many of us will need greater depth before approving a tool.
They provide three layers to their evaluations to provide a greater level of depth. The Quick and Basic can be used by faculty and staff in their initial look to provide additional context to the question of use. The Full evaluation is what I believe is the best for the work Tech and Ed Tech members need to use in their process.
- Quick Evaluations - Based on 6 rating questions - Includes Pass, warning, or Fail ratings. No Overall Score or Evaluation Concern
- Basic Evaluations - Based on 30 rating questions - Includes Pass, warning, or Fail ratings. Includes Overall Score and Evaluation Concern
- Full Evaluations - Based on 155 rating questions - Includes Pass, warning, or Fail ratings. Includes Overall Score and Evaluation Concern
The Basic and Full evaluations look at "Evaluation Concerns," which receive a score ranging from Best to Poor (Best (81-100) | Good (61-80) | Average (41-60) | Fair (21-40) | Poor (0-20) and are evaluated within the following categories and this is where the fun begins.
Within each of these categories is a subset of measures used for evaluation. Each category is scored based on the subset of measures defined by Common Sense. For example, Kahoot is rated as Pass with an overall score of 82% (Best); however, when you look at the score within the ten areas outlined in the Evaluation Concerns, you need to look deeper as these numbers may tell a different story with only three of the ten category measures equalling the overall BEST score of 82%.
- Data Collection - 70% - GOOD
- Data Sharing - 70% - GOOD
- Data Security - 45% - AVERAGE
- Data Rights - 95% - BEST
- Individual Control - 55% - AVERAGE
- Data Sold - 40% - FAIR
- Data Safety - 60% - AVERAGE
- Ads & Tracking - 85% - BEST
- Parental Consent - 75% - GOOD
- School Purpose - 88% - BEST
Common Sense provides details about their rating, which I am not outlining here, but as asked in my "Detailed Question," how do you use these ratings in your vetting process, and which do you value more than others? Does a score of 40% for data sold and 45% for data security negate the overall score of 82%?
This is not easy...
As we try to empower faculty and staff to use tools in their work to improve teaching and learning, along with building efficiencies in school operations, how can we more effectively vet these applications, services, and vendors to ensure that their privacy and data use policies and practices do not put our schools and those that we server at risk both now and in the future?
[NOTE: This has been cross-posted on my blog - www.williamstites.net]
#General
#TeachingandLearning
#CybersafetyandDataSecurity
------------------------------
William Stites
Montclair Kimberley Academy
Montclair NJ
------------------------------