Balancing Security and Usability: How Data Can Help
In my earlier post, I shared my experience with my former school district’s attempt to introduce a new digital platform. The platform met data privacy laws by keeping student information in Canada, but it encountered resistance from teachers, students, and parents. A better platform was available, but since it stored data in the U.S., we weren’t allowed to use it. In the end, the district abandoned the platform altogether.
This raises an important question: how can schools and organizations make better technology decisions? Instead of focusing only on legal compliance, they should also look at data that shows how well a tool actually works.
How Data Can Help
Data-driven decision-making means using data to guide choices instead of just following rules or assumptions (Marsh et al., 2006). If the district had collected and analyzed data, like teacher feedback, student engagement, and how often the platform was used, they might have seen the problems early and made adjustments before scrapping the whole project. The authors point out that leaders don’t need to be data experts, but they should understand how to use data to make informed decisions. This means looking beyond compliance checkboxes and considering things like ease of use, learning outcomes, and whether teachers and students actually benefit from the tool.
Ethical Concerns
While data can improve decisions, it must be used responsibly. Prinsloo and Slade (2014) warn that data can sometimes be used unfairly, such as tracking student performance in a way that disadvantages certain groups. The Open University (n.d.) stresses that organizations need clear policies on learning analytics, ensuring transparency and giving students and teachers a say in how their data is used.
A Better Way Forward
Instead of choosing tools based only on legal requirements, schools should consider both security and usability. They can do this by:
• Collecting data on how well a tool works before making decisions.
• Including teachers and students in the selection process.
• Using learning analytics to track whether a tool is helping, not just whether it meets privacy laws.
By taking a balanced approach, we can avoid situations where compliance gets in the way of good learning experiences.
References
Prinsloo, P., & Slade, S. (2014). Educational triage in open distance learning: Walking a moral tightrope. International Review of Research in Open and Distributed Learning, 15(4), 306-331.
The Open University. Data Ethics Policy | Student Policies and Regulations | The Open University. (n.d.). https://help.open.ac.uk/documents/policies/ethical-use-of-student-data
Marsh, J. A., Pane, J. F., & Hamilton, L. S. (2006). Making Sense of Data-Driven Decision Making in Education: Evidence from Recent RAND Research. Occasional Paper. Rand Corporation.
I like your take on balancing compliance and practicality. The “checklist compliance” card is often played reactively without welcoming any evaluation or discussion. Data residency is often treated as a binary, one-size-fits-all requirement, regardless of the data’s sensitivity or the actual risks involved. Not all data carries equal risk, so finding some room for consideration and negotiation could benefit learners and faculty alike. Maybe a tiered approach could work, where non-sensitive data could be considered on its own basis of privacy of risk, without a blanket policy application.
Data residency requirements make sense but can present challenges. Given our current political climate, domestic data residency feels particularly prudent at the moment. Solution options narrow once when we include tariffs, embargoes, and boycotts in our decision making. I can forgo buying cauliflower, but it might have a harder time forgoing a tech solution that lacks an sufficient domestic substitute.
Your idea of including teachers and students in the process could even extend to involving IT/legal teams to challenge assumptions around data-residency policies.
A binary trade-off between security and usability does risk stifling good learning experiences. You make a good case for data-driven decision-making.