IT Leader Insights: Balancing AI Safety and Student Autonomy

In Kigumi’s line of work, we get a lot of questions about the role of IT departments when it comes to digital wellbeing. So I’m extremely happy to be interviewing seasoned IT Head and AI governance professional, Samuel Quek, about his experience in leading and building IT teams within international schools in Asia.


About Samuel

Samuel Quek is a seasoned IT and Cybersecurity leader with over 20 years of experience in governance, digital transformation, and system integration across international and public sectors. He spent eight years at Taipei American School, where he served as Associate Director of IT, and then Director of Technology. Samuel now works as a cybersecurity and AI governance consultant, advising organizations on risk management, global privacy compliance, technology adoption, and long-term digital strategy. He holds certifications including CISSP (Certified Information Systems Security Professional), CIPM (Certified Information Privacy Manager), TOGAF Enterprise Architecture Practitioner, and is currently pursuing his AIGP (AI Governance Professional) certification.

Mila Devenport, Founder, Kigumi Group (Mila): Let’s start with a bit about how you got here as an IT leader in schools. Can you share a bit about your career path and what you love most about working in international schools?

I actually started my career in government technology in Singapore, working on large scale systems like the national online authentication service SingPass, the Civil Defence Force’s emergency response system, and the biometric passport programme. That gave me a very solid grounding in secure, reliable systems for critical services.

Over time I realised that what I enjoyed most was when technology directly improved people’s everyday lives, and that is what drew me into education. I moved into an international school role and helped modernise systems so they stayed secure while reducing friction for students, employees, and parents. In a classroom you cannot afford to lose ten or fifteen minutes waiting for a password reset or a system to come back online, that is a big part of a forty-five minute lesson, so reliability and usability - keeping it simple - really matter.

What I love about international schools is the sense of community and purpose. You work with students, staff, and parents from many cultures, and technology becomes a way to keep them safe online, make teachers’ work easier, and open up new learning opportunities for young people. I really enjoy that combination of complex technology with very human, long term relationships in a school community.

Mila: Lots of schools right now are struggling with the balance between safeguarding kids and building student agency in digital skills. On one hand, if schools overcontrol the students’ digital environments, it can lead to graduating students who haven’t been exposed to cybersafety basics; on the other hand, schools clearly need to keep kids safe and foster positive digital cultures. That leads us to the question today: in an AI-enabled world, how do you build a school culture and approach that balances safety and privacy with fostering digital autonomy and critical thinking - with clear accountability across schools, families, students, and vendors?

The onset of generative AI combined with real concerns about data privacy, deepfakes, persuasive design, and algorithmic surveillance has raised the emotional temperature for the entire community. We've seen in recent news the impact when Edtech companies are breached and student data is compromised - but schools also need to do their part. Parents, teachers, and administrators all feel that something important is at stake, even when they cannot fully name it -  this can create a significant amount of anxiety.

In an anxious system, our reflex is reactivity. We swing between extremes, block everything or allow everything. We implement security and AI tools as emergency responses to headlines, vendor promises, or parent pressure. These moves may reduce short-term discomfort, but they rarely solve the long-term issues.

As a technology leader, my role is to help the school move from a culture of reactivity to a culture of resilience, built on a foundation of clear principles and not from reacting to emergencies.

Mila: Moving from a passive to a proactive stance is also an important principle of ours when we do digital wellbeing for schools. From an IT perspective, what does it look like to move away from a more passive approach?

Samuel’s Tip #1: Focus on Building Digital Judgment

Our most essential goal is to ensure a student graduates with the ability to think independently, a concept we can call "digital differentiation". This is the capacity to stay part of the crowd without letting the crowd do all your thinking for you - this can be considered a form of self-determination.

Social media and Generative AI reward groupthink, emotional contagion, and passive consumption - we know that social media creates echo chambers, and GenAI is intentionally sycophantic. If we create a perfect walled garden where every risk is pre-filtered, we unintentionally keep students reliant on our authority. They don't learn to develop the internal structures to distinguish their own values and judgments from the algorithms' suggestions when they step out of school into the unmonitored world.

So we need to look at guardrails, not as cages, but as scaffolding so that students can experience meaningful autonomy while internalizing safe, ethical behaviors. We can do this by using a developmental model:

  • Early Years: The scaffolding is tight, with firm and mostly invisible technical controls.

  • Middle and High School: Gradually lower some technical controls and raise the psychological and ethical expectations. Structured 'friction' that requires judgement can be introduced as

    • Explicitly asking if an output is biased, or true or fair.

    • Requiring the disclosure of when and how AI is used.

This shifts the locus of control from the external (firewall) to the internal (conscience and thinking). The goal is to ensure that whenever they encounter risk, they are not encountering it unprepared for the first time, without adult supervision.

Samuel’s Tip #2: Making the Playground Safe

Scaffolding only works if the underlying environment is responsibly designed. Schools need a governance approach that treats data, AI and vendor solutions with the same seriousness given the physical safety.

We can utilize industry frameworks to give us a shared language for risk management, such as:

  • Legal and Risk Categorization using principles such as from the EU AI Act.

  • Implementation and Assurance using the NIST AI Risk Management Framework to understand the context, identify risks, conceptualize benefits, and design safeguards.

  • Privacy and Security using standards such as ISO 27001 and 27701 to guide data practices, know what sensitive data is being stored, where it flows and who is accountable for it.

Mila: We find a large area of complexity for school leaders (who are not IT professionals) is understanding the role of the IT department in digital wellbeing and AI literacy. Can you share thoughts on who is responsible for what in this scenario for international schools?

There need to be clear, shared responsibilities across the entire school community in order to establish expectations across the different roles. In general, the main players include:

  • School Leadership: Consults with the community and establishes the institutional values, sets the risk tolerance/appetite, and defines ethical guidelines. They must also categorize and approve any High-Risk AI applications, for legal/governance purposes.

  • IT and Data Teams: Responsible for the technical implementation of controls, thorough vendor evaluations, and ongoing system monitoring.

  • Teachers: Must design educational activities that encourage students to actively use, analyze, and critically evaluate digital tools.

  • Parents and Caregivers: Act as essential partners by supporting the school's boundaries, modeling and engaging in conversations about digital citizenship and responsible use even at home.

  • Students: Are responsible for using AI honestly, proactively declaring its use, and learning from any misuse or mistakes.

Mila: In our digital wellbeing work we find the parent-school partnership is often not yet strong enough to support the pressing cybersafety, social media, and wellbeing concerns facing children. Do you have tips for parental engagement for international schools?

A holistic approach is essential for schools to effectively navigate the prevalent systemic anxiety surrounding modern education and technology. Instead of operating in isolation, schools must actively cultivate a genuine partnership model with parents and the wider community. 

This collaborative framework is critical, as it communicates that the challenges and responsibilities are a shared effort, ensuring that parents feel supported and are not isolated in the complex journey of their child's development. This shared understanding helps to de-escalate anxiety by sharing the perceived burden.

Building and maintaining this trust requires a staunch commitment to transparency. Schools must clearly articulate and communicate the protocols surrounding digital access and usage - specifically, what technologies, platforms, or activities are blocked and allowed, and, most importantly, why these specific decisions have been made. Whether it's a policy on social media, the use of personal devices, or the filtering of web content, providing clear, reasoned justifications goes a long way toward establishing and cementing parental trust. Vague or opaque policies fuel speculation and anxiety; clarity fosters confidence.

Furthermore, a key component of this partnership is the proactive education of parents. This is not merely a courtesy but a strategic necessity. By hosting workshops, providing resources, and holding informational sessions, schools can effectively reinforce the concepts and digital literacy skills being taught to their children. These educational efforts should focus on contemporary digital citizenship principles, such as:

  • Evaluating Bias and Credibility: Teaching parents and children alike how to critically assess the source and motivation behind online information, helping them to discern fact from misinformation and understand algorithmic influence.

  • Maintaining Privacy and Security: Instilling best practices for protecting personal data, understanding privacy settings, and recognizing the risks of oversharing online.

  • Fostering Active Participation: Moving beyond passive consumption, this education empowers both generations to become active participants in shaping and managing their digital lives, making informed, deliberate choices rather than simply reacting to technology.

When parents are equipped with the same knowledge and language as the school, the home environment becomes an extension of the classroom, creating a support system that is consistent, and helps children thrive in the digital age. This integrated strategy of partnership, transparency, and education transforms systemic anxiety into a shared commitment to digital well-being and responsible technological engagement.

Mila: If you could encapsulate your advice on AI and cybersafety for kids into one nutshell for school leaders, what would it be?

Balance. The core philosophy of any effective digital education centers on a delicate balance between implementing robust IT governance structures while simultaneously cultivating internal judgment and personal agency in students. This dual approach is critical for strengthening a child's locus of control, moving them from a mindset of external compliance to one of internal responsibility.

When IT governance is applied intelligently - focusing on security, ethical use, and infrastructure reliability rather than punitive control - it provides a safe and reliable framework. Within this structured environment, students are then empowered to develop a growth-focused outcome. They learn not just to use technology, but to master it responsibly, viewing digital challenges as opportunities for learning and development.

This mastery leads to the development of digital differentiation. This is a cognitive skill that enables students to critically evaluate the vast, often contradictory, sea of digital information and influence. In a world saturated with external pressures, advertising, and curated narratives, students who possess this ability are equipped to navigate the digital landscape without becoming passive recipients of influence.

Crucially, digital differentiation ensures they maintain the ability to think for themselves. It is the safeguard against intellectual conformity. By teaching students to scrutinize sources, recognize bias, and synthesize information from diverse perspectives, educational institutions are preparing them to be discerning, autonomous thinkers, not merely efficient digital consumers. This approach ensures technology serves as a tool for intellectual empowerment and personal growth, rather than a vector for external control.

p[id="big-paragraph"] { user-select: none; }
Previous
Previous

Parent Media Review: Octonauts

Next
Next

Why You Should Check That Your Kids Can Write AI Process Journals