Rebekah Curniski is a teacher at the Nicene Classical Academy in Calgary.As a teacher, I’m always curious about new ways to help students learn, excel, and overcome their individual challenges. So when a friend told me at dinner about an educational centre that uses artificial intelligence as the teacher, I was intrigued.She explained how AI analyzes each child, identifies learning gaps, and creates a customized program that helps them master material at their own pace. The results, she said, were astonishing — students advancing through maths and reading faster than ever before. It sounded like every educator’s dream.A faster, more precise way to identify gaps in learning? .OLDCORN: Time to end the Alberta teachers’ strike now.Wonderful. It can take teachers weeks to discover what a student doesn’t understand, and not all of us are equally good at it. So if AI could do that instantly — why not use it?But then I wondered: What happens next, after the analysis?If AI designs the program, provides practice questions, and measures mastery, what role is left for a human being? Who checks the accuracy of those questions, the assumptions behind them, or the worldview they promote?“Don’t worry,” I can almost hear people say. “It’s just maths.”Except it never is just maths..Recently, a teacher in a faith-based institution — required to follow a particular maths program — spotted something off. The program automatically generated an opening question for class: “Jim went to the store. At the store, they bought four chocolate bars.”Thinking quickly, she crossed out “they” and replaced it with “he.” When a student asked why, she replied, “Because Jim is a boy. Singular, not plural.”It was a small thing — but it mattered..SLOBODIAN: BC’s barbaric secret — Teen girls mutilated in ‘gender affirming clinics’ while NDP politicians cheer.Now imagine the same question, unchecked, presented by AI. Who catches the bias, the subtle indoctrination? Who ensures that what appears to be a maths exercise doesn’t quietly reshape language or meaning?We are so dazzled by AI’s promise of academic acceleration that we forget the cost of turning over moral and intellectual formation to a machine. To produce faster learners, are we willing to let tiny distortions slip through unchallenged?We say, “Don’t talk to strangers.” Yet we invite all sorts of strangers — through video games, social media, and now artificial teachers — into our children’s lives. For what purpose? Academic excellence? Or the abdication of our role to “train up a child in the way he should go”?.That dinner conversation stayed with me. I went to bed thinking little more of it — until I woke suddenly at 4 a.m. with words flashing vividly in my mind:NO, AI will not teach my children.NO, AI will not parent my children..PARKE: Public education and the myth of separation of church and state.It felt almost like the handwriting on the wall in the Book of Daniel — a divine warning. I got up, compelled to write this down.AI is a remarkable tool. But that’s all it is — a tool. It can assist teachers, but it cannot replace them. It cannot read a child’s face, sense discouragement, or discern whether a student skipped breakfast or fought with a friend before class.Teaching is not just about information transfer; it is about formation. It is human.The teacher had seen an issue and responded in real time — with moral judgment, common sense, and compassion. No algorithm can replicate that..Yet in our apathy, we seem eager for AI to become our Mary Poppins: to swoop in, tidy up our messy classrooms, and make learning effortless. But that fantasy belongs to another world. The hard truth is that good teaching is hard work — and always will be.Assessing students, building character, nurturing intellect: these are slow, human tasks. They cannot be automated. The pursuit of speed and efficiency may produce technically proficient students, but not wise ones.Even secular researchers are starting to voice concern. A recent article from the Alberta Teachers’ Association warned that excessive reliance on AI could lead to “moral passivity” — teachers and students alike surrendering decision-making to machines. Over time, this could cause “cognitive atrophy” — a weakening of our ability to think deeply, remember, and reason independently..OLDCORN: Alberta’s teachers’ union abandons students for politics.If AI writes the essay, solves the problem, or designs the lesson plan, what happens to the human mind that used to do those things? What happens to the soul that once found meaning in them?One private school proudly advertises that students using AI tutors can “learn twice as fast.” But perhaps the better question is: What are we losing twice as fast?.We risk trading understanding for efficiency, relationship for convenience, and wisdom for data.Teaching has always been about more than producing clever students. It’s about shaping citizens — whole, thinking, moral beings. The long, patient work of mentoring children through failure, discovery, and perseverance cannot be outsourced.So no — AI will not teach my children. It will not replace teachers. It will not shape souls..HILTON-O’BRIEN: Big money in little politics.Let AI serve where it belongs: as a tool, subordinate to the human mind and heart. The slow, often messy work of human education — the kind that forms character as well as intellect — remains worth every moment.Because while AI may help us learn faster, only humanity can teach us why to learn at all.Rebekah Curniski is a teacher at the Nicene Classical Academy in Calgary.