A new report from the Justice Centre for Constitutional Freedoms is warning that growing calls to nationalize or heavily regulate artificial intelligence in Canada could open the door to expanded government surveillance, weakened privacy protections, and limits on free expression.The report, The danger of government-controlled Artificial Intelligence, authored by retired Western Standard Opinion Editor Nigel Hannaford, argues that proposals to bring AI systems under state control — revived in part following the February 2026 Tumbler Ridge mass shooting — risk giving governments access to Canadians’ private digital interactions.The analysis suggests that while public safety is a legitimate policy goal, it should not be used to justify sweeping changes that would allow state oversight of personal AI use, including conversations, research queries, and creative work conducted through AI platforms.The Justice Centre says one of the central risks is the possibility of government surveillance of private AI interactions, particularly if platforms are owned or tightly controlled by the state.It also warns that Canadians could begin to self-censor if they believe their AI usage is being monitored by authorities, particularly when exploring sensitive, controversial, or politically charged topics.The report further raises concerns about political influence over AI systems, suggesting government involvement in their design or operation could shape what information users receive or suppress certain viewpoints..It also points to federal legislation, including Bill C-22, known as the Lawful Access Act, arguing it could expand police access to subscriber data and metadata from digital service providers, including AI companies.According to the report, mandatory retention of metadata for up to one year could allow authorities to build detailed behavioural profiles of Canadians, raising additional privacy concerns.The authors also question whether stricter regulation or nationalization of AI would have prevented the Tumbler Ridge attack, arguing that broader systemic failures in existing institutions were likely more relevant than technology access itself.The report makes a series of recommendations, including rejecting any move toward government-owned AI systems, opposing Bill C-22’s metadata retention provisions, and maintaining strict warrant requirements for access to private data. It also calls for narrowly tailored safeguards rather than broad surveillance frameworks and stronger protections for freedom of thought and inquiry.“Public safety must be pursued without sacrificing the fundamental freedoms that underpin a free and democratic society,” Hannaford wrote.He added that Canadians must be able to use emerging technologies without fear of government monitoring or interference, emphasizing the importance of Charter-protected rights in the digital age.The report concludes by urging policymakers to reject government-controlled AI models and resist legislative efforts that would expand state access to private communications.