The Department of Canadian Heritage is requesting a larger budget to increase its capacity for monitoring and addressing internet "disinformation."It needs millions to watch people online who share wrong "political beliefs,” said the department.According to Blacklock’s Reporter, the scale and scope of disinformation was “expanding along with potential for associated harms,” said a department report. The proposal suggested allocating additional funding for the Digital Citizen Initiative, a program that currently costs $7.1 million per year. This program supports researchers who are working to identify and combat fake news.“The Initiative addresses real and continuing needs of Canadians with respect to online disinformation and related harms,” said the report Evaluation of the Digital Citizen Initiative. “The demand for funding appears to outweigh available resources. The scope and scale of the issue are rapidly expanding along with the potential for harm.”“Disinformation impacts Canadians’ health and safety, civic discourse and engagement, political beliefs, perceptions of democratic institutions, confidence in political systems and trust in media,” said the report. “It may also amplify mistrust amongst communities, discrimination, stigma and marginalization and social divisions.”The report gave no examples. “The Initiative filled a need by funding research to help understand disinformation,” it said.Research conducted by the Privy Council found less than half of Canadians have a strong level of trust in federal institutions when it comes to telling the truth.According to a 2022 report Misinformation And Disinformation, most people put more faith in family, friends and social media compared to government agencies.The report said a minority of Canadians surveyed, 42%, were “institution trusting” and had “high trust in institutional and authoritative sources of information” such as federal departments. “On average, institution-trusting respondents are significantly older, more educated and have higher income,” wrote researchers.Back in 2019, then-Heritage Minister Steven Guilbeault started the Digital Citizen Initiative program.“We have seen too many examples of public officials retreating from public service due to the hateful online content targeted towards themselves or even their families,” Guilbeault said in a 2021 podcast. “I have seen firsthand alongside other Canadians the damaging effects harmful content has on our families, our values and our institutions.”“Could we envision having blocking orders? Maybe,” said Guilbeault. “It would likely be a last resort, a nuclear bomb in a toolbox of mechanisms for a regulator.”“It’s pretty extreme, but theoretically, it is a tool that is out there and could potentially be used.” “But really, no decisions have been made on that. This is something you would see as part of the regulations, most likely.”In a 2021 Discussion Guide, the Heritage department recommended Parliament consider the appointment of a Digital Safety Commissioner for “advice on content moderation.”This commissioner would provide guidance on content moderation and have the authority to issue compliance orders to platforms such as Facebook, Twitter, YouTube and others, possibly imposing fines as high as $25 million for non-compliance.“This consultation is an important step.”The bill has not yet been introduced.
The Department of Canadian Heritage is requesting a larger budget to increase its capacity for monitoring and addressing internet "disinformation."It needs millions to watch people online who share wrong "political beliefs,” said the department.According to Blacklock’s Reporter, the scale and scope of disinformation was “expanding along with potential for associated harms,” said a department report. The proposal suggested allocating additional funding for the Digital Citizen Initiative, a program that currently costs $7.1 million per year. This program supports researchers who are working to identify and combat fake news.“The Initiative addresses real and continuing needs of Canadians with respect to online disinformation and related harms,” said the report Evaluation of the Digital Citizen Initiative. “The demand for funding appears to outweigh available resources. The scope and scale of the issue are rapidly expanding along with the potential for harm.”“Disinformation impacts Canadians’ health and safety, civic discourse and engagement, political beliefs, perceptions of democratic institutions, confidence in political systems and trust in media,” said the report. “It may also amplify mistrust amongst communities, discrimination, stigma and marginalization and social divisions.”The report gave no examples. “The Initiative filled a need by funding research to help understand disinformation,” it said.Research conducted by the Privy Council found less than half of Canadians have a strong level of trust in federal institutions when it comes to telling the truth.According to a 2022 report Misinformation And Disinformation, most people put more faith in family, friends and social media compared to government agencies.The report said a minority of Canadians surveyed, 42%, were “institution trusting” and had “high trust in institutional and authoritative sources of information” such as federal departments. “On average, institution-trusting respondents are significantly older, more educated and have higher income,” wrote researchers.Back in 2019, then-Heritage Minister Steven Guilbeault started the Digital Citizen Initiative program.“We have seen too many examples of public officials retreating from public service due to the hateful online content targeted towards themselves or even their families,” Guilbeault said in a 2021 podcast. “I have seen firsthand alongside other Canadians the damaging effects harmful content has on our families, our values and our institutions.”“Could we envision having blocking orders? Maybe,” said Guilbeault. “It would likely be a last resort, a nuclear bomb in a toolbox of mechanisms for a regulator.”“It’s pretty extreme, but theoretically, it is a tool that is out there and could potentially be used.” “But really, no decisions have been made on that. This is something you would see as part of the regulations, most likely.”In a 2021 Discussion Guide, the Heritage department recommended Parliament consider the appointment of a Digital Safety Commissioner for “advice on content moderation.”This commissioner would provide guidance on content moderation and have the authority to issue compliance orders to platforms such as Facebook, Twitter, YouTube and others, possibly imposing fines as high as $25 million for non-compliance.“This consultation is an important step.”The bill has not yet been introduced.