About SIGNAL

Our Goal

To counter the growing threats of technology-facilitated gender-based violence, online radicalization, polarization, and digital inequity while advancing critical understanding of these phenomena. Through feminist, queer, anti-racist, and equity-driven approaches, we develop tools, practices, and policies that foster safer, more equitable digital environments and build socially just and technologically resilient futures.

Our Mission Statement

SIGNAL brings together an interdisciplinary network of researchers, practitioners, and community partners committed to examining how digital platforms enable hate, violence, and social polarization while developing strategies to counter these harms. Recognizing the scope and urgency of this challenge, we approach these critical issues through intersectional feminist methodologies, community-engaged research, and activist praxis. Our work is grounded in inquiry and centred on the lived experiences and expertise of those most affected by technology-facilitated violence. We combine scholarly analysis with actionable interventions to address both the complexity of digital harms and the imperative to build safer, more equitable digital futures.

Our Approach: The Three Pillars of SIGNAL

SIGNAL's work is guided by three interconnected principles embedded in our name: (1) Intersectional Gender-Justice, (2) Networked Action, and (3) Liberation.

Intersectional Gender-Justice

Intersectional gender-justice recognizes that systems of oppression operate simultaneously and interdependently in digital spaces, requiring analytical approaches that account for this complexity. To understand and document digital harms, we systematically document and analyze technology-facilitated gender-based violence, radicalization, and social polarization across digital platforms. Through comprehensive data collection and network analysis, we create open-access databases that map toxic media ecosystems, revealing how systems of oppression, including racism, misogyny, colonialism, ableism, and queerphobia, intersect and compound in digital spaces. Through this process, we analyze emergent technologies. We develop innovative analytic tools for studying how emergent AI technologies perpetuate harm. Our research examines deepfakes, chatbots, image generators, and algorithmic recommendation systems through an intersectional lens, centering the experiences of women, 2SLGBTQIA+ communities, racialized groups, and other marginalized populations disproportionately targeted by these technologies.

Networked Action

Networked action recognizes that resistance requires connection, linking past struggles to present movements and academic knowledge to community organizing. To archive resistance practices, we curate a comprehensive digital archive of feminist counter-practices and resistance strategies, housed at the Canadian Women's Archive. This living archive documents both historical and contemporary feminist media tactics, creating accessible blueprints for collective action and ensuring crucial knowledge isn't lost to platform censorship or digital erasure. We then translate research into actionable resources—policy briefs, pedagogical toolkits, public workshops, and multimedia tools—designed for educators, policymakers, community organizers, and the broader public. Our work bridges academic inquiry and real-world intervention.

Liberation

To reach collective liberation we must actively build communities of care. We foster coalitional communities through regular network events, workshops, and a yearly summer institute. Recognizing the emotional toll of researching digital violence, we create spaces of collective care, mentorship, and support that sustain scholars, activists, and practitioners doing this vital work. Through all our objectives, we work toward collective liberation—building pathways to greater social cohesion, democratic participation, and digital equity. Our vision extends beyond harm reduction to imagining and creating more just, sustainable, and liberatory digital futures for all.