The overall goal of this project is to build the reader's understanding of how surveillance technology and AI are used, their consequences, and what we the public can do to protect our rights and fight back against biased algorithms. Specifically, we dive into the use of facial recognition (FR) algorithms in policing and how they disproportionately impact Black and brown communities.
See our interactive learning module here: Surveillance technology is here. What does this mean for you?
Our target audience is adults who are not in the tech industry. When creating this module, we had our friends and family in mind, who do not work in tech and aren’t as familiar with these concepts as we are. We also wanted to specifically target those interested in learning more about racial justice issues related to policing and surveillance.
- Understanding the basics of surveillance technology and how it’s used
- A basic understanding of facial recognition technology and its effectiveness
- The knowledge that this technology is already out there and affecting people’s lives, especially Black and brown people
- The idea that these stories of surveillance technologies are not far away; they are close to us and our communities
- An understanding of where legal regulations exist for this technology
- Actions that readers can personally take to fight this technology
This learning module was created by graduate students at Northwestern University. If you have any questions or feedback, please feel free to reach out to us: