Rethinking Design: Prioritizing Safety in Technology

| 5 min read
Antiracist economist Kim Crayton powerfully states, “intention without strategy is chaos.” While the desire to create technology that prioritizes safety is commendable, mere goodwill isn’t sufficient to drive meaningful change. What’s essential here is the development of a concrete strategy that addresses the inherent biases and omissions that often lead to the mistreatment of marginalized and vulnerable populations. This article aims to guide you through implementing a structured plan for embedding safety principles in your tech design. We’ll explore strategies to articulate to stakeholders the critical need for prioritizing safety, while also tackling the common critique that merely increasing diversity within teams will resolve these ethical concerns. Spoiler alert: diversity is necessary but insufficient by itself to eliminate the risks associated with unethical tech practices.

Constructing a Framework for Inclusive Safety

When designing with safety in mind, your objectives should be threefold: First, pinpoint how your product could potentially facilitate abuse. Second, devise methods to thwart those abusive tendencies. Lastly, ensure support systems are in place for users who are vulnerable, empowering them to regain authority and autonomy. To achieve these goals, I’ve developed a methodology known as the Process for Inclusive Safety, born from years of experience in creating safety-focused designs. This framework will be invaluable whether you’re launching a new product or enhancing an existing one. The methodology encompasses five key phases: - Conducting thorough research - Creating user archetypes - Identifying possible abuse scenarios - Formulating design solutions - Testing for safety This process isn’t rigid; view it as a flexible toolkit that you can tailor to fit your own design context. You may not find it necessary to adopt every step rigidly. Instead, adapt the elements that resonate with your project’s specific needs. Importantly, as you implement this process, I encourage you to share feedback or suggestions to enhance its utility. This isn’t a static guide—it’s a dynamic resource meant to evolve alongside technological advancements. If your work is focused on developing products for particularly vulnerable populations—such as apps for survivors of domestic violence or addiction—make sure to check out Chapter 7. That section dives into specialized considerations for these sensitive cases, which differ from the strategies used for more general products.

Step 1: Conduct Thorough Research

Kick off your design project with a comprehensive evaluation of how your technology might be misused, alongside an understanding of the experiences of both survivors and perpetrators. Your research should delve into potential harm, addressing issues like security flaws, algorithmic bias, and user harassment. Broadly, your initial research should consider existing literature on similar products and the associated ethical implications. For instance, if you’re designing a smart thermostat, take time to study how prior models have been exploited in abusive situations. If your product incorporates AI, scrutinize how biases have surfaced in other AI implementations. Resources like Google Scholar can guide you in locating relevant academic work documenting these concerns. Next, aim to engage directly with individuals who have experienced that particular form of harm. Speak with advocates and professionals in relevant fields to deepen your understanding without causing further trauma to survivors. If you discover issues related to domestic violence, for instance, interviewing individuals at shelters or hotline services can provide insight into survivor needs and experiences. It's crucial to recognize the ethical sensitivities in this research phase, particularly with survivors. They deserve compensation for sharing their insights; consider offering payment or making a donation to organizations that support victims as part of your initial outreach.

Specific Research: Understanding the Perpetrator

While it may not be feasible to interview self-identified abusers, your general research should focus on how bad actors leverage technology to inflict harm. Investigate their tactics—how they conceal their actions, rationalize abuse, and exploit your product's design. This understanding is critical for building protective measures.

Step 2: Develop Archetypes

With research in hand, create archetypes for both the abuser and the survivor. These representations are driven by insights gleaned from your research rather than direct personal narratives. Think of archetypes as broader syntheses of potential user experiences, much like what we do when designing for accessibility. The abuser archetype serves as an illustration of individuals who weaponize technology. For instance, envision a character whose motivation is to surveil or manipulate their partner through digital means. Conversely, the survivor archetype depicts someone navigating the challenges of abuse. Consider their perspectives: Are they aware of the abuse, or are they oblivious and require intervention to clarify their situation? You might find it beneficial to create several survivor archetypes, capturing variations in their experiences and levels of awareness. This practice allows you to design solutions that directly respond to the distinct needs of different survivors, whether they seek proof of abuse or strategies to escape a harmful situation. Reflecting on user goals is crucial; abusers want to perpetuate harm, while survivors are desperately seeking safety and control. As you brainstorm solutions, keep these objectives at the forefront of your design process.

Step 3: Identify Abuse Scenarios

After establishing archetypes, the next step is to brainstorm possible abuse scenarios, ideally those that have not emerged from prior research. This process requires creativity and possibly some unorthodox thinking. Set aside dedicated time for your team to break down potential misconduct and envision unprecedented risks. Consider employing a brainstorming technique inspired by the show Black Mirror, which highlights dystopian technological consequences. Use this as a springboard to explore the most extreme and inventive abuse cases associated with your product. And while exploring these darker aspects, don’t shy away from having fun—creativity is a big part of safety design. You should grant yourselves enough time during these sessions—four hours, at a minimum—to thoroughly consider how your product might be manipulated. Accept that it won’t be possible to foresee every potential misuse, but recognize when you’ve put in a strong effort and commit to prioritizing safety post-launch. Feedback from users can unearth overlooked vulnerabilities that can be addressed in future iterations.

Step 4: Design Solutions

Armed with a detailed understanding of abuse scenarios and user archetypes, you’re prepared to develop design strategies that counter the identified risks. This stage is also when you align your safety objectives with existing design discussions, seeking ways to mitigate potential harm while bolstering survivor support initiatives. Questions to guide your design thinking include: - Can the design eliminate identified risks entirely? - How can you inform users about abuse occurring through your product? - What steps can you offer survivors to help them solicit assistance while maintaining privacy? Some products may allow you to proactively address abuse; for example, a health app could integrate features for users to report assault, triggering resources. However, caution is crucial—avoid putting users in jeopardy by inadvertently revealing their actions to an abuser.

Step 5: Test for Safety

The penultimate phase requires rigorous testing of prototypes from both the abuser's and survivor’s perspectives. This process should help unveil vulnerability gaps and affirm the effectiveness of your safety designs. Prioritize usability testing alongside safety evaluation, though be wary of over- reliance on your own insights at this stage. Instead, enlist external testers unaware of your design nuances to offer a genuine assessment of user experience. Consider testing from both perspectives: how might an abuser manipulate the product effectively, and how can a survivor navigate and overcome these threats? Address safety from both ends to develop a resilient design. Creating a culture of awareness and concern for user safety within tech design is no small feat, but with careful planning and strategic implementation, you can make significant strides toward ethically responsible tech.### The Critical Importance of Compassionate Design When developing products, it's essential to consider not just the user’s intent but also their vulnerabilities. Take the example of a fitness app with GPS features; an abuser might exploit this tool to track an ex-partner, employing whatever means necessary to breach privacy settings. If you find that someone can ascertain another person's location despite security measures being in place, it’s clear your app is facilitating stalking. The next immediate step is to return to the design phase and prioritize solutions that safeguard user privacy. This cycle of testing and refining your design is not just recommended; it's imperative if you're serious about ensuring your product isn’t weaponized. ### Empowering Survivors through Thoughtful Testing Moving to survivor testing, it’s crucial to identify how you can empower individuals who have faced harassment or stalking. Interestingly, addressing the needs of those who are vulnerable often intersects with the goals of the abuser archetype. The survivor, for instance, wants to remain safe from stalking, and doing so can also obstruct an abuser's efforts. In some instances, such as with a smart thermostat, survivors need the capability to investigate alterations in their environment that they haven't initiated. Testing should include scrutinizing whether users can access a log of changes made to the thermostat settings or whether the process for regaining control over their device is straightforward. Failure to make these pathways clear means there's still a job to be done. ### Enhancing Inclusivity with Stress Testing Then, there’s the concept of stress testing, which takes into account how various pressures can influence user behavior. As highlighted by Eric Meyer and Sara Wachter-Boettcher in *Design for Real Life*, many personas in user experience design focus on ideal circumstances. However, real-life users often grapple with challenges like anxiety or overwhelming situations. That’s why embedding stress-case scenarios into your product testing is essential. This approach can unveil shortcomings in your design that might otherwise go unnoticed. By engaging with real emotional experiences and potential crises users face, you not only make your product more inclusive but also build a compassionate framework that acknowledges the complexities of human experience. ### Looking Ahead: A Triad of Safety, Empowerment, and Consideration Reflecting on these principles—addressing abuser tactics, empowering survivors, and including stress scenarios—sets a foundation for a more humane technological landscape. If you’re working in tech, recognizing that your creations significantly impact human lives should guide your design thinking. Ultimately, the value of your product lies not in its features alone but in its ability to foster safety, autonomy, and consideration for those who use it. The tools you build have the potential to uplift or harm, and being mindful of this responsibility will only serve to enrich both your work and the lives of your users.