ai-ethics · 8/20/2025
Code them with Kindness
Treating kindness not as garnish but as groundwork.
Beyond Accuracy and Latency: Designing with Care
In tech, we obsess over accuracy and latency — how precise a system is and how fast it responds. Those are crucial, but they’re not the whole story. What if we treated care as a design constraint with the same weight?
Care changes the roadmap. It reshapes reviews. It asks us to pause and consider: what is the human cost of this system working, or failing?
Inputs: Whose Data, Whose Consent?
The raw material of modern AI is human data: conversations, clicks, browsing histories. Care requires us to ask:
- Was this data given knowingly?
- Do contributors have a way to say no or revoke? Are they aware of these ways?
- Are the sources diverse enough to avoid encoding narrow worldviews as defaults?
Outputs: Failure Modes That Harm Softly or Loudly
A system doesn’t just succeed or fail — it fails in specific ways. Some are quiet (skewed recommendations, biased rankings), others loud (wrong medical advice, offensive outputs). Care means designing so that failures don’t disproportionately burden the most vulnerable.
Process: Governance That Includes the Impacted
Who gets a seat at the table when systems are reviewed? If only engineers and executives, blind spots multiply. Care brings in the people most likely to be affected: workers, patients, students, marginalized groups; not as an afterthought but as stakeholders with power.
Ethics isn’t a mood board; it’s a set of defaults. Let’s set better ones. 🪴