As housing gets more competitive, landlords and governments are outsourcing critical decisions to automated systems. But these tools often replicate old biases. Just faster and at scale.
In the US, SafeRent’s AI tool for tenant screening gave consistently lower scores to Black and Hispanic renters, and to people using housing vouchers - a legal form of income assistance. This is what we would call #AlgorithmicDiscrimination.
Report here: https://algorithmwatch.org/en/report-algorithmic-discrimination/
