Automating housing (in)justice: The promise and limits of ‘fair’ rent tech

By Tegan Cohen

Postdoctoral Research Fellow, Law School, Queensland University of Technology

Blog
Technology Law, Ethics and Policy
Artificial Intelligence
Building
Carlos Torres
Automated tools for rental practices are promising convenience, objectivity and fairness. But can they avoid reproducing the deep structural inequity underpinning housing markets? In Australia and elsewhere, policymakers need to address the underlying political, social, and material conditions of these seemingly ‘objective’ automated tools.

The AI + Society Initiative is excited to continue amplifying the voices of emerging scholars from around the world through our bilingual blog series Global Emerging Voices in AI and Society, stemming from the Global AI & Regulation Emerging Scholars Workshops and the Shaping AI for Just Futures conference. This series showcases the innovative research and insightful discoveries presented during these events–and beyond–to facilitate broad knowledge sharing, and engage a global network in meaningful interdisciplinary dialogue. 

The following blog is authored by Tegan Cohen who was selected by an international committee to present her paper “Automating housing (in)justice: The promise and limits of ‘fair’ rent tech” as part of the 2nd Global AI & Regulation Emerging Scholars Workshop.

Automating housing (in)justice: The promise and limits of ‘fair’ rent tech

Amid an acute rental crisis in Australia, concerns about the use of automated tools for rental practices such as tenant screening have grown. Rent tech firms often market these products with the promise of greater convenience, objectivity, and fairness. In a context plagued by deep structural inequity, the interventions of these technologies can never be merely technical nor neutral. But can they work against injustice?

Fairness and facades in private rent tech

There are good reasons to be dubious about techno-solutionism in private rental contexts. Shifts to ‘automated landlordism’ seem to be reinforcing rather than reducing power imbalances. Around the world, examples of automated and algorithmic systems deployed for dataveillanceprice gouging, stimulating bidding wars, and mass eviction of tenants have been piling up for some time. Systems which supposedly predict a tenant’s ‘temper control’, dishonesty, or ‘hoarding’ tendencies, and which mine social media data to infer personality profiles, add new and problematic dimensions to what it means to be a ‘good tenant’.

At the same time, a lack of clear standards for translating ethical principles such as ‘fairness’ into practice has allowed ‘AI ethics’ to serve as an empty vessel for disparate values and objectives, or a Trojan horse for self-interested agendas. The problem of ‘ethics washing’ has sharpened calls to inject substance and rigor into ethical commitments.

In light of all this, my research explores the possibilities and limits of applying human rights standards to rent tech. Specifically, I ask: what standards does the human right to adequate housing imply for the use of AI and automated systems in the private rental sector.  How can these normative standards be applied and enforced? And what contextual values, socio-political structures, and incentives work against their adoption?

Beyond “debiasing” rental practices

Critical scholarship on AI ethics has drawn out the limits of efforts to achieve fairness by ‘de-biasing’ technical systems. To understand and address unfairness, we must look at the power dynamics of the social context in which technical systems intervene.

In Australian private rental markets, renters have few options and very little power. At present, renters contend with record low vacancy rates and record high rents. A tiny fraction of the available rental properties are affordable for low-income earners. In a market saturated with applicants, landlords and their agents have ample discretion to choose between prospective tenants. Discrimination is rife, pushing many to endure greater costs and instability, and lower housing standards to secure leases. Short leases and few restrictions on price increases contribute to unstable tenure. Low-income earners compete against high income earners for scarce ‘low-price’ stock. Well-founded fears of retaliatory conduct by landlords dissuade tenants from asserting their legal rights.

Under these conditions, ostensibly ‘neutral’ or ‘de-biased’ models will still contribute to unfairness; rationalising and scaling unjust processes. Automated and algorithmic interventions into rental practices can only be ‘fair’ if they work toward reform of power disparities and priorities in Australian rental markets.

Legal scaffolding

Commercial rent tech firms have strong incentives to adopt ‘out-of-the-box’ fairness standards which uphold dominant objectives and power structures. Currently, Australian legal frameworks do little to motivate industry actors to look at the underlying sources of unfairness.

Tenancy laws in a few States deem certain practices unfair, such as soliciting rent bidding and imposing blanket bans on keeping pets. Anti-discrimination laws prohibit unequal treatment of protected groups. The Privacy Act imposes some constraints on collecting and using personal data (though exemptions, including for small businesses, provide escape hatches).

However, the broad discretion of landlords to allocate rental properties, and their relative power to applicants, often provides cover for deviations from the modest standards set by the law. Digital intermediaries, like rental screening applications, could play a transformative role by discouraging non-compliance through design, and documenting unfair practice at individual and statistical levels.

Still, many forms and patterns of unfairness in the private rental context fall outside the narrow concepts codified in existing legal frameworks. Prohibitions on indirect discrimination are intended to counter systemic bias, but not all kinds relevant to the private rental context. Others will evolve too quickly for the ‘bright line’ rulemaking which characterizes tenancy regulation. Tinkering with these legal regimes might induce incremental improvement but is unlikely to incentivize transformative approaches in the rent tech industry.

Centering the social function of housing

We need thicker conceptions of fairness that  centre the social function of housing and pay attention to the historical and structural conditions that produced inequity. We need not start from scratch–the right to adequate housing, as recognised and developed under international human rights law, implies standards and duties that can and should guide the development and use of AI in the housing context.

Reform will not be easy. The notion of land and houses as a prime vehicle for wealth accumulation is deeply ingrained in the nation’s economic, political, and cultural fabric. After claiming ownership of Aboriginal lands on the basis of a legal fiction, British colonizers proceeded to treat the stolen land as a key vehicle for speculation and wealth creation. By the mid-20th century, home ownership was firmly inscribed in settler-colonial constructions of ideal citizenship, while renting came to be framed as transient and irresponsible. Over the last 25 years, a unique combination of fiscal and tax policies, and extended period of low interest rates, turbo-charged speculative investment in the housing market by individuals affectionately valorised as ‘mum and data investors’. Renters are ‘generators of risk, reward and value’ for landlords, who are free to conduct due diligence and audits of tenants in order to guard against ‘investment risk’.

Reframing renters as rightsholders, and houses as homes, in the socio-political systems which shape landlord/tenant relations is a complicated undertaking. The right to adequate housing provides the objectives and principles that can guide such a shift. My research looks at how the right can be practically implemented into domestic legal frameworks to incentivise the development and deployment of rental technologies that work against, not for, structural inequity. Developing mechanisms and avenues for renters to collectively contest the operation and outputs of automated and algorithmic systems will be crucial to ensuring accountability and countering power imbalances. 

Finally, while technology can play a role,  addressing the underlying political, social, and material conditions which foster housing injustice requires more than automated intervention. If we are to imagine a transformative role for rent tech, however, we need to look further than quantitative metrics and easy solutions.

Key resources to learn more

Cohen, Tegan & Suzor, Nicolas P (2024). Contesting the Public Interest in AI Governance, Internet Policy Review. doi: 10.14763/2024.3.1794

Yeung, Karen, Howes, Andrew, & Pogrebna, Ganna (2019). AI Governance by Human Rights-Centered Design, Deliberation and Oversight: An End to Ethics Washing, in Pasquale, F. & Dubber, M. (eds) (2022), The Oxford Handbook of Ethics, Oxford University Press. 

Phan, Thao, Goldenfein, Jake, Kuch, Declan, & Mann, Monique (2022). Economies of Virtue: The Circulation of ‘Ethics’ in AI, Institute of Network Cultures.

Hutchens, Gareth & Daly, Nadia (April 10, 2024). Australian renters face all-time high rents and record low vacancy rates after prices jump in March quarter, ABC News.

Anglicare Australia (2024). 2024: Rental Affordability Snapshot.

Maalsen, Sophia, Wolifson, Pete, Rogers, Dallas, Nelson, Jacqueline & Buckle, Caitlin (September 2021), Final Report No 363: Understanding discrimination effects in private rental housing, AHURI. 

Lee, Michelle Seng Ah, Floridi, Luciano, & Singh, Jatinder (2021), Formalising trade-offs beyond algorithmic fairness: lessons from ethical philosophy and welfare economics, in AI and Ethics, Springer.

Power, Emma R. & Gillon, Charles (2019). Performing the ‘good tenant’, Housing Studies, 37:3, 459.

Hulse, Kath, Reynolds, Margaret, & Martin, Chris (2020) The Everyman archetype: discursive reframing of private landlords in the financialization of rental housing, Housing Studies, 35:6, 981.

Wood, AJ. (2016) Why Australia won’t recognise Indigenous customary law, The Conversation.

About the author

Dr. Tegan Cohenis a Postdoctoral Research Fellow at the Queensland University of Technology Law School. She is particularly interested in the benefits of, and possibilities for, collective rights and action to shape more just and equitable futures with AI.  She is an Affiliate of ARC Centre of Excellence for Automated Decision-Making and Society and member of QUT’s Digital Media Research Centre. A descendant of Wiradjuri people, she currently lives in Meanjin (Brisbane) on Turrbal and Yugara land.

 

This content is provided by the AI + Society Initiative to help amplify the conversation and research around the  ethical, legal and societal implications of artificial intelligence. Opinions and errors are those of the author(s), and not of the AI + Society Initiative, the Centre for Law, Technology and Society, or the University of Ottawa.