Snug is a real estate app that connects tenants with landlords using an algorithmic Snug Match Score, which aims to deliver high quality matches between parties. However, concerns have been raised about whether the app’s algorithm facilitates breaching Victorian and Queensland rental laws, where it is illegal to solicit offers above the asking rental price.
In addition, former CHOICE campaigner @LindaPrzhedetsky (now a researcher in rental technologies and an associate professor at the Human Technology Institute of the University of Technology, Sydney) highlights the potential that apps using personal attributes can be a cause for serious discrimination, which is also protected under the law.
From the the above Guardian article:
“There is no transparency about how the Match Score is calculated,” Przhedetsky said. “We don’t know if their Match Score is calculated fairly. We don’t know how their algorithm is using renters’ data.”
Przhedetsky goes on to point out:
Apps had been used in other countries to find and scan renters’ social media accounts for “red flags” … tenancy screening technology overseas had also shown the potential for discrimination on the basis of “proxy data” – information that seems neutral but correlates strongly with a protected attribute, such as living in a suburb that has a high proportion of people with a particular racial background.
Here’s what Snug had to say:
Snug declined to disclose what data points went into the Match Score, saying only it was “based on far less personal information than would be collected in an old style application”. It was “just one indicator within a complete rental application that property managers may consider”.
“Snug does not capture social activity data, nor does it generate ‘proxy data’,” Butterworth said.
“Snug has no foreseeable plans to implement further permissible data attributes contained conceptually in the patent application.”
There’s a lot more detail in the article if you’re interested, it’s located here:
So, three big discussion points in this case study:
The potential for price manipulation
The potential for discrimination, along with general issues around tech transparency and housing
And also the general issues of data privacy and security, and the control we have over our own data and how it is used
We’re focused on the potential risks and concerns here for obvious reasons. However, are there also potential benefits to this type of technology? For example, does it present an opportunity to improve protection around the issues we are discussing, by making markets more accessible and transparent. Are there other improvements for consumers, either existing or potential, such as better access to housing markets or fewer limitations than traditional rental situations (such as limited or inconvenient inspection times).
What are your thoughts?