Snug rental app: Data privacy and fairness case study

Snug is a real estate app that connects tenants with landlords using an algorithmic Snug Match Score, which aims to deliver high quality matches between parties. However, concerns have been raised about whether the app’s algorithm facilitates breaching Victorian and Queensland rental laws, where it is illegal to solicit offers above the asking rental price.

In addition, former CHOICE campaigner @LindaPrzhedetsky (now a researcher in rental technologies and an associate professor at the Human Technology Institute of the University of Technology, Sydney) highlights the potential that apps using personal attributes can be a cause for serious discrimination, which is also protected under the law.

From the the above Guardian article:

“There is no transparency about how the Match Score is calculated,” Przhedetsky said. “We don’t know if their Match Score is calculated fairly. We don’t know how their algorithm is using renters’ data.”

Przhedetsky goes on to point out:

Apps had been used in other countries to find and scan renters’ social media accounts for “red flags” … tenancy screening technology overseas had also shown the potential for discrimination on the basis of “proxy data” – information that seems neutral but correlates strongly with a protected attribute, such as living in a suburb that has a high proportion of people with a particular racial background.

Here’s what Snug had to say:

Snug declined to disclose what data points went into the Match Score, saying only it was “based on far less personal information than would be collected in an old style application”. It was “just one indicator within a complete rental application that property managers may consider”.

“Snug does not capture social activity data, nor does it generate ‘proxy data’,” Butterworth said.

“Snug has no foreseeable plans to implement further permissible data attributes contained conceptually in the patent application.”

There’s a lot more detail in the article if you’re interested, it’s located here:

So, three big discussion points in this case study:

  • The potential for price manipulation

  • The potential for discrimination, along with general issues around tech transparency and housing

  • And also the general issues of data privacy and security, and the control we have over our own data and how it is used

We’re focused on the potential risks and concerns here for obvious reasons. However, are there also potential benefits to this type of technology? For example, does it present an opportunity to improve protection around the issues we are discussing, by making markets more accessible and transparent. Are there other improvements for consumers, either existing or potential, such as better access to housing markets or fewer limitations than traditional rental situations (such as limited or inconvenient inspection times).

What are your thoughts?


I see from the Guardian article that:

A patent application filed by Snug in 2018, which is still active, suggested that the company’s intention was to collect information from users that included friend lists, social media networks and ratings on third-party platforms such as Airbnb and Uber, and to develop a kind of rental credit system.

So if you don’t use those other systems you might have trouble getting a decent ‘ranking’? If your friends are considered ‘not quite up to scratch’, rating goes down?

It also states that:

Attributes listed for renters include their claims history, details of income and employment, and “3rd party ratings from alternate sources (e.g. Facebook, Airbnb, eBay, Uber, Linkedin)”.

The fact that you have made a claim against a landlord in the past should bear little if any relationship with your likelihood of making a claim in the future, and I suspect that there are probably laws in some jurisdictions against using such information. There can be pretty bad landlords - don’t assume that all claims are indicative of a troublesome tenant.

I do not find the company’s Privacy Policy particularly enlightening or comforting. It feels like boilerplate, trying to give users the warm and fuzzies but with enough loopholes that the company can do all sorts of things with the personal information it collects. The Terms of Service are similarly unhelpful to the end user, but that’s fine because 98% of end users will never read either of these policies.

One final, pedantic, annoyance about the company’s website. This is an Australian company that presumably is hoping to break into the global market but does not appear to have done so at this time. Why then does the Help page (which refers specifically to “tenancy information for Australian states and territories”) URL identify itself as being in English (US) (through the end tag /en-us)?


Having a quick look, this seems to be a repository of information that can be used for renting in a similar way to credit reference agencies for lending money.

But what information could be there? What could determine your score?

Late a few times in rent paying? Having a pet (perfectly allowable in Vic)? Pestering the landlord / agent for essential things to be fixed?

We already have a serious rental crisis in Australia without more crooks jumping in to make money out of the situation. And I do say crooks, because if the score goes up if more money is offered, then that is clearly soliciting higher offers in a roundabout way. And that is illegal in Vic.