When a landlord ran an automated background screening report on a prospective tenant in 2018 the results were surprising to say the least. The check returned countless drug related charges, jumping bail and selling meth, a DWI along with disorderly conduct. However the real injustice was that none of the charges related to the applicant. The burgeoning demand for the rental market since 2008 has also seen a boom in the demand for personal data. The renter screening industry now valued at around £1 billion. Approximately 90% of landlords access these convenient reports, yet convenience as a result can sometimes mean compromise. Cheap automated background screening does not guarantee accuracy and integrity. The background check can, at times, be at best indiscriminate. The search, by default, is there to scrape negative information.
A subject with a common name can have numerous searches returned because of a popular surname. These searches returning data from States and Counties unrelated to the subject. The potential renter got the apartment but only after convincing the landlord she had no connection to the 5 other women. The automated background screening report has flaws and the applicant had to clarify. As a result of incorrect reporting the amount of U.S. citizens suffering hardship is giving cause for concern. Firstly the clear fault lies with the company churning the data. Secondly, the failure to analyse or question the returned data by a human is disappointing in itself. However if a landlord (Applicant) pays for the search do they feel compelled to question the search results. Thirdly, federal lawsuits against screening companies are at record levels. In short, record numbers of Americans are being denied appropriate housing.
Automated Background Screening needs a human touch
Renter screening reports are generated in seconds but are only as reliable as the available data. Tenants have no option and have to purchase the report. Court records and interviews confirm these reports seldom have a human qualify the content. The screening companies RealPage and On-Site are the subject of some complaints. As a result the errors forced one applicant in to a motel room for a year. Moreover the father had to share the room with his young daughter. The applicant wrongly reported for heroin dealing and a sex offender. One applicant left homeless because of inaccurate reporting by On-Site. Another Automated background screening report detailed a sex offender. TransUnion Rental Screening Solutions had reported on an applicant with the same name but 30 years older. Rental demand is strong and subsequently competition fierce. Many renters are unaware of a report itself being the reason for rejection.
[bctt tweet=”Some automated background screening companies sell up to 20,000 reports a month.” username=”probateresearch”]
Some automated background screening companies sell up to 20,000 reports a month. The Consumer Data Industry Association deny any systemic problems. However the organisation prefer to name consumer lawyers. Companies say less than 1% of renters dispute reports. Consequently taking an error rate of 1% of 43 million renters this means a lot of hardship. There remains no way of knowing how many renters are aware of incorrect reporting. What are my options? Complain to The Consumer Financial Protection Bureau or The Federal Trade Commission or litigate.