The dream of homeownership

 

Let’s suppose you are a scientist, with a Ph.D. in biology, and you work as a university professor with full dedication. Let’s suppose you are the child of immigrants who settled in a popular neighborhood, slum, or ghetto. You studied, worked, and saved as much as possible throughout your life to get ahead. To save as much as you can, you continue to live in the humble neighborhood where you were born. You have a good amount of money saved and decide to go for the dream of owning your own home in another place. To do this, you go to your preferred bank and apply for a loan to buy or build your new home.

 

Your credit score is good, and you have never had a late payment; it should be just another procedure in your life. However, the final word on whether you are granted the loan no longer depends on the person serving you at the bank; instead, they enter and validate various data about you into their computer, which then tells them if they can grant you the loan or not.

 

To both your surprise, the bank, through its software, rejects your loan application. You leave thoughtful and without really understanding why that decision was made since you have already fulfilled your part in society.

 

The problem could be that the algorithm on which the bank’s software runs has learned from past data that most people who live in that neighborhood and apply for a loan tend not to repay it, or not on time and in full without litigation, which happens because they do not have a fixed and formal job. The algorithm knows too much, implicitly generating its own bias, and due to the weight it gives to each variable, it imposes a wall between you and the dream of owning your own home.

 

You could go back to the bank and demand an explanation. The person serving you will re-enter all your data into the system, and the result will be the same. You ask for a real explanation, but unfortunately, no one can give one. You only hear excuses.

 

The problem is not you. The problem could lie in the data that fed the algorithm in the first place. The problem could be allowing a single variable, like your current address, to derail the entire operation. The problem would then have been created by the group of people who developed the software in the first place. The problem becomes the algorithms that act as black boxes, those in which the public, and often their owners, do not know what is inside.

 

Click here to read the next chapter 👉 
 


Click here to return to the Index 🔍