When Banking Was Personal
Walk into any American bank in 1955, and something remarkable would happen: the manager would look up from his desk and call you by name. Not because he'd glanced at your account on a screen, but because he'd probably approved your father's business loan, knew where you worked, and had a pretty good idea whether you'd honor your commitments.
This wasn't small-town charm — it was how the entire American financial system operated. From Detroit to Dallas, getting credit meant convincing a human being that you were worth the risk. Your credit score? It didn't exist. Your financial fate rested on something far more fragile and powerful: your reputation.
The Handshake Economy
Jim Patterson needed $8,000 to buy his first house in suburban Cleveland in 1962. He walked into First National Bank on a Tuesday morning, sat down with loan officer Robert Mills, and spent an hour explaining his plans. Mills knew Jim's supervisor at the steel plant, had seen him coaching Little League on weekends, and remembered his father paying off a car loan two years early.
Photo: First National Bank, via seeklogo.com
By Friday, Jim had his mortgage.
No credit report was pulled. No algorithm calculated his debt-to-income ratio. Mills made the decision based on what bankers then called the "Three C's": Character, Capacity, and Capital. Could Jim pay? Would he pay? And what could the bank recover if he didn't?
This system wasn't perfect, but it was profoundly human. Bankers exercised judgment. They considered circumstances. A temporary setback didn't doom you forever, and a steady character could overcome modest income.
When Numbers Took Over
Everything changed in 1989 when the Fair Isaac Corporation convinced America's largest banks to adopt FICO scores. Suddenly, your entire financial life could be reduced to a number between 300 and 850. The promise was fairness: objective criteria that couldn't be influenced by prejudice or favoritism.
The reality was more complicated.
By the mid-1990s, loan officers became order-takers. Computer screens flashed green for "approved" or red for "denied" before the customer finished explaining their situation. The mortgage broker who once spent an hour understanding your circumstances now spent three minutes inputting data.
What We Gained
The new system delivered remarkable speed and consistency. Mortgage applications that once took weeks now processed in hours. Credit became portable — your score followed you from Maine to California, eliminating the need to rebuild relationships with every move.
More importantly, algorithmic lending reduced discrimination. The old boys' network that had systematically excluded women, minorities, and outsiders couldn't survive purely numerical assessment. A 750 credit score opened doors regardless of your last name or which side of town you lived on.
Credit also became accessible to millions who'd never qualified under the old system. Students, young professionals, and people without established community ties could now borrow based on their financial behavior rather than their social connections.
What We Lost
But something essential disappeared in the translation from judgment to algorithm. The system that once considered your character now only measures your transactions. Late payments from a medical emergency carry the same weight as chronic irresponsibility. A divorce, job loss, or family crisis can crater your score regardless of the circumstances.
Worse, the new system created perverse incentives. Paying off loans early — once seen as admirable — can actually hurt your credit score. Closing old accounts damages your "credit history length." The algorithm rewards gaming the system over genuine financial responsibility.
Most fundamentally, we lost the ability to tell our story. The banker who once listened to your explanation now simply reads numbers on a screen. Your score determines not just whether you get credit, but where you can live, what job you can get, and sometimes even whom you can marry.
The Human Cost of Efficiency
Today's credit system processes millions of applications with ruthless efficiency, but it struggles with anything that doesn't fit the algorithm's assumptions. Self-employed entrepreneurs, recent immigrants, and people rebuilding after setbacks often find themselves trapped by scores that don't reflect their true financial capacity.
The irony is profound: we created a more "fair" system that's often less just, a more efficient process that's frequently less effective at predicting who will actually repay their loans.
Looking Back
The old system wasn't perfect. Personal relationships sometimes masked prejudice, and geographic mobility was harder when your credit couldn't travel with you. But there was something powerful about a financial system that treated people as individuals rather than data points.
In 1955, your banker knew your story. In 2024, your bank knows your score. We gained speed, consistency, and scale. We lost flexibility, understanding, and the simple human dignity of being known.
The question isn't whether we can return to the handshake economy — we can't and probably shouldn't. But as we continue automating human judgment out of our financial system, it's worth remembering what we're trading away: the irreplaceable value of being seen as more than the sum of our transactions.