America increasingly has a genuine love/hate relationship with credit cards.

On the upside, credit cards save consumers repeated trips to the bank or to an ATM to draw cash out for goods and services. One swipe of a card at a retailer is all it takes – after that, you go home with the purchase, all easy-peasy.

On the downside, credit cards aren’t free as they charge interest to cardholders, and that interest – often as high as 25% or even 30% - can really add up. In fact, many is the consumer who filed for Chapter 7 bankruptcy with enormous credit card debt.

That’s not stopping U.S. consumers from flashing their plastic on a regular basis.

According to CreditCards.com, there were 432.8 million credit cards in circulation in the first quarter of 2019 – that’s up from 416.5 million in the last quarter of 2018.

And, despite concerns over competition from cash, debit cards, and emerging digital payments like PayPal (PYPL) - Get Report and Apple Pay, credit card usage remains robust. According to the Federal Reserve, 2018 saw 41 billion credit card transactions, with a dollar value standing at $3.8 trillion. That’s up from $3.32 trillion in 2017.

A Brief History of Credit Cards

Just like you can’t discuss great gorilla movies without mentioning King Kong, you can’t discuss a history of credit cards without tagging the name Frank McNamara – not so widely-known known as the unofficial founder of the credit card.

McNamara, the founder of Diners Club, the first card provider, created the credit card out of frustration after dining out in 1949. Here’s how Diners Club describes the creation of the credit card – thanks to McNamara and his “forgotten wallet” story.

“In 1949, businessman Frank McNamara forgot his wallet while dining out at a New York City restaurant,” the Diners Club story goes. “It was an embarrassment he resolved never to face again. Luckily, his wife rescued him and paid the tab.”

“In February 1950, McNamara returned to Major's Cabin Grill with his partner Ralph Schneider,” Diners Club notes. “When the bill arrived, McNamara paid with a small cardboard card, known today as a Diners Club Card. This event was hailed as the "First Supper," paving the way for the world's first multipurpose charge card.”

Diners Club was able to attract 10,000 members alone in New York City that first year, and 20,000 new users altogether, all of whom who could count on a growing number of participating restaurants and hotels right out of the gate.

The credit card was soon enshrined in Hollywood gold, as the first plastic credit cards “bewitched” Audrey Hepburn in the film classic “Breakfast at Tiffany’s.”

Helping matters along with was a partnership between McNamara, Schneider, and Alfred Bloomingdale, who had rolled out a payment card of his own in Los Angeles at the same time.

The three agreed to pool their resources with Diner’s Club, with Bloomingdale serving as vice president. Their card, all three agreed, had to be repaid in full each month – it wouldn’t be until the 1970’s when credit cards like Visa (V) - Get Report and Mastercard (MA) - Get Report allowed customers to carry a balance from month to month, which also allowed interest charges to roll over and increase as long as the card debt remained unpaid.

Diner’s Club is widely viewed as the first credit card as a result.

It was also the first card to charge a fee (7% of the transaction amount signed on a card) and an annual fee ($3.)

Predating the Diners Club Card

The McNamara story is a good one, but it’s not the first time capitalism saw financial parties leveraging credit to conduct business.

Back in 1700 B.C., local farmers in the Middle East, in what is now Iraq, would ask for extended credit until they could bring in their harvest of fruits and vegetables. That credit model was erected on the Code of Hammurabi, a Babylonian-era system that set the table for lenders and creditors to provide funding to borrowers and have it paid back in interest.

Since early credit cards – and current ones today – operate under the revolving credit model (i.e., a credit line a borrower can use over and over again until a credit limit is hit, and which must be repaid in similar fashion, giving borrowers more flexibility and lenders more opportunities to profit on interest charges), a physical tool was needed to identify both the creditor and the borrower. As commerce and barter transaction models progressed in the last half of the 19th century, coins and metals, uniquely etched, would serve as the first “credit card” in known history.

At that point oil companies and selected merchants began using crude and early forms of credit cards in a single-transaction model.

Merchants could “stamp” the coins with a special imprint and insert it into the customer’s receipt, signifying the creditor and borrower involved in a specific transaction. By the 1930s, special metal plates were used to make crude cards signifying the same information, but could only used to make transactions with a single merchant.

It would take Diners Club to introduce a credit card that could be used time and time again with myriad merchants and retailers. But soon, Diners Club was displaced, as the American Express (AXP) - Get Report card was introduced in 1958 and rolled out the first plastic credit card a year later.

By 1963, American Express would claim one million active cards and over 85,000 participating retailers and merchants.

Soon, other financial services providers, especially large banks, began entering the credit card market.

The Bank of America (BAC) - Get Report credit card (named BankAmericard), which would morph into the Visa card, was available nationwide by 1966.

The same year, a consortium of western U.S. banks formed the Interbank Card Association, and soon introduced the first MasterCard, and solidified the idea of big bank executives banding together, via card associations, to flood the market with big bank-sponsored credit cards, a business model that would dominate the market for decades to come.

Technology Changes the Card Picture

The early 1960s saw the introduction of the magnetic stripe verification feature on credit cards. The magnetic stripe, introduced by IBM (IBM) - Get Report, held sway as the primary credit card verification feature for roughly four decades.

That scenario changed in the early 2000s, when credit card providers shifted over to the radio frequency identification card (RFID). This technology breakthrough gave card users access to touchless identification verification that connected to a merchant’s RFID card reader, which picked up the signal on the card’s microchip, increasing the speed times for transactions and giving card users an extra layer of security.

Fast forward to 2020, where credit cards now have an extensive array of new features like biometric identification, which allow for facial, touch and even eyeball identification access. Also rising in prominence is the EMV computer chip card, that provides users with even more data security on their physical cards.

Mobile technologies like Apple Pay and Google Wallet also became more common, allowing card users to “tap” their phones on a merchant’s card reading unit and complete transaction faster than ever.

No doubt, technology will continue to advance the use of credit cards going forward, with features like DNA scanning and artificial intelligence already in the pipeline.

Regulatory Dynamics

With credit card usage becoming more prevalent, and banks and card providers getting more aggressive about maximizing their cash cow, Uncle Sam has had to step in and place some much-needed consumer guardrails in the credit card market.

That process largely began in the early 1970s, when Congress passed the Fair Credit Reporting Act of 1970, which limited the growing use of cardholder information data being shared to other parties, who would often use that information to hawk more goods and services for cardholders.

The same year, the Unsolicited Credit Card Act of 1970 stopped credit card providers from shipping credit cards to consumers who never asked for them.

In 1974, Congress was back for another turn at the plate, this time passing into legislation the Fair Credit Billing Act of 1974, which amended the Truth in Lending Act to rein in aggressive billing procedures and, for the first time, gave consumers some regulatory tools to dispute and correct card invoicing mistakes.

The same year, the Equal Credit Opportunity Act became law, stopping card companies (and other creditors and lenders) from discriminating against consumers on the basis of race, sex, nationality, religion, or marital status.

Fast forward to the 21st century, and the Credit Card Accountability, Responsibility and Disclosure Act of 2009. The so-called CARD Act set further limits on what card providers could do with their cards. The CARD Act basically beefed up language in the Truth in Lending Act, curbing “deceptive and abusive” practices by the credit card industry.

As part of the CARD Act, the Consumer Financial Protection Bureau was created, giving consumers another tool backed by the federal government to get a better grip on credit card contracts and policies, so they better understood the use and accountability of their own credit cards. Among other issues, card terms, interest rates, and fees, were now mandated to be more open and transparent, with more tools available for card consumers looking to dispute card charges.

History of Credit Cards – a Timeline

  • 1885 – Paper loyalty cards are used.
  • 1928 – Metal plate-based credit cards are in vogue.
  • 1950 – The Diners Club – the first modern era credit card – is introduced.
  • 1959 – The first plastic credit card is rolled out.
  • 1969 - Credit cards with magnetic stripes are introduced.
  • 1987 – The first travel rewards cards are introduced and immediately become popular.
  • 2002 – So-called “mini credit cards” which fit on a key fob, are introduced, but not embraced by the public.
  • 2011 and 2014 – Google Wallet and Apple Pay are made available to mobile phone users.
  • 2015 – EMV chip cards are introduced.