Uncategorized

10 – CHARLES F. FROST

Do you know him?

You’ve seen his name a gazillion times. You just can’t place it.

Here’s a hint: I can almost guarantee you have thrown out junk mail with his name on it this year.

Charles F. Frost, or C.F. Frost, was an ad executive at Ogilvy & Mather in the 1950s. One of the firm’s clients was American Express, and it was getting ready to roll out a new product. But it didn’t want to use a random name because it was afraid of lawsuits.

So they put Frost’s name on the new product: the credit card. His name has appeared in just about every prototype of an American Express card since its introduction in 1958.

Whoa, young’uns! I know what you’re thinking. 1958?

Here’s something that might seem shocking: credit cards were virtually non-existent in 1954.

The only credit card was Diner’s Club, formed in 1950 after some businessman forgot his wallet when dining with clients and was forced to call his wife to bring it from home. But its uses in 1954 were still limited to restaurants, mostly in New York.

American Express, which was a financial services company specializing in freight shipments and foreign exchange, had been looking to get into the credit business since just after World War II. Once Diner’s Club was established, it provided the impetus to go big.

The first American Express card was, well, a card. Made of paper. You know that assortment of plastic in your wallet or purse? Not until 1959 did AmEx become the first company to issue embossed plastic cards. It was more of a charge card than a credit card – you were expected to pay the balance when the bill came monthly.

American Express charged what was considered a steep fee for its card, $6 a year. But it carried a certain caché to flash the card at a restaurant or hotel. It was a big deal in our house when my Dad got his Amex Card in 1968.

The nation’s banks were slower to get to the market. In the ’60s, they formed two alliances – BankAmericard, named for alliance leader Bank of America, and Intercard, also known as Master Charge. Their names evolved to Visa and Mastercard.

People snapped up the cards. Banks were part of both alliances and soon were the backers of the cards issued by individual retailers and service providers. And they made fortunes on the interest they charged when they allowed people to pay a minimum of their balance each month.

That’s not an unmitigated good. Many Americans forget that the card only means you’ll pay for whatever eventually. They rack up thousands of dollars in debt that becomes very difficult to repay.

In the 1970s, banks began issuing debit cards, a different kind of plastic that deducted funds from checking accounts. And debit cards led to another thing that seems as if it has been around forever: the automated teller machine, or ATM.

The idea that you can get cash from a machine at 3:22 a.m. on Sunday is another thing that might have been beyond the wildest dreams of people in 1954.

Most of us have at least two go-to credit cards to handle expenses. In fact, there are more and more places where they don’t allow cash transactions – if you want to get a hot dog at a Met game, you need a credit or debit card.

Don’t leave home without it.

Standard
Uncategorized

11 – CROWNING ACHIEVEMENT

By 1954, Americans could again buy a wide range of vehicles, now that the restrictions on building cars during World War II and the retooling of plants were over.

Wide range, of course, of cars built in the United States.

The fact is that the Big Three automakers – General Motors, Ford and Chrysler – accounted for about three-quarters of all the cars produced in the whole world in the early 1950s. It was a domination that did not seem likely to end.

It did.

The first car I remember my Dad driving was a beige Dodge. Most of his cars, which were given to him by his employer, Firestone, were Chevrolets. He once drove a Studebaker Lark, which he wasn’t crazy about – Studebaker stopped making cars in the mid-’60s. He also had a Chevy Corvair for a few weeks – it was a car that, as a 7-year-old, I could not fit in its back seat.

The Corvair is what some people see as the start of the decline of American automotive dominance. It was an attempt to build a smaller car for people who wanted one – and it was not particularly well thought out.

It was, in fact, “Unsafe at Any Speed,” the title of the book written by attorney and journalist Ralph Nader, who researched lawsuits filed against GM about the Corvair. 

American cars, once thought to be the paragon of the industry, were starting to be seen as poorly made, with constant recalls and safety issues. And as gas prices rose in the 1970s, they were also seen as not economical, often getting less than 10 miles per gallon. “Lemon” and “gas guzzler” entered the vocabulary.

The exact opposite was happening on the other side of the world.

A Japanese company, Toyota, had struggled to find its place in the vehicle industry. It needed a bailout from the Bank of Japan in 1949, but only after management and labor agreed to cost cutting and productivity improvements.

They succeeded. In 1952, Toyota built its first passenger vehicle, the Crown. Six years later, it tried the U.S. market – and failed. 

But it kept coming. Toyota and the other Japanese companies, Honda and Nissan (originally Datsun), found the niche in economical smaller cars. And they focused on quality as the Americans foundered.

By the 1980s, it was an intense battle. The Big Three saw it as a point of pride that Americans should buy cars made in the U.S.A. But more and more Americans just didn’t want them – the Japanese cars were safer and easier on gas.

For a while, the Honda Accord and Ford Taurus battled for supremacy. Then came the Toyota model named for its original vehicle: the Japanese word for crown is Camry.

Meanwhile, across the Sea of Japan, South Korea began to have ideas about building cars. Some businessmen built its first one in 1955, two years after the truce halting the Korean War. Hyundai built its first car twenty years later.

The American auto industry almost died for good in the Great Recession. Presidents George W. Bush and Barack Obama intervened to bail out the industry, and American carmakers have recovered a little of the ground lost to the companies from Japan, South Korea and luxury vehicle makers in Europe.

But it’s somewhat political. On the East and West coasts, non-American cars dominate the roads. In the middle of the country, where the nation’s remaining auto plants are located, GM and Ford hold sway.

My Dad was disappointed when my wife and I started buying Toyotas in the 1990s. Firestone was closely aligned with Ford, but my first two cars were Fords and they were not particularly reliable.

He finally sort of came around with my Highlander, an SUV that was a lot easier to drive and maintain than the clunkers GM was still cranking out.

Dad didn’t live to see me switch to Hyundai in 2018. 

And that’s what led me to this thought: How would you go back to 1954 and explain to anyone – maybe even anyone in Korea – that there would be millions of Hyundais and Kias on American highways in 70 years?

상상할 수 없는!

Standard
Uncategorized

12 – WHY WOULD I PAY TO WATCH TV?

TV was free in 1954. 

Well, free after you bought a set, which was an expensive proposition. A 21-inch black-and-white inside a large wooden cabinet sold for about $250. That already seems like a lot, but we’re also talking 1954 dollars. According to the Bureau of Labor Statistics, that set would cost nearly $2,900 today.

The first color TV set for the public went on sale in 1954. That cost about $1,000. Then. That’s the equivalent of about $11,000 in 2024. That’s a lot of loot, considering only NBC aired prime-time programs in color, and all news and sports was in black and white.

Which is why barely more than half of the country had sets in 1954. Add to that the expense of mounting an antenna on your roof – you had to do that to get the signal until they started building smaller sets with their own “rabbit ears.”

After that, TV was free.

But, as you probably surmised, TV prices dropped. While just about everything else you can think of has soared in price, TV set prices, accounting for deflation, have fallen more than 90%.

TV became a bargain. By the 1970s, most people had color sets. And everything was broadcast in, as NBC used to put it, “living color.”

The problem is that a sharp TV picture – something we take for granted – was hard to come by. Antennas sort of worked, except in big cities and rural areas. People would sometimes stand by their set because they believed that through some static electrical phenomenon it improved the picture.

The solution was to deliver TV service a different way – through cables attached to a system that got its signal from another new invention, satellites. 

Not only would the new cable systems show regular local and national programming, but it would open up lots of space for new and different channels. That included channels that could show the more intense films being released in theaters without cuts.

There was a catch: You had to pay for cable TV.

We do it routinely now, for it or satellite or fiber-optic. But older people, among them my parents, found the idea offensive. They held off as long as they could.

It took my parents into the 1980s to give in. But give in they did – in their later years, they loved watching shows on HBO (as we discussed in No. 19 last week). And, of course, they couldn’t get the YES Network, the channel of their beloved New York Yankees, fast enough.

What they found was that the quality of the picture was infinitely better. So much so that people were able to use TV sets with high definition – a picture so clear that professional sports now use it to determine if a game official called a play correctly.

But if you had told them in 1954 that they would spend more than $1,000 a year to get a TV picture, my Dad would say, as he often did when something seemed ridiculous, “Get outta here!”

Standard
Uncategorized

13 – DALLAS

Most people cognizant on November 22, 1963 know exactly where they were and what they were doing when they found out President John F. Kennedy was assassinated in Dallas.

But I think the details are more shocking – and would have been to people in 1954 – than the idea that an American president could be murdered. It had been nearly a century since Abraham Lincoln was murdered, and two other presidents had been slain – James Garfield in 1881, William McKinley in 1901.

That’s not to mention the attempts. In my parents’ lifetimes, a gunman missed then President-elect Franklin D. Roosevelt in Miami, killing Chicago Mayor Anton Cermak (there’s some question as to whether Cermak, not FDR, was the target). And just in November 1950, a Secret Service agent died foiling an attempt to kill President Harry Truman.

But President Kennedy’s death hit Americans harder. It’s not that he was universally loved – he was facing a tough re-election fight against arch-conservative Barry Goldwater, who happened to be a friend of his.

It’s partly because he was a relatively young man – JFK was 46. It’s partly because he was on the upsurge in popularity, having stood up to the Soviets in Cuba a year before and negotiated a nuclear weapons treaty. 

And it’s partly because Americans watched this crime unfold on television in real time.

They saw Walter Cronkite tearfully delivering the bulletin of the president’s death. They watched the casket come off Air Force One in Washington that evening. They were glued to their TV sets for the president’s lying in state at the Capitol and the chilly funeral procession to Arlington National Cemetery. They watched Jackie Kennedy bear up and 3-year-old JFK Jr. salute as the caisson passed.

And Americans – including my Dad and I, who were waiting for my Mom to dress for a dinner out to celebrate their 12th anniversary – watched live as another gunman emerged from a crowd at a Dallas police station and shot suspected assassin Lee Harvey Oswald.

The impressions of that weekend before Thanksgiving were indelible. And no matter how much you can imagine some cataclysmic event, it’s still shocking to think it actually happened.

The Kennedy Assassination was an event that scarred Americans. There are some on the left who believe the problems of the next 25 years were a result of JFK and his coterie of the best and the brightest – including his brother, Robert, killed in 1968 – not being there.

My mother was devastated. I remember seeing her crying at the door of our first-floor apartment in Flushing as I ran home, in tears myself, from school. She was one who believed in the Camelot aura. 

Much as I said about Vietnam, the country was never the same after the assassination. A presidential slaying was imaginable in 1954. But that doesn’t make it any less shocking.

Standard
Uncategorized

14 – PASTA CON PESTO

My father’s father is from the seaport city of Savona in the Italian province of Liguria. It’s in the northwest corner of the country, bordering France.

I don’t have a lot of memories of him. He died when I was 9 in 1963.

But his legacy in our family – preserved by my mother, his daughter-in-law – was a pasta dish we thought belonged to us.

In this dish, the sauce is green. It’s made from basil, pounded with a mortar and pestle, along with olive oil, parmigiano reggiano (my brother, who knows better, says my Mom used pecorino Romano) and pine nuts. The sauce is mixed with a pasta – I think we tended toward linguine – and boiled potatoes (because our people never thought much about carbs). Some recipes include green beans; I don’t remember us doing that.

The sauce is called pesto from the Italian word for pounding or grinding

My mother, who had never had pesto as a kid, enjoyed my grandfather’s dish so much that she learned to make it herself. Thus continuing the tradition in the family. She often left the potatoes out – and my father complained that it wasn’t real pasta con pesto (that’s what we called it; pasta Genovese is more common) without them.

It was a dish we loved as kids. And we shared it.

My mother made it for her father, from whom I get my extremely picky eater genes. It was actually pretty funny to watch him try to eat it – this man who was the most distinguished looking person I’ve ever known picking at it like a kid. But once he started to eat it, he loved it.

So did my friends in high school when they came by for lunch. And so did my Mom’s friends.

But it was something that only we ever had. It was never in any Italian restaurant where we ever dined, because most of the traditional dishes were southern Italian.

Fast forward to 1984.

It’s dinner time at my job in an office above Grand Central Station. I go down to Zabar’s in the terminal. And I noticed that one of the choices is a pasta salad with chicken and pesto.

I was floored. I was determined to try it. It wasn’t bad. It wasn’t my mother’s, but it wasn’t bad.

Throughout the ’80s, it amazed me to see pesto as a dish and as a flavor spread far and wide.

What’s my point?

In 1954, when I was born, the idea that Italian-Americans would blend into the mainstream of this country was not so certain. We were often depicted as crude, the subject of slurs and jokes, and stereotyped as mobsters and other criminals.

That applies to other ethnicities and other races. I’m sure many of you have felt it in your lives at some point.

But the diversity of this country is its superpower. And it often manifests itself in the food we eat – we adopt dishes brought here by others and adapt them to what we know.

And the result is spectacular. On my last trip to Chicago, I had chicken tikka masala tacos. They were amazing. Two dishes from two very different places combining in the middle of America.

That’s what happened to pesto. We Ligurians blended it into the mix and it joined the vast American menu.

My parents and grandparents would probably be a little surprised at how widely accepted something we thought was only ours has become. 

What would be more surprising is one of the people who carries on the pesto legacy in my family.

It’s my grandfather’s granddaughter-in-law. She makes absolutely fabulous pesto.

She was born in Hong Kong.

Happy St. Joseph’s Day!

Standard
Uncategorized

15 – THE LOSS

In the military comedy “Stripes,” Bill Murray’s character tries to generate a laugh with a line aimed at inspiring his fellow screw-up recruits.

“But we’re American soldiers! We’ve been kicking ass for 200 years! We’re ten and one!”

A 1954 audience seeing that scene would be bewildered by the “one.” They’d be more bewildered by the fact that where it happened, Vietnam, was very likely in the newspaper they read that day.

On the day I was born, Vietnamese nationalists were laying siege to the French garrison at Dien Bien Phu in French Indochina. A month later, the French surrendered to the Viet Mihn forces loyal to Ho Chi Mihn, who had been fighting for independence since the end of World War II.

In July, the peace conference to end the conflict divided the nation at the 17th parallel, much as Korea had been divided at the 38th parallel the year before.

But Ho and his Communist allies were hardly ready to give up on the dream of controlling all of the country. A group that would eventually be called the National Liberation Front, or Viet Cong, began agitating against the South Vietnamese government, with the support of Ho and the North Vietnamese.

The United States, with France out of the picture and concerned about another Communist state in Asia, began supplying aid to the South Vietnamese under President Dwight Eisenhower. That aid became “advisers” under John F. Kennedy and troops under Kennedy and Lyndon Johnson.

The conventional wisdom was that American military might and superior tactics would overpower the Viet Cong and the North Vietnamese – even if they had support from the Soviet Union and China.

We were assured that victory was close at hand. How could guys in black pajamas outfight the best-trained military in the world?

The answer was: They had a bigger reason to fight than we did.

They were fighting for their homes. We just didn’t want what we thought would be another Communist domino to fall, even if it was more than 8,000 miles away from the West Coast.

That was hard for Americans to grasp until the beginning of 1968. After hearing another rosy assessment as the year began, the nation was shocked when the Communists launched an attack timed with the start of the Lunar New Year, known as the Tet Offense. The Viet Cong came close to capturing the U.S. Embassy in Saigon before being repelled.

There had been an antiwar movement shortly after the U.S. escalation began – and it grew with the TV pictures of the Tet Offensive flashing in U.S. homes. 

And yet, the U.S. elected Richard Nixon as president in 1968. He was not inclined to immediately end the war as so many protesters demanded, claiming he wanted “peace with honor.” That led to even more conflict at home – at a level Americans hadn’t seen since the Civil War.

Eventually, Nixon and Secretary of State Henry Kissinger negotiated a settlement for North and South Vietnam to co-exist, and U.S. troops withdrew in 1973. Two years later, and six years after Ho’s death, the North overran the South and unified the nation.

Vietnam paid a high price for Ho’s vision. Between 1.5 million and 2 million Vietnamese died. The countryside was ravaged, some of it with chemicals dropped by U.S. bombers. And there was more conflict to come – opponents fled the country by boat, and the nations bordering Vietnam came into conflict with this emerging power.

But the U.S. was badly damaged. About 60,000 Americans died, about 50,000 in combat. More than 300,000 were wounded. Our prestige around the world was bruised badly.

And then there were the internal problems. Those who fought in Vietnam were not accorded the kind of heroes’ welcome American soldiers had come to expect. Some of those opposed to the war saw them as criminals. Some of those in favor saw them as drug-addled losers.

The divisions that occurred as a result of the Vietnam War never completely healed. They are perhaps at the core at the anger that drives so many to support Trump and the Republican agenda of grievance. 

This would have been a lot for someone peering into the future in 1954 to grasp. And there’s one other thing that might have stunned them.

It’s the sweatshirt I’m wearing as I write this. It was made for an American company in Vietnam. The two countries enjoy strong trading ties and have for most of this century.

Country Joe asked the right question. What were we fighting for? 

Standard
Uncategorized

16 – ZAPPED FOOD

You don’t hear the term “TV dinner” much anymore.

It was still a fairly new term in 1954. And it was a very specific thing – it came frozen in an aluminum tray with compartments.

The main maker of these meals was Swanson, an independent food company that was a year away from acquisition by Campbell Soup. The first TV dinners were turkey – there was a compartment with a couple of slices of freeze-dried turkey, gravy and stuffing, a smaller compartment for the worst mashed potatoes you can imagine, and a space for dessert, usually some cranberry muffin kind of thing.

Eventually, Swanson sold varieties such as fried chicken, salisbury steak, and macaroni and cheese.

It usually took about 45 minutes to an hour to heat the dinners in a conventional oven. 

Fast forward to 2024: 45 MINUTES! Are you kidding?

Between the Swanson TV dinner and what’s in your freezer right now was a device that was actually around in 1954.

The discovery that you can heat food rapidly with a microwave beam came in the 1940s. And there were microwaves in use in 1954 – if you could afford the $3,000 they cost, the equivalent of nearly $35,000 today.

When the price became a bit more reasonable, around $400, in the late 1960s, companies began selling them to a mass audience. Sharp, Amana and Litton were the first popular makers of the devices, which most people discovered in the break rooms of their workplace. 

Microwaves never really caught on as a way to cook a regular dinner. What they were for was heating things fast. You could boil water in two minutes, pop corn in one, melt butter and reheat leftovers.

And the frozen TV dinner became a frozen meal. 

Swanson was actually slow to pick up on the idea that maybe its frozen dinners weren’t suitable for the microwave age. For one thing, you can’t put an aluminum tray in a microwave.

What was once one case in a supermarket is now a whole row. Meat, fish, vegetarian dishes, pizza, ethnic foods. There are few foods that can’t be heated or reheated in a microwave. And dinner is ready in 5 minutes or less, especially on a night when you come home late from work and have no energy to mix any two ingredients together to create a meal.

But microwaves aren’t the only thing that would come as a surprise to our parents.

Food processors, a more advanced form of blender, came into popular use in the 1980s after Cuisinart developed its machine. Bread machines baked loaves in people’s homes in a couple of hours. Air fryers are making french fries safe to eat again.

And the ultimate in convenience food started out as a new use for an old device – the toaster. In 1964, Kellogg’s introduced Pop Tarts, described as a toaster pastry. It is one of the most popular convenience foods in American history – the subject of a forthcoming movie by Jerry Seinfeld.

Bon Zappetit!

Standard
Uncategorized

17 – NOT SO WORRIED ABOUT GLACIERS RIGHT NOW

In 1954, when people worried about climate change, it was about the possibility that glaciers might return.

Sure it might take thousands of years. But the idea that they would come back and swallow the planet was considered more likely than what actually happened.

The world would undergo periods of heating up and then periods of cooling. In 1982, Newsweek had a cover about our cooling planet.

Yeah, some scientists mentioned the idea that maybe sending so much carbon dioxide into the atmosphere might do something to heat up the planet.

The idea that things were changing for the warmer didn’t become the operative scientific belief until the 1990s, when the term “greenhouse effect” became accepted.

It might have been hard for older people to grasp the idea that things were getting warmer. But if you’ve been around for 70 years, you’ve noticed a few things.

Winters aren’t as cold as they used to be – except when there’s a gap in the atmosphere and a polar vortex comes sweeping out of Canada with ridiculously low temperatures.

It’s also a lot drier. That’s a problem for vegetation – trees especially.  And that leads to wildfires that create scary skies like the ones in the United States last summer that were bright orange.

And storms are more violent. Sandy, the 2012 superstorm that destroyed a lot of the New York area, was seen as a harbinger of the kind of disaster the world has coming if nothing is done to curtail the changes in climate.

But while the rest of the world watched ice caps melt and sea levels rise and said, hey, I think these scientists have a point, the idea of climate change met with opposition in this country.

Because so much of the U.S. relies on fossil fuel production – think of the places dependent on oil and coal – the resistance partly comes from people worried that they won’t be able to make a living in a world powered by wind and sun. 

They’ve been goaded by cynical figures who, instead of leading, decided to challenge the science. Climate change was said to be a hoax – one idiot senator from Oklahoma brought a snowball to the floor to “prove” that climate change isn’t real.

One idiot president also used the word hoax – and who were his idiot followers going to believe – him or their lying eyes?

Trump pulled the United States out of the world’s most comprehensive agreement to combat climate change, the Paris Agreement signed by 195 nations in 2015. The agreement’s goal is to limit the increase in the world’s mean temperature to less than 3.6 degrees Fahrenheit – and preferably less than 2.7 degrees Fahrenheit. To achieve the goal, the world needs to cut emissions by 50% before the end of this decade.

Urgency about climate change has led to several changes in our life. More reliance on non-fossil fuels, such as solar and wind. Electric and hybrid vehicles, after being shunned as an idea for decades, are becoming the car industry’s dominant product. The idea that products or plane trips can be “carbon neutral” is being instilled in the public.

Will we succeed in stopping the planet from becoming too warm to live on? That’s not the kind of question people in 1954 would have expected to hear.

Standard
Uncategorized

18 – THIS HERE THING

If this was 1954, I wouldn’t be able to do what I’m doing right now.

Type these words on a TV screen.

I certainly would have been able to type these words on a piece of paper. There were typewriters in 1954, although the first IBM Selectric was about seven years in the future.

Typing whatever I wrote would certainly be more legible than my handwriting, which is not much better in 2024 than it was in 1954.

But if I used a typewriter, I wouldn’t have been able to cleanly change the word “distance” to “future” two paragraphs up without either xxxxxxxxing it out or crumpling the piece of paper I was using into a ball and throwing it in the garbage. 

(I also wouldn’t have been able to use Liquid Paper, the popular correction fluid that was invented two years after I was born by Bette Nesmith Graham, the mother of future Monkees guitarist Michael Nesmith.)

Computers were not unknown in 1954. They just weren’t anywhere near peoples’ homes. They were in labs and research centers, and took up hundreds of square feet of space.

The first home computers – or personal computers – were put together by tech hobbyists from kits. They followed innovations in reducing the size of semiconductors and advanced circuitry.

Eventually, the whole computer was sold to people who had no interest in soldering wires and touching a screwdriver. Commodore, Tandy and Apple sold what we would consider clunky machines with floppy discs that held only as much data as 10 seconds of any song on your current iTunes.

IBM’s launching of the PC began the real breakthrough to the mass audience, its operating system being built by a new company called Microsoft. Eventually, other companies adopted Microsoft’s MS/DOS and then Windows as their system, with Intel chips providing the power; IBM exited the PC business in 2005, selling it to the Chinese company Lenovo.

Most of the early machines were desktops, with separate monitors and computer cases, and cables attaching them. Apple pioneered desktops with the hard drive and CPUs inside the monitor.

I’m not writing this on a desktop. I’m writing it on a laptop, which I can take anywhere as long as the battery lasts. It not only holds everything I’ve written for the past six years, it holds my photos, some of my music, spreadsheets for baseball stats, a Web browser to get information and watch Brandon Nimmo’s 2022 catch over the center field wall at Citi Field for the 2,015th time, and OOTP 24, the best computer baseball game ever.

Once I finish writing this, I’ll save the file and publish it on my blog in a few days. 

There are people nostalgic for typewriters, for ledgers, for photo albums, for record players, for VCRs, for newspapers, for cookbooks, for tape recorders, for board games, for alarm clocks, for date books, for phone directories.

I’m not. I’ve got it all in less than two square feet on my desk. People in 1954 might have imagined all this, but the reality still would have amazed them.

Now, if only I could stop using this thing and get to sleep.

Standard
Uncategorized

19 – BADA BING

There was nothing on television like “The Sopranos” in 1954.

In fact, there was nothing in the movies like “The Sopranos.”

In further fact, if you were to go back in time with a recording of “The Sopranos” (and some 1954-compatible way of playing it), there’s a chance you’d be arrested in much of the United States.

It was risque to even imply sex and violence, and communities had laws about what constituted obscenity. On TV, married couples had separate beds. There was never any blood coming from a gunshot wound in a Western or crime film. 

And the most scandalous utterance in film, even into the early 1960s, was Rhett Butler’s final statement in “Gone With the Wind”: “Frankly. my dear, I don’t give a damn.”

Contrast that with “The Sopranos.” 

Tony Soprano, a New Jersey mob boss, has his office in a strip club called the Bada Bing. He gets his way through beating and killing people. His vocabulary is chockablock with scatological and carnal synonyms. 

Between Clark Gable and James Gandolfini was the Hays Code. It was Hollywood’s self-censorship effort to avoid government getting involved. Up until the late 1960s, it was a strict guide to how a movie and, later, TV show maker could portray elements of real life that make some people uncomfortable.

And there were people who were uncomfortable even with what was being put out. In 1964, because movies were pretty safe places to let kids go by themselves, my parents had no problem allowing me to see “Dr. Strangelove, or How I Learned to Stop Worrying and Love the Bomb.” It’s a classic (and about the only Kubrick film I actually enjoy, but that’s not important here).

When I told some of the kids in my neighborhood about it, they were scandalized. The local Catholic newspaper had condemned the movie – most likely because George C. Scott’s character starts the movie in bed with a woman who answers his phone.

Efforts to crack the Hays Code finally succeeded in the late 1960s. 

Three years after seeing “Dr. Strangelove,” my 8-year-old brother and I went to the Town Theatre in Glen Cove to see what we thought – because of the advertising – was a comedy. 

When we came out of seeing “Bonnie and Clyde,” both my brother and I were afraid to tell our parents what we had seen, which had culminated with the piercing of the title characters’ bodies with what seemed like hundreds of bullets. (NOTE: Maybe there should have been a spoiler alert there, but that also is something people in 1954 wouldn’t know about.)

My family apparently wasn’t the only one that felt a little misled. And yet, films with more explicit violence and sex were extremely popular amid the turmoil of the era.

So the motion picture industry developed a rating system to replace the Hays Code. It ranged from G, movies that had nothing anyone could reasonably object to, to X, movies with very graphic sex and violence that theaters would not allow under age 17 to see. The X rating, now synonymous with pornography and bloody films, evolved to NC-17.

In 1954, movies were an important component of local television. They filled afternoon slots, weekend slots, late slots.

But because TV is so easily accessible to families, many of the movies that broke through the various taboos were doomed to be either not airable or edited to distraction. 

That’s where cable television came in. Certain premium channels – HBO, Showtime and Starz among them – showed the movies uncut. Then they started developing shows of their own, such as “The Sopranos,” many of them just as compelling as any movie. 

My parents, after a little hesitance because of its depiction of Italian-Americans, came to embrace “The Sopranos.” Although my Dad was always dismissive of anything with “bad language,” both my parents always looked forward to good entertainment. 

That’s the bottom – whether you see it or not – line.

Standard