Uncategorized

15 – THE LOSS

In the military comedy “Stripes,” Bill Murray’s character tries to generate a laugh with a line aimed at inspiring his fellow screw-up recruits.

“But we’re American soldiers! We’ve been kicking ass for 200 years! We’re ten and one!”

A 1954 audience seeing that scene would be bewildered by the “one.” They’d be more bewildered by the fact that where it happened, Vietnam, was very likely in the newspaper they read that day.

On the day I was born, Vietnamese nationalists were laying siege to the French garrison at Dien Bien Phu in French Indochina. A month later, the French surrendered to the Viet Mihn forces loyal to Ho Chi Mihn, who had been fighting for independence since the end of World War II.

In July, the peace conference to end the conflict divided the nation at the 17th parallel, much as Korea had been divided at the 38th parallel the year before.

But Ho and his Communist allies were hardly ready to give up on the dream of controlling all of the country. A group that would eventually be called the National Liberation Front, or Viet Cong, began agitating against the South Vietnamese government, with the support of Ho and the North Vietnamese.

The United States, with France out of the picture and concerned about another Communist state in Asia, began supplying aid to the South Vietnamese under President Dwight Eisenhower. That aid became “advisers” under John F. Kennedy and troops under Kennedy and Lyndon Johnson.

The conventional wisdom was that American military might and superior tactics would overpower the Viet Cong and the North Vietnamese – even if they had support from the Soviet Union and China.

We were assured that victory was close at hand. How could guys in black pajamas outfight the best-trained military in the world?

The answer was: They had a bigger reason to fight than we did.

They were fighting for their homes. We just didn’t want what we thought would be another Communist domino to fall, even if it was more than 8,000 miles away from the West Coast.

That was hard for Americans to grasp until the beginning of 1968. After hearing another rosy assessment as the year began, the nation was shocked when the Communists launched an attack timed with the start of the Lunar New Year, known as the Tet Offense. The Viet Cong came close to capturing the U.S. Embassy in Saigon before being repelled.

There had been an antiwar movement shortly after the U.S. escalation began – and it grew with the TV pictures of the Tet Offensive flashing in U.S. homes. 

And yet, the U.S. elected Richard Nixon as president in 1968. He was not inclined to immediately end the war as so many protesters demanded, claiming he wanted “peace with honor.” That led to even more conflict at home – at a level Americans hadn’t seen since the Civil War.

Eventually, Nixon and Secretary of State Henry Kissinger negotiated a settlement for North and South Vietnam to co-exist, and U.S. troops withdrew in 1973. Two years later, and six years after Ho’s death, the North overran the South and unified the nation.

Vietnam paid a high price for Ho’s vision. Between 1.5 million and 2 million Vietnamese died. The countryside was ravaged, some of it with chemicals dropped by U.S. bombers. And there was more conflict to come – opponents fled the country by boat, and the nations bordering Vietnam came into conflict with this emerging power.

But the U.S. was badly damaged. About 60,000 Americans died, about 50,000 in combat. More than 300,000 were wounded. Our prestige around the world was bruised badly.

And then there were the internal problems. Those who fought in Vietnam were not accorded the kind of heroes’ welcome American soldiers had come to expect. Some of those opposed to the war saw them as criminals. Some of those in favor saw them as drug-addled losers.

The divisions that occurred as a result of the Vietnam War never completely healed. They are perhaps at the core at the anger that drives so many to support Trump and the Republican agenda of grievance. 

This would have been a lot for someone peering into the future in 1954 to grasp. And there’s one other thing that might have stunned them.

It’s the sweatshirt I’m wearing as I write this. It was made for an American company in Vietnam. The two countries enjoy strong trading ties and have for most of this century.

Country Joe asked the right question. What were we fighting for? 

Standard
Uncategorized

16 – ZAPPED FOOD

You don’t hear the term “TV dinner” much anymore.

It was still a fairly new term in 1954. And it was a very specific thing – it came frozen in an aluminum tray with compartments.

The main maker of these meals was Swanson, an independent food company that was a year away from acquisition by Campbell Soup. The first TV dinners were turkey – there was a compartment with a couple of slices of freeze-dried turkey, gravy and stuffing, a smaller compartment for the worst mashed potatoes you can imagine, and a space for dessert, usually some cranberry muffin kind of thing.

Eventually, Swanson sold varieties such as fried chicken, salisbury steak, and macaroni and cheese.

It usually took about 45 minutes to an hour to heat the dinners in a conventional oven. 

Fast forward to 2024: 45 MINUTES! Are you kidding?

Between the Swanson TV dinner and what’s in your freezer right now was a device that was actually around in 1954.

The discovery that you can heat food rapidly with a microwave beam came in the 1940s. And there were microwaves in use in 1954 – if you could afford the $3,000 they cost, the equivalent of nearly $35,000 today.

When the price became a bit more reasonable, around $400, in the late 1960s, companies began selling them to a mass audience. Sharp, Amana and Litton were the first popular makers of the devices, which most people discovered in the break rooms of their workplace. 

Microwaves never really caught on as a way to cook a regular dinner. What they were for was heating things fast. You could boil water in two minutes, pop corn in one, melt butter and reheat leftovers.

And the frozen TV dinner became a frozen meal. 

Swanson was actually slow to pick up on the idea that maybe its frozen dinners weren’t suitable for the microwave age. For one thing, you can’t put an aluminum tray in a microwave.

What was once one case in a supermarket is now a whole row. Meat, fish, vegetarian dishes, pizza, ethnic foods. There are few foods that can’t be heated or reheated in a microwave. And dinner is ready in 5 minutes or less, especially on a night when you come home late from work and have no energy to mix any two ingredients together to create a meal.

But microwaves aren’t the only thing that would come as a surprise to our parents.

Food processors, a more advanced form of blender, came into popular use in the 1980s after Cuisinart developed its machine. Bread machines baked loaves in people’s homes in a couple of hours. Air fryers are making french fries safe to eat again.

And the ultimate in convenience food started out as a new use for an old device – the toaster. In 1964, Kellogg’s introduced Pop Tarts, described as a toaster pastry. It is one of the most popular convenience foods in American history – the subject of a forthcoming movie by Jerry Seinfeld.

Bon Zappetit!

Standard
Uncategorized

17 – NOT SO WORRIED ABOUT GLACIERS RIGHT NOW

In 1954, when people worried about climate change, it was about the possibility that glaciers might return.

Sure it might take thousands of years. But the idea that they would come back and swallow the planet was considered more likely than what actually happened.

The world would undergo periods of heating up and then periods of cooling. In 1982, Newsweek had a cover about our cooling planet.

Yeah, some scientists mentioned the idea that maybe sending so much carbon dioxide into the atmosphere might do something to heat up the planet.

The idea that things were changing for the warmer didn’t become the operative scientific belief until the 1990s, when the term “greenhouse effect” became accepted.

It might have been hard for older people to grasp the idea that things were getting warmer. But if you’ve been around for 70 years, you’ve noticed a few things.

Winters aren’t as cold as they used to be – except when there’s a gap in the atmosphere and a polar vortex comes sweeping out of Canada with ridiculously low temperatures.

It’s also a lot drier. That’s a problem for vegetation – trees especially.  And that leads to wildfires that create scary skies like the ones in the United States last summer that were bright orange.

And storms are more violent. Sandy, the 2012 superstorm that destroyed a lot of the New York area, was seen as a harbinger of the kind of disaster the world has coming if nothing is done to curtail the changes in climate.

But while the rest of the world watched ice caps melt and sea levels rise and said, hey, I think these scientists have a point, the idea of climate change met with opposition in this country.

Because so much of the U.S. relies on fossil fuel production – think of the places dependent on oil and coal – the resistance partly comes from people worried that they won’t be able to make a living in a world powered by wind and sun. 

They’ve been goaded by cynical figures who, instead of leading, decided to challenge the science. Climate change was said to be a hoax – one idiot senator from Oklahoma brought a snowball to the floor to “prove” that climate change isn’t real.

One idiot president also used the word hoax – and who were his idiot followers going to believe – him or their lying eyes?

Trump pulled the United States out of the world’s most comprehensive agreement to combat climate change, the Paris Agreement signed by 195 nations in 2015. The agreement’s goal is to limit the increase in the world’s mean temperature to less than 3.6 degrees Fahrenheit – and preferably less than 2.7 degrees Fahrenheit. To achieve the goal, the world needs to cut emissions by 50% before the end of this decade.

Urgency about climate change has led to several changes in our life. More reliance on non-fossil fuels, such as solar and wind. Electric and hybrid vehicles, after being shunned as an idea for decades, are becoming the car industry’s dominant product. The idea that products or plane trips can be “carbon neutral” is being instilled in the public.

Will we succeed in stopping the planet from becoming too warm to live on? That’s not the kind of question people in 1954 would have expected to hear.

Standard
Uncategorized

18 – THIS HERE THING

If this was 1954, I wouldn’t be able to do what I’m doing right now.

Type these words on a TV screen.

I certainly would have been able to type these words on a piece of paper. There were typewriters in 1954, although the first IBM Selectric was about seven years in the future.

Typing whatever I wrote would certainly be more legible than my handwriting, which is not much better in 2024 than it was in 1954.

But if I used a typewriter, I wouldn’t have been able to cleanly change the word “distance” to “future” two paragraphs up without either xxxxxxxxing it out or crumpling the piece of paper I was using into a ball and throwing it in the garbage. 

(I also wouldn’t have been able to use Liquid Paper, the popular correction fluid that was invented two years after I was born by Bette Nesmith Graham, the mother of future Monkees guitarist Michael Nesmith.)

Computers were not unknown in 1954. They just weren’t anywhere near peoples’ homes. They were in labs and research centers, and took up hundreds of square feet of space.

The first home computers – or personal computers – were put together by tech hobbyists from kits. They followed innovations in reducing the size of semiconductors and advanced circuitry.

Eventually, the whole computer was sold to people who had no interest in soldering wires and touching a screwdriver. Commodore, Tandy and Apple sold what we would consider clunky machines with floppy discs that held only as much data as 10 seconds of any song on your current iTunes.

IBM’s launching of the PC began the real breakthrough to the mass audience, its operating system being built by a new company called Microsoft. Eventually, other companies adopted Microsoft’s MS/DOS and then Windows as their system, with Intel chips providing the power; IBM exited the PC business in 2005, selling it to the Chinese company Lenovo.

Most of the early machines were desktops, with separate monitors and computer cases, and cables attaching them. Apple pioneered desktops with the hard drive and CPUs inside the monitor.

I’m not writing this on a desktop. I’m writing it on a laptop, which I can take anywhere as long as the battery lasts. It not only holds everything I’ve written for the past six years, it holds my photos, some of my music, spreadsheets for baseball stats, a Web browser to get information and watch Brandon Nimmo’s 2022 catch over the center field wall at Citi Field for the 2,015th time, and OOTP 24, the best computer baseball game ever.

Once I finish writing this, I’ll save the file and publish it on my blog in a few days. 

There are people nostalgic for typewriters, for ledgers, for photo albums, for record players, for VCRs, for newspapers, for cookbooks, for tape recorders, for board games, for alarm clocks, for date books, for phone directories.

I’m not. I’ve got it all in less than two square feet on my desk. People in 1954 might have imagined all this, but the reality still would have amazed them.

Now, if only I could stop using this thing and get to sleep.

Standard
Uncategorized

19 – BADA BING

There was nothing on television like “The Sopranos” in 1954.

In fact, there was nothing in the movies like “The Sopranos.”

In further fact, if you were to go back in time with a recording of “The Sopranos” (and some 1954-compatible way of playing it), there’s a chance you’d be arrested in much of the United States.

It was risque to even imply sex and violence, and communities had laws about what constituted obscenity. On TV, married couples had separate beds. There was never any blood coming from a gunshot wound in a Western or crime film. 

And the most scandalous utterance in film, even into the early 1960s, was Rhett Butler’s final statement in “Gone With the Wind”: “Frankly. my dear, I don’t give a damn.”

Contrast that with “The Sopranos.” 

Tony Soprano, a New Jersey mob boss, has his office in a strip club called the Bada Bing. He gets his way through beating and killing people. His vocabulary is chockablock with scatological and carnal synonyms. 

Between Clark Gable and James Gandolfini was the Hays Code. It was Hollywood’s self-censorship effort to avoid government getting involved. Up until the late 1960s, it was a strict guide to how a movie and, later, TV show maker could portray elements of real life that make some people uncomfortable.

And there were people who were uncomfortable even with what was being put out. In 1964, because movies were pretty safe places to let kids go by themselves, my parents had no problem allowing me to see “Dr. Strangelove, or How I Learned to Stop Worrying and Love the Bomb.” It’s a classic (and about the only Kubrick film I actually enjoy, but that’s not important here).

When I told some of the kids in my neighborhood about it, they were scandalized. The local Catholic newspaper had condemned the movie – most likely because George C. Scott’s character starts the movie in bed with a woman who answers his phone.

Efforts to crack the Hays Code finally succeeded in the late 1960s. 

Three years after seeing “Dr. Strangelove,” my 8-year-old brother and I went to the Town Theatre in Glen Cove to see what we thought – because of the advertising – was a comedy. 

When we came out of seeing “Bonnie and Clyde,” both my brother and I were afraid to tell our parents what we had seen, which had culminated with the piercing of the title characters’ bodies with what seemed like hundreds of bullets. (NOTE: Maybe there should have been a spoiler alert there, but that also is something people in 1954 wouldn’t know about.)

My family apparently wasn’t the only one that felt a little misled. And yet, films with more explicit violence and sex were extremely popular amid the turmoil of the era.

So the motion picture industry developed a rating system to replace the Hays Code. It ranged from G, movies that had nothing anyone could reasonably object to, to X, movies with very graphic sex and violence that theaters would not allow under age 17 to see. The X rating, now synonymous with pornography and bloody films, evolved to NC-17.

In 1954, movies were an important component of local television. They filled afternoon slots, weekend slots, late slots.

But because TV is so easily accessible to families, many of the movies that broke through the various taboos were doomed to be either not airable or edited to distraction. 

That’s where cable television came in. Certain premium channels – HBO, Showtime and Starz among them – showed the movies uncut. Then they started developing shows of their own, such as “The Sopranos,” many of them just as compelling as any movie. 

My parents, after a little hesitance because of its depiction of Italian-Americans, came to embrace “The Sopranos.” Although my Dad was always dismissive of anything with “bad language,” both my parents always looked forward to good entertainment. 

That’s the bottom – whether you see it or not – line.

Standard
Uncategorized

20 – INFORMATION SUPERHIGHWAY

There’s something about the Internet that feels like it’s always been there. And if you’re 35 or younger, it has.

Maybe it’s because we’ve gotten so used to the idea of pulling information seemingly out of the air. Who won the Northwestern basketball game? Who won the Northwestern basketball game on this day in 1974? Who was Grover Cleveland’s first vice president?

There was no Internet in 1954. There were newspapers and magazines. There were encyclopedias to look up facts. You could send a letter to a government agency when you had a question and wait the necessary period of time by your mailbox for the response.

But beginning in the 1960s, engineers began creating networks of computer in order to share unused capacity. Companies began to see these networks as ways to share information.

There were efforts to commercialize the information. Cable TV networks ran scrolled news 24/7 that was formatted to fit a screen. In Britain, teletext used bandwidth to provide what was, at the time, a robust assortment of real-time information.

The information evolution continued. My role in the mid-1980s came at The New York Times which started – and then foolishly ended – a videotex service that came in through a modem in a dedicated device used for banking. Large all-purpose commercial services were launched such as Compuserve, Prodigy and America Online, which flooded the country with CD-ROMs to get you started on their service.

With the blossoming of the World Wide Web in the mid-1990s, the information provider middleman was eliminated. You could call up the information yourself if you knew the uniform resource locator, or URL, a line of text that usually started with an “http:” inserted in the top of a browser.

There were those who thought the Internet was a fad. Hardly. It’s how you’re seeing this. It’s how you’ve seen the vast majority of what you see in the 21st century.

It also took a toll. Many cities no longer have daily newspapers. Online shopping has decimated retailing. When was the last time you saw a World Book Encyclopedia?

It also changed the way we communicate with people. We send e-mail and texts instead of writing letters. Through social media, we keep in touch with people with whom we went to high school in a way that was impossible when we went to high school in the first place.

Like so many other things about 2024, not everything about the Internet is a unmitigated benefit. Fake news and hoaxes and conspiracy theories and racist crap have been shared for centuries, but now it often looks and seems very real. People are more isolated, even with 1,000 “friends” on Facebook.

The Internet is arguably one of the biggest changes of the last 70 years. What becomes of it in 2094 is hard to imagine – and maybe at this point, it’s silly to try.

Standard
Uncategorized

21 – THE PLAGUE

My parents would have been stunned by the COVID-19 pandemic that began in this country in 2020.

My grandparents wouldn’t have been.

They saw it before. In 1918, the world suffered through an influenza pandemic that killed tens of millions of people – the exact number isn’t known. One third of the people in the world came down with it. The worst of it lasted until 1920.

Over the years, there have been other breakouts. But people thought the epidemiology had advanced and minimized the damage. Outbreaks of Avian flu and Ebola in this century were kept under control by international intervention and swift vaccination.

We weren’t as lucky in 2020. 

Maybe lucky is the wrong word. Because one problem we had in 2020 was that knuckleheads ran the world’s most powerful nations.

In China, where the pandemic began in the city of Wuhan, Xi Jinping’s totalitarian rule couldn’t stop the virus. And because of his suspicion of outsiders, the world knew little about what was going on in the country of origin.

But the real cetriolo who botched COVID was, no surprise here, Donald Trump.

Trump thought ignoring the spreading virus would make it go away. Otherwise, he feared that his efforts to turn the economy around – vital in an election year – would go for naught.

So there was no sense of urgency in his administration about the virus. And because he wanted to will it out of existence – and not signal weakness to his lunkhead supporters – the United States found itself overwhelmed by cases by the end of March 2020.

The nation was forced to shut down in a way none of us ever imagined. The streets of major cities were deserted. Schools closed. People worked from home – or lost their jobs entirely. Major events were canceled.

Trump held these inane briefings, at one of which he speculated about the possibility of injecting bleach into people to fight the virus. He refused to wear a mask – and his followers did the same, saying that it was an infringement on their freedom and not a proven method of helping to prevent the spread.

So we had refrigerator trucks brought in as makeshift morgues, the sadness of health care workers working long hours to save and then losing their lives from exposure to the disease.

It was a horrible time that none of us will soon forget. Nearly 1.2 million Americans – and nearly 8 million people around the world – died from COVID-19. The outbreak is ongoing, but people seem to have decided to live their lives – in large part thanks to the COVID vaccine that so many Americans and others around the world have taken.

My parents didn’t live to see COVID. My Mom died in November 2019, a few months before the world shut down. It would have been bewildering to her. 

Come to think of it, it’s still a little bewildering to us.

Standard
Uncategorized

22 – DO YOU KNOW THE WAY TO SAN JOSE?

The answer to Burt Bacharach and Hal David’s musical question, at least from New York, is to cross the George Washington Bridge, stay straight on Interstate 80 for about 2,900 miles, and then head south on I-680 to downtown San Jose.

That answer was not nearly as simple in 1954.

There was a federal highway system, created in the 1920s, that often followed the path of settler and Native American trails. But you slowed down in every town with a post office and general store until you got where you were going.

In 1954, President Dwight Eisenhower – who, as a young soldier went on the military’s landmark cross-country drive that lasted two months – was in the middle of developing a system of highways of at least two lanes in each direction. In most instances, the roads would have limited access – entrance and exit ramps; you couldn’t just get on from your driveway.

It took almost 40 years to complete the system. But when it was done, there was a grid of fast roads waffling the nation. Some of the roads – I-10, I-80 and I-90 – completely crossed the country east to west; others – I-5, I-35 and I-95 – cross it north to south.

One of the system’s original purposes was national defense. Highways would allow for faster troop movement in the event of an emergency.

But it made America more mobile. It contributed to a boom in driving because it was a lot easier to get in a car for an intercity trip than to put a family on a train or bus.

It was not an unmitigated positive. Many of the routes uprooted established communities, often those with large Black populations, turning healthy neighborhoods into areas of blight. It contributed to a car and truck culture that swallowed gasoline and churned out emissions.

And while some of us are good at finding our way somewhere (not that I’m bragging), many of us haven’t got a clue as to where we’re going.

When I was young, I loved road maps. They were given out for free at gas stations – I still have several of them from the ’50s, ’60s and ’70s. I would study them for hours and plot the day when I would go across the country by car – something my family did after I graduated from high school in summer 1972.

But road maps weren’t the most efficient way to figure out how to get somewhere. Especially during the period when the Interstate Highway System was being built. Often the maps would show broken red lines or even broken plain lines to indicate where highways were being built or planned. 

That’s nice, but you can’t take non-existent roads to San Jose or anywhere else.

In the 1970s, the Defense Department – those guys again – came up with an idea that helped solve the getting lost problem. Using satellites, the developers were able to track vehicles in motion – in space, in the air and on the ground.

The Global Positioning System was strictly a military property until 1983, when a Korean Air Lines passenger jet was shot down by the Soviet Union for straying into its territory. After that, President Ronald Reagan allowed limited use of GPS for civilian purposes, mostly airplane navigation. In 2000, President Bill Clinton directed the use of the system for whatever civilians wanted to do with it.

What civilians wanted to do with it was not get lost in a car. 

Instead of unfolding a map and trying to spot where you were going, you could now punch the address into your phone or, later, into the information screen on your dashboard and get turn-by-direction to Grandma’s house or a restaurant in Piscataway. The GPS also told you if there was traffic ahead and if you wanted to use an alternative to avoid it.

It doesn’t always work well – who among us hasn’t found ourselves on a dead end street that the GPS thought went through to the next road? 

But if you want to know the way to San Jose, you can find out fast, gas or charge your car, and get on whatever interstate you and Dionne Warwick need to use to get there.

Standard
Uncategorized

23 – “I JUST WANT TO SAY ONE WORD TO YOU… JUST ONE WORD… PLASTICS”

If Benjamin Braddock, the anti-hero of “The Graduate,” were to have a glimpse of 2024 from his parents’ 1968 pool party, he might have listened to his father’s friend after all.

The world is far more plastic that might have been imagined in 1954 or 1968. Plastic existed back then – there have been various forms of it since the 19th century. But as it became cheaper to make, it began to replace glass and metal, more expensive materials.

Glass, in particular, got phased out of a lot of products. As I mentioned in No. 25, many bottles and jars were converted to plastic, particularly soft drinks. But mayonnaise, salad dressing and a whole range of other products were made lighter by the use of plastic containers. Plastic replaced glass in some windows and picture frames.

One form of plastic, polystyrene foam, is better known by its trademark name, Styrofoam. It’s used for cheap coolers and was the insulation of choice for online shippers before the sealed plastic air bubbles came into use.

Another big chance came in the ubiquity of plastic shopping bags. A Swedish engineer developed the first ones in the 1960s – within 30 years, they were replacing paper bags for groceries, trash and anything else that needed carrying.

By the 21st century, a trip to the supermarket included as many as a dozen plastic bags, all containing plastic bottles and jars, paid for by ‘plastic,’ which became a slang term for credit cards.

Unfortunately, the plastic bags and bottles went from convenience to nuisance to crisis.

The problem is that plastic doesn’t biodegrade. It photodegrades, meaning it can take 1,000 years for the light and heat to wear down the bag into nothingness. And that disintegration can be toxic, meaning it’s not particularly suitable for landfills.

There is also the mess they make. On a winter train ride into New York, you can’t go very far without seeing unmitigated plastic bags flapping among the bare branches. And there are folks who make a living recovering bottles and other detritus from the sea, in part because fish and aquatic mammals die when they ingesting the plastics.

So plastic, while still used in so many more permanent items, is in its outcast days as a packaging material. Plastic bottles, like glass bottles of old, come with a deposit charge of 5-10 cents that you get back when you return them to the store or a collection site.

In many places, you either can’t get a plastic bag in a supermarket or you pay an extra 5 or 10 cents for it (for some reason, that doesn’t count the bags that hold produce or meat). Instead, the store wants you to bring your own bag and reuse it.

That bag is usually made out of plastic.

Standard
Uncategorized

24 – ALL SMILES

When I was born, my parents shopped for groceries at the A&P around the corner from their apartment in Flushing.

Down Northern Boulevard was a Sears catalog store, at which they could order what they couldn’t get otherwise, particularly tools.

On Main Street was F.W. Woolworth’s, where they went for things they wanted to get cheaply.

All three of those retailers were dominant institutions, with origins in the 19th century. None of them exist in anywhere near that form in 2024.

Instead, the world’s largest retailer now was doing business in 1954 as a single Woolworth’s-type five-and-dime in Bentonville, Arkansas.

It wasn’t until 1962 that Sam and Bud Walton opened the first Wal-Mart in Rogers, Arkansas. Not only did Wal-Mart overtake every other retailer – food or otherwise – in the world, it is now the largest company of any kind by revenue, taking in $611.3 billion in its most recent fiscal year. That’s $8 billion more than the Saudi oil company, Aramco.

Wal-Mart, with its smiley face logo, achieved success in part by staying away from places like New York. They built massive stores that carried everything anyone would need on cheap real estate away from center cities. The volume of these stores and paying low wages to work in them helped drive down prices.

The biggest retail challenger to Wal-Mart is online – and I’ve taken the liberty of asking “someone” else to write the next part.

Alexa, tell me the history of Amazon.

“Amazon was founded in Jeff Bezos’ garage in 1995. While the company originally only sold books, it’s become one of the world’s largest online retailers.” 

That’s an understatement.

Amazon is so ubiquitous that I’d wager you saw one of its gray trucks with the blue smile swirl at some point today. And its power is not limited to retailing.

Go on, Alexa.

“A couple of Amazon’s best (ED: Alexa’s word, not mine) innovations include the 2002 launch of Amazon Web Services and the 2007 launch of the Kindle. More recently, the 2010 creation of Amazon Studios offers Prime members award-winning original series and films.”

Not only has Amazon bigfooted other retailers, it has created products that it can sell and then pay itself for. That includes…

“In 2014, the company launched its first smart speaker, the Amazon Echo, featuring its Alexa assistant.”

The dominance of Wal-Mart and Amazon (you can throw in Target for the discerning) has given consumers experiences they couldn’t imagine in the checkout line at A&P. A new novel can be at your door a day after your order it. Groceries show up whenever you need them.

This is not, however, an unmitigated blessing. In fact, there are those of you who bemoan all of this.

And there’s an argument to be made. Amazon and Wal-Mart bear some responsibility for the glut of empty store space in this country. They have limited the idea of shopping in a town or village to boutique stores in touristy places. Their demands on their workforce has made being in their employ seem to be only for the desperate.

They seem like monoliths – the futuristic Buy n Large of the Pixar film “Wall-E” or Engulf & Devour of Mel Brooks’ 1976 comedy “Silent Movie.”

But 70 years ago, Sears and A&P and Woolworth seemed unbreakable, too. Now Sears has a few stores in California, Woolworth’s has morphed into Foot Locker and the only thing left of A&P is the Eight O’Clock Coffee brand that some company bought.

Hard to imagine for Wal-Mart and Amazon. But obviously not impossible.

Standard