Skip to content
Author
PUBLISHED: | UPDATED:

Journalism, it’s often said, is the first-draft of history. That draft sometimes is found under a big headline on the front page. Other times it’s less conspicuous, perhaps on Page 6. Almanacs are full of lists of global and national historic events. But our “This Day in History” feature invites you to not just peruse a list, but to take a trip back in time to see how a significant event originally was reported in the Chicago Tribune.

Check back each day for what’s new … and old.

US makes it illegal for employers to fire women for becoming pregnant — Oct. 31, 1978

CLICK HERE to read the full story from Wed, Nov 1, 1978

“Bill signed to prevent job bias in pregnancy”

President Jimmy Carter signed the Pregnancy Discrimination Act on Oct. 31, 1978, extending job protections to pregnant women.

When people get nostalgic for “the good old days,” they often forget things like employers feeling they could fire mothers-to-be for becoming mothers-to-be.

But just two years earlier, the U.S. Supreme Court ruled in General Electric v. Gilbert that pregnancy discrimination was not a form of sex discrimination.

The new law, an amendment to Title VII of the Civil Rights Act of 1964, was a response intended to extend protections despite that decision.

“Carter said he is convinced job discrimination based on pregnancy, childbirth and related medical conditions ‘constitutes discrimination based on sex,’ and said passage of the bill shows Congress shares ‘my unalterable opposition to such discrimination,” the report in the Tribune said.

“It does not bestow favorable treatment on America’s 42 million working women,” the President said.

Rather it was intended to make sure employers who are about to become mothers aren’t punished for it in hiring, firing, pay, promotions and other considerations.

“Women affected by pregnancy, childbirth, or related medical conditions shall be treated the same for all employment-related purposes, including receipt of benefits under fringe benefit programs, as other persons not so affected but similar in their ability or inability to work,” the law says.

Critics note the law is imperfect.

For one, it applies only to businesses with 15 or more employees. But it has been bolstered over the years and provides a basis from which to fight for greater protections.

But the idea the law is needed at all spoke – and speaks – volumes.

The first draft of Chicago history at your fingertips at newspapers.com.

President Bush throws first pitch at World Series, miles from Ground Zero — Oct. 30, 2001

CLICK HERE to read the full story from Wed, Oct 31, 2001

“Flag-waving fans cheer Bush at Series”

Exactly seven weeks after the terror attacks of 9/11, President George W. Bush stood alone on the pitcher’s mound at New York’s Yankee Stadium before Game 3 of the World Series on Oct. 30, 2001.

With a bulletproof vest under a blue New York Fire Department jacket, Bush’s honorary first pitch – a near-perfect strike to backup Yankees catcher Todd Greene – provided one more image to a period deeply ingrained in the memories of Americans who lived through it.

The Chicago Tribune’s Teddy Greenstein noted the ceremonial gesture came “amid intense security and equally intense emotion … after team members and fans stood for a moment of silence to remember the victims” of the attacks on the World Trade Center Twin Towers, the Pentagon and aboard the jet that passengers crashed in Pennsylvania to prevent terrorists from using it as a weapon, too.

Adding to the drama of the moment, just one day earlier, the U.S. government announced it was evaluating a “credible” new terrorist threat.

After President Bush threw his pitch, Bob Sheppard, the Yankees’ long-time public address announcer, said, “Thank you, Mr. President.”

The stadium organist played, “Deep in the Heart of Texas,” an acknowledgement of Bush’s Lone Star state roots and one-time ownership of baseball’s Texas Rangers.

Then the crowd of 55,820 ticketholders spontaneously broke into a chant of “USA, USA.”

“Very nice throw,” Arizona Diamondbacks manager Bob Brenly said, when Bush walked to one of the baselines to pose for a picture with him and Yankees manager Joe Torre. “Good stuff, Mr. President.”

Torre’s Yankees would win the game, 2-1. But the Diamondbacks took the Series in seven games.

Bush was the first president to throw out the first pitch at a World Series game since Dwight Eisenhower in Game 1of the 1956 World Series at Brooklyn’s Ebbets Field.

But it’s a safe bet Ike’s pitch didn’t require the security necessary in 2001. While the full extent of precautions at the state, local and national level were not known, the city of New York reportedly had 1,200 uniformed police officers assigned to the game only 11 miles or so from New York’s Ground Zero.

That’s not what most people remember, though.

What they recall is the president alone in the center of a packed stadium of flag-waving fans, throwing a strike.

And as he left the field, Greenstein reported, cameras showed a banner unfurled in the stands.

It said: “USA fears nobody. Play ball.”

Wall Street’s ‘Black Tuesday’ — Oct. 29, 1929

CLICK HERE to see the full front page from Wed, Oct 30, 1929

“Stock slump ends in rally”

Stock prices had begun to slide on Wall Street the month before. But the great crash would truly drop off a cliff on Black Thursday, four days later on Black Monday and finally – on Oct. 29, 1929 – Black Tuesday.

The infamous Wall Street crash of 1929 is associated with the start of the Great Depression that afflicted America and other industrialized nations. But there were many factors here and abroad, not the least of which was farming overproduction and the spasms and collapses of the banking system.

To the extent the U.S. stock market is an indicator of any sort on the economy as a whole, however, it’s notable the Dow Jones Industrial Average fell roughly 11% on Black Thursday, then 12.8% on Black Monday and 11.7% on Black Tuesday.

Volume was so heavy that tickers could not keep up, even as brokers scrambled to sell, and billions of dollars in value evaporated.

“An incredible stock market tumbled toward chaos today despite heroic measures adopted by the nation’s greatest bankers,” the Tribune’s Tom Pettey reported of Black Tuesday. “At the end there was an erratic rally.

“It was a day of tremendous activity, with climaxes punctuating every hour. Only the rally of the late hour saved the market from utter collapse.

“It was then the bankers and investment buyers dipped into a grab bag filled with hornets and sought furiously to rally prices. They succeeded after a fashion, but no one as yet knows the full temper of this history making market. Another day should tell the story.”

The next didn’t tell the story. Not the whole one, anyway.

The influx of capital from the Rockefeller family and other financial titans seeking to prop up the teetering market at the end of Black Tuesday did in fact seed a rally at the very end of the trading day and there was a 12.3% bounce on Wednesday.

But the Dow Jones Industrial Average did not return to its closing high-water mark of 381.17 on Sept. 3, 1929, until – get ready for this – Nov. 23, 1954, more than 25 years later.

Whatever recovery there was proved short-lived. Another slide began in April 1930 that didn’t bottom out until July 8, 1932 at 41.22.

That’s a loss of 89% in value in the index over a little more than 34 months.

Part of the problem was the amount of speculation in the market, thanks to people and institutions – including banks – borrowing heavily to invest in the naive belief share prices would always go up.

With President Franklin Roosevelt’s election in 1932 came the New Deal, which included regulations and laws to prevent or at least slow the kind of panic sell-offs seen in the crash. It also helped alleviate some of the worst hardships of the Depression.

But it was actually the United States’ involvement in World War II that pulled this country and its economy to the other side of this trying time.

Statue of Liberty unveiled — Oct. 28, 1886

CLICK HERE to read the full story from Fri, Oct 29, 1886

“Our lady of freedom”

President Grover Cleveland and other dignitaries were joined by a crowd estimated perhaps 1 million strong for a parade and other ceremonies to dedicate the Statue of Liberty in New York on Oct. 28, 1886.

The giant copper-clad figure, symbolically welcoming those seeking to enjoy America’s freedoms and opportunities, was a gift from the people of France and designed by French sculptor Frederic Auguste Bartholdi.

Its original metal framework (later replaced during one of the statue’s restoration efforts) was built by Gustave Eiffel, who is better known for his contribution to the skyline of Paris although Lady Liberty is just as iconic.

“A gloomy morning did not dampen the enthusiasm of the crowds which were early in the streets to witness the ceremonies commemorative of the unveiling of Bartholdi’s Statue of Liberty Enlightening the World,” the Chicago Tribune reported.

There is some dispute over when the idea for the statue took hold, but it was more than a decade and a half in the making.

Lady Liberty’s torch was a tourist attraction in and of itself, first in Philadelphia and then New York, years before the full statue was completed.

Its head was showcased at the 1878 World’s Fair in Paris, where the statue would be constructed, shipped in parts to America and reassembled.

For the modern colossus, the United States provided the site, then called Bedloe Island and now known as Liberty Island.

A multitude of Americans young and old, rich and of modest means, contributed funds to build a proper platform for it. Joseph Pulitzer’s New York World newspaper leading the campaign.

“That the grandeur of this memorial has not been exaggerated may be judged from a glance at the actual dimensions of the figure,” the Tribune reported. “The total height of the statue and pedestal, from low-water mark to the top of the torch … is 305 feet, 11 inches.

“The forearm is 161/2 feet in circumference. The nail of the finger is 12 inches in length. The head is 15 feet in height and 40 persons can be accommodated within its interior.”

(Visitors could go out up the statue’s right arm and onto the torch as well until 1916.)

President Cleveland, who years earlier vetoed a bill to help fund the platform when he was governor of New York, ironically got to enjoy a leading role in its dedication.

He said the statue’s “light shall pierce the darkness of ignorance and man’s oppression until Liberty enlightens the world.”

That sentiment would be echoed by another President at the statue’s 1986 centennial celebration.

“We are the keepers of the flame of liberty,” Ronald Reagan said. “We hold it high for the world to see.”

Chicago Tribune editorial ridicules idea of war with Japan, calls attack on Hawaii ‘a military impossibility.’ — Oct. 27, 1941

CLICK HERE to read the full story from Mon, Oct 27, 1941

“Mr. Knox spies a war”

A Chicago Tribune editorial on Oct. 27, 1941, dismissed as “alarmist” a warning from the U.S. Secretary of the Navy that “a collision” with Japan seemed “inevitable.” and could “occur on 24 hours’ notice.”

The institutional voice of Col. Robert McCormick’s newspaper declared without reservation, “Even our base at Hawaii is beyond the effective striking power of her fleet,” calling such an attack “a military impossibility.”

Eschewing a letter to the editor, Japan submitted a rebuttal 41 days later – on Dec. 7.

Until the surprise attack on Pearl Harbor on the Hawaiian island of Oahu made entry into World War II inevitable, the Tribune editorial page, which consistently opposed President Franklin Roosevelt, preached the gospel of isolationism both in the Pacific and across the Atlantic vs. the Nazis.

Much is made of the Chicago Tribune jumping the gun on election results in 1948, publishing the erroneous banner headline “Dewey defeats Truman” on its Nov. 3 front page.

That newsroom nightmare was preserved for posterity in a photograph of reelected President Harry Truman holding up a copy gleefully. It serves as a warning to this day of the danger of getting ahead of facts in reporting, not just at the Tribune but media outlets everywhere.

The Tribune editorial page’s rejection of a possible Japanese attack is less celebrated, but hardly an aberration.

Even after the war, McCormick would express resentment that Roosevelt’s war efforts advanced the causes of Soviet communism and British imperialism.

That’s the prism through which the editorial pooh-poohing remarks Navy Secretary Frank Knox made to ordinance manufacturers to urge them to continue building up a stockpile of arms just in case.

A one-time Chicago Daily News publisher and part-owner who had been the Republican vice presidential nominee alongside Alf Landon in 1936, Knox was hardly a saint.

The man, who three days before the bombing of Pearl Harbor was telling people the Navy would not be caught napping, called for internment of Japanese Americans as early as 1933 and deserves a good deal of blame for the implementation of camps during the war.

But in calling for America’s arms makers to be ready for possible conflict, he was spot on, and the Tribune had no use for it.

“Mr. Knox wants the country to believe that we may be at war with Japan at any moment. War for what?” the editorial said, arguing there was no way Japan could threaten U.S. vital interests.

Dismissing Hawaii as a possible target, the editorial shrugged off the idea Japan might attack Philippines represents something worth fighting over.

Japan’s conflict with China, the editorial said, should be worrisome to Great Britain, not the United States, as the United States was far less reliant on foreign trade that could be imperiled.

“Mr. Knox says a collision is inevitable,” the editorial said. “It is only inevitable if Mr. Knox and Mr. Roosevelt are intent upon embroiling us in a war in which we have no business, from which we are assured before we enter it will not have the slightest hope of profit, and which we know will cost us many thousands of American lives.”

Many lives were in fact lost, but they did not die in vain.

White Sox win World Series — Oct. 26, 2005

CLICK HERE to see the full coverage from Thu, Oct 27, 2005

“Believe it!”

For the first time in most fans’ lifetimes, the White Sox gave Chicago its first World Series championship since 1917 at 11:01 p.m. on Oct. 26, 2005.

White Sox shortstop Juan Uribe fielded Orlando Palmeiro’s grounder, and threw to Paul Konerko for the final out of Game 4 at Houston’s Minute Maid Park.

The 1-0 victory over the Astros completed the Sox sweep of the National League champs, ending an 88-year baseball title drought, dating back to the South Siders’ World Series defeat of the New York Giants in six games during World War I.

“A lot of people have waited a long time for this moment, and I’m happy that we were able to give it to them,” said Ozzie Guillen, a former Sox star who guided them to the American League pennant and a World Series title in only his second season as manager. “I didn’t come here for the glamor. I didn’t come here for the money. I came here to win.”

Sox starter Freddy Garcia went seven innings in the decisive game, and relief pitchers Cliff Politte, Neal Cotts and Bobby Jenks preserved the shutout.

Jermaine Dye, whose two-out single drove in Willie Harris from third off the Astros’ Brad Lidge in the eighth inning for the lone run of the game, was named Series most valuable player.

While the White Sox had disposed of the Astros in the minimum four games, it wasn’t exactly a romp.

Game 1 and Game 4 each were decided by a single run.

The middle two games – including the marathon five-hour, 41-minute, 14-inning Game 3 that ran until 1:20 a.m. on the 26th – each were decided by only two runs.

But what’s remembered 15 years later is that the 2005 White Sox were in first place from opening day to season’s end, and lost just once in the postseason on their way to the title.

And another thing White Sox fans also remember quite well: They ended their World Series drought well before the Cubs.

Former Interior Secretary convicted in Teapot Dome scandal — Oct. 25, 1929

CLICK HERE to read the full story from Sat, Oct 26, 1929

“Doheny is next to face court; Fall convicted”

Bribed in exchange for awarding no-bid commercial drilling rights to naval oil reserves while he was President Warren Harding’s Interior Secretary, Albert Fall was convicted on Oct. 25, 1929, for his pivotal role in the Teapot Dome scandal.

The conviction earned Fall an historic distinction as the first person ever found guilty of a crime committed while serving as a U.S. cabinet member.

But, while the Teapot Dome scandal has been eclipsed among corrupt episodes at the highest levels of the federal government and faded in memory, Fall’s legacy lives on in a remark he made during the congressional inquiry into his venal actions.

Explaining the concept of oil field drainage, Fall said, “Sir, if you have a milkshake and I have a milkshake and my straw reaches across the room, I’ll end up drinking your milkshake.”

The line was adapted memorably by filmmaker Paul Thomas Anderson in 2007’s “There Will Be Blood,” uttered by the ruthless oilman Daniel Day-Lewis played en route to an Oscar.

Fall illegally awarded cut-rate leases to oil rights without bidding at Teapot Dome in Wyoming as well as the Navy’s Elk Hills and Buena Vistas reserves in California.

Edward Doheny of Pan American Petroleum and Transport, who won the California leases, gave Fall an interest free $100,000 “loan” so Fall could buy land for his New Mexico ranch.

Harry Sinclair of Monmouth Oil, who scored the Wyoming lease, gifted Fall with livestock for the ranch and transferred roughly $300,000 in bonds and cash to Fall’s son-in-law.

That should give an idea of just how much more their oil companies thought they would make off the deals. The Supreme Court voided the ill-gotten deals in 1927, however.

When the verdict against Fall was announced in Washington, D.C., it apparently was quite a scene.

“A veritable bedlam broke loose in the courtroom this morning when the jury of four women and eight men, after nearly 24 hours’ deliberation, reported their finding that the $100,000 passed by Doheny to Fall in a ‘little black bag’ was an outright bribe, rather than the ‘innocent loan’ which both asserted it was,” the Tribune reported.

“While the aged and broken ex-secretary sat dazed under the blow that branded him a felon, members of his family sobbed aloud. Mark Thompson, Fall’s personal attorney and lifelong friend, gasped and collapsed on the courtroom floor, and Doheny, purple and trembling with rage, cursed the court and was led out, shaking his fist and shouting as he went.”

The irony is that while Fall was convicted of taking the bribes and wound up serving a year behind bars, neither Sinclair nor Doheny were convicted of bribing him.

Sinclair wound up with a fine and a six-month sentence for contempt of court.

Doheny walked and his company – to add insult to injury, and perhaps support his claim that the Fall “loan” was legit – foreclosed on Fall’s home.

The 40-hour work week becomes official — Oct. 24, 1940

CLICK HERE to read the full story Wed, Oct 23, 1940

“2 million feel effect of new 40 hour week”

Today it’s taken for granted that a standard work week is 40 hours, but the legal requirement was set by the Fair Labor Standards Act of 1938 and took effect on Oct. 24, 1940.

“Nearly 2 million employees will be affected by the new 40-hour week,” the Chicago Tribune’s John Fisher wrote in the paper’s Oct. 23 edition.

“Maximum hours of work in industries subject to the law will be reduced from 42 to 40, and all time worked above the 40-hour limit must be compensated for at the rate of time-and-a-half the usual hourly rate. A year (earlier) the maximum hour week was reduced from 44 to 42 hours in accordance with the law enacted in 1938.”

Of the 975,900 or so Illinois workers covered by the law, it was expected 208,768 would be affected by the new standard.

President Franklin Roosevelt’s administration insisted businesses adhere to the new restrictions despite the urgency of its defense program.

It was not lost on anyone this surely would increase overhead costs for employers, either through now-requisite overtime pay or having to add workers, helping reduce unemployment.

Added to that, the minimum wage, which had been 30 cents an hour was to be bumped up to 40 cents. That would be $7.44 in 2020 money.

The actual federal minimum wage in 2020: $7.25.

Steve Jobs introduces Apple’s iPod — Oct. 23, 2001

CLICK HERE to read the full story from Mon, Oct 29, 2001

“Elegant gizmo lines up all your tunes like peas in an iPod”

Apple’s Steve Jobs stood on stage in his customary black turtleneck on Oct. 23, 2001, but it was a low-key presentation by today’s standards. The product Jobs unveiled: the iPod.

Consumers did not camp out en masse in anticipation of its arrival in stores, but the iPod helped revolutionize the way mainstream consumers consumed music.

It also proved a major advance toward the introduction of the world-changing iPhone six years later. People did line up to buy that.

“Joining the lineup of iMac and iBook is iPod, a $400 pocket-size hard drive that carries up to 1,000 songs in a metallic case about the size of a pack of cigarettes,” Chicago Tribune tech guru James Coates wrote the following week.

The new device, Coates said, was an “elegant and expensive offering for the overserved I-want-my-MP3s-set.”

Jobs had explained Apple was taking on digital music because everyone loves music and no company – neither small upstarts nor electronics giants such as Sony – had broken through with consumers.

“We think not only can we find the recipe, but we think the Apple brand is going to be fantastic because people trust the Apple brand to get their great digital electronics from,” Jobs said.

Introduced a few months after Apple launched iTunes media player, the iPod was an MP3 player that could deliver CD quality sound and fit in a pocket.

Its first iteration was 2.4 inches wide, 4 inches tall and 0.78 inches thick. It weighed 61/2 ounces, lighter than many mobile phones in 2001.

Back then the idea that one could walk around with a 5-gigabytes hard drive was an eye-opener. But Jobs touted an ultra-thin hard drive and a battery efficient enough to keep the songs playing for 10 hours.

Apple’s firewire made it possible not just to recharge the device to 80% in an hour, Jobs said it could facilitate the upload of an entire CD in 10 seconds.

The iPod would evolve in time. The next version ditched its physically rotating scroll wheel for a touch-based one and was Windows-friendly.

Video eventually was added to the mix, as were some of the apps found on the iPhone.

The groundbreaking smartphone eclipsed the iPod in a few years, having subsumed its best capabilities, but the latest iPod Touch runs $199 and has 256 gigabytes of storage.

With music more likely today to be streamed, the iPod Touch’s appeal is primarily as a gaming device with the ability to message and video chat via WiFi.

But back in 2001, the original iPod was cutting edge.

“The gadget fits into the new Apple line sweeter than an overripe Granny Smith rolled in sugar,” Coates said.

Kennedy tells nation of Cuban Missile Crisis, announces blockade — Oct. 22, 1962

CLICK HERE to read the full story from Tue, Oct 23, 1962

“Quarantine of Cuba on! Kennedy takes 7 steps”

President John F. Kennedy announced on Oct. 22, 1962, that he had ordered a U.S naval “quarantine” of Cuba in response to the Soviet Union’s effort to install offensive nuclear weapons on the island nation 90 miles from Florida.

The Cuban Missile Crisis had the world on high alert with the Cold War in real danger of overheating.

The White House and the Kremlin were engaging in brinksmanship at the highest level. At stake was … everything.

“In a dramatic and perhaps fateful radio-television address to the nation the President said he was (acting) to counter what he has now determined is a Russian buildup in Cuba of offensive missile sites,” Chicago Tribune Washington correspondent Laurence Burd reported.

Kennedy “said the sites will be capable of launching nuclear strikes against major cities in this country and Latin America. In addition, he said, Red jet bombers capable of dropping nuclear weapons are now being uncrated and assembled in Cuba and air bases being prepared for them.”

Soviet leader Nikita Khrushchev two days later, on Oct. 24, would call the quarantine an “act of aggression” and said Soviet Cuba-bound ships would proceed. U.S. forces turned back some ships but allowed through those without offensive weapons.

Surveillance flights meanwhile showed the Soviet missile sites were close to becoming operational.

Facing a stalemate, Kennedy told advisors on Oct. 26 he believed only a U.S. attack on Cuba would get the missiles out. But ABC News reporter John Scali informed the White House he had been approached by a Soviet agent who seemed to want officials to know the Soviets would remove their missiles if the United States agreed to not invade.

White House personnel were still trying to figure out if Scali’s back-channel info was legitimate when Khrushchev, in the middle of the Moscow night, sent Washington an unexpectedly emotional letter with a proposal echoing what Scali had passed along.

“If there is no intention to doom the world to the catastrophe of thermonuclear war, then let us not only relax the forces pulling on the ends of the rope, let us take measures to untie the knot,” Khrushchev wrote.

The apparent ray of hope soon dimmed, however. On the 27th, not only was a U.S. U-2 spy plane shot down over Cuba, but Khrushchev sent another letter saying there would be no deal without the United States also removing its Jupiter missiles from Turkey.

President Kennedy gambled by responding to Khrushchev’s first letter, as if he never saw the second. Attorney General Robert Kennedy, meanwhile, was sent to tell Soviet Ambassador Anatoly Dobrynin that there were plans to remove the Jupiter missiles from Turkey, but the move couldn’t be publicly linked to the Cuba situation.

Khrushchev issued a public statement on Oct. 28 that the Soviet missiles in Cuba would be dismantled and removed, ending the crisis.

Yet until the Soviets also agreed to pull their IL-28 bombers out of Cuba, the naval quarantine remained in place until Nov. 20.

The Jupiter missiles were taken from Turkey the following April.

The near-disaster spurred at least two future developments.

It got the White House and Kremlin to set up a direct phone hotline with each other so the two sides could speak directly when warranted. It also prodded the two superpowers toward nuclear test-ban treaty talks.

Chicago’s Saul Bellow win Nobel Prize for Literature — Oct. 21, 1976

CLICK HERE to read the full story from Fri, Oct 22, 1976

“Saul Bellow: ‘Child in me is delighted'”

Canadian-born Chicagoan Saul Bellow on Oct. 21, 1976, became the first U.S. winner of the Nobel Prize for Literature since John Steinbeck 14 years earlier.

“The child in me is delighted,” Bellow said. “The adult is skeptical.”

Bellow also was that year’s second University of Chicago professor to win a Nobel Prize. Milton Friedman, whose office was in the same building as Bellow’s, won in economics.

A 7 a.m. call from a Reuters reporter alerted Bellow to his victory. The phone didn’t stop ringing after that, according to Chicago Tribune reporter Timothy McNulty.

It already was scheduled to be moving day for the author of “The Adventures of Augie March,” “Herzog” and “Humboldt’s Gift,” who was leaving his South Side apartment for a place on the North Side, so things were bound to be chaotic in any case.

“There he stood, in his green turtleneck shirt and black leather jacket, telling the movers to ‘take that green chair over there next,’ while a Swedish television reporter asked how he felt about immortality,” McNulty observed.

“‘I wish I could believe that that the Nobel Prize brings immortality,’ Bellow said. ‘Don’t leave that green chair, fellows. No, immortality is spiritual.'”

Bellow was born in 1915, a child of Jewish Russian immigrants, who left Quebec for Chicago’s West Side when he was 9. He didn’t become a naturalized U.S. citizen until the 1940s, when he discovered his family had come into this country illegally.

By the time of his 2005 death at 89 in Massachusetts, the one-time Trotsky-ite polarized many with views, but in some ways he always did.

Bellow attended the University of Chicago, but transferred to Northwestern because it was less expensive. His plans to study literature there was scuttled by what he perceived to be anti-Semitism in the English department. So, he earned his honors degree in anthropology and sociology and, like the city where he lived most of his life, they would inform his work.

“It’s a tremendous tribute that these American Jews who both celebrate the same liberty from imposed restraint should win it in the same year,” author and U of C professor Richard Stern told McNulty of Friedman and Bellow. “The social structure which Friedman envisages sees the Bellovian vitality, energy, and construction as its finest product.”

Bellow shrugged off some of the labels pinned on him.

“I’m not a Jewish writer, I’m not a Chicago writer, I’m a modern writer,” he said at a news conference. “But I don’t mind giving a boost to the old town.”

As for his plans for the $160,000 that came with the Swedish honor (which translates to roughly $720,000 in 2020), Bellow said, “At this rate, my heirs will get the money in a day or so.”

Nixon’s ‘Saturday Night Massacre’ — Oct. 20, 1973

CLICK HERE to read the full story from Sun, Oct 21, 1973

“Nixon fires Cox and Ruckelshaus; Richardson quits!””

Trying in vain to thwart investigation of the Watergate break-in and the subsequent cover-up, President Richard Nixon triggered resistance from the Justice Department on Oct. 20, 1973, in what became known as the “Saturday Night Massacre.”

It was nothing short of an assault on the institutional integrity of the Department of Justice and exacerbated an ongoing constitutional crisis, rather than quell it.

“President Nixon fired Special Watergate Prosecutor Archibald Cox tonight,” Chicago Tribune Washington correspondent Glen Elsasser wrote beneath a huge headline. “Atty. Gen. Elliot Richardson resigned in protest, and Deputy Atty. Gen. William Ruckelshaus was then fired after he refused Nixon’s order to dismiss Cox.

“The shocking developments came in rapid fire after Cox pointedly refused to obey Nixon’s order to stop seeking the disputed White House tape recordings of conversations between the President and Watergate principals.

“Nixon’s action left Cox’s sweeping Watergate probe in shambles and threw it back into a stunned Justice Department under the direction of United States Solicitor General Robert H. Bork, whom the President designated to succeed Richardson temporarily.”

Cox issued a statement that night: “Whether ours shall continue to be a government of laws and not men is now for Congress and ultimately the American people” to decide.

The government of laws won.

Rather than cripple the Watergate investigation, Nixon’s tactic further empowered it by greatly eroding his public and political support just 10 days after Vice President Spiro Agnew resigned to plead no contest to charges he failed to report bribes on his tax returns.

The Tribune reported its switchboard was deluged with calls, all but one critical of Nixon.

Impeachment proceedings would begin 10 days later. Leon Jaworski was named Cox’s successor. A federal judge would rule Nixon actions were an abuse of power.

The incriminating tapes eventually became public and precipitated Nixon’s resignation 81/2 months later.

A legacy of the “Massacre” is the Ethics in Government Act of 1978, intended to foster transparency and accountability.

Bork later would say he believed Nixon was acting within his rights, but he considered resigning so as to not be seen as doing his bidding. That is what happened, however.

In his posthumous memoirs, Bork said Nixon dangled a spot on the U.S. Supreme Court but resigned in August 1974 before there was a chance to appoint him. Nixon’s successor, President Gerald Ford, would nominate John Paul Stevens to replace William O. Douglas in 1975.

President Ronald Reagan put Bork’s name forward for the high court in 1987 to fill Lewis Powell’s seat, but the nomination was rejected by the Senate, 58-42, buoyed by six Republican no votes. The job went to Anthony Kennedy, whom the Senate approved 97-0.

John DeLorean arrested on cocaine charges — Oct. 19, 1982

CLICK HERE to read the full story from Wed, Oct 20, 1982

“DeLorean held in cocaine deal”

John Z. DeLorean was arrested on Oct. 19, 1982, in Los Angeles, where the FBI said the 57-year-old executive was picking up 220 pounds of cocaine in a $24 million deal he hoped would save his insolvent eponymous sports car company, deeply mired in debt.

His arrest and that of two others following a videotaped sting came as the British government said it was shuttering DeLorean’s factory in Northern Ireland.

DeLorean’s attorneys would argue successfully at trial he was the victim of entrapment, and he was found not guilty.

But by the time of his acquittal in 1984, his DeLorean Motor Company had gone under and his reputation was trashed.

The company’s legacy is one model, a distinctive but slow-selling stainless-steel two-seat car with gull-wing doors dubbed the DMC-12. Only about 9,000 were produced.

It already would evoke a measure of nostalgia by 1985, when it was used as time-machine prop in the first of the “Back to the Future” films.

DeLorean once has been a star at General Motors and made quite a name for himself, mixing with celebrities, romancing actresses and models and acquiring stakes in the NFL’s Chargers and Major League Baseball’s Yankees.

The youngest division head in GM history, DeLorean had overseen development of successful models such as Pontiac’s GTO, and Firebird before quitting to start his own company in 1973 after butting heads one too many times with GM’s culture.

He would later collaborate with J. Patrick Wright on a critical book, “On a Clear Day You Can See General Motors,” published in 1979.

DeLorean’s company was backed by investors including talk show host Johnny Carson and enjoyed support from dealers. But it still took close to eight years for DeLorean to get his car to market, and the economy was not exactly welcoming.

The Lotus-designed two-seater with a rear-mounted Renault V-6 engine originally was projected to cost $15,000 at a time when the average vehicle ran around $10,000.

But the DMC-12 wound up with a $25,000 price tag (which would be close to $75,000 today). And the car generated tepid response from critics and consumers.

DeLorean in later years wound up starting a watch company, and he was said to be playing with the idea of a return to the car business in the years before his death in 2005 at age 80.

His legal troubles had taken their toll, however, and financial stress had required him to sell his estate in Bedminster, N.J. Today, it is part of a golf club owned by President Donald Trump.

First long-distance Chicago-New York telephone call — Oct. 18, 1892

CLICK HERE to read the full story from Wed, Oct 19, 1892

“Talks to New York”

Many these days prefer to text rather than speak, but the ability to call rather than send a telegram or letter was an innovation to be celebrated on Oct. 18, 1892, when the first long-distance telephone call was made between Chicago and New York.

Chicago Mayor Hempstead Washburne, seated in the Quincy Street offices of AT&T, spoke to New York Mayor Hugh Grant – connected by 950 miles or so of copper wire strung along some 42,750 wooden poles – heralding the imminent launch of phone service between the two cities.

“Spoke to” is the correct description as the demonstration was not a complete success.

It seems that attaching 40 receivers to the line for the benefit of the crowd of 80 or so on hand in New York diminished the volume of each, making it near impossible for Mayor Grant to hear much on his phone.

There was no such problem in Chicago, where 60 people were gathered, and the event was hailed as a great triumph.

“Mayor Washburne at the Chicago end of the long-distance telephone may have heard distinctly what Mayor Grant tried to say to him at the New York end of the wire this afternoon, but Mayor Grant was unable to hear a single word from Mayor Washburne,” the Tribune reported.

They nevertheless seemed to have a conversation, the technological breakthrough tied to a week of Chicago celebrations dedicating its World Columbian Exposition fairgrounds, which would not open to the public until the following spring.

Washburne: “The city of Chicago greets the city of New York,” Washburne said.

Grant: “The city of New York returns the compliment and wishes you all success in the grand celebration you are to have this week,” Grant said.

Washburne: “The city of Chicago returns the greeting and congratulates the nation on an American invention which shall supplement the telegraph and enable the people of the continent to communicate orally where they now resort to the mail and the telegraph.”

Music was played over the phone line. There was a recitation of Tennyson’s “Charge of the Light Brigade” and a whispering of “Mary Had a Little Lamb.”

But a highlight may well have been telephone inventor Alexander Graham Bell in New York speaking with his former assistant, William Hubbard, in Chicago. The National Portrait Gallery’s collection has a photograph of Bell on the call with an audience watching.

AT&T estimated it would be two to three weeks before the service would be available to the public.

“The tariff for five minutes talk with New York will be $9,” the Tribune reported. “The company’s officials are sanguine that they will do excellent business.”

Al Capone convicted if tax evasion — Oct. 17, 1931

CLICK HERE to read the full story from Sun, Oct 18, 1931

“U.S. jury convicts Capone”

The feds finally nailed murderous Chicago mob boss and bootlegger Al Capone, who was found guilty of tax evasion on his illicit earnings on Oct. 17, 1931.

The verdict was returned to Federal Judge James H. Wilkerson at 11:10 p.m. after eight hours, 19 minutes of deliberation.

Judge Wilkerson, who earlier had thrown out a plea bargain that would have made a trial unnecessary, ultimately sentenced Capone to 11 years in prison, plus a sizable fine and liability for his back taxes plus interest.

Capone would go to prison in May 1932 at age 33, effectively ending his seven-year reign as a crime boss. He already was suffering signs of mental decline from venereal disease.

Paroled in November 1939, at least in part because of his health, Capone died in early 1947, just eight days after his 48th birthday.

Of the 23 counts in two indictments Capone faced, the jury found him guilty of only five.

“The counts on which Capone was found not guilty cover the charges of tax evasion for the years 1924, 1928 and 1929,” the Tribune’s Philip Kinsley reported. “Capone was in prison in Philadelphia during 1929 and the jury considered that his negotiations with his Washington lawyer, Lawrence P. Mattingly, were an effort to reach a compromise on his income taxes for 1928 and 1929. In 1924 the government’s evidence showed an expenditure of only $4,500 by Capone, for an automobile.”

At the time of his conviction, Capone was appealing a six-month sentence for contempt of court and was under indictment for 5,000 violations of the Volstead Act, which was intended to end consumption, distribution and manufacture of alcoholic beverages. (The end of prohibition was still 19 months away in late 1933.)

Capone had shown up in court in a new suit that morning, “barbered to the last inch of shining comfort,” the Tribune said, adding he had begun the day’s proceedings “cheerful and smiling, but he was obviously worried.”

Summoned from his headquarters at the Lexington Hotel to hear the late-night verdict, he traveled by limousine to the courthouse.

The scene there played out far differently than as depicted in, say, 1987’s “The Untouchables,” which starred Robert De Niro as the mobster known as Scarface.

When a not guilty verdict was read first, Kinsley noted, “Capone’s face broadened into a smile for a fraction of a second.”

But as it was followed by guilty verdicts, “Capone dropped his eyes for a moment. He forced a sickly half grin and then spoke to” his lawyers “as they rose to advance to the bar. Almost immediately the gang chief resumed his previous attitude. He smiled again, and later, in the corridors, joked and chatted with friends.”

Asked by a reporter how he was feeling, Capone again said, “I’m feeling fine,” but indicated his attorneys would do all his talking from then on.

John Carlos and Tommie Smith give Black Power salute at Olympics — Oct. 16, 1968

CLICK HERE to see the full page from Thu, Oct 17, 1968

“Smith and Seagren win gold medals”

Having finished first and third in the 200-meter dash at the Mexico City Olympics on Oct. 17, 1968, U.S. sprinters Tommie Smith and John Carlos stood on the medal stand with heads bowed, gloved fists raised in the fashion of a Black Power salute, wearing black socks but no shoes as “The Star-Spangled Banner” played.

The Tribune’s George Strickler described it on deadline as “a somewhat discordant note … in the gala day of achievement,” which badly undersold the moment.

It was in fact a political statement that has reverberated ever since.

International Olympic Committee President Avery Brundage, a Chicagoan who had no objection to Nazi salutes at the 1936 Berlin Games, was deeply offended and sought to punish Smith and Carlos for bringing domestic politics to his global stage.

When the U.S. Olympic committee balked despite its own less-than-tepid support for the two, the IOC threatened to ban the entire U.S. track contingent amid fear various protest movements could take hold across many nations.

So, an apology was issued, and Smith and Carlos were sent home.

Predictably, their protest didn’t play better among many back in the United States, where, then as now, some insist sports should never intersect with politics.

Chicago American columnist Brent Musburger wrote for the majority view, saying, “Smith and Carlos looked like a couple of black-skinned storm troopers” on the medal stand and suggesting it was poor form to embarrass those “picking up the tab for their room and board” at the Games.

“Protesting and working constructively against racism in the United States is one thing,” Musburger said, “but airing one’s dirty clothing before the entire world during a fun and games tournament was no more than a juvenile gesture by a couple of athletes who should have known better.”

Yet on ABC, which televised the Olympics back in the United States, even as Howard Cosell acknowledged “the preponderant weight of American public opinion will support the committee,” he expressed strong support for the athletes without reservation.

Cosell called the U.S.O.C. largely “a group of pompous, arrogant, medieval-minded men who regard the Games as a private social preserve for their tiny clique.”

The outspoken sportscaster rejected Olympics participation as “a privilege, not as a right earned by competition” and the notion that “the Games are sports, not politics, something separate and apart from the realities of life,” so this was a perfectly acceptable forum for a statement in his view.

The Black athlete “says his life in America is filled with injustice, that he wants equality everywhere, not just within the arena,” Cosell said. “He says that he will not be used once every four years on behalf of a group that ignores what happens to him every day of all of the years. He says he … will use his prominence earned within the arena to better his plight outside of it. … He’s aware of backlash but says he’s had it for 400 years.”

There remains considerable resistance to demonstrations in sports generally, and racial statements from athletes specifically. But Smith and Carlos have enjoyed a measure of greater mainstream recognition in recent years.

Smith and Carlos received ESPN’s Arthur Ashe Courage Award in 2008, and President Barack Obama referred to them as “legendary” when they visited the White House with 2016’s Team USA Olympians and Paralympians.

“We’re proud of them,” Obama said. “Their powerful silent protest in the 1968 games was controversial, but it woke folks up and created greater opportunity for those that followed.”

Mata Hari executed for espionage — Oct. 15, 1917

CLICK HERE to read the full story from Tue, Oct 16, 1917

“French execute Dutch dancer as ‘tank’ spy”

Dutch exotic dancer and courtesan Margaretha MacLeod was executed by firing squad for espionage in Paris on Oct. 15, 1917. She is remembered by her stage name, taken from a Malay phrase for “sun” or “eye of the dawn,” Mata Hari.

Said the Tribune: “Mme. Mata-Hari, long known in Europe as a woman of great attractiveness and with a romantic history, was, according to unofficial press dispatches, accused of conveying to the Germans the secret of the construction of the entente ‘tanks,’ this resulting in the enemy rushing work on a special gas to combat their operations.”

That was the official line, but it probably wasn’t true.

There’s evidence MacLeod, nee Zelle, was merely a convenient scapegoat to distract from mounting casualties on the western front.

She had escaped an abusive marriage and had to make a life of her own on her own.

Her love life, both before and during the war, included both German and French military officers.

As someone with neutral citizenship, she was allowed to travel freely during the war.

These factors would contribute to the undoing of a woman the Tribune described as an “adventuress.”

While MacLeod confessed to accepting money to spy on behalf of Germany during World War I, there are serious doubts she actually passed along any information of value.

The French also sought to use her as a spy, but they too came up with little in the way of useful insights via her pillow talk.

Despite falling short of her billing as one of the greatest spies of all-time, Mata Hari made an attractive target, literally and figuratively.

The mythmaking extended to her death (and beyond, including the 1931’s “Mata Hari,” Greta Garbo’s biggest movie hit).

Although it is true MacLeod refused a blindfold before the firing squad, she did not, as legend had it, blow her executioners a kiss.

Pilot Chuck Yeager breaks sound barrier — Oct. 14, 1947

CLICK HERE to read the full story from Mon, Dec 22, 1947

“Reveal Rocket Plane Exceeds Speed of Sound”

Air Force Capt. Chuck Yeager, then 24, became the first human to fly faster than the speed of sound, zipping across the sky above California’s Mojave Desert in a rocket-powered aircraft on Oct. 14, 1947.

Yeager eventually received recognition for his breaking the sound barrier. And his reputation and role in aeronautics history would be further burnished by “The Right Stuff,” Tom Wolfe’s 1979 history of the early days of the U.S. space program that has been turned into a 1983 film and 2020 Disney+ TV series.

Yeager’s record-breaking Bell XS-1, nicknamed “Glamorous Glennis” after his wife, is now on display at the Smithsonian Institution’s National Air and Space Museum.

But the test pilot and his Cold War breakthrough at Mach 1 initially were shrouded in military secrecy.

Folks back at Muroc Army Air Field (known today as Edwards Air Force Base) heard indications over the radio – “There’s somethin’ wrong with this ol’ machometer,” Yeager said with faint chuckle, “it’s gone kinda screwy on me” – but that was it.

News of the Air Force rocket plane’s ability to exceed 700 mph didn’t make it into the Chicago Tribune or elsewhere until late December via an Aviation Week report, referencing a series of successful test flights in November. There was no mention of Yeager or his fellow pilots by name.

“Significantly, the magazine said, the plane encountered no turbulence at the supersonic speeds,” reported the Tribune of Dec. 22, 1947. “It has long been a theory of aviation scientists that as planes approached or reached the speed of sound, they would meet a ‘barrier’ of sound waves which would buffet the planes and make them difficult to control.”

By mid-June of the next year, however, the Air Force would honor Yeager with its Mackay Trophy for his accomplishment and assigned him to field some questions at a news conference.

Yeager, the Tribune said on June 16, 1948, “laughingly declared his head would be ‘lopped off’ if he revealed the secrets of the flight.”

“If you had a gold mine, you wouldn’t tell where it is,” Yeager said in his West Virginia drawl. “A lot of people are trying to get hold of the information and we don’t want to help them.”

Yeager described how the XS-1 was carried aloft by a B-29 plane. At 7,000 feet, he climbed down into the rocket plane. The B-29 climbed to 25,000 to 35,000 feet. Systems were checked, then the XS-1 was dropped from the bomb bay doors and Yeager ignited the rockets.

“You have four selector switches – one for each rocket,” Yeager said. “You can switch them on or off as you choose. Each rocket has 1,500 pounds thrust. The fuel lasts 21/2 minutes with four rockets, five minutes with two and 10 minutes with one.

“After the fuel is exhausted, the plane slows down and you look for a place to land. Your speed goes down to 300 to 400 miles an hour and you put down your (landing) gear at 300 and you make your landing pattern at 240 mph. Landing speed is 160 mph. The plane has brakes but I didn’t have to use them. The landing run is 2 to 21/2 miles.”

What wasn’t publicly known until later is Yeager had broken ribs two days before his triumphant flight while out horseback riding.

Concerned he might lose his flight assignment, he had the injury treated secretly and kept it to himself as he flew into the history books.

First U.S. commercial cellphone call — Oct. 13, 1983

CLICK HERE to read the full story from Mon, Oct 10, 1983

“Cellular phones calling here”

From a Chrysler convertible at Chicago’s Soldier Field, Ameritech Mobile Communications President Bob Barnett placed the first U.S. commercial cellphone call on Oct. 13, 1983.

The call was to the grandson of telephone inventor Alexander Graham Bell in Berlin, West Germany, and it was part of a media event to herald the launch of Ameritech’s new mobile phone service.

“It’s more than a convenience; it will improve productivity for most people,” Martin Cooper, president of Chicago’s Cellular Business Systems, said of cellphone technology in the Tribune a few days earlier.

“Why do you go to the office? So you can be reached. Especially with the portable unit, instead of talking from the office, a stockbroker or a lawyer or a contractor can be reachable where the action is.”

Although there were radiophones going all the way back to the 1940s, Cooper is widely considered the inventor of the cellphone.

While Motorola’s general manager of communications systems in 1973, he made a demonstration call from a New York sidewalk to AT&T’s Joel Engel – another first.

But it took time, money and FCC approval before the cellphone concept was ready for the U.S. marketplace.

“Cellular technology improves mobile telecommunications by dividing a region, say Chicago, into small geographic areas or cells,” the Tribune explained. “Through computer switching, calls are ‘handed off’ from one cell to another as the car moves.”

When hand-held devices were first introduced, the cellphone of the day was the DynaTAC 8000x.

It was nicknamed “the brick” for both its size and heft, weighing at 28 ounces, more than four times heavier than a modern iPhone 11. Its retail price even harder to carry at $3,995.

Less-portable in-car devices were cheaper.

“The automobile phone unit runs $2,500 to $2,800, with installation charges bringing the total up to $3,000,” the Tribune reported. “Phone charges for Chicago include a $50-a-month line-service charge and usage rates of 40 cents a minute during peak calling hours and 24 cents a minute during off-peak hours. Average monthly bills are expected to range from $150 to $200.

“In time, increased competition among phone manufacturers is expected to bring their costs down to about $1,500, though some say it will be closer to $800 or $900. Rates probably won’t drop substantially, however, because operators expect they will need the revenue to continue to expand the system.”

The big question in 1983 was how big an industry mobile phones would be, with the Tribune fielding estimates for where it would be in another 10 years that ranged from $1.1 billion to $10 billion.

The correct answer in 1993 proved to be roughly $10.9 billion in total service revenue in the United States, not counting sales of device or infrastructure equipment.

By 2019, that U.S. number would grow to $187 billion with almost 270 million smartphones in use at $77.5 billion in sales.

“We aren’t looking for cellular to generate instantaneous and exorbitant profits,” Ed Stainaio, a Motorola vice president and general manager, said. “But it’s going to be a major telecommunications business that is moving forward.”

Still, one industry analyst with modest expectations wondered aloud, “Why pay that kind of money and do in my car what I can do on a stationary telephone for far less?”

Chief Justice Roger Taney dies — Oct. 12, 1864

CLICK HERE to read the entire story from Fri, Oct 14, 1864

“The News”

When U.S. Supreme Court Chief Justice Roger B. Taney died at age 87 on Oct. 12, 1864, Election Day was only 27 days away.

President Abraham Lincoln, running for a second term against Democrat George McClellan, chose not to nominate a successor until December despite calls from some advisors to do so immediately.

This may have had less to do with letting voters have a say in the process than in Lincoln attempting to get those interested in succeeding Taney to rally behind him to advance their bid.

One of those people was Salmon P. Chase, a former presidential challenger who had been part of Lincoln’s “team of rivals” cabinet as Treasury secretary.

Chase, a one-time U.S. Senator and Ohio Governor, threatened to resign his post more than once in the belief Lincoln needed his support and that of his political followers. Lincoln, however, accepted his departure in June 1864.

The potential court appointment made Chase an enthusiastic supporter of Lincoln’s re-election effort.

Lincoln won and Chase got the nod.

Critically, while Lincoln chafed with Chase, they agreed when it came to opposing slavery and supporting the war effort.

Taney, appointed the nation’s fourth Chief Justice in 1836 by President Andrew Jackson, was another matter.

Though sympathetic with seceding Southern states and ailing, he refused to resign from the high court when the Civil War erupted and clashed with Lincoln throughout Lincoln’s first term.

Taney already had earned a dubious place in U.S. history by writing the infamous majority opinion in the 1857 Dred Scott case, ruling that Black Americans could not be citizens, regardless of whether they were free or not.

“The wall Slavery had long maintained against the advance of Liberty are breached and fallen, and the chief custodian sits lifeless by the main buttress with his well-rusted keys fast in his hands,” the Tribune said upon Taney’s death.

“To drop metaphor, for nearly 30 years the late Chief Justice, whose other relations to public and private life have been eminently useful and honorable, has never betrayed one trust imposed upon him, to guard well the institution of man-selling. He summed up the labors of a lifetime in the Dred Scott decision.”

Half a year after Taney’s passing, the Civil War would end, Lincoln would be dead and the nation would forge ahead in reckoning with its original sin.

‘Saturday Night Live’ debuts — Oct. 11, 1975

CLICK HERE to read the entire brief from Mon, Oct 13, 1975

“New fall TV season worst in decades”

Produced by and for the first generation to grow up watching television, the NBC program known today as “Saturday Night Live” made its debut from Studio 8H in New York’s 30 Rockefeller Center on Oct. 11, 1975.

The Lorne Michaels-produced show was just called “Saturday Night” at the time.

Ironically, ABC had introduced a prime-time variety show three weeks earlier from a few blocks away at the Ed Sullivan Theater called “Saturday Night Live with Howard Cosell.”

Cosell’s first show included an appearance by comedian Billy Crystal and each week featured a comedy troupe that included Bill Murray, Brian Doyle-Murray and Christopher Guest known as the Prime Time Players.

All four would wind up eventually in the cast of NBC’s “SNL,” where the original cast – John Belushi, Dan Aykroyd, Jane Curtin, Chevy Chase, Gilda Radner, Garrett Morris and Laraine Newman – would become collectively known as The Not Ready for Prime-Time Players. (Cosell would guest host the NBC show in April 1985.)

Gary Deeb, the Chicago Tribune’s famously unsparing TV critic, had little use for ABC’s Cosell debacle, calling it “a ratings bust and creative embarrassment.”

But in a mini-review at the bottom of a column headlined “New TV season worst in decade,” Deeb lavished praise upon the NBC replacement for “Best of Carson” reruns of “The Tonight Show.”

“‘Saturday Night,’ NBC’s spanking new late-night variety show, premiered in superb fashion Saturday with a hip 90 minutes highlighted by some devastating spoofs of TV commercials, a George Carlin monologue on religion, and a funny ‘Believe It or Not’ takeoff by Albert Brooks called ‘The Impossible Truth,'” Deeb wrote.

“The show is live from New York – and not to be confused with Howard Cosell’s dismal program earlier in the evening. ‘Saturday Night’ sports the experimental aura of ‘Laugh-In’ and the off-the-wall hilarity of the early Sid Caesar programs. It’s deliberately aimed at a young-adult audience and runs at 10:30 p.m. Saturdays, except for once-monthly preemptions by ‘Weekend,’ the NBC newsmagazine.”

The format would evolve from that first program. But features from that debut – including a cold open, monologue, newscast and commercial parodies, musical performances and sketches – would serve it well.

Now in its 46th season despite reliably consistent complaints from nearly the start that it’s not as good as it once was, it has become a program showcasing people who grew up watching “Saturday Night Live.”

Illinois Secretary of State Paul Powell dies — Oct. 10, 1970

CLICK HERE to read the full story from Thu, Dec 31, 1970

“Powell death setback for Dems”

Upon the death of Illinois Secretary of State Paul Powell, who suffered a fatal heart attack on Oct. 10, 1970, Chicago Mayor Richard J. Daley said his fellow Democrat “will be recognized as a major figure in our state.”

Daley couldn’t have been more right, but he might not have known exactly why – and neither would the public at first.

A half-century late, Powell and his shoe box remain a part of Illinois political lore thanks to the discovery of more than $750,000 in cash stashed in the St. Nicholas Hotel suite in Springfield where he lived while working and another $50,000 or so in his office.

The executor of Powell’s estate, John Rendelman, chancellor of Southern Illinois University at Edwardsville, came across the money in the days after the downstate politician’s death but didn’t share details publicly until year’s end.

“Rendelman said that the money found in Powell’s apartment was in a shoe box, two leather brief cases and three steel strong boxes which were hidden behind old whisky cases and mixed among the clothing of his closet,” the Chicago Tribune reported.

“I almost fainted when I got into the clothes closet in Powell’s rooms at the St. Nick and found the money,” Rendelman said. “It was in all denominations, but mostly in $100 bills. Also, there were some $1,000 bills.”

There’s an amusing story about Rendelman sticking the money in shirt boxes and taking it down to his car, having to return to the suite and finding the vehicle towed because it was in a no parking zone. But that’s tangential.

The main point: Added with $50,000 or so Powell had in savings accounts, stocks in race tracks and banks and what not, his estate would within a couple years be worth in excess of $3 million (or more than $18 million adjusted for 2020).

Not bad considering Powell’s annual salary never exceeded $30,000.

Early details about Powell’s death didn’t quite jibe, either.

Powell, 68, died in a hotel in Rochester, Minn., where he was undergoing tests. But the aide who said he found the body was in fact in Springfield hauling two suitcases of papers and other items from Powell’s office.

The widower Powell had in fact been with his personal secretary and inamorata, and that may be the least scandalous part of his story.

What’s indisputable is Powell was a supreme Springfield power broker. As Secretary of State, he controlled 3,800 patronage jobs. He had been Speaker of the Illinois House from 1949-51 and 1959-63.

In death, he was eulogized in glowing terms by Republicans and Democrats alike.

But the real tributes paid Powell were green and accepted with open palms while he was very much alive.

Contractors he awarded no-bid deals with the state wound up sent to prison.

And an old quote attributed to Powell lives on with the memory of his corruption and the shoe box, two leather brief cases and three steel strong boxes hidden in his closet.

“There’s only one thing worse than a defeated politician,” Powell once said, “and that’s a broke politician.”

Obama wins Nobel Peace Prize — Oct. 9, 2009

CLICK HERE to read the full story from Sat, Oct 10, 2009

“Obama and the prize”

Only nine months into his first term, with goals such as winding down the war he inherited in Iraq incomplete, President Barack Obama won the Nobel Peace Prize on Oct. 9, 2009.

“To be honest, I do not feel I deserve it,” Obama said, casting the coveted honor not so much as a reward for what had been accomplished but rather “as a call to action.”

Said Obama during a Rose Garden appearance, “This award must be shared with everyone who strives for justice and dignity.”

Obama became just the third sitting U.S. President to win the Peace Prize, although former President Jimmy Carter, who left office in 1981, was honored in 2002.

Theodore Roosevelt won the prize in 1906 for his role in negotiating peace in the Russo-Japanese War with the Treaty of Portsmouth, although he never actually went to the Portsmouth Naval Shipyard in Kittery, Maine. Roosevelt also was credited with using arbitration to settle a dispute with Mexico.

Woodrow Wilson won the Nobel Peace Prize in 1919 as the architect of the League of Nations, which was established to prevent a recurrence of the just-completed world war. The United States wound up not joining the League, however, and the organization failed to forestall World War II.

The Norwegian Nobel Committee said it was honoring Obama “for his extraordinary efforts to strengthen international diplomacy and cooperation between peoples,” lauding him for creating “a new climate in international politics.”

According to the committee, the United States under Obama was “playing a more constructive role in meeting the great climatic challenges the world is confronting (and) democracy and human rights are to be strengthened.”

With the honor came a $1.4 million prize Obama donated to charity and a gold medallion.

Greg Miller, Janet Hook and Mark Silva wrote from the Tribune’s Washington bureau that the medal “does not come with a ribbon, but the award could end up being a weight around Obama’s neck.”

Geir Lundestad, who stepped down as director of the Nobel Institute after 25 years in 2014, said in an interview to promote his 2015 memoir that the honor had been meant to strengthen Obama.

But Lundestad, who sat in on committee deliberations without a vote, acknowledged criticism of the choice ensured “the committee didn’t achieve what it had hoped for” and noted Obama himself rarely mentioned the prize.

“I think the Nobel Committee couldn’t vote in our election in 2008, so they decided to vote this year,” said John Bolton, who had been U.S. ambassador to the United Nations under Obama’s predecessor, President George W. Bush, and later was part of President Donald Trump’s administration for a time. “It’s high-minded Europeans talking down to hayseed Americans, saying this is the way you ought to be.”

Thorbjorn Jaglund, the committee’s chairman defended the choice, arguing no one had done more than Obama for world peace over the previous year.

For the Obamas, it was just one more thing to be celebrated. The President said he was reminded by daughter Malia it also was the birthday of the family dog, Bo.

“It’s good to have kids to keep things in perspective,” President Obama said.

The Great Chicago Fire — Oct. 8, 1871

CLICK HERE to see the full page from Wed, Oct 11, 1871

“Fire!”

Some believed Chicago’s obituary was to be written after the great fire that began Oct. 8, 1871.

Instead, from the ashes, today’s modern city was born of steel and brick and the iron will of its citizens, undaunted by the 3.3 square miles of hellscape left to smolder when the flames finally died out on Oct. 10.

Nearly 300 people were killed, more than 90,000 left homeless and roughly 17,450 buildings were destroyed, including the Chicago Tribune’s supposedly fireproof headquarters.

Yet, when the Tribune could at last resume publishing on Oct. 11 with a two-page edition, it urged the surviving 300,000 Chicagoans forward.

“In the midst of a calamity without parallel in the world’s history, looking upon the ashes of 30 years’ accumulations, the people of this once beautiful city have resolved that CHICAGO SHALL RISE AGAIN,” it said.

There had been a drought throughout the Midwest that year. Chicago, with its wooden buildings and lax regulation, never really stood a chance.

Just a day earlier, four square blocks along Canal Street had burned.

On the same day the Great Chicago Fire ignited, another blaze began 250 miles to the north that dwarfed it in scope and fatalities, if not notoriety.

That Wisconsin inferno swept over approximately 1,875 square miles, damaged 17 towns and claimed roughly 1,200 lives, including 800 or so in Peshtigo, which got the worst of it.

It’s not certain what started Chicago’s fire, which began behind the home of Patrick and Catherine O’Leary on DeKoven Street, near 12th and Halsted Streets. The legend of Mrs. O’Leary and a cow is, it seems, a myth.

“During Sunday night, Monday, and Tuesday, this city has been swept by a conflagration which has no parallel in the annals of history, for the quantity of property destroyed, and the utter and almost irremediable ruin which it wrought,” the Tribune’s lead story said.

But even as the report spoke of the destruction of “public improvements that it has taken years of patient labor to build up, and which has set back for years the progress of the city, diminished her population, and crushed her resources,” it voiced defiant hope.

“Chicago will not succumb,” the newspaper said. “Late as it is in the season, general as the ruin is, the spirit of her citizens has not given way, and before the smoke has cleared away, and the ruins are cold, they are beginning to plan for the future.

“Though so many have been deprived of homes and sustenance, aid in money and provisions is flowing in from all quarters, and much of the present distress will be alleviated before another day has gone by.”

In fact, within a week, some 6,000 temporary buildings had sprung up.

Tribune Editor Joseph Medill, elected Chicago’s mayor the following month, pushed to outlaw wooden buildings in the business district and other fire safety measures.

Among the structures that survived the inferno, famously, was the old Water Tower on North Michigan Avenue.

Another left intact was the O’Leary’s home, though it no longer stands. On its site today sits the Chicago Fire Academy.

Fox News Channel launches — Oct. 7, 1996

CLICK HERE to read the full story from Mon, Oct 7, 1996

“Cable’s lament: 57 channels … and it’s just not enough”

Available in just 17 million of the 63 million U.S. homes with cable at the time, Fox News Channel made its debut on Oct. 7, 1996, advertising its approach as fair and balanced.

Twenty-four years later, Fox News Channel’s coverage of President Donald Trump’s Sept. 29 debate with Joseph Biden drew an estimated audience of more than 17 million viewers, tops among all media outlets

Media mogul Rupert Murdoch had enlisted Roger Ailes, a long-time Republican political adviser who had been running CNBC, to launch the new network to counter what they saw as the liberal bias of traditional news outlets.

Sean Hannity, Neal Cavuto, Bill O’Reilly, Catherine Crier, Alan Colmes and Mike Schneider were part of the original night-time lineup.

Murdoch said he wanted his new channel, based in what only months earlier had been a Sam Goody music store, to clearly label analysis and opinion, so it would not be confused with news coverage.

Yet, as Fox News carved out and catered to its own audience segment, those lines grew more difficult to discern while its influence – particularly with regular guest and devoted viewer Trump in the White House – is difficult to overestimate.

It is one of cable’s great success stories amid a splintered media universe, with opinion programs from Hannity and Tucker Carlson its top draws.

Initially, Fox News parent News Corp. had to pay some cable systems to carry the channel, rather than the reverse. But with fewer channels available then, compared to today, that often wasn’t enough to do the trick.

The nation’s second-largest cable operator, Time Warner, which shared a corporate parent with CNN, opted to pick up MSNBC, which had launched in June, instead of Fox News Channel.

News Corp. said it was “preparing a retaliatory” strike against Time Warner, the Tribune’s Tim Jones reported.

“News Corp.’s announcement was preceded by remarks from Ted Turner, soon to be Time Warner’s vice chairman, that compared News Corp. Chairman Rupert Murdoch to Adolf Hitler,” Jones wrote, noting CNN founder Turner later apologized.

Michael Jordan retires for the first time — Oct. 6, 1993

CLICK HERE to read the full story from Thu, Oct 7, 1993

“So long, Michael; it’s been great”

The news he was walking away from the NBA had broken the night before, in the middle of White Sox playoff opener vs. the visiting Blue Jays. The Bulls’ Michael Jordan made it official at a packed news conference at the team’s Deerfield training center on Oct. 6, 1993.

It was, it turned out, the first of three retirements as an NBA player for Jordan, a global basketball and marketing phenomenon now principle owner of the Charlotte Hornets.

Jordan would walk away again in 1998 and, after a stint with the Washington Wizards, for good in 2003.

“I have reached the pinnacle of my career,” said Jordan, who had won three of his five Most Valuable Player awards and three of his eventual six NBA titles with the Bulls.

“I just feel I don’t have anything else to prove. It’s going to be tough, but I’m very happy about my decision and I’m very happy to make that choice.”

Conspiracy theorists long have tried to tie Jordan’s NBA sabbatical playing

minor-league baseball in the White Sox organization from 1993-94 to his penchant for gambling, the more plausible party line from the league leadership on down is Jordan was burned out and reeling from his father’s murder that summer.

Bulls Chairman Jerry Reinsdorf, who also controlled the White Sox, made it easy for Jordan.

Reinsdorf continued to pay MJ his basketball salary through his return to the basketball team late in the ’94-’95 season on the premise Jordan had always been underpaid while enriching team investors.

“He’s living the American dream,” Reinsdorf said at the news conference. “The American dream is to reach a point in your life where you don’t have to do anything you don’t want to do and (can do) everything that you do want to do.”

That some would be sad to Jordan go was an understatement.

“I know that kids are going to be disappointed, but I hope they learn that Michael Jordan was once a basketball player but now he’s a human being, a man,” Jordan said. “He has a family and other things he has to achieve.”

Asked how the three-time world champion Bulls would fare without their superstar leader, future executive John Paxson, who had been Jordan’s teammate in Chicago the longest at that point at eight years, gently corrected a reporter.

“Without Michael, it’s not a world championship team,” Paxson said. “Michael has defined our team.”

Jordan, meanwhile, left the door open for his eventual return.

“The word ‘retired’ means you can do anything you want from this day on,” he said. “So, if I desire to come back and play again, maybe that’s what I want to do.”

Truman delivers first televised presidential address — Oct. 5, 1947

CLICK HERE to read the full story from Mon, Oct 6, 1947

“Truman for ‘no meat’ days”

Few remember what Harry Truman told Americans on Oct. 5, 1947. Part of what makes it notable all these years later is it was the first time a U.S. President gave an address on TV from the White House.

The speech was a call for the nation to reduce its consumption of meat, eggs, poultry and grains to help feed hungry, war-scarred Europe where President Truman said the situation was “grim and forbidding as winter approaches.”

The goal was to provide a much-needed lifeline for starving nations. The Marshall Plan, which would fuel western Europe’s economic recovery, was still six months away.

“Despite the vigorous efforts of the European people, their crops have suffered so badly from droughts, floods and cold that the tragedy of hunger is a stark reality,” Truman said.

“The nations of western Europe will soon be scraping the bottom of the food barrel. They cannot get through the coming winter and spring without help – generous help – from the United States and from other countries which have food to spare.”

Vastly more Americans tuning in for Truman’s address did so on radio than TV, as video was a nascent medium and few people had sets. But a first is a first.

Only a week earlier, the first televised World Series had begun, ultimately alternating over seven games between NBC, CBS and DuMont with Joe DiMaggio’s Yankees downing Jackie Robinson’s Dodgers.

Henceforth, every Truman address would be televised and a template now taken for granted was established.

Truman became the first presidential candidate to run a paid political ad in 1948. His 1949 inauguration was the first televised. And addressing a 1951 conference in San Francisco, Truman was the star of the nation’s first live coast-to-coast TV broadcast.

The first live presidential press conference wasn’t until 1961, five days after John F. Kennedy’s inauguration.

Kennedy’s successor, President Lyndon Johnson, delivered the first live televised night-time State of the Union address on Jan. 4, 1965.

It should be pointed out Truman was not the first U.S. President to appear on television.

Predecessor Franklin Roosevelt went before a TV camera for experimental station W2XBS (now WNBC) at the 1939 New York World’s Fair. But the only receivers picking up the signal were at the fair in Queens and Radio City in Manhattan.

Soviets launch Sputnik — Oct. 4, 1957

CLICK HERE to read the full story from Sat, Oct 5, 1957

“Reds fire ‘moon’ into sky”

The Space Age began on Oct. 4, 1957, with the Soviet Union launching a man-made “moon” into low Earth orbit. The satellite’s name was Sputnik.

A metal ball with four antennas, it inspired awe. It also evoked Cold War fear the Soviets had leap-frogged the United States technologically and would militarize space.

Sputnik’s development had spun out of Soviet efforts to design and build intercontinental missiles.

A global wake-up call, its launch helped jump start the U.S. space program, which eventually overtook the Soviets and landed an American on the moon within a dozen years.

The Soviet satellite also spurred a push to better educate U.S. schoolchildren in math and physics in hopes they someday would contribute to the country’s space efforts.

It’s difficult to imagine modern life without satellites, given their role in areas such as communications, navigation and security, not to mention the technological advances that came from the space program.

“The artificial moon is about 23 inches in diameter,” a story on the Tribune’s front page said. “It was launched by a carrier rocket which was believed to have given it the necessary spin to fly around the world in 1 hour and 35 minutes.

“The satellite was describing an elliptical trajectory around the earth at an estimated height of 560 miles, the Russian news agency Tass said. Despite its height, it can be seen with binoculars in the rising and setting sun, the agency reported.”

In truth, what was visible was a stage of the rocket that launched Sputnik into orbit. The satellite itself was too small to be discerned so easily.

“The ‘moon’ is equipped with a radio transmitter,” the story said. “Radio listening posts in the United States and England said they had picked up signals believed to be from the satellite.”

A group working for RCA was believed to be the first in the United State to have detected the beep, beep, beep of its signal.

RCA-owned NBC interrupted TV and radio programming to play a recording.

Sputnik’s signal died 22 days later when its battery died. The satellite remained in orbit until burning up on re-entry in early January.

A second Soviet satellite would be launched on Nov. 3, 1957. It carried a mutt named Laika, which sadly died in the fourth trip around the Earth because of overheating.

The United States, which actually began a satellite program in 1954, failed to successfully put one into orbit until Explorer 1 on Jan. 31, 1958.

The first draft of Chicago history at your fingertips at newspapers.com.

O.J. Simpson found not guilty — Oct. 3, 1995

CLICK HERE to read the full story from Wed, Oct 4, 1995

“Simpson freed, nation torn”

Maybe some of the shock has worn off in the 25 years since the O.J. Simpson murder verdicts were announced in Judge Lance Ito’s Los Angeles court room on Oct. 4, 1995.

At the time it was inconceivable to a good number of people that a jury could acquit the former football star, corporate pitchman, actor and sportscaster in the 1994 murder of ex-wife Nicole Brown Simpson and Ronald Goldman.

Many also found it stunning in 1995 that two segments of the general population could see the same trial on TV day after day for eight months and come away with completely different perceptions of Simpson.

“Tens of millions of Americans dropped what they were doing and gathered around TV sets in their homes, workplaces and other public sites as a jury of 10 women and two men delivered their sealed verdict,” the Tribune’s Vincent J. Schodolski and Howard Witt reported.

“The long-awaited decision took on the character of a national event, one of the most-watched TV broadcasts ever, rivaling the aftermath of the 1963 Kennedy assassination, the first man walking on the moon in 1969 and the Persian Gulf War in 1991.

“But if watching the Simpson spectacle brought Americans together, the jury’s verdict coming after stunningly swift deliberations … fueled intense debate across a country long captivated by the marathon morality play.”

Simpson would be found liable in a civil suit pressed by the families of the victims. He would go to prison for a later crime involving an attempt to steal back memorabilia.

But on this day he was released from custody, deemed not guilty by the legal system if not a majority in the court of public opinion.

Simpson’s multi-million dollar legal defense team did the job it was paid to do, shining a harsh light on racism within the Los Angeles Police Department and casting enough doubt to overshadow a mountain of evidence pointing to Simpson’s guilt.

It did not help when the prosecution asked Simpson to put on a glove recovered from the crime scene, and Simpson was unable to do so.

As defense attorney Johnnie Cochran famously told jurors, “If it doesn’t fit, you must acquit.”

Even before the televised trial began on Jan. 24, the case had become a national obsession, a true-crime tragedy and patchwork quilt of hot-button issues and magnets for attention.

The jury reached its decisions after fewer than four hours of deliberation. But the announcement of their decisions was held until the next morning to ensure authorities could keep the peace if it had sparked unrest.

As events played out, it only sparked unease.

There was elation and despair in disparate scene across the country, tears of joy and sadness, exposing a national divide.

At the Southern California grave of Simpson’s murdered ex-wife, abused women converged to commiserate and support each other. For them, the celebrity former athlete’s acquittal was salt in a never-quite-healed wound.

For some Black Americans who viewed the justice system’s short-comings too often tilted against them, it was a role reversal to be celebrated.

President Wilson suffers debilitating stroke — Oct. 2, 1919

CLICK HERE to read the full story from Fri, Oct 3, 1919

“President is ‘very sick'”

President Woodrow Wilson suffered a debilitating stroke on Oct. 2, 1919, from which he never fully recovered.

The remainder of his presidency was shadowed by efforts to hide the stroke’s effects and preserve his presidential power.

“The indications are that the president will be incapacitated for attention to public business for some time,” the Tribune reported. “This is the more unfortunate because there is a multitude of matters of the utmost importance requiring the action of the executive.”

Although it wasn’t widely known, Wilson’s power was largely being wielded by close aides and his wife, Edith, whom he had married less than four years before being stricken.

But the constitutional crisis, or at least the potential for it, was obvious from the start.

Wilson wasn’t dead and yet he (and his associates) remained unwilling to have him step down. Vice President Thomas Marshall refused to step up and assume the reins of power.

The 25th Amendment to the U.S. Constitution, which detailed a more specific means to transfer power if a president dies or is disabled, would not be ratified until 1967, 48 years later.

“Whether the inability of the president to discharge the duties of his office shall be so prolonged as to require Vice President Marshall to take the helm remains to be seen,” the Tribune reported. “The constitution provides that during the inability of the president to perform his official functions the vice president shall act, but there is no provision for determining when such inability exists. Presumably Mr. Marshall would not assume the duties of the executive office unless Mr. Wilson should request him to act.

“This question arose during the period between the shooting of President Garfield and his death, but it was never settled.”

Even before the major stroke that left him paralyzed on his left side, compromised sight in his right eye and seemed to affect his thinking and emotions, Wilson had been ailing through much of September.

Earlier in 1919, he caught the flu as part of the pandemic that began the year before. While traveling out West in a bid to rally support for the League of Nations he took ill in September, then some experts believe he suffered what is now considered a mini-stroke.

The Oct. 2 episode left Wilson a far less effective leader, unable to combat a Republican-led Senate resistant to joining the League of Nations or control some of his own cabinet member’s worst impulses.

Despite remaining incapacitated, Wilson sought a Democratic nomination for a third term in 1920. The party deemed him too frail and opted for Gov. James M. Cox of Ohio, who lost to Sen. Warren G. Harding of Ohio. Wilson would die less than three years after leaving office, on Feb. 3, 1924.

Babe Ruth called shot in World Series vs. Cubs — Oct. 1, 1932

CLICK HERE to read the entire story from Sun, Oct 2, 1932

“Home runs by Ruth, Gehrig beat Cubs, 7-5”

On Oct. 1, 1932, Babe Ruth pounded a tie-breaking home run to center in the fifth inning of the third game of the World Series against the Cubs at Wrigley Field that lives on in baseball lore as his “called shot.”

There is no definitive record of whether Ruth actually called the homer.

The Tribune had multiple reporters there. Westbrook Pegler, the future conservative columnist, suggested that the Babe did just that on the front page of sports.

Others are not so clear, although Edward Burns wrote on the Tribune’s front page: “That tie-breaking second homer of the Babe’s probably will go down as one of the classics of baseball razzing.”

What is indisputable is it was Ruth’s second home run of the game. The Cubs and Ruth had been yelling at each other, and Ruth made hand gestures as he dug in against Cubs pitcher Charlie Root.

Pegler described a running dispute with Ruth and Guy Bush, a pitcher in the Cubs’ dugout who had lost Game 1 and would start Game 4 in which the Yankees completed their series sweep.

“The Babe laughed derisively and gestured at him ‘Wait mugg: I’m going to hit one out of the yard.,'” Pegler wrote. “Root threw a strike past him and he held up a finger to Bush, whose ears flapped excitedly as he renewed his insults. Another strike passed him and Bush crawled almost out of the hole to extend his remarks.

“The Babe held two fingers this time. Root wasted two balls and the Babe put up two fingers on his other hand. Then, with a warning gesture on his hand to Bush, he sent him the signal for the customers to see.

“‘Now,’ it said, ‘this is the one. Look!’ And that one went riding in the longest home run ever hit in the park.”

Part of the confusion over the “called shot” is whether Ruth gestured at Bush, at Root or to the outfield. Pegler had reported at one point while Ruth was playing right field, a fan had hit him with a lemon.

Burns said Ruth paused his jawing only to make contact.

“Ruth resumed his oratory the minute he threw down his bat,” Burns wrote. “He bellowed every foot of the way around the bases, accompanying derisive roarings with wild and eloquent gesticulations. George Herman Ruth always enjoys a homer under any circumstances, but it is doubtful if he ever rocked one that gave him the satisfaction that accompanied that second one yesterday.”

Among those in attendance was New York Gov. Franklin D. Roosevelt, then running for President and accompanied by Chicago Mayor Anton Cermak (who would be fatally wounded in a failed assassination attempt on Roosevelt a few months later in Miami).

Before the game, Roosevelt spoke with the two managers and, the Tribune’s Harvey Woodruff reported him waving to right field when talking to the Yankees’ Joe McCarthy.

Roosevelt, Woodruff wrote, “no doubt was telling Joe where Ruth and Gehrig should place their drives. Ruth did it twice and Gehrig did it twice.”

Archive — This Day in History, September