The footprints are still there, the striped tread of Neil Armstrong’s boots, caked into dust. There’s no atmosphere on the moon, no wind and no water. Footprints don’t blow away and they don’t wash away and there’s no one up there to trample them. Superfast micrometeorites, miniature particles traveling at 33,000 miles per hour, are bombarding the surface of the moon all the time, but they’re so infinitesimal that they erode things only at the more or less unobservable rate of 0.04 inches every million years. So unless those footprints are hit by a meteor and blasted into a crater, they’ll last for tens of millions of years.
This summer marks half a century since Armstrong first walked on the moon, though cosmologically, that was a mere snap of the fingers ago. “Man on the moon!” cried Walter Cronkite on CBS television news, gasping, while the world watched, rapt. Kids away at summer camp were marched from their tents deep in the woods to mess halls to plop down in front of a little screen, while camp counselors tinkered with rabbit-ear antennas. “That’s one small step for man,” Armstrong said, immortally, as he stepped off the ladder of the Lunar Module on July 20, 1969, “one giant leap for mankind.” And then Armstrong pressed his gray-and-white boot into the dust, and left that first trace.
But what really lasts from that moment? What was the mission for? And what did it leave behind, here on Earth? Fifty years later, floods made more frequent by the changing of the climate have begun to wash away the Kennedy Space Center in Florida, from which Apollo 11 was launched (NASA has been shipping in sand to try to shore up devastated dunes), and hurricanes worsened by the rising of the seas threaten the site of Apollo 11’s mission control, the Johnson Space Center in Texas. Houston, we have a problem.
Much of the beauty, the wonder and the earthshaking awe of the expedition to the moon is best remembered in photographs taken by American astronauts on custom-made Swedish cameras called Hasselblads, first used on the Mercury 8, an Earth-orbiting mission, in 1962. As the photographer and curator Deborah Ireland explains in HASSELBLAD AND THE MOON LANDING (Ammonite, $14.95), the three astronauts on Apollo 11, Neil Armstrong, Buzz Aldrin and Michael Collins, had to share two Hasselblads. “Oh, golly, let me have that camera back,” Aldrin said as they neared the moon. The iconic photograph of the single footprint shows the mark of Aldrin’s boot, not Armstrong’s, and it was Aldrin who took the picture of Armstrong right after he planted the American flag, near the base they named Tranquility. Collins remained on the Command/Service Module, orbiting. “What are you doing, Mike? What are you taking pictures of?” Armstrong asked Collins on the journey back to Earth, as they stared back at the face of the moon. “Oh, I don’t know,” Collins answered. “Wasting film, I guess.” It was not wasted. The photographs remain as astounding as ever.
Still, before July of 1969, a lot of critics thought the whole program was a waste. At no point before Apollo 11 actually landed on the moon did a majority of the American public support the mission, as the retired NASA historian Roger Launius reports in APOLLO’S LEGACY: Perspectives on the Moon Landing (Smithsonian, $27.95). It cost $25.4 billion ($180 billion in today’s money), and, in the 1960s, was the line item on which the United States government spent the most money, aside from Vietnam. “No bucks, no Buck Rogers,” NASA officials got in the habit of telling Congress. Notwithstanding the undisputed ingenuity of its scientists and engineers and the undaunted courage of its astronauts, critics still called it a “moondoggle.”
But after the mind-blowing, Tang-selling, moon-boot trendsetting triumph of the landing, both the general indifference and the specific skepticism were forgotten. They’re forgotten in some of these new books, too, most of which take the form of one kind of boosterism or another.
“Man has always gone where he has been able to go,” Collins told a joint session of Congress, in September 1969, and those are the lines with which James Donovan chooses to end SHOOT FOR THE MOON: The Space Race and the Extraordinary Voyage of Apollo 11 (Little, Brown, $30). Donovan’s earlier books are swaggering accounts of the Alamo, “The Blood of Heroes,” and of Custer’s last stand, “A Terrible Glory,” in which men conquer, even in defeat, and that’s the view he takes of Apollo, too, a vantage that allows no room for, say, women. “If you think going to the moon is hard, try staying at home,” said Barbara Cernan, the wife of Gene Cernan, a member of the crew of Apollo 10 and commander of Apollo 17. You can read about her in Lily Koppel’s 2013 book “The Astronauts’ Wives Club,” but you won’t find her in Donovan’s account. Nor will you find a word about the record-breaking pilot Jerrie Cobb, who in 1961 became the first of 13 women to qualify to become an astronaut. NASA refused to allow them to fly, as Martha Ackmann explained in her 2003 book “The Mercury 13.” Cobb fought back. “We seek, only, a place in our nation’s space future without discrimination,” Cobb said before a House investigation, in 1962. In 1998, when she was 67, she said, “I would give my life to fly in space, I really would.” She died this spring, at the age of 88. Her footprints are not on the moon.
The moon landing is a matter of public memory, which is another way of saying that it’s contested history. In 1971, Collins became the director of the Smithsonian’s National Air Museum, overseeing the addition of “Space” to its name in 1976, and he provides the introduction to APOLLO TO THE MOON: A History in 50 Objects (National Geographic, $35), by the curator Teasel Muir-Harmony. Included is an artifact borrowed from the Smithsonian Museum of African-American History, a tin can plastered with a photograph of the Reverends Martin Luther King Jr. and Ralph Abernathy, King’s successor as the head of the Southern Christian Leadership Conference. The S.C.L.C. used that sort of can to collect donations at rallies, like the one Abernathy led at the Kennedy Space Center on July 15, 1969, the day before the launch of Apollo 11. Abernathy carried a sign that read: “$12 a day to feed an astronaut. We could feed a starving child for $8.” Muir-Harmony quotes Abernathy as saying, “On the eve of man’s noblest venture, I am profoundly moved by the nation’s achievements in space,” but weirdly leaves out the meaningful part of that speech, which you can see Abernathy deliver in the opening scenes of an ambitious and affecting three-part PBS/American Experience documentary, “Chasing the Moon,” scheduled to be released in July, along with an accompanying book, CHASING THE MOON: The People, the Politics, and the Promise That Launched America Into the Space Age (Ballantine, $32), by the film’s director, Robert Stone, and one of its producers, Alan Andres. “We may go on from this day to Mars and to Jupiter and even to the heavens beyond,” Abernathy said, “but as long as racism, poverty and hunger and war prevail on the Earth we as a civilized nation have failed.” By this measure, the last 50 years is a history of defeat heaped upon defeat.
In AMERICAN MOONSHOT: John F. Kennedy and the Great Space Race (Harper/HarperCollins, $35), the best new study of the American mission to space, rich in research and revelation, the historian Douglas Brinkley carefully considers this and other attacks launched by civil rights activists, like the National Urban League’s Whitney Young. “It will cost $35 billion to put two men on the moon,” Young complained. “It would take $10 billion to lift every poor person in this country above the official poverty standard this year. Something is wrong somewhere.” But Brinkley concludes that, as a purely economic matter, the mission was worth it, given the gains that extended to matters of public health. He writes, “The technology that America reaped from the federal investment in space hardware (satellite reconnaissance, biomedical equipment, lightweight materials, water-purification systems, improved computing systems and a global search-and-rescue system) has earned its worth multiple times over.”
In ONE GIANT LEAP: The Impossible Mission That Flew Us to the Moon (Simon & Schuster, $29.99) Charles Fishman suggests that criticisms of the program were forgotten because in the summer of 1969, almost overnight, Apollo came to stand for the very opposite of Vietnam: one the nation at its best, the other the nation at its worst. Fishman isn’t especially interested in this point; instead, most of his book is a long argument that the mission was worth it, for reasons many readers will wonder at. “The race to the moon didn’t usher in the Space Age,” he insists, “it ushered in the Digital Age.” He points, specifically, to the development of integrated circuits and real-time computing. But there’s something else, something bigger, that Fishman wants the shot at the moon to get credit for: “In 1961, when the race to the moon kicked off, there was no sense in popular culture of ‘technology’ as a force in the everyday lives of consumers as we think of it now.” His argument goes like this: Apollo didn’t bring us to Mars, at least not yet, but, hey, it brought you Alexa. A counterargument goes something like this: My country went to the moon and all I got was this lousy surveillance state.
The race for the moon began as a race to the White House. On Oct. 4, 1957, the Soviet Union launched into orbit the first satellite, Sputnik. The American public began to panic, and Democrats decided to put that panic to political use. “People will soon imagine some Russian sitting in Sputnik with a pair of binoculars and reading their mail over their shoulders,” the Democratic strategist George Reedy wrote to Lyndon Baines Johnson on Oct. 17. “The issue is one which, if properly handled, would blast the Republicans out of the water, unify the Democratic Party and elect you as president.” Even before Sputnik, the Massachusetts senator John F. Kennedy had been attacking President Eisenhower, accusing him of failing to devote adequate funds to the missile program, leading to the United States falling behind the Soviet Union in the arms race and leading to what Kennedy dubbed a “missile gap.” In November 1957, Johnson, as Senate majority leader, opened Senate hearings into why the United States was lagging and warned Americans, “Soon, the Russians will be dropping bombs on us from space like kids dropping rocks onto cars from freeway overpasses.”
The environmentalist and writer Rachel Carson observed the forging of this “space-age universe” with dismay. Men had been fantasizing about the “conquest of space” since before H. G. Wells, as Carson well knew. “In the pre-Sputnik days, it was easy to dismiss so much as science-fiction fantasies,” Carson wrote to the woman she loved, Dorothy Freeman, in February 1958. “Now the most far-fetched schemes seem entirely possible of achievement. And man seems actually likely to take into his hands — ill prepared as he is psychologically — many of the functions of ‘God.’” Johnson’s hearings in part persuaded Carson to write a book that she for a long time called “Man Against the Earth,” but that eventually appeared as “Silent Spring.”
That same year, 1958, in “The Human Condition,” Hannah Arendt described the launching of Sputnik as an event in human history “second in importance to no other, not even to the splitting of the atom.” Like Carson, Arendt didn’t celebrate this development, described in both the American and Soviet press as marking the first “step toward escape from men’s imprisonment on Earth.” An escape? “Nobody in the history of mankind has ever conceived of the Earth as a prison for men’s bodies or shown such eagerness to go literally from here to the moon,” Arendt wrote, regretting the dawn of an age in which the Earth had come to be understood as a prison and space yet another place to be conquered.
The year Carson set out to write “Silent Spring” and Arendt published “The Human Condition,” Eisenhower established NASA, taking the notable and important caution of establishing it as a civilian agency. In a farewell address delivered on Jan. 17, 1961, three days before Kennedy’s inauguration, Eisenhower deplored the arms race and indicted what he dubbed the “military-industrial complex.” On April 12, 1961, while Kennedy was still settling into the Oval Office, the Soviets sent a man into space, Yuri Gagarin. Five days later, Kennedy faced the first crisis of his presidency: the botched invasion of the Bay of Pigs, itself a failure of intelligence and of technology. A reporter asked him, at a news conference, “Mr. President, don’t you think we should try to get to the moon before the Russians, if we can?” On May 7, Alan Shepard became the first American to fly in space, on a mission known as Freedom 7, a flight that, as Brinkley points out, came just one day after the first Freedom Riders left Washington on a Greyhound bus, headed for Louisiana, to challenge Jim Crow. On May 25, in a message to Congress, Kennedy edged toward a resolve: “This nation should commit itself to achieving the goal, before the decade is out, of landing a man on the moon and returning him safely to Earth.”
Kennedy had campaigned on behalf of a New Frontier, and he meant to deliver. “Why, some say, the moon?” he asked in a stirring speech delivered at Rice University, in Houston, on Sept. 12, 1962. “We set sail on this new sea because there is new knowledge to be gained, and new rights to be won, and they must be won and used for the progress of all people.”
But if the program was launched in a partisan contest, it was also, of course, a front in the Cold War. In his 1985 book, “…The Heavens and the Earth: A Political History of the Space Age,” the University of Pennsylvania historian Walter McDougall argued that the turn from Eisenhower to Kennedy, in the aftermath of Sputnik, altered the very nature of the Cold War. “Where it had previously been a military and political struggle in which the United States need only lend aid and comfort to its allies in the front lines,” McDougall wrote, “the Cold War now became total, a competition for the loyalty and trust of all peoples fought out in all arenas of social achievement, in which science textbooks and racial harmony were as much tools of foreign policy as missiles and spies.”
For McDougall, a conservative, the race to the moon, led by liberals, was a step on “The Road to Serfdom.” “To train X thousands of engineers, to reach the moon by 19xx, to place X numbers of missiles in silos regardless of Soviet deployment, to plan for economic growth of X percent without unemployment or inflation, these were not the assignments of a free society but the dictates of a command economy.” People drawn to that argument have often been drawn, too, to the study of Wernher von Braun, the former Nazi and ex-SS officer who led the American rocket program. During World War II, von Braun had overseen the production of the German V-2 rocket (the “V” was for “Vergeltung,” or “vengeance”) at a facility built as part of the Dora-Mittelbau concentration camp, where the rockets were assembled by prisoners. Becoming an American citizen does not seem to have diminished von Braun’s zeal for unhindered technological development. “We felt no moral scruples about the possible future abuse of our brainchild,” he told The New Yorker in 1951. “Someone else would have done the job if I hadn’t.” (His amorality is the subject of a song recorded in 1965 by Tom Lehrer: “Don’t say that he’s hypocritical / Say rather that he’s apolitical / ‘Once the rockets are up, who cares where they come down? / That’s not my department,’ says Wernher von Braun.”)
The seemingly unintended consequences of developing technologies that would take men to the moon were not top of mind in the Kennedy administration, mainly because a lot of those consequences were intended: Rockets can carry weapons, too, and everything learned on the moon mission had military applications, even if NASA was a civilian agency. If not worried about the legacy of conquest and the future of war, the Kennedy and Johnson administrations were concerned, very concerned, with the civil rights movement. Edward R. Murrow, who had left CBS to take a post with the Kennedy administration, urged the president to include a black astronaut on the moon mission: “I see no reason why our efforts in outer space should reflect with such fidelity the discrimination that exists on this minor planet.” Edward Dwight was subsequently recruited and became the first black Air Force pilot to be trained at the Aerospace Research Pilot School at Edwards Air Force Base. But, as is recounted in “Chasing the Moon,” he was all but forced out by his commander, Chuck Yeager, who instructed the other trainees not to speak to him. Meanwhile, as Brinkley demonstrates, the White House used the space program to try to aid economic development in the Jim Crow South, especially after Johnson became president. “The White House was working hard to change the Old South,” Brinkley writes, “partly by using NASA to bring high-tech jobs and a futuristic thinking to backward regions slow to abandon violent and self-defeating prejudice.”
To the extent that the space program was a liberal, big-government project, it did not survive the conservative turn in American politics. “Many critical problems here on this planet make high-priority demands on our attention and our resources,” Richard Nixon said in 1970 when, as president, he rejected NASA’s recommendation to build a station on the moon to be used as a base for the exploration of Mars. To the extent that the space program was just another battle in the Cold War, it survives only in the Buck Rogers-era imagination of Donald Trump, with his proposed Space Force. And to the extent that the space program involved a repudiation of humanity itself, the legacy of Apollo is Alexa, and it haunts us all.
One small step for man, one giant leap for mankind. The lasting legacy of the voyage to the moon lies in the wonder of discovery, the joy of knowledge, not the gee-whizzery of machinery but the wisdom of beauty and the power of humility. A single photograph, the photograph of Earth taken from space by William Anders, on Apollo 8, in 1968, served as an icon for the entire environmental movement. People who’ve seen the Earth from space, not in a photograph but in real life, pretty much all report the same thing. “You spend even a little time contemplating the Earth from orbit and the most deeply ingrained nationalisms begin to erode,” Carl Sagan once described the phenomenon. “They seem the squabbles of mites on a plum.” This experience, this feeling of transcendence, is so universal, among the tiny handful of people who have ever felt it, that scientists have a term for it. It’s called the Overview Effect. You get a sense of the whole. Rivers look like blood. “The Earth is like a vibrant living thing,” the Chinese astronaut Yang Liu thought, when he saw it. It took Alan Shepard by surprise. “If somebody’d said before the flight, ‘Are you going to get carried away looking at the Earth from the moon?’ I would have said, ‘No, no way.’ But yet when I first looked back at the Earth, standing on the moon, I cried.” The Russian astronaut Yuri Artyushkin put it this way: “It isn’t important in which sea or lake you observe a slick of pollution or in the forests of which country a fire breaks out, or on which continent a hurricane arises. You are standing guard over the whole of our Earth.”
That’s beautiful. But here’s the hitch. It’s been 50 years. The waters are rising. The Earth needs guarding, and not only by people who’ve seen it from space. Saving the planet requires not racing to the moon again, or to Mars, but to the White House and up the steps of the Capitol, putting one foot in front of the other.
Jill Lepore is a professor of history at Harvard, a staff writer at The New Yorker and the author of many books including, recently, “These Truths” and “This America.”
Follow New York Times Books on Facebook, Twitter and Instagram, sign up for our newsletter or our literary calendar. And listen to us on the Book Review podcast.
Source: Read Full Article