Two weeks before Apollo 11 left Earth for the moon, its crew was still training in Texas. Neil Armstrong and Buzz Aldrin were strapped into a simulator rehearsing and re-rehearsing the only section of their mission that had not been flown before – the final descent to the lunar surface.
Armstrong did it all by the book, but Aldrin was different. “He would be practising simulations of before and during the landing, but he cared more about the landing,” says Margaret Hamilton, who has a close personal interest in the story; as close as anyone alive.
The landing is what Aldrin wanted to practice because his life depended on it. He’d discovered that if he kept a particular radar switch open during the descent the simulation moved a little faster through its steps towards touch-down. “But then,” Hamilton says, “what happened is, if he forgot to flip the switch back, alarms would come on.”
On one simulation when Aldrin forgot about the switch and the alarms came on, the flight controller aborted the landing.
Later that month the mission launched for real. Four days in, quarter of a million miles from Earth and minutes before landing on the moon, Aldrin did it again. He flipped the radar switch open and forgot about it, and survived – because Hamilton, a self-taught computer programmer who once taught French to support her husband while he studied for a law degree, had taken care of it.
Out in the vacuum of space the alarms came on but no abort was called. The Eagle landed. The men returned to Earth. An epic contest with Soviet Russia was won, and the narrative of victory settled comfortably on the shoulders of the astronauts.
Hamilton wouldn’t dream of re-writing that narrative, but half a century on it needs adjusting. As people relive the almighty chutzpah of the moon landings, they might pause to consider that no Apollo astronaut would have got more than a few seconds beyond the launchpad without Hamilton’s work. Every moment of every mission depended on a stack of computer code as high as she was tall, and it was her code.
There are plenty of others – men and women – who had a role in it, but Hamilton’s was unique. Hers was the guiding intelligence. She was one of the first coders hired by Nasa for Apollo, and by 1969, at the age of 33, she was leading a team 400-strong.
Not that conventional leadership was her style. About two years before the Apollo 11 launch a call came through to Mission Control in Houston at four in the morning. It was from the Draper instrumentation lab in Cambridge, Massachusetts, where the software people worked (and a few of them still do).
“We’re calling them to say we’re ready to deliver this release,” Hamilton remembers. “We’re telling them we made this deadline. We’ve all had wine. We’re feeling good, and we told them what we wanted to name it because we were in that kind of a mood, and we wanted to name it Gorilla, and they wouldn’t let us.”
Hamilton is 83 now and a twist of nervous energy. Uber or Lyft, I ask, thinking a binary question might go over well with a coder. “Neither. I walk everywhere.” And to relax? “I never leave work.”
She didn’t always get her way with the Nasa higher-ups, but she did most of the time. More than anyone she internalised the idea that when lives were at stake there could be no errors in the software. More than that, it had to be a safety net. She embedded in it thousands of lines of code for “error detection and recovery in real time”. Meaning: if the astronauts screwed up, the systems had to save them. Which they did.
“She was our goddess of software,” says Hugh Blair-Smith, who devised the languages most of the Apollo code was written in. She was also, as President Obama noted a few years ago, a working mum in the 1960s.
By 1967, Hamilton was amicably divorced with a young daughter, Lauren. The story goes that when she had to work at nights and weekends, Lauren would come too.
One evening Hamilton booked the hardware simulator. “Lauren wanted to play astronaut, so she went over to it and started playing, thinking she was doing what I did.”
The system crashed. Lauren had inadvertently selected the Apollo launch code, programme P01. “And then she started playing around again and was in mid-course to the Moon when she hit some other keys which selected the pre-launch programme, P00. That wiped out all the navigation data in mid-course.”
The data had been wiped because the on-board computer, a cubic foot of hardware with less memory than a Fitbit, couldn’t run both programmes at once.
That got Hamilton thinking. “I went back to the higher-ups at Nasa and MIT [the Massachusetts Institute of Technology], and I said what if the astronauts do the same thing? ‘Never will happen,’ they said. So I said, what if it does? ‘No it won’t,’ they said. ‘They’re too well trained’.
“Well, wouldn’t you know. The very next flight, Apollo 8, [Commander Jim] Lovell made ‘the Lauren error’.” He selected the wrong programme half-way to the moon, scrubbed his nav data and handed Hamilton an arduous nine-hour assignment working out how to re-upload it. After that Nasa evinced a new respect for her error detection and recovery code, and she had carte blanche to make it standard across all Apollo software.
It may not be coincidence that talking to Hamilton can feel like interfacing with someone much cleverer than you who’s put herself in error detection and recovery mode. At first the conversation skips around as if between hyperlinked paragraphs, and I worry that the many highlights of her career will blur into one if I fail to keep up. Then she slows down and we find a rhythm.
The lunar module’s limited computing power meant not only that it could only run one programme at a time, but also that the programmes had to be ranked by priority and able to restart in milliseconds. If the computer crashed on final descent, the module carrying Armstrong and Aldrin would crash too.
Hamilton was determined to avoid that at all costs, and to find a way of warning the astronauts if trouble was brewing in the data. She draws a parallel with modern airliners and black boxes: “What if the pilot knew more about it before the plane crashed? What if he knew about things that weren’t making sense rather than waiting until some box is saying, ‘Oh, yeah, this is what happened.’ That’s a little late, right?
“So that was my main worry,” Hamilton continues. “The astronauts should know if an emergency’s going to happen. I wanted to bring up the problem with the hardware guys and the mission guys but the first thing they’re going to say is, ‘well how are you going to solve it?’ So I had to solve it first, otherwise I wouldn’t be able to do what I wanted to do. I held a meeting with the hardware guys first and I told them the problem and how I would solve it. These were higher-ups. For somebody my age and my position at the time, to hold a meeting with them you had to be right.”
Her solution to the early warning challenge became famous. She called it the priority display. If necessary, priority displays would simply replace normal ones on the lunar module’s screens, giving the crew new options based on new priorities.
But of course it wasn’t that simple. To be a solution, a priority display would have to be able to cut in at any time, and the normal display – on a device known as the DSKY (display and keyboard, pronounced diskey) – wasn’t even on all the time.
In fact, the hardware guys told Hamilton, the DSKY was off more than it was on. “So I said to them why isn’t it on all the time? Why turn it off? And they said it’s never been done before. So I said, well why is that? ‘Well we don’t know. Maybe the power won’t last for the whole mission.’ So they said let us think about it for a couple of days. We’ll come back and let you know. And they came back and they said we’re going to keep it on all the time so you can interface to the astronauts. So the software can.”
To make this new interface worthwhile, every possible scenario had to be gamed out ahead of time, and “safe spaces” had to be identified in that giant software stack – spaces from which the computer could restart without repeating old mistakes or making new ones.
Most Nasa engineers thought this was all conceived in an excess of caution. Their hardware was proven and their astronauts too smart to make mistakes. The story of how wrong this assessment turned out to be bears retelling, briefly, as a fable of hubris and humility.
By July 20, 1969, Hamilton’s genius as a software engineer was well established – she is credited with coining the phrase and founding the discipline. She had demanded the same respect paid by Nasa to hardware engineers, and on the whole she received it.
Sexism came in the form of banter that would now be actionable but that she accepted as part of the culture. (“It was humour, but sick humour; Mad Men stuff.”) It came from Nasa men and military men, but not from fellow programmers. One of them, Dan Lickly, was by now her second husband. And her software was nine tenths of the way to the moon.
It contained, Hamilton admits, dozens of then-secret messages including four lines from Henry VI and an ignition file titled ‘Burn Baby, Burn’. She didn’t write them all but she did write some of them. These and hundreds of flight programmes had been hand-woven by seamstresses with copper wire into primitive integrated circuits. Within the circuits were thousands of tiny magnetised donut-shaped rings. A wire going through a ring was a zero. A wire going round one was a one.
From this laborious binary notation came the calculations of speed, acceleration and trajectory that carried Armstrong and Aldrin to the moon and back, even though, soon after their lunar module separated from its mother ship, things started to go wrong.
As the lunar module, the Eagle, started its descent, Hamilton was in Draper’s SCAMA room (for Switching, Conference and Monitoring Arrangement), with an open line to a colleague at Mission Control. Six minutes into the descent, with the Eagle at 30,000 feet above the moon, an alarm sounded. It was the 1202 programme alarm. At the time Neil Armstrong, with an edge in his voice, had to ask what it meant.
Hamilton knew. At least, she knew it was one of a set of “not supposed to happen” alarms. It meant the computer was jettisoning non-essential tasks like a balloonist throwing ballast from her basket.
She knew, but only two people at Mission Control did. They were Steve Bales, the flight controller who had aborted Aldrin’s simulation because of a similar alarm days earlier, and his subordinate, Jack Garman. They were 26 and 25 years old respectively.
For 30 seconds it was up to Bales whether or not to abort the landing. Neither Hamilton nor her colleagues in the SCAMA could help much. “In Cambridge we looked at each other,” Don Eyles, a colleague of Hamilton’s, wrote later. From the moment of Armstrong’s request for information “events moved very quickly, too fast for us to have any input”.
Five-and-a-half minutes before landing, Garman said quietly that the 1202 could be ignored if it didn’t light up too often. He told Bales, and Bales told his superior that as far as he was concerned Eagle was go for landing.
At Draper, Eyles wrote, “we were barely breathing”.
Looking back across the decades, Hamilton says: “I’d like to lighten it up a little bit but it was sheer panic. I didn’t know whether to be worried about the astronauts or the software.”
Four more programme alarms lit up the Eagle over the next three minutes. The root causes were not identified until later. There was Aldrin’s open radar switch, bothering the computer because of a technical issue with its power supply. There was a discrepancy between altitude measurements by the Eagle’s landing radar and its internal guidance system, and there were a few feet per second of excess speed from a breath of oxygen left in the airlock between the Eagle and its mothership as they undocked. All three glitches were overloading the computer with requests for non-essential calculations.
In the end, as the world held its breath, all five alarms could be safely over-ridden, and this was only possible because of the swift reboot and prioritisation talents buried deep in the Eagle’s brain by the unassuming programmer who called herself “one of the guys” but was actually their boss. The mission could survive the panic because Hamilton had panicked years before.
There was one other failsafe that Hamilton likes to remember. Her “priority display” innovation had created a knock-on risk that astronaut and computer would slip out of synch just when it mattered most. As the alarms went off and priority displays replaced normal ones, the actual switchover to new programmes behind the screens was happening “a step slower” than it would today.
Hamilton had thought long and hard about this. It meant that if Aldrin, say, hit a button on the priority display too quickly, he might still get a “normal” response. Her solution: when you see a priority display, first count to five.
“It became part of their checklist,” she says, with what Kubrick might have called a hint of pride. “Apollo 11 was the first time it ever came up.”
And did Aldrin count to five?
She says Eagle’s safe landing seconds later was “maybe the most exciting moment of my life”. There was “an interesting feeling of what the software was doing” and at the same time a realisation of “the things we’d gone through to get there, the whole background. Everything came rushing through my mind.”
Her mantra is: “What if?”
“What if we’re just dreaming we’re here?” she’d ask her father, a philosopher and poet, on long drives through Michigan’s upper peninsula, where she grew up in the early 1950s.
What if she hadn’t discovered a rare talent for abstract mathematics while an undergraduate in Indiana? And jumped at the chance to work for Edward Lorenz on chaos theory at MIT, and then at Draper when Nasa published job ads saying they needed computer people to help put men on the moon?
What if she hadn’t taken charge of the Apollo source code and built into it so many safety features that others said weren’t needed? There is every chance the programme would have failed.
Hamilton tells a comical tale of being hired, aged 15, to give tours of a Michigan copper mine where “native” copper lay around to be picked up in chunks, with no need for processing. On her first tour she had two customers, but soon there were thousands, so she hired more guides and opened a gift shop. “So now I was running this organisation at 15, and when they asked me [at Nasa] what I’d done I told them all about it.”
She was offered two jobs on the spot, and suggests it was because of the copper mine story. I remind her that by the time she applied she could think in multiple dimensions and operate in at least five software languages. “Yes,” she says. That too.
Leaving Nasa as the Apollo programme wound down, Hamilton started a software business in her own name. It has an unflashy website with an early 1990s look about it, but click on the client list and there’s hardly a blue chip aerospace, defence or tech firm not included. The US army, navy and air force are on it as well. It turns out there’s a market for software that doesn’t fail.
Three years ago Barack Obama presented Hamilton with the Presidential Medal of Freedom. Hanks, De Niro, Redford, Springsteen and Diana Ross got them on the same day. So did Bill and Melinda Gates, but it was the goddess of software, radiating substance even though she could barely see over the podium, who somehow stole the show. “Our astronauts didn’t have much time,” Obama said. “But thankfully, they had Margaret Hamilton”.
Since then American science has taken a beating from an administration in Washington that doesn’t seem to understand the first thing about the scientific method. Hamilton doesn’t like it. “I’m not happy with people who are against saving the universe,” she says. Speaking of which, she reckons we’ll get to Mars eventually, although there’s probably still a lot of back-of-the-envelope worrying to do.
“I’m sure they haven’t done the what ifs yet.”
Margaret Hamilton photographed by Brad DeCecco for Tortoise. Archive photographs courtesy Draper. All other photographs Getty Images