She couldn't have arrived at a more exciting time. Everyone knew that there was more to predicting the weather than red sky at night shepherd's delight and mackerel skies and mare's tails make big ships have small sails. They also knew that weather was chaotic in the sense of 'exquisitely sensitive to initial conditions' and hard to predict further than a few hours ahead. In Ireland of course predicting the weather is easy: it's either raining or about to rain and these states oscillate with a fair degree of regularity. The MIT boffins realised that they would need computers to do the calculations because the equations were so necessarily complex that, without computers, you'd only get the answer after the weather front had passed through. It was the same story with cracking the Enigma code in WWII: no point in getting the news after the enemy strike had happened.
There was a sense in the early days of computing that the sexy stuff was in the hardware and the boys corralled that off for themselves. As we saw when remembering the career of Grace Hopper, the software was seen as, well, soft and sort of peripheral to soldering and replacing the blown vacuum-tubes. So there were openings for young women on the software side, and Margaret landed one of those. There was no job description called software engineer because that role was being crafted by the ingenuity and ways-of-
When Margaret moved from weather work to war work, trying to predict when and whence the Russians were going to launch the first missile strike of Armageddon, she was assigned a Newbie-hazing programming project. Far from being intimidated by the gnarly code she was tasked to make work, she untangled it and made it do her bidding. This may remind you of Aoife McLysaght solving an intractable coding issue a generation or two later. Having once started working for military-industrial complex, it was a short step across MIT to the Draper Lab which was writing code for the Apollo space project. By this time Margaret had a 4 y.o. daughter at foot, anti-social hours were being worked and when necessary the child came to work. One time the girl pressed a button which said Don't Press This Button and erased part of the code for a flight simulation program. Margaret used this to push for more redundancy and safety nets in the code. to make it more robust and, well, safer. NASA was trying to fit a massive task into the limited but bulky hardware that was the best money could buy. Or at least the most expensive money could buy which isn't always the same thing. NASA insisted there was no room for a lot of back-ups and fail-safes against every absurd eventuality that the software people might dream up. Whatever about female pre-schoolers, no Astronaut was stupid enough to press the Don't Press This Button button. Except that on the Apollo 8 mission one of the astronauts did indeed press the button and unravelled part of the program on which his return to Earth depended. They had to rebuild the code by wireless.
After this the NASA management paid a little more attention to the unmanly cries for safety first, or at least safety second or maybe just a bit of safety if-and-when it didn't affect the ambitious time-line of the Apollo project. But in way, the NASA brass was right, it was an inherently risky business that required a lot of contingent events to fall out as predicted. If you agonised about unlikely accidents then the rockets would never leave the ground. Neil Armstrong, trained as an engineer, reckoned he had a 50:50 chance of getting to the Moon and back.
So when the big day 20th July 1969 arrived and Eagle approached the surface of the Moon with Armstrong and Aldrin aboard, a bunch of lights flashed and buzzers went off to report numbered error messages. Neither man knew what the errors meant. Mission Control had seconds to decide whether to abort the landing. But Margaret Hamilton had written the code and she knew that it was a prioritisation overload. The on-board computer was getting itself all tied up trying to execute more tasks than it had memory to accommodate. Some of the mission-critical tasks were being put on hold while trivial tasks were being addressed. By killing the currently inessential tasks, enough memory was recovered to get the bus down to a safe landing in the Sea of Tranquility. Margaret Hamilton deserves a large part of the credit for including priority codes for each of the tasks in the program.
Like Dennis 'dmr' Ritchie, Hamiliton was there at the beginning of the digital age and her Ways of Seeing, her obsessions and worldview cast a long shadow across how geeks should behave. In November this year Barack Obama gave a Presidential Medal of Freedom to Margaret Hamilton; and one also, posthumously, for Grace Hopper. Tribs on SciShow with some details of how Hamilton's interventions and planning mattered. Or in writing: Brief bio of Hamilton, NASA & Apollo on Wired. For example, her experiences with curious children, foolish astronauts and the design of complex programs that might encompass thousands of lines of code, led her to develop USL a Universal Systems Language. By looking at the structure of the software and formally mapping the dependencies of the several parts, programmers were forced to act defensively. They were effectively forbidden from making glaring errors; errors which Hamilton had calculated to be the most likely to occur and/or the most damaging. She must have studied Utilitarianism in her Philosophy classes at college.
Other Women of Science
Would that be the "twelve oh-two alarm" code that appears in "From The Earth To The Moon" and "Go!" by Public Service Broadcasting?
ReplyDeleteThat's the boy, according to my text sources and also a 1201 error. Does that imply that there were 1199+ other error codes which Hamilton's team had popped in to cover the bases? I guess not necessarily 1202 might be the 2nd error of type 12. That's why they pay astronauts and test-pilots the big bucks: they remain cool under pressure. Me, when the stress rises I make a little [as in wee] puddle and pass out in it.
Delete