SLGer’s Research Narrative

Summary: This is an annotated narrative about the research career of a female computer scientist. Building on Sputnik-spawned educational opportunities, the narrative winds through academic and research institutions with intermittent entrepreneurial stints into continuing retiree activism. Her research spans software quality assurance, software engineering curricula, technology assessment, computer security, web search, and universal design. Rather than a personal autobiography, this narrative threads her rationale for choosing research problems amid international players and evolving fields of computing. Links to terminology, publications, and historical events were accessed July 2016.

Career Startup, Thanks to Sputnik (1961-1967).

The 1957 Sputnik crisis precipitated unprecedented educational opportunities in science-related fields through President Eisenhower’s 1958National Defense Education Act. Benefiting from newly energized science teachers in small town Utica Ohio, young Susan L. Gerhart met an IBM 650 at a 1961summer institute for high school students.

The wide open computing field took hold early in the Mathematics program at Ohio Wesleyan University in Delaware Ohio, where she administered, and wrote a compiler for an IBM 1620. During summers at OWU, she tutored “new math” for high school teachers and later received a graduate fellowship for study in Communication Sciences at the University of Michigan, all funded by the National science Foundation. When computer science grew into a recognized field of study, she entered the PhD program at Carnegie Institute of Technology.

Her experience as a first generation college graduate technology opportunist is described in a Computer Educator Oral History. Her Youtube video “Sputnik Launched My Career” questioned how the US gained from lagging in the early space era and whether we would now have the Internet.

Plucked from the corm fields of central Ohio into professional computer science, she launched a quarter century research journey that skimmed across applications of computing that trace back to “The Sputnik Moment”.

Early Career Research Theme: Does the program work right? Prove it!! (1968-1981)

The Challenge of “Correctness”.

turing’s ideas in the 1950s showed how programs written for his evolving machine designs could assure acceptable answers for the problems they purport to solve. Executing and assessing results from test runs has always been practice-able, yet ad hoc, expensive, and often missing errors.

Could there be scientific principles to extrapolate reliably from information other than test runs? Yes! Mathematical induction offered one route, exemplified by

“the sum 1+2+ 3 +… n=n(n+1)/2”,

a canonical example taught in discrete mathematics.

Mathematical reasoning about programs grew from “loop invariant’s” (Robert Floyd) and “axiomatic semantics” (C.A.R Hoare) where the assertions acted as comments supporting induction proofs. If only the proof process could be automated, programs might even be derived during proof construction. Bring on automated reasoning!

But, during or after program construction, proving correctness was hard! Very hard! Too hard?

Research Rule 1: Reduce proof effort through Language Choices (~1970).

Gerhart’s PhD thesis “Verification of APL Programs” drew on the then popular array-based programming language, APL. Static analysis assured conformance of array operators while the rich mathematical theory of arrays supported element-by-element proof arguments.

Her dissertation influences were thesis supervisor Donald Loveland’s mathematical logic background plus programming maverick Alan Perlis’ passion for teaching APL. Former adviser Perlis also germinated fundamental questions about testing, e.g. could a single test datum determine a program’s correctness?

Her thesis approach demonstrated a 26 operator “one liner” Hamming encoder. Automatic conformance proofs eliminated certain errors common in other languages, focusing tests on problem semantics. APL, albeit supported by IBM, never gained wide acceptance. Beyond its limitations to array oriented applications, APL had a practical typography need met by APL Selectric type balls.

Ironically, her thesis is most remembered for its bumper sticker quotation: Perlis’ aphorism: “One person’s constant is another Person’s variable”.

The historical trail from this research can be found in repositories: Microsoft Academic researcher data bank and DBLP Research Timeline and Google Scholar.

Research Rule 2: prove simply, transform as needed (~1973).

Continuing higher level reasoning, she began exploring correctness-preserving transformation that applied optimizations to easily proved and understood schemas. Demonstrations of the transformation approach include backtracking algorithms (with Larry Yelowitz) and, later, list copying algorithms (with Stanley Lee and Willem deRoever).

Knuth’s landmark “Art of Programming” enabled correctness theory to express “Knowledge About Programs:,a model and case study”. The technique software design patterns” subsumed the goals of this work.

Two summer jobs then broadened Gerhart’s horizons beyond academic computer science.

Research switch: could testing ever assure correctness? (~1974)

A summer job at compiler company Softech returned to the broader sense of correctness, asking whether testing could be strengthened to provide more systematic assurance or integrated with proving. Drawing upon the terms “reliability and “validity” from psychological testing, a paper (with John Goodenough)”Toward a Theory for Test Data Selection” proposed an idealistic “Fundamental Theorem” to evaluate testing methods. This twisted paradigm of selecting unbiased and equivalent test sets received a IEEE Transactions on Software Engineering 1976 Best Paper award.

Paradigms collided! Were testing and proving complementary or competitive? Could either world view assure economical production of quality software?

Research Rule 3: Don’t Trust Processes, or Publications (~1975).

Programming methodology became a core area for Gerhart’s research and teaching at Duke University. Another summer stint at NASA Langley Research ICASE led into the discovery that programming methods were not a panacea. That seminal paper on test data selection had revealed numerous errors in a proof-driven program offered by an eminent European researcher. The Naur text formatter program became a benchmark for specification, testing, and proving methods well into the 21st century.

The contentious issues of structured programming and credibility of correctness proving lead her to co-author with Larry Yelowitz, “observations of fallibility in modern programming methodologies”, dubbed “Goofs in Proofs”. Published specifications and proving techniques were observed to miss crucial properties that also exhibited failures of peer reviewing.

The questioning and rationalization of errors in published papers contributed to the subsequent “sociology of proving” pioneered by Edinburgh researcher Donald Mackenzie. Reasonable doubts about effectiveness of proofs persisted from the obvious limit of controlling a proven program’s execution environment, characterized as assumptions to be discharged by testing, inspection, static analysis, or other proofs. cultures clashed, even a gadfly philosopher later chimed in on his way to becoming a modern conspiracy theorist.

Gerhart had distinguished herself as a bi-tribal researcher across proving and testing with a heretical spirit and an uncanny knack for seeing flaws in both artifacts and processes.

Research experimentation: “Build it and they will….” (1977-1981).

Gerhart then in 1977 joined the Program Verification project funded by ARPA at USC Information Sciences Institute. Goals shifted from verification to readable specifications that sometimes served as executable models as well as requirements for implementations and oracles for testing and analysis.

The project produced an Inter Lisp-based verification system, named Affirm (after Triple Crown winning race horse Affirmed). AFFIRM’s user interface enabled productive proving and subsequent understanding of results achieved by the mechanical theorem prover and its human proof operators.

Benchmark models of security kernels, communication protocol, and a military messaged system were demonstrated.

The AFFIRM project also worked with the Arpanet development group under the late Jon Postel to systematize the expression and certification of the controls of the not yet public Internet, reported in ?”Specification and Verification of Communication Protocols in AFFIRM (with Carl Sunshine, David Thompson, Rod Erickson, and Daniel Schwabe). Also the project analyzed“Finding a Design Error in a distributed Program” (with Stanley Lee and Roddy Erickson)and later showed how AFFIRM worked on the benchmark Kemmerer simple library system.

With publication creds in both testing and proving, Gerhart prognosticated about the verification, specification, and transformation fields in “Program Verification in the 1980s: Problems and Opportunities”and “Program Validation” in Newcastle Seminars on reliable software

Further developed by originator David Musser at General Electric, AFFIRM’s source and reference artifacts are deposited in the Computing History Software Preservation repository. Doctor Gerhart co-authored several reference manuals, proof examples, and specification libraries. Project members included: David Musser, John Guttag, David Wile, Susan Gerhart, Raymond Bates, David Thompson, Roddy Erickson, Lisa Moses (secretary), and students (Daniel Schwabe, Debra Baker, David Taylor, Stanley Lee, and Jeannette Wing).

Although AFFIRM and other so-called ‘program verifiers’ reached usability, the required mathematical attitudes clashed with most industrial practice in the USA, progressing further in Europe and Canada, and retaining support primarily from national security organizations.

Mid-Career Twists and Turns

getting in on the ground floor of software engineering education (1982-1985).

The maturing software engineering industry need for a systematic educational framework motivated core memory pioneer Doctor An Wang to establish a Master of Software Engineering program. Doctor Gerhart joined the Wang Institute of Graduate Studies, where she coordinated curriculum development and courses in the topics of Programming methodology and Software Engineering, which complemented Management, computer Architecture, and Formal Methods. This curriculum later evolved into the Carnegie-Mellon software Engineering Institute program.

Two of Doctor Gerhart’s MSE students went on to receive doctorates in computational logic at Edinburgh University: Randy Pollack, senior researcher on European projects and Glenn Bruns, senior researcher at Bell Labs, now at Cal State Monterey Bay.

The 1980s computing industry entered the mini-computer era of changing costs, emerging networks, and expanding IT applications. These factors led to the demise of the Wang Institute which had flourished at the height of the “Massachusetts Miracle”

Doctor Gerhart summarized her experience as Skills versus Knowledge in Software Engineering: A Wang Institute Retrospective, a companion article to faculty member Mark Ardis evolutionary curriculum analysis.

Industry turmoil, here come the Japanese.

The 1980s Japanese Fifth Generation project injected further international competitive turmoil as it advanced an alternative parallel processing paradigm. Adopting a novel logic programming model and specific language, Prolog attracted world-wide attention. Although Japan’s own economic collapse and the inherent difficulty of elaborating a technological revolution killed Prolog, the Logic Programming paradigm lived on.

While at Wang Institute, Doctor Gerhart explored Prolog as a software engineering tool.Her interests had shifted from logic in automated reasoning, to programming by logic, then applying logic programming to software engineering problems, notably test data generation.

industrialization the Consortium Way (1986-1991).

“Upstream” Design Research and Experimentation.

Austin-based MCC (Microelectronics and Computer Corporation) was a research consortium of US companies and government agencies staffed by stakeholder assignees and hired researchers in semiconductor, artificial intelligence, human computer interface, database, architecture, software technology, and other specialities. These technical areas complimented an International Liaison Office scanning European and Asian technology developments.

The Software Technology Program at MCC , lead by operating system pioneer Laszlo (Les) Belady, offered Design Methodology exploration to reduce costs and accelerate capture of system requirements. Software Engineering) problems were approached in the context of “upstream” (pre-implementation) design methodology.

Doctor Gerhart executed a series of experiments with research tools and methods approaching usability and dissemination. A blend of questions and experiments on theory, methods, tools, and maturity guided technology assessments. MCC STP hosted an early US installation of the Israeli-developed STATEMATE product based on a robust methodology called State Charts, demonstrated in STATEMATE and cruise control: A Case Study (with Sharon Smith).

A survey of hypertext by Jeffrey Conklin followed a similar assessment model, well before the WWW bastardized the “hypertext” concept.

This experimentation shared ideas emerging from generalized design theory, notably “Wicked Problems” as formulated by Berkeley sociologist Horst Rittel . Rittel’s design theory IBIS (Issue-based Information Systems)” supported rationale capture during product developments as well as analysis of social media conversations. Taking the work one level higher resulted in Gerhart’s formulation of “Requirements for environments for analysts”.

The MCC Formal Methods Transition Study (1990-1991).

She then returned to the (now called) “Formal methods” field with increased emphasis on “technology transfer”, sometimes call “technology diffusion”, an objective shared with the Carnegie Mellon Software Engineering Institute (SEI) and the Software Productivity Consortium.

Called a “Transition Study” the year-long project was supported by thirteen companies and government agencies: Motorola, Rockwell, NCR/Bell Labs, Kodak, Hughes, Nortel, Bellcore, Sematech, NSA, NASA JSC, MITRE,…

This industrial research model Sought to demonstrate and assist evaluation and transfer of maturing methods and tools affecting world-wide software industry competitiveness. MCC’s International Liaison Office representative Ted Ralston injected Perspectives on leading European Formal Methods companies. Deliverables included technical reports, workshop presentations, experimental demonstrations, and videotapes from project members : Susan Gerhart, Ted Ralston, Kevin Greene, David Russinoff, Damir Jamsek, and Mark Bouler.

Later appeared a summary of the MCC Formal Methods Transitioning Study in an IFIP Technology Transfer context. and pragmatic prospectus in Ed Yourdan’s “American Programmer” newsletter.

Toward The Ultimate Goal: Safe Software Systems (1992-1993).

Leaving MCC after the Software Technology Program collapsed, Gerhart joined a consulting team that produced a book-length report “An International Study of Applications of Formal Methods” which lead to a suite of papers informing educators, researchers, and practitioners about the progress of and needs for the field. The report’s empirical assessment model rated success of use and penetration of acceptance probed through extensive interviews.

The applications were socially important, notably TCAS (aircraft collision avoidance), the Paris Metro signaling system, a Canadian reactor, NIST smart card security system,….

Widely disseminated in 1994/1995, report chapters gathered citations as lessons in research methodology. Co-authors were Dan Craigen and Ted Ralston with combined funding from NIST (National Institute of Standards and Technology), and NRL (Naval Research Laboratory) in the US and AECB (Atomic Energy Control Board) in Canada.

These studies built on the MCC Formal Methods Transition Studies as well as two international workshops, FM 89 Formal Methods for Trustworthy Systems and FM 91 Workshop on Formal Methods , and the ORA Canada Formal Methods bibliography and previous assessments.

Gerhart had abstracted technical ideas into rigorous assessments of implementations of those ideas then further into a methodology for transferring those ideas through consortia research and, finally, empirical assessments of effectiveness of such transfers.

As often happens, behind the scenes, discussions continue for years until an opportunity arises for funding to mobilize communities of researchers. Closing in on the Year 2000, research leaders knew that large scale software systems were not only expensive but also dangerous, as reported since 1986 in Risks to the Public from Computing. Conversations about the need for “High Confidence Systems”, “Dependable Software”, “Certifiably Safe Systems”, and related terms motivated staff at NIST, MITRE, NRL, NASA, DARPA, NSF, and others to keep talking until the leader and funding coalesced under CyberPhysical systems at the National Science Foundation, lead by Helen Gill.

Web search before Googlearchy (1994-1999).

After periods of management at the National science Foundation and a NASA Houston technology transfer institute, Doctor Gerhart moved into the field of Internet search with a tool,Twurl, to coordinate searches among the (then) many existing search engines. Called “Context driven web search and browsing”, discrete mathematics and linguistic query extensions empowered search experts to manipulate searches as objects.

This tool later in 2004 supported experiments that asked “Do Search Engines Suppress Controversy?”. Comparing then competing search engines suggested performing critical searches by more than one engine. Moreover, searches often miss or push controversial topics below the usual threshold of user interest. This suggested a model of the web distinguishing an “Organization” web” layer governed by business links over “Analytical pages with fewer links but more detail. Using synonyms of words that suggested controversy, such as “debate” or “evidence”, together with more detailed terminology drives searches through the promotional layer into critique and data pages. Try the “Controversy Discovery Engine” interface to experiment with more analytic searching .

While search tools such as Twurl promised ways to manage search data by professionals, casual searchers came to terms with bias in search, overlooking their ad-driven Faustian bargain with surveillance capitalism. Google-archy won!

Wrapping up and restarting.

Trying to give back to Academia (2000-2005).

In the last stage of her career, Doctor Gerhart taught undergraduate software engineering, data bases, discrete mathematics, and object-oriented programming at Embry-Riddle Aeronautical University in Prescott Arizona. With a NSF security education grant, she directed a student-oriented demonstration of the buffer overflow programming mistake that underlies many security attacks. Co-authors were educational technology specialist doctor Jan Hogle and then student, now faculty at University of New Mexico, Jedidiah Crandall. Bundled with software development and programming language advice, this Java applet was used for a decade in classrooms and laboratories world wide.

The Journey to Vision Accomplishment (2005-).

Born severely myopic,prone to retinal degeneration, in 2005 Gerhart began a journey to overcome legal blindness and print disability. Her adaptations to reading, writing, walking, and communication are depicted in her blog “As Your World Changes: adjusting to vision loss through technology, with class”. She benefits from accessible books at with over 400,000 books ready for reading by synthetic speech. A globally distributed free screen reader NVDA (NonVisual Desktop Access) enables her writing, browsing, grousing, and other Windows interfacing.

She delivers short tutorials, gesturing on an iPod Touch and iPhone via the Apple VoiceOver reader and myriad navigation, reading, and communicating apps learned from AppleVis accessibility portal and Eyes On Success interviews and many more podcasts generously shared among the blind and visually impaired community.

Gerhart interacts with an network of macular degenerates through and, @slger123 on Twitter, tweets within the flow of #accessibility and #a11y.

Gerhart shares experience through workshops on “Using Things That Talk”, advocates web accessibility within computing organizations, and maintains a public service for visually impaired citizens of Prescott AZ.

Diversity and inclusiveness (1986-).

A charter member of the Systers community for women in computing, she remains active after 25 years now commenting on accessibility, disability, and history of computing.

Gerhart supported the local TRIO Upward Bound program for first generation pre-college students with courses in web design and security. She contributed to the Center for minorities and Persons with Disability, CMD-IT, as a member of the advisory board. Proudly speaking, her namesake “Susan E.” is a rock star software engineer in Portland Oregon.

Learning widely with peers (2008-).

A Governing Council member and student at lifelong learning institute (OLLI) at local Yavapai College expands her interests in science, philosophy, and literature. She has offered peer learning groups in social media, Internet History, and the Singularity, exhibited on website YC OLLI Asks . Using her Bookshare source of reading enables participation in the 2nd Tuesday book club.

She wrapped up five years serving on the OLLI Governing Council with a panel, looking back over 50 years, on “The Sputnik Moment” (Youtube), helping this retiree organization to “contribute to a rapidly changing, multicultural, multi-generational society”.

On to “Novel Writing” (2013-).

In her remaining time, having been hooked during an OLLI novel writing” course, she is attempting her ultimate creative synthesis in “A Chip On Her Shoulder”, a novel about social media situations tracing through many of the events of this research narrative.