Science and Technology. The American Economy: A Historical Encyclopedia

One of the central questions of economic policy explores
how the government can foster economic growth. Although
expanding the borders of a nation or its resources promises
to create growth, politicians cannot easily control the circumstances or consequences of such expansion. As a result,
economic thinkers have often focused on policies that the
state can control more easily—including those related to the
progress of science and technology. Economists and policymakers have frequently posited a causal relationship between new technologies and economic growth, often looking to Britain’s industrialization as an example. In Great
Britain, the scientific revolution of Isaac Newton and Robert
Boyle in the late seventeenth century led to the Industrial
Revolution of the late eighteenth century and created
Britain’s unparalleled economic supremacy in the nineteenth century. Although causality in this chain of events remains hotly disputed by many historians, the correlation between the growth of science and technology, on the one
hand, and the national economy, on the other, cannot be
simply dismissed—economic and technological developments accompany one another. Consequently, this Baconian
equation—whereby science yields technology, which yields
economic growth—has always been and continues to be an
unwritten assumption of economic policy. Attacks on this
formula in the post–World War II period have not dissuaded
economic policy makers from building programs to encourage the development of science and technology in the name
of national growth.
The ways in which scientific and technological development has been fostered have differed over time. The differences often hinge on how significant a role the federal government plays in the business life of the nation.
Consequently, for most of America’s history, the government’s involvement in science and technology has waxed and
waned, increasing in periods of crisis (such as wartime) and
decreasing in periods of less urgent need. Furthermore, the
kinds of activities the government has undertaken in the
name of science remain quite diverse, from intellectual property law to military investment, from education to the direct
funding of research.
Science and Technology in the Eighteenth
and Nineteenth Centuries
Economic policy as it applies to science and technology extends back to the nation’s very beginnings and the ratification
of the U.S. Constitution. The Constitution mentions science
once in Article 1, Section 8, and states that Congress has the
power “to promote the Progress of Science and useful Arts, by
securing for limited Times to Authors and Inventors the exclusive Right to their respective Writings and Discoveries.” To
do this, George Washington signed the country’s first patent
act into effect on July 31, 1790. According to this act, the federal government could grant patents for “any useful art, manufacture, engine, machine, or device, or any improvement
thereon not before known or used.” A three-member patent
board, which included Thomas Jefferson (the first patent examiner and secretary of state), Secretary of War Henry Knox,
and Attorney General Edmund Randolph, granted the
patents. Each applicant had to supply a specification, a drawing, or preferably a model and pay a nominal fee. The board
decided the duration of each patent, up to a maximum of 14
years. In addition, it established penalties for patent infringements.
The system created by the Patent Law of 1790 relied heavily on the secretary of state’s oversight and involvement. Since
this responsibility conflicted with the numerous other duties
of the position, Jefferson strove to change the system. Furthermore, with only 55 patents approved between 1790 and
1793, patent applicants expressed dissatisfaction with the delays in issuing patents caused by the competing demands on
the patent examiner’s time. Consequently, Congress passed a
new patent act in 1793. This legislation left the administration of patents in the secretary of state’s office but created a
registration system under which patents were granted pro
forma upon the completion of the required paperwork, ensuring the granting of virtually every patent. By the 1830s
complaints against this system had mounted, and the 1836
Patent Act further revised the system. That measure codified
standards for the approval of patents—definitions that persist to the present day. Essentially, it required that inventions
had to be novel, useful, and nonobvious to a practitioner

skilled in the relevant area. In many ways, the 1836 system
combined the scrutiny of the 1790 system with the bureaucracy of the 1793 act.
The government’s first involvement in creating an institution for the study of science came with Jefferson’s election to
the presidency in 1800. Jefferson advocated a national institution for the sciences, which would generate graduates who
could put their knowledge to work for the common good. Although the new president initially opposed the creation of a
national military academy, his opposition waned when he realized that with the right personnel, his institution for science
and a national military academy could operate as one and the
same entity. Further, he recognized that a military academy
would attract far less political and regional opposition than a
national university.
Congress established the U.S. Military Academy at the garrison at West Point, New York, in March 1802. The academy
operated as a continuation of a military school that had been
in existence at West Point since 1796, albeit without the endorsement of the federal government. Superintendent
Jonathan Williams, a man of the Enlightenment who had assisted his great-uncle Benjamin Franklin in numerous experiments, headed the school. Williams had also traveled extensively in France and knew French scientific and military
institutions, including the Ecole Polytechnique—the model
for the U.S. Military Academy. Jefferson clearly chose
Williams to head West Point on the basis of his scientific reputation.
Williams’s tenure at West Point was rocky, however, because of his efforts to create a scientific, as well as a military,
institution. He established the U.S. Military Philosophical Society, a scientific society open to civilians that he hoped
would become a leading organization for the production and
dissemination of new scientific knowledge. The society’s
membership rolls included Jefferson, James Madison, James
Monroe, John Quincy Adams, John Marshall, Robert Fulton,
and Eli Whitney, among others. Although the society never
achieved the distinction in science that its illustrious membership promised, West Point nonetheless did become the
leading educational institution for science and especially engineering in the antebellum period.
At least until West Point graduates proved their mettle in
the Mexican-American War (1846–1848), the academy had
to survive periods of wavering federal support, as well as outright hostility from some members of Congress. Many Americans underestimated the scientific importance of West Point,
but it did produce the nation’s most important promoter of
science in the nineteenth century—Alexander Dallas Bache
(class of 1825)—and an overwhelming majority of the nation’s university-trained engineers, many of whom went on
to hold important positions in the construction of the nation’s infrastructure. West Point produced more scientists
and engineers than science education colleges themselves.
In the early nineteenth century, technology assumed an
increasingly central role in the nation, particularly in terms of
the industry and commerce. Engineers constructed roads,
canals, and harbors to facilitate domestic and international
trade, and inventors developed, patented, and sold the machines that underpinned the industrial and agricultural development of the nation. The government maintained a
laissez-faire policy relative to this technological development.
But with technological progress inevitably came technological difficulties, forcing the government to eventually adjudicate and prevent disasters. The first technology to require
government intervention for public safety involved the steam
engine. The invention and diffusion of the steamboat in the
early nineteenth century played an important role in opening
up the commerce of the nation to the western territory,
which was reachable through the country’s extensive rivers.
The steamboat made river transportation fast and inexpensive. But by the 1830s, the United States experienced a rash of
steam-boiler explosions, victims of which numbered in the
hundreds. Yet no government agency with the authority to
investigate the accidents existed.
As long as the public used high-pressure steam engines,
accidents continued. Scientists in the private sector realized
that this problem had become a subject ripe for empirical
study. Alexander Dallas Bache, then a professor at the University of Pennsylvania, organized an investigation into the
causes of boiler explosions at the Franklin Institute, a privately endowed organization in Philadelphia. After a six-year
study, the institute found that most explosions occurred because of negligence; in other words, they were preventable.
The Franklin Institute’s report called on the government to
develop some sort of regulatory legislation and advocated inspections, licensing, and penalties for noncompliance. However, all attempts to get legislation passed in Congress failed,
with arguments over the constitutionality, efficacy, and expense of such regulation stalling passage. Then, in July 1838,
a bill was passed that provided for licensing, certification, and
the appointment of regional inspectors without financial ties
to manufacture. Furthermore, it established liability for owners and operators in the case of accidents. However, since the
act did not specify the inspection criteria, inspectors enforced
the laws haphazardly across the nation. No one liked the law
as passed, and it failed to prevent accidents, as evidenced by
the 70 explosions that occurred between 1841 and 1848.
In 1852 Congress returned to the issue of regulation. Sen.
John Davis of Massachusetts worked with engineers to construct more effective legislation. The bill Davis introduced
proved quite similar to the recommendations of the Franklin
Institute’s 1836 report. Davis’s 1852 bill met resistance from a
small but vocal number of Congress members who opposed
any kind of interference in commerce. To them, regulation
threatened private property rights. Nonetheless, the bill
passed, and a new role for government had begun. In some
ways, the 1852 bill became a model for the regulation of technology, setting manufacturing standards, operating standards, a system of annual inspections, and licensing procedures for engineers. Congress authorized stiff penalties for
noncompliance, especially for fraudulent and falsified documentation. Inspection boards investigated accidents. This
legislation established a precedent that has justified further
regulatory oversight of new technologies to the present day.
Despite the willingness of Congress to consider a more active role in the nation’s technology, support for the develop-

ment of new science and technology remained a foreign notion. For example, congressional reluctance to involve the
government in the pursuit of science, regardless of the economic costs, delayed the creation of the Smithsonian Institution. In his will, James Smithson, a wealthy British bachelor,
bequeathed his entire estate to the United States for the purpose of developing “an establishment for the increase and diffusion of knowledge among men.” He did not specify any
other stipulation in the bequest, so Congress debated
whether to accept the gift and what to establish using the half
million dollars. Finally, in August 1846, Congress passed a bill
for this project, providing a secretary, a board of regents, and
a building that would include space for laboratories, libraries,
museums, lecture rooms, and an art gallery. Clearly, a wide
range of activities were planned for this endeavor, but Congress had not decided exactly what role the government
might take in the sponsorship of science, even without the
expenditure of tax dollars. Bache served as the sole scientist
on the board, and under his direction, the Smithsonian
moved toward becoming an institution of scientific research.
He ensured this trajectory when he appointed his friend
Joseph Henry, a professor of physics at Princeton, as the institution’s first director. Under Henry’s management, despite
constant struggles about funding and direction, the Smithsonian became a precedent-setting private foundation that
supported scientific research as its primary goal, rather than
as a by-product of other priorities.
The American Civil War presented the federal government
with new and unprecedented military and technological
problems, from ironclad ships to steam engines to submarines. It also presented an opportunity for those pushing
for a greater federal role in the direction and funding of science and engineering. War changed the climate in Congress,
making legislators much more receptive to the idea of encouraging research, though, ironically, the cost of fighting the
war meant that funds for scientific research almost disappeared. During the Civil War, the federal government approved several institutions that would exert a lasting influence on science into the twentieth century, including the
Department of Agriculture, the National Academy of Sciences, and the Morrill system of land-grant colleges. Congress created the Department of Agriculture from the agricultural division of the Patent Office, which was responsible
for patenting plants. Although headed by a chemist, the department’s scientific mission would become subjugated to
the demands of American farmers until well into the twentieth century.
The Morrill Act also bridged the divide between science
and farming. Vermont Republican Justin Morrill had become
convinced that the nation was failing to provide useful knowledge to its farmers and workers. He imagined that new educational institutions would improve America’s productivity by
making practical scientific and technical education accessible
to all. After years of fighting between northerners and southerners, he drafted a bill in 1862 that offered 30,000 acres of
federal land to each state for each senator and representative
to create “at least one college where the leading object shall be
. . . to teach such branches of learning as are related to agriculture and the mechanic arts.” States could either designate
existing universities to fulfill this function, as in the case of
Wisconsin, or found new institutions, such as the University
of California. After the war, southern states divided the appropriation between separate agricultural and mechanical
colleges for whites and blacks. The colleges created by the act
became sites for the pursuit of new knowledge in engineering
and agriculture. For agriculture, the 1887 Hatch Act furthered
this mission by allocating funds for agricultural experiment
stations operated in conjunction with the land-grant schools.
Of the developments in the Civil War era, the National
Academy of Sciences possessed the most direct mission in
terms of supporting and directing scientific research. Bache
had been arguing since the 1840s for an American equivalent
to the French Academie des Sciences—to support research
through government subsidy, centrally organize and coordinate research in the nation’s interest, and advise the government on scientific and engineering issues. By 1862, with Congress seemingly interested in authorizing greater government
activity in science and with the pressing need for expert advice about military technologies, Bache decided the time was
right to pursue his notion of the academy. To do so, he secured the support of Massachusetts senator Henry Wilson.
On March 3, 1863, Wilson presented the act to incorporate
the National Academy of the Sciences, which required approval from Congress but no appropriation, and it passed.
The National Academy of the Sciences Act named the 50
charter members of the academy who would remain members for life, a move that elicited some ire from the American
scientific community, particularly since the new members
represented Bache’s interests in the physical sciences more
often than they represented the numerically larger community of natural historians. Even among those elected to the
academy, considerable discontent existed. But following the
lead of Joseph Henry, who accepted his nomination as a
member despite his dislike of the autocratic setup of the
academy and its prescribed membership, all nominees for
membership eventually accepted their appointments.
Controversy over the origins of the National Academy of
Sciences soon gave way to a more devastating congressional
apathy. Despite the fact that Congress had chartered the academy, it failed to consult with it for scientific and technical advice. In fact, only seven requests were made to the academy
during the war, and the Treasury and Navy Departments resisted paying the expenses of the committees that had formed
within the academy to study specific issues.
Economic policy regarding science and technology in
nineteenth-century America continued to be characterized
more by belief than action. American politicians and scientists alike commonly believed that technological progress
would lead to a more prosperous nation. However, politicians remained wary of claiming that the federal government should assume responsibility for pursuing scientific
and technological research. Scientists, for their part, wanted
to avoid offering the government any real control over scientific endeavors. So, while admitting that science and technology had a central role to play in the economic life of the
nation, neither scientists nor politicians were willing to

coordinate a partnership between science, technology, and
the government.
Science and Technology in the Twentieth Century
The 1901 founding of the industrial research laboratory at
General Electric (GE) set a new tone for science and technology at the beginning of a new century. GE’s lab was neither
the first nor the only industrial research laboratory in the
United States, and the industrial research laboratory was not
a uniquely American development. Still, GE’s reputation, the
size of its research arm, and its high visibility helped establish
a growing tradition of commercial research facilities. Industrial research labs represented a new alliance between science,
technology, and industry in America. The corporate pursuit
of research and development (R&D) helped set the new tone:
Science no longer functioned as an esoteric activity pursued
only at universities and by private scientific societies—rather,
it became germane to the economic life of General Electric
and therefore the nation. This development promised to produce a national attitude more conducive to government interest in scientific and industrial research. In addition, intellectuals such as Charles Sanders Pierce and John Dewey
trusted technology to improve the life of the nation, both socially and economically. Those politicians who resisted
greater government involvement in science often did so not
because they doubted science’s economic promise but rather
because they came from a laissez-faire ideological position—
believing that the government should not interfere in the
market and that supporting research was interference. According to this view, GE and other large corporations should
have set up large industrial research facilities precisely because their work involved a business mission. Although national interests required successful companies, they argued,
the government should not directly aid those companies.
But national defense proved another matter altogether,
and in that sphere, the notion of governmental involvement
met no resistance. As a result, scientists advocating more government support often heightened their efforts during
wartime. This dynamic occurred during both world wars. Although the federal government did create new agencies related to scientific and technological research—such as the
National Bureau of Standards (in 1901), which became the
government’s first physical laboratory during peacetime—
Congress established nearly all the agencies with scientific
missions under the cloud of war.
In 1915 the promise of the airplane as a military tool
helped create support for an agency devoted to the study of
aeronautical research. Attaching it to a naval appropriations
bill, Congress created the National Advisory Committee for
Aeronautics, or NACA, “to direct the scientific study of the
problems of flight.” Although only $5,000 was appropriated
for research, the move to direct federal support for research
in any field constituted a notable change from congressional
attitudes in the past. NACA’s board consisted of 12 members
appointed by the president, though notably no one from the
aircraft industry received an appointment until 1939. NACA
originally operated as a committee to provide technological
advice and as such reported directly to the president, but it
gradually evolved into more of a research agency. In 1917
NACA set up its primary research facility, Langley Field in
Hampton Roads, Virginia; others would follow. NACA became a model agency that was largely devoted to research
into civilian flight, and the military branches took control of
their own research. In addition to its work in aeronautical research, NACA helped gain passage for bills such as the Kelly
Air Mail Act of 1925, which authorized the use of private
companies for airmail delivery, acting essentially as a government subsidy of the nascent commercial air travel industry.
The Air Commerce Act of 1926 created the Bureau of Aeronautics within the Department of Commerce, which provided regulatory oversight of the whole air industry, in ways
not entirely dissimilar to earlier steam-boiler regulation. In
1958 NACA became the National Aeronautics and Space Administration, or NASA.
The success of NACA as a site for limited governmentsponsored research notwithstanding, prominent scientists
had greater visions for the marriage between federal support
and scientific research. Early in the twentieth century, George
Ellery Hale, founder and director of the Mount Wilson Observatory in Pasadena, California, and one of the founders of
the California Institute of Technology, saw the war in Europe
as an opportunity to promote American science. He presented a plan to the National Academy of Sciences in April
1916: If the United States proceeded to go to war with Germany, the academy would offer its services and resources to
the president. This plan received a unanimous endorsement
by the membership of the academy, and the academy
planned to send a delegation to Woodrow Wilson. A group of
five imminent scientists met with the president and stressed
the importance of science to the nation’s defense. Wilson
agreed to involve the academy in the creation of an arsenal of
science. Back at the academy, the National Research Council,
or NRC, was formed to promote cooperation between research institutions and leading scientists and engineers in
universities, industry, government, and the military.
Hale’s plan was highly centralized, investing a great deal of
power in the NRC. Consequently, it generated some resistance, though it also made the secrecy needed for wartime
more manageable. Hale intended to work directly with the
president instead of through any intermediary institutions.
For this reason, he also sought the approval of Wilson’s 1916
Republican opponent, Charles Evans Hughes; Hale wanted to
ensure the NRC’s position regardless of who won the 1916
election. However, like Bache before him, he sought the cooperation but not the oversight of the government. His National Research Council would contract to perform and coordinate research for the government, but it would not
operate as a government agency. As a result, the NRC continued to be funded by private gifts, just as the National Academy of Sciences had. Given the short duration of the war after
the United States entered into it, little time remained to test
these arrangements.
In March 1918 Hale worked to make the National Research Council and its connections to government permanent. He wanted to do this though an executive order, so that
the NRC could remain a private organization without gov-

ernmental control. Wilson agreed and signed an order in May
1918 to make the NRC a permanent executive scientific advisory council. Hale reorganized the NRC for peacetime in
1919 and placed the research focus on pure, instead of industrial, research. As the United States retreated into its isolationist position, Congress cut funding, and the NRC’s connections with government, especially the military, suffered.
The NRC reinforced the role of American universities as
the frontline institutions in scientific research. In the face of
extremely limited governmental support, the council also
worked closely with the growing number of philanthropic
patrons of science, such as the Carnegie Institution and the
Rockefeller and Guggenheim Foundations.
During the Great Depression of the 1930s, even the small
amount of funding that had supported limited scientific research dried up. Debates about whether technology, by increasing productivity, had increased unemployment changed
the public’s impression of technology. The Progressive Era’s
unparalleled faith in technological progress vanished, replaced
by a suspicion that technology had contributed to the dire circumstances of the period. However, policy changed with
Franklin Delano Roosevelt’s New Deal. The Works Progress
Administration (WPA), committed to finding jobs for skilled
people, ended up supporting some scientific research and
many engineering projects. By 1938 most federal spending on
science (including technology and agriculture) had been restored to predepression levels. And as the United States grew
closer to war, the WPA moved into defense projects, with increasing scientific and technological components.
War again provided a significant catalyst for government
interest in scientific research, and like Hale and Bache in previous wars, one individual played a prominent role in creating a new vision of scientific and technological cooperation
with the government. In 1939 Vannevar Bush went to Washington from his position as a dean at the Massachusetts Institute of Technology (MIT) to head the Carnegie Institution,
one of the philanthropies primarily involved in funding scientific research. Bush received an appointment as the chair of
NACA. An electrical engineer, he possessed a centralized, hierarchical vision of science. Concerned about Germany’s aggression in Europe, he supported military modernization and
preparedness.
Bush took the lead in organizing science for war. He approached Harry Hopkins, Roosevelt’s closest adviser, in May
1940 with a plan to mobilize and coordinate researchers
under nongovernmental experts like himself. Hopkins saw
Bush as the man who could harness America’s considerable
technical resources in the national interest. By June Roosevelt
had created the National Defense Research Committee
(NDRC), and the president began to delegate science and
technology policy to Bush. In hindsight, it is clear that, with
the NDRC, the mobilization of scientific and technological
resources began a full year and a half before the United States
entered World War II. When the government needed science
to advance the war effort, science was ready.
Whereas earlier attempts by scientists to contribute to war
efforts had been more promise than action, science played a
much more important role in World War II. In May 1941
Bush headed the Office of Scientific Research and Defense
(OSRD), a newly created agency put in charge of the NDRC.
This office, though providing scientific and technological
R&D for the military, remained under civilian control. Bush
sought out scientists, engineers, and technicians; offered over
9,000 draft deferments; and placed people where their skills
and experiences would be most useful. The OSRD also contracted research to additional private, university, and government institutions, and Bush could move projects between institutions. The OSRD oversaw most of the important
technological developments of the war, from radar to the
proximity fuse—with the notable exception of the Manhattan
Project, which began as an OSRD project but, for reasons of
budget and secrecy, was transferred to the Army Corps of Engineers, where it essentially functioned autonomously. In addition to Bush’s OSRD, the military branches themselves
spawned new R&D capabilities during the war. These agencies
often quarreled with the OSRD over personnel and projects.
Still, Congress rarely limited funding in the war years, and the
federal R&D budget (including agriculture) grew from $74.1
million in 1940 to $1.59 billion in 1945. The government
spent over $2 billion on research during World War II—not
including the Manhattan Project—divided roughly equally
between the army, the navy, the army air corps, and the OSRD.
The size of the federal government swelled during the war,
and although it did contract afterward, it did not shrink all
the way back to prewar levels. Vannevar Bush wanted to ensure a continued partnership between his researchers and the
government. However, he hoped to make certain that scientists, not politicians or bureaucrats, made the key decisions
about what research to pursue. Like Hale, he envisioned government support without government supervision. However,
the Keynesian vision of the state’s role in the economy came
into conflict with Bush’s vision. If Bush argued that scientific
research played a central role in economic and technological
development, which he did, then it would be hard for him to
convince the government to leave the direction of that research to a small, elite committee. As argued by Harley Kilgore of West Virginia, Bush’s opponent in the debates about
the structure of the National Science Foundation, something
with such a strong influence on the nation’s economic future
belonged in democratic hands. For five years, from 1945 to
1950, Bush and Kilgore engaged in a high-profile debate over
the government’s role in the sponsorship of science. They
agreed that the government should aid R&D spending, but
they disagreed about just how much direction and oversight
the federal government should provide. Kilgore advocated a
central agency to direct and fund research in the interest of
economic growth. Bush wanted an agency controlled by scientists, with basic science as their priority.
Meanwhile, others in Congress remained less supportive
of funding science and instead sought policies to create an
economic environment in which market forces would encourage companies to invest in R&D. They contended that
private R&D should be supported by university research,
which could be funded to a lesser extent by the government.
In its 1947 report, the president’s Scientific Research Board
called for the nation to spend 1 percent of its national income

on R&D. By the 1950s this level of funding had become a
standard expectation.
In May 1950, a month and half before the beginning of the
Korean War, President Harry Truman signed the National Science Foundation (NSF) Act. The act fixed the structure of the
NSF, which would be supervised by a board appointed by the
president that would share power with a director. Alan Waterman, the chief scientist at the Office of Naval Research, became the first director of the NSF. Hardly the dictator Bush
feared, Waterman worked cooperatively and deferentially with
scientists in the academy. Through the NSF, the federal government sponsored research, but scientists at nongovernmental institutions, principally universities, would perform the
work. In addition, the NSF supported the kind of basic research that Bush had promoted in his report
Science, the Endless Frontier. During the five-year fight for the NSF, other new
and existing institutions and agencies, such as the National Institutes for Health and the Atomic Energy Commission, had
taken over many of the functions that Kilgore had imagined
for the NSF. However, the orientation to so-called pure science
left the NSF vulnerable to questions about its utility—Congress often wanted more concrete commitments about the
benefits of funding basic science. The NSF faced extinction in
1952 and fought for its existence in its first several years.
The NSF’s worries ended in 1957 with the Soviet Union’s
launch of their
Sputnik satellite. To many Americans, Sputnik
became a technological symbol of Moscow’s growing and aggressive power. The United States had been developing a similar satellite since 1955, under the navy’s Project Vanguard. In
fact, the country successfully launched
Explorer 1 only three
months after
Sputnik in 1958. But the impact of seeing the
Soviets arrive first in space cannot be underestimated. The
government’s science policy in response to
Sputnik encompassed several dimensions, all of which justified considerable
increases in funding. In the wake of
Sputnik, the federal government created new agencies, increased the funding and visibility of old agencies, and constructed initiatives for scientific and technological education. In 1958 Congress created
NASA, which, as mentioned, was a transformation of NACA.
NASA constituted the most visible government response to
Sputnik. In addition, the National Defense Education Act created a student loan program; provided financial assistance for
instruction in science, mathematics, and foreign languages;
and gave fellowships for graduate training in science and engineering. By 1960, spurred by the cold war, the federal government had clearly taken responsibility for funding scientific research.
Between 1958 and 1968, federal funding of science remained high. Private investment in R&D grew more regularly
and steadily than the more volatile federal expenditure. Still,
the federal share of national R&D investment hovered
around 63 percent from 1960 to 1985. NASA expenses accounted for a considerable proportion of federal expenditures and peaked in 1968, in the wake of John F. Kennedy’s
pledge to send a man to the moon in the decade of the 1960s.
The effort to achieve that goal, called Project Apollo, cost
$25.4 billion and ultimately succeeded with the 1968 orbit of
the moon and the 1969 lunar landing of
Apollo 11.
However, just as spending on science reached unprecedented levels, Bush’s vision of pure science in the national interest came under fire. In 1965 the Department of Defense
sponsored its own study of the efficacy of scientific research,
called Project Hindsight. The report, issued in 1969, examined the development of 20 weapon systems and overwhelmingly credited targeted, applied research, not Bush’s pure research, for their development. Although there was some
criticism of Project Hindsight—including a refutation by the
NSF—the study changed the policy climate, casting a much
more favorable light on targeted research.
The Vietnam conflict also affected R&D spending. Although public opposition to the war and to the military more
generally cast a shadow over defense research, military procurement channeled money to the defense industry and its
R&D. Some new military technologies had been developed
under federal contracts, but others emerged more independently. Procurement acted as another way for the government
to direct R&D. For example, in 1962, the federal government
purchased the entire output of integrated circuits in their initial year of production. Many of these technologies also
worked their way into public, nonmilitary applications, from
television to the computer to the microwave. In the twentieth
century, the aircraft industry oversaw particularly successful
transfers of technology from military to civilian applications.
The end of the cold war in 1989 caused considerable confusion in terms of science and technology policy. The cold
war had given policymakers a clear national security imperative for the R&D funding, and the generally strong postwar
economy ensured access to the necessary funds. Even after the
stagnant economy of the 1970s, President Ronald Reagan’s
emphasis on national defense nearly returned defense-related
R&D to its 1960s levels. By 1986 defense expenditures peaked
at 69 percent of the federal R&D budget. Combining the
public and private sectors, two-thirds of the $120 billion
spent on R&D funded defense work. Still, by 1992, defense
spending as a proportion of total R&D had only shrunk to 60
percent. Even President Bill Clinton, who claimed to favor
R&D with more direct technological consequences, sought
only for civilian R&D to achieve parity with military R&D by
1998. Yet the reduction in defense research brought consequences. As the national security basis of the federal investment in science eroded, so did congressional interest in supporting large scientific research projects. The 1993 collapse of
support for the $8 billion superconducting supercollider
would become the most visible casualty.
In the postwar period, as expenses grew, so did Congress’s
interest in adjudicating scientific and technological budget
allocations. By the 1960s, concerns arose that Congress
members lacked the expertise to make these technical decisions and that they needed better access to expert advice,
much like the president had had from organizations such as
the National Academy of Sciences since the Civil War. Emilio
Daddario, a Connecticut Democrat and chair of the congressional Subcommittee on Science, Research, and Development, called for a study of Congress’s access to technical
advice and information. He found that although a system of
scientific and technical advisers for the executive branch ex-

isted, the legislative branch lacked such accommodation.
Daddario began to push for an advisory agency for Congress. However, his interests remained more than organizational—he hoped Congress could take a greater role in managing technology, especially moderating its negative
environmental consequences. For Daddario, the promising
tool was technology assessment, and the agency he sought
for Congress would take a leading role in such efforts. He introduced his legislation in 1970 and immediately encountered resistance, with most of the opposition aimed directly
at the regulatory dimension of technology assessment. Although Daddario was no longer in the House when it was
formed, Congress created the Office of Technology Assessment (OTA) in 1972 as a supplement to the General Accounting Office. When OTA began operating in 1974, Daddario became its chair. In practice, the OTA served as an
advisory body for the Congress, and Daddario’s hopes for
true technology assessment failed to materialize. Following
the 1994 Republican takeover of Congress, congressional action eliminated the OTA.
During the 20-year existence of the OTA, the president’s
system of science advisers also underwent several changes. In
1976 the Office of Science and Technology Policy was created
to “serve as a source of scientific and technological analysis
and judgment for the President with respect to major policies, plans, and programs of the Federal Government.”
George H. Bush’s President’s Committee of Advisors on Science and Technology (PCAST) provided further support to
the chief executive. This committee coordinated access to experts in the private sector and academic community, particularly on matters of technological development, setting scientific research priorities, and reforming math and science
education. President Bill Clinton created another group, the
National Science and Technology Council (NSTC), in 1993 to
coordinate federal R&D. This council, which was to report to
the vice-president, followed clear goals for federal investments in science and technology. George W. Bush reformed
PCAST in 2001 when he created the President’s Council of
Advisors on Science and Technology (PCAST 2001), in large
part to advise and aid in decisions about stem-cell research.
Each of these groups operated with specific issues in mind,
and the subtle differences and hierarchies between these advisory bodies created room to discuss controversial subjects.
Unlike the OTA, whose ineffectiveness at investigating controversial problems stemmed from its dependent political
position, the presidential committees operated independently and, generally, in limited time frames.
The history of science and technology policy in the United
States is necessarily multifaceted because so many factors affect the development of scientific and technological knowledge. The most direct influence in this field remains federal
funding, but that funding often comes with federal control,
which scientific and technological practitioners have often
resisted. In addition, national security continues to be the
most common rationale for federal research support, and
that orientation clearly affects the nature of the science and
technology produced. Federal support in the education of
both highly trained technical personnel and the public also
plays an important role in a nation’s ability to produce science and engineering advances. In the twentieth century, the
role of private corporations in the pursuit of scientific knowledge grew increasingly important, and government policies,
such as taxation, had the capacity to affect the methods and
levels of private research support. In the case of existent technologies, federal and state regulation clearly influences both
regional and national economies. Lastly, matter related to intellectual property law should not be dismissed as critical factors in technological development, as recent issues in technology transfer and pharmaceutical patenting have shown.
—Ann Johnson

Leave a Reply 0

Your email address will not be published. Required fields are marked *