FB_1711_Hampel_8

 

In my undergraduate class on the history of education in America, I ask my students to use the term “higher education” to describe both high schools and colleges in the 19th century. The past is not the present in miniature, I tell them. In those days, high schools and colleges often resembled each other.

For one thing, only a small fraction of American youth went to high school. Incoming students had passed an entrance exam. The courses were rigorous, electives were rare, and whether they planned to find a job after graduation or go to college, most students worked hard. The teachers were sometimes called professors, and several high schools even granted bachelor’s degrees (Labaree, 1988).

To illustrate the value of the high school diploma, I tell my students that graduates could enter medical and law schools without taking a single college course. Moreover, high school graduates — usually the male graduates — easily secured white-collar office jobs. And in many districts, 18-year-old women could teach elementary school. That gets the attention of my undergraduate teacher education majors!

Not every 19th-century high school was so demanding, to be sure. In many rural areas, the so-called high school was a room or two attached to the elementary school. But the better schools set the standard: A good high school was selective and serious, requiring as much time — four years — as college. And like college, it was a special place, bestowing on graduates a ticket to the middle class. No wonder fewer than one-fourth of the graduates bothered to go to college.

Even so, colleges proliferated throughout the 19th century, soaring from 22 in 1800 to more than 800 by 1880 — at that point, there were 16 colleges for every 1 million Americans, a ratio never again reached. In that year, only four universities in England granted degrees; the state of Ohio, with 20 million fewer people, had 37 colleges (Labaree, 2017).

But some colleges were high schools in disguise. “All sorts of institutions assume the name of college,” the president of Pennsylvania State University complained in 1892 (College Association of the Middle States and Maryland, 1893). One-fourth of the nation’s small colleges were not truly colleges, according to the president of the University of Chicago (Boyer, 2015). Many impoverished colleges needed every tuition dollar, so they overlooked or scaled back their entrance requirements. On average, colleges attracted only 25 to 30 freshmen, and it was possible to get in without a high school diploma.

Many colleges tried to hold the line by putting “sub freshmen” in preparatory departments. In fact, by 1870, the number of sub freshmen in American colleges matched the collegiate enrollments. Those youngsters had their own courses, but they shared the faculty and they roamed the same campus as the collegians. While there was a distinction between secondary and college coursework, the boundary line could be hard to spot whenever students ranging in age from 13 to 25 were taught in the same buildings by the same teachers (VanOverbeke, 2008).

And there’s another important reason why the definition of college was often blurry in the 19th century: New kinds of institutions offered a mixture of basic and advanced work to prepare students for particular careers. Most agricultural colleges, for example, expected their incoming students to have no more than an elementary school education (Geiger, 2015). Normal schools, created to train teachers, offered programs ranging in length from one to four years, although few students stayed more than two years. The least prepared students studied the same academic material as high school freshmen or sophomores. Students with stronger backgrounds could pursue a wide variety of academic courses — some comparable to high school, others to college — in addition to the pedagogical courses everyone took (Ogren, 2005).

Also emerging were commercial schools — which, after the Civil War, grew as fast as the normal schools — offering penmanship, bookkeeping, math, and other practical courses to prepare clerks and office workers in an era when those were still desirable jobs for men. Were these high schools or colleges? On one hand, many called themselves colleges, and the median student age was 19. On the other hand, studies rarely lasted even two years, and elementary school graduation sufficed for admission (Weiss, 1981; Angulo, 2016).

It may be tempting to write off the earlier period as helter-skelter confusion, but there were significant benefits to the messiness of education in the 19th century.

By the late 19th century, then, American education was not a streamlined system with clear demarcations from one level to another. Different institutions emerged at different times rather than arising together as parts of a coherent plan. There were no federal mandates to steer them, and state oversight was modest — in 1890, the typical state department of education had just two staff members (Tyack & Hansot, 1982). And in the unfettered marketplace, vendors competed more than they cooperated.

What a mess

What a mess, my students sometimes say. (Little do they know that the full story is even more bewildering. In some parts of the country, military institutes and female seminaries were also described as higher education.) To their relief, I explain that yet another of that era’s innovations, research universities, helped simplify the array.

Universities had good reasons to work closely with high schools. If high schools did a better job teaching the basics, then professors could pare down the introductory survey courses (the lion’s share of the 19th-century college curriculum) and concentrate on seminars and research. Furthermore, the late 19th century saw the rapid rise of many new academic fields (as well as the redefinitions of old ones, including teacher education and commerce), and decisions had to be made as to which subjects belonged at which level. There were also purely pragmatic considerations: Cooperation boosted enrollments across the board. If any graduate of a high school approved by the state university could enroll as a freshman, then there was a new incentive to finish high school. By accepting all graduates from credible high schools, universities (and the many colleges that also used admission by certificate) bolstered both their enrollments and their status.

My students are surprised when I point out that from the 1880s through World War I, admission by certificate became far more common than the old custom of admission by written examinations. How remarkable that teams of university professors would visit dozens of high schools, decide which ones were satisfactory, and then welcome any and all graduates to their freshman class. What a leap of faith to assume that everyone from a decent high school deserved a place in the University of Michigan (which pioneered the system) and many other first-rate campuses (Wechsler, 2001).

I ask my students to use the term “higher education” to describe both high schools and colleges in the 19th century.

I then describe other efforts to clarify the boundaries between secondary and higher education.

Spurning the option of admission by certificate, several prestigious colleges and universities created the College Board, which gave examinations meant to ensure that high school students met particular requirements in various academic fields. By spelling out those requirements in detail, the College Board shaped the corresponding high school courses. For instance, the eight-part syllabus for physics stipulated 51 different experiments.

Furthermore, prominent university leaders and other educators convened blue-ribbon groups (most notably the Committee of Ten in 1893) to sketch what a true high school should and should not teach. And when the business mogul/philanthropist Andrew Carnegie endowed pensions for college faculty, it became important to decide the true meaning of “college” as well — the foundation needed to decide which faculty would be eligible for a pension. To the foundation officers, a genuine college had several attributes, including a four-year time span and admission restricted to graduates of four-year secondary schools. They even specified how often a course should meet in a real high school and college — hence the familiar Carnegie unit and the emphasis on time served and credits accumulated as the basis for promotion and graduation.

Many other players joined the quest for order, I tell my students, including state governments, individual educators, and organizations such as the National Education Association. It is a complicated story, but the outcome is clear: By the 1920s, it was much easier to differentiate secondary education and higher education.

The advantages of messiness

Because my students like to see history as a record of progress, they tend to assume that the alignment of secondary and higher education was entirely positive, another sign that American education got better and better over time. So I like to push back by discussing several advantages of the old ways and drawbacks of the new.

It may be tempting to write off the earlier period as helter-skelter confusion, but there were significant benefits to the messiness of education in the 19th century. For instance, when small towns raised money and contributed land to start a school, they gave their local economy a shot in the arm and coaxed outsiders to move there. There were also great opportunities, especially in cities, for individuals eager to start their own schools without local sponsorship. For example, the initial boom in the study of shorthand relied on private instructors, for-profit institutes, and correspondence courses since high schools rarely offered the subject in the 1880s and 1890s (Hampel, 2017). Likewise, it was possible for an individual teacher to take the plunge and create a private academy, as Abraham Flexner — who later transformed medical education — did in Louisville, quitting his public high school job to coach five adolescents. He was so successful that the growth of the Flexner School prompted the president of Harvard to ask Flexner what instructional methods he used (Bonner, 2002).

With many choices in the lightly regulated educational marketplace, unwary students could be shortchanged or swindled, but they also had much flexibility to shape their own path. At a time when formal credentials were less important than they are today, many students attended this or that school for a few years. There was no disgrace in not graduating. For instance, an elementary school teacher could persevere in high school and then teach, but she could also attend a normal school for one or two years, or she could just learn enough on her own to pass the school board’s examination. The route taken by Lewis Terman — before becoming one of the country’s most influential psychologists — was not atypical: He attended normal school for a year, taught, returned for a second year, taught again, returned for a third year, worked for three years as a school principal, and then entered Indiana University as a junior (Muchison, 1932). In many fields, there were far more stop-start-stop-start educational trajectories than we see today.

Furthermore, in the years before the high school diploma became a college admissions requirement, savvy students could prepare on their own to pass a college’s entrance exam. All that mattered was success on written tests. Grades, class rank, athletic feats, and extracurricular activities were irrelevant. And because nearly every college indicated which books to read and often published the previous year’s exams, students knew exactly what they needed to study, either on their own or with the help of a tutor. As a result, the age range in most 19th-century campuses was wide. Bright 15- and 16-year-old freshmen would sit next to classmates in their early to mid-20s who had worked for years. And if the ambitious excelled on the entrance exams, they could start as sophomores; in fact, the future architect Louis Sullivan entered MIT as a junior (Twombly, 1986).

The disadvantages of tidiness

My students learn also that there were disadvantages to making the boundaries less blurry. I point out three, in particular.

First, the prestige of high school declined. The word “secondary” began to convey not just a part of a sequence but a lesser status. By the mid-20th century, high school teachers were rarely called professor, and their salaries lagged behind what college faculty earned. The growing reliance on SAT and ACT tests also downplayed the importance of what students achieved in high school — what mattered was their apparent potential for future success. Rising enrollments meant that high school graduation was no longer the badge of merit for a small fraction of teenagers (in contrast, the status of college graduation eroded less severely when enrollments soared after the mid 20th century).

Second, getting an education took longer. Neither high schools nor colleges showed much interest in reducing their traditional four-year time spans. A few people tried — in the 1930s and 1940s, for example, the president of the University of Chicago wanted students to start college at 16 — but alternatives rarely took hold. Nor did anything become of periodic suggestions to eliminate college entirely (as the Germans had done) by strengthening high schools and steering the graduates directly to professional training. Instead, Americans came to believe that more time in school and college was not only academically useful but also psychologically crucial. Close friendships, athletic glory, extracurricular success, and opportunities to date attractive young men and women rivaled the formal curriculum. Staying enrolled offered chances to develop, mature, and exert leadership skills that would be useful after graduation (Clark, 2010). “Activities like the Crimson [student newspaper] give experience more valuable than the classroom,” was the opinion one Harvard provost heard on his campus, where “football builds character,” “the class loyalty of Harvard men is a cherished thing,” “the houses are communities in which values are developed,” and therefore “you must not hurry the educational process — it takes time and exposure to let ideas seep in” (Keller & Keller, 2001).

Third, the new boundaries could be illusory. By the 1920s, the messy array of 19th-century schools and colleges had been cleaned up, and the system looked relatively logical, rational, and streamlined. But the overlap between school and college was still substantial. In the 1930s, for instance, many college students knew less than high school students. In one study, 22% of high school seniors surpassed the test score of the average college sophomore, and 10% did better than the average college senior. Within each grade, the variations were vast. College graduates wore identical gowns for commencement ceremonies, but they were hardly alike in their academic achievement — some resembled 8th graders and a few surpassed their professors (Learned & Wood, 1938).

How remarkable that teams of university professors would visit dozens of high schools, decide which ones were satisfactory, and then welcome any and all graduates to their freshman class.

The decision to group students of similar ages and to promote them on the basis of credits earned meant that cohorts of young people moved ahead in the curriculum, side by side, even though they had wildly different knowledge and skills. And the mid-century rise of community colleges didn’t make it any easier to sort out who belonged at what level. Were the new community colleges just “high school with ashtrays,” as some skeptics said, or were they bona fide colleges? With their multiple purposes and diverse clientele, community college boundaries lacked sharp definition.

19th-century lessons for 21st-century challenges?

To be aware of this history is to understand the full significance of the ways in which Americans are currently rethinking the boundaries that separate high school and college. Early colleges, dual enrollment, 2 + 2 partnerships with community colleges, the expansion of Advanced Placement, the growing popularity of gap years . . . The borders have become more and more porous over the past few decades, and high school students have more and more choices about how and when to cross them.

In the 19th century, choice was the name of the game, and the marketplace provided many options and alternatives for education beyond the basics. As I tell my students, there were benefits to the messiness of that era, but there were disadvantages, too. The challenge today, as we reintroduce some of that disarray, is to provide what 19th-century choice often lacked: reasonable regulation to prevent fraud, along with concerted efforts to make sure that young people are well-informed about the choices they make. The blurring of boundaries should be an opportunity, not a trap.

 

References

Angulo, A.J. (2016) Diploma mills: How for-profit colleges stiffed students, taxpayers, and the American dream. Baltimore, MD: Johns Hopkins University Press.

Bonner, T.N. (2002). Iconoclast: Abraham Flexner and a life in learning. Baltimore, MD: Johns Hopkins University Press.

Boyer, J.W. (2015). The University of Chicago: A history. Chicago, IL: University of Chicago Press.

Clark, D.A. (2010). Creating the college man: American mass magazines and middle-class manhood. Madison, WI: University of Wisconsin Press.

College Association of the Middle States and Maryland. (1893). Proceedings of the 4th annual convention. New York, NY: Holt.

Geiger, R.L. (2015). The history of American higher education: Learning and culture from the founding to World War II. Princeton, NJ: Princeton University Press.

Hampel, R.L. (2017). Fast and curious: A history of shortcuts in American education. Lanham, MD: Rowman & Littlefield.

Keller, M. & Keller, P. (2001). Making Harvard modern: The rise of America’s university. New York, NY: Oxford University Press.

Labaree, D.F. (2017). A perfect mess: The unlikely ascendancy of American higher education. Chicago, IL: University of Chicago Press.

Labaree, D.F. (1988). The making of an American high school. New Haven, CT: Yale University Press.

Learned, W.S. & Wood, B.D. (1938). The student and his knowledge. New York, NY: Carnegie Foundation for the Advancement of Teaching.

Muchison, C. (ed.). (1932). A history of psychology in autobiography, Vol. 2. Worcester, MA: Clark University Press.

Ogren, C. (2005). The American state normal school. New York, NY: Palgrave Macmillan.

Twombly, R. (1986). Louis Sullivan: His life and work. Chicago, IL: University of Chicago Press.

Tyack, D. & Hansot, E. (1982). Managers of virtue: Public school leadership in America, 1820-1980. New York, NY: Basic Books.

VanOverbeke, M.A. (2008). The standardization of American schooling: Linking secondary and higher education, 1870-1920. New York, NY: Palgrave Macmillan.

Wechsler, H.S. (2001). Eastern Standard Time: High school-college collaboration and admission to college, 1880-1930. In M.C. Johanek (ed.), A faithful mirror: Reflections on the College Board and education in America. New York, NY: College Entrance Examination Board.

Weiss, J. (1981). Educating for clerical work: The 19th century private commercial school. Journal of Social History, 14 (3), 407-423.

 

 

Citation: Hampel, R.L. (2017). Blurring the boundary between high school and college: The long view.  Phi Delta Kappan 99 (3), 8-12.

 

ABOUT THE AUTHOR

default profile picture

Robert L. Hampel

ROBERT L. HAMPEL is a professor of education, University of Delaware, Newark, Del., and author of Fast and Curious: A History of Shortcuts to Education .