Skip to Main Content

We have a new app!

Take the Access library with you wherever you go—easy access to books, videos, images, podcasts, personalized features, and more.

Download the Access App here: iOS and Android. Learn more here!


In 1840, the Baltimore College of Dental Surgery was founded as the world’s first dental school, establishing the Doctor of Dental Surgery degree. Twenty-six years later, Harvard University Dental School, the first dental school with a University affiliation began to offer the Dentariae Medicinae Doctorae degree, thus establishing the current paradigm of dental education and the practice of dentistry as an autonomous domain, completely detached from the medical education and the practice of medicine.1 Over time, these domains evolved into two distinct industries, artificially detaching oral and systemic health in a manner that does not emulate the biological reality. The separation of dentistry and medicine had important repercussions that continue to affect public health and healthcare delivery both in the United States and globally.

Notably, dental public health is a relatively new addition to the field of public health. Prominent milestones affecting dental healthcare delivery can be identified in United States history and illustrate some of the outcomes attributable to the autonomous evolution of dentistry and medicine. While medical insurance availability can be traced back to the official establishment of health insurance in the early 1930s by the New York State Insurance Commissioner to cover medical expenses, dental expense coverage continued as a fee-for-service model until as late as 1954, when California introduced the first dental insurance plans. Inclusion of dental insurance to partially cover cost of services first began to gain some traction in the 1970s when the United Automobile Workers Union began incorporating employer-based plans as a collective bargaining agreement component.2 However, large proportions of the population continued to have no access to dental insurance, with low-income populations who could not afford out-of-pocket costs disproportionately affected.

A series of congressional acts including: (a) establishment of the first community health center through the Economic Opportunity Act in 1965, (b) the 1977 Rural Health Initiative, and (c) establishment of Federally Qualified Health Centers under Medicare and Medicaid in 1989 and 1990, respectively, provided some aid to assist the low-income public with access to affordable medical care.3 However, none of these initiatives supported comprehensive dental-care services. It was not until 2000, when Surgeon General David Satcher published his report “Oral Health in America” that public awareness was raised to what Dr. Satcher coined as “the silent epidemic” in referencing the largely insidious and pervasive national oral health crisis that had become established.4 Dr. Satcher provided stark statistics with respect to high prevalence of dental caries in pediatric populations and poor dental health in the elderly and raised awareness to large access disparities for large subpopulations of Americans. Disparities in the access to dental-care services were most pronounced in dental professional shortage areas and further exacerbated by low Medicaid reimbursement rates, which were causing dentists to reduce or exclude services to Medicaid patients and children enrolled in the Children’s Health Insurance Program (CHIP).


Pop-up div Successfully Displayed

This div only appears when the trigger link is hovered over. Otherwise it is hidden from view.