What is Dentistry?

Dentistry is the diagnosis, treatment, and prevention of conditions, disorders, and diseases of the teeth, gums, mouth, and jaw. Often regarded as necessary for complete oral health, dental care can have an impact on the health of your entire body.