Dentistry
['dentɪstrɪ] or ['dɛntɪstri]
Definition
(noun.) the branch of medicine dealing with the anatomy and development and diseases of the teeth.
Editor: Sharon--From WordNet
Definition
(n.) The art or profession of a dentist; dental surgery.
Typist: Rex