Definition (Source: Wikipedia)
***Health care, or healthcare, is the prevention, treatment, and management of illness and the preservation of mental and physical well being through the services offered by the medical, nursing, and allied health professions.