List of nursing schools in the United States
From Wikipedia, the free encyclopedia
This is a list of nursing schools in the United States of America, sorted by state. A nursing school is a school that teaches people how to be nurses (medical professionals who care for individuals, families, or communities in order to attain or maintain health and quality of life).
This article's lead section may be too short to adequately summarize the key points. (November 2021) |