Skin Cancer

Skin cancer is cancer that forms in the tissue of the skin and is the most common form of cancer in the United States. There are several types of skin cancers that can be diagnosed depending on the location of the cancer. Over the past 30 years, more people have had skin cancer than all other cancers combined. Regular skin self-exams are recommended.