Reform
By healing soldiers on the battlefield, taking over plantations and farms, carrying messages, and even fighting alongside men, women began to see themselves as more equal to men. This was the first time in the history of the United States that women had a role in the war effort, which permanently changed the way women were viewed in society
An emblem of the Red Cross displays a society formed directly after the war due to society's changed views about women and their roles as nurses in society
- It was necessary for women to step up and take more involved roles during the Civil War
- They became accustomed to this independence and these responsibilities
- The practice and respect women nurses gained during the Civil War encouraged them to open up hospitals
- Women such as Clara Barton established the Red Cross society to help people all over the United States
- Women on the home front learned that they were just as capable as men
- Some women spies and soldiers like Jennie Hodgers even chose to retain their male identities
- Women still retained their taste of independence after the Civil War.
- Women could not fully return to the way their lives were before the war after they had a taste of other privileges
"The pendulum never swings back to where it was beforehand." (Exclusive history fair interview with Foner)
- New roles of women reformed society's views of women
- Women began to be viewed as more helpful and even as capable as men