Recently in my women’s studies class we’ve been assigned a paper on how something affects women. For my project I’ve decided to research how pregnancy affects the female body. It’s fascinating how pregnancy is just accepted as the norm for a lot of women to go through — sometimes multiple times — in their life, yet we never talk about what it does to the body. Sure, we all know women get “fat” and often feel sick. Their feet swell and then they have a painful birth. But what I want to focus on is the lasting impact of pregnancy. What does being pregnant do to a woman’s bones and organs? How is skin and fat affected once the baby is out? You always hear about “post baby” bodies, but we never talk about why that’s something to strive for to begin with.
I’ve barely started this project, but I’m already thinking critically about these things. Pregnancy is something I’m not sure I’ll ever go through, and I have to admit that looking at these facts are making me seriously reconsider ever being pregnant! Major kudos to the women of the world who are braver than me.