I recently made a small change in my life. I recommitted myself to care for my planet, my world, by choosing not to support animal agriculture. Over 6 years ago, in an environmental studies class in college I learned about the destruction caused by animal agriculture. And I committed to becoming a vegetarian. I know this is a touchy subject. People are very attached to their habits and to their food. If this is going to offend you, stop reading and come back to it if/when you're ready to hear it. My aim is not to change anyone's mind or persuade anyone to start eating a plant based diet, I just think there's a lot of misinformation out there - mostly perpetuated by the meat and dairy industries. So I only aim to bring a bit of light and awareness to it.
There are so many different reasons that people choose to eat plant based. Whether it's compassion for animals, care for the environment, or for health reasons - you inevitably reap the benefits of all three reasons. Your health dramatically benefits, the planet benefits, and of course animals benefit.
I grew up, like most people, eating plenty of meat and dairy and believing that it was healthy. Necessary even. My grandmother is from Cuba and my grandfather was from Spain. Both cultures definitely eat meat and dairy. My dad's side of the family is American and live mostly in the South. They also eat lots of meat and dairy, and also believe this is healthy. It definitely took some time for my family to get used to me being a vegetarian, and it's taking time for them to get used to me being vegan. Food is a big part of people getting together, but it's not the important part. The important part is the connection you have, the conversations you have. Food is secondary. And to me, it wasn't worth risking my planet or my health for.
I honestly think most people who eat meat don't have all of the information. I don't think that people would continue to eat meat in such high quantities once they understand the consequences. But with so much bad information out there, how are we supposed to unveil the truth? How do we unwrite a lifetime of marketing from the meat and dairy industries?
I encourage you to watch the documentary Cowspiracy. It's on Netflix and it's one of the best documentaries I've seen. If you're interested in learning about the health benefits of a plant based diet, I encourage you to watch Forks Over Knives, also on Netflix. These two films correct some of the misconceptions about eating plant based, and they explain what the risks and repercussions are for eating animal products. Once you know the facts, then you can make your own educated decision.
Leave me a comment if you've watched either of these and let me know how it impacted you!