Watched a documentary on Netflix tonight called What The Health. It's an alarming look at how eating animal products actually induces disease in our bodies. I've often wondered if I could shift my eating habits to be more animal-friendly. After seeing this film, I feel compelled more than ever to switch to a plant-based diet.
Thursday, July 20, 2017
Subscribe to:
Comments (Atom)