It’s hard to read any women-marketed websites without seeing the words, “self-care” sprinkled across multiple headlines and advertising, or crammed among social media hashtags. You often see these two words in ads and articles featuring images of toned, slim, (usually white) women, mid-yoga-pose; or a perfectly staged cup of tea next to an expensive looking candle. Today’s industry of self-care seems to have led to a near cult-like belief that the act of engaging in it will relieve us of any physical, emotional, or spiritual pains. But what happens when self-care isn’t the miracle cure-all, but in fact, is damaging to our health? What if, as I recently experienced, trying to practice self-care makes us feel worse than before?