Why do we still feel the need to tan? Even though we know that a tan is the sign of skin damage (and I don’t think anyone wants a melanoma) there is still an urge to be tanned. The other day I was speaking to a client who feels that she can’t go out in summer with her white legs. Another who feels she has to spray tan herself every week to look healthy, yet a tan is not a sign of health.
Nobody says that Nicole Kidman or Cate Blanchett need to get a tan, so why do the rest of us feel that white is so unattractive? Now this is not true in Asian countries where white is what most people want to be and they spend time and money on bleaching their skin and using whitening lotions.
Some say that a tan makes the cellulite less obvious, and that we look slimmer when tanned (darker skin recedes more than lighter skin), but is it really worth potential death or disfigurement?
Do you still like to tan either fake or real? If so, why?