Quote:
Originally Posted by kandwo
That depends on what you mean by "the West". Nudity is quite accepted in the northern parts - Scandinavia + Finland. And in Germany. As someone who's grown up in a culture with a relaxed attitude to nudity the American taboos surrounding nudity seem almost neurotic.
|
Of course there are regional differences, but they are relatively minor. I mean, generally you don't see women with bare breasts walking on a main street in any Western city; you don't usually see completely naked adult humans in programs and movies meant for children; and so on. (Or do you see those things on a daily basis? If so, that's news to me).