this year, there were some major movies set in japan(kill bill, lost in translation, last samurai) and i don't know..i am getting kinda sick of seeing japan all the time in movies. since when i was young, i remember watching hollywood movies with lots of japan influence like people wearing kimono in japanese style settings (eg. Demolition Man with sylvester stallone and sandra bullock which was kinda bad) i was young when i lived in korea so i remember people being ticked off when subject of japan came up due to bad history between korea and japan... i've been watching the LAst samurai trailer, and keep seeing words like "honor, intergrity.blah blah blah" but there really should be movies about the lack of human decency japanese showed towards other weaker nations..
most people have NO idea how cruel japanese were.. during Japanese invasion of korea, there was an earthquake in japan, and japanese people blamed Koreans FOR THE EARTHQUAKE so they slaughtered thousands of korean people...that was really sad
i don't blame the people of japan right now,(even though i should because japan has NEVER issued an apology to China or Korea) but portraying them as so nicely and beautifully in the past kinda makes me angry.. probably SOny has lots of power i guess..
people should be more brave to take on darker subjects occured around the world that have never been exposed much..
__________________ In this life, it's not what you hope for, it's not what you deserve -- it's what you take. -from Magnolia-