Whatever happened to Hollywood having a positive outlook? It looks like every major star has a movie coming out that shows the Earth in ruins.

 

In Will Smith's new movie "After Earth" which comes out this June all of humanity has abandoned Earth because everything here has evolved to kill us. Tom Cruise has "Oblivion" Where the humans have to leave Earth after a war with aliens in theaters April 19th. Even Brad Pitt is profiting from the end of the world as we know it with "World War Z" where he has to stop the zombie Apocalypse in June.

So let's recap Smith leaves Earth because EVERYTHING here hates and wants to eat people, Cruise has to leave after a war with aliens and Pitt is trying to end a war with zombies. what ever happened to happy endings and bright outlooks? I bet Matt Damon would do a positive movie set in the future! WRONG even Agent Bourne is doing a scary bleak look at the future in his new movie "Elysium".

 

 

See yet another terrible prediction for our future when "Elysium" hits the big screen this August.