Seriously why do white people feel it is their destiny to own everything,even if its not theres? How can you own land,water, and natural elements? Rather it be killing innocent or lying to the innocent they feel that no matter what if they dont own it,then its not good. Like they feel their Gods gift to the world or something. I mean damn fools let a man breathe. And I dare you guys to give me an example where white people havent tried owning everything.
get off ur high horse bitch