The title is a bit grandiose, but I'm serious about the question. I wasn't sure but can a person really own land in the United States? As in, the government ceases to have a claim over that land because it is privately owned? The reason I ask was because I thought such a thing could be potentially bad. I mean, in good times no one would sell, say, New York City. But in the worst of times, a person would loose pockets could pick up a surprising number of things one might consider immoral at the least. Can I buy up parts of New York and other cities. Buy up parts of the state and the nation. Wouldn't this undermine the politics and rule by the majority as politics would no longer have a claim over private property? And through such a system, would it not be possible for another nation to buy up parts of America?
If you buy land in America, or any country pretty much, you only gain the standard rights of a landowner over that land - obviously it's still US territory and you aren't suddenly allowed do things against US law while on the land. Foreign businesses can put a country into a state of economic dependency, which is what currently happens a lot from the USA especially in Latin America and other places - it's called imperialism.
Even if one does not technically own land, it is still possible to indirectly own the equity in the land, and recieve the proceeds from that ownership, through investment in a bank. The thing is, interest and mortgage payments serve a similar economic function to land rent. When, for example, a wealthy man in China puts more money into American banks, the banks have more money to lend, and the buyers have more available money to bid up the price of land. The interest on their mortgage payments will go to the bank, and then eventually to the Chinese investor in the form of interest on his deposits.