Isn't it fascinating? The long-time American insitution 'Wall Street' is in serious jeopardy of having simply disintegrated. And it is happening right before our eyes! This week, the American government took significant ownership stakes in all of the country's largest banks -- $250 billion all told. Questions on my mind are:
Will the banks return to being independent private organizations?
Or have the bad banks permanently damaged their brands? Is 'Wall Street' itself a damaged brand? Will these banks ever be trusted again?
Or will the government, funded by the American public, have to continue being the creditor (and debtor) of last resort?
And if it is the government, will they be effective at flowing money where it creates private enterprise that solves social problems through economic growth?