America or United States

machadinho

Senior Member
Português do Brasil
Hi. I taught political philosophy at an American university this Fall, and noticed that my students, mostly Americans, refer to the United States of America as America, or United States, or this country.

I had the impression they usually said America when talking about it positively, but United States (or even this country) when they talked about it in more critical terms.

Do these phrases usually have such connotations when Americans talk about their country, or is it just my impression?

Thanks :)
 
  • Parla

    Member Emeritus
    English - US
    No, they don't have any such connotations, Machadinho. The terms are interchangeable. Perhaps what you observed was a regional usage, if the school draws its US students chiefly from the immediately surrounding area.
     

    machadinho

    Senior Member
    Português do Brasil
    Thank you all. So it was just a wrong impression.

    (natkretep, I did indeed search for the answer first but no topic discussed this particular question.)
     
    < Previous | Next >
    Top