I am interested to know why Americans have such an enduring love affair with Paris, and to a lesser extent, the rest of France.

What is the historical and/or cultural basis for this?

For those of you who have been to France: Did it live up to your image and expectations of the place?

EDIT: I thought this was obvious, but evidently not – I am asking about the American perspective and opinion here. Yes, other nationalities visit France too… but this is ‘AskanAmerican’ … and I am after YOUR thoughts on the subject.

Leave a Reply
You May Also Like