INTRODUCTION: The Origins of Modern Western Society

The notion of “the West” has ancient origins. Greek civilization grew up in the shadow of earlier civilizations to the south and east of Greece, especially Egypt and Mesopotamia. Greeks defined themselves in relation to these more advanced cultures, which they lumped together as “the East.” They passed this conceptualization on to the Romans, who in turn transmitted it to the peoples of western and northern Europe. When Europeans established overseas colonies in the late fifteenth century, they believed they were taking Western culture with them, even though many of its elements, such as Christianity, had originated in what Europeans by that point regarded as the East. Throughout its long history, the meaning of “the West” has shifted, but in every era it has meant more than a geographical location.