Word
west
Meanings;
- To or toward the west.(adverb)
- The direction toward the point of the horizon where the sun sets at the equinoxes, on the left-hand side of a person facing north, or the part of the horizon lying in this direction.(noun)
- The western part of the world or of a specified country, region, or town.(noun)
Synonyms
to the west
westward
westwards
westwardly
the Occident
western
westerly
westwardly
occidental
western
westerly
westwardly
occidental