What does "westerns" mean?
1. Definition (n.) movies about cowboys, indians and the Wild West; movies that show what life was like in the 1700-1800s in the U.S.
Examples My man loves the old westerns where the good guy with the white horse always kills the bad guy.