While I agree that American culture has become the most influential in Western civilization since the latter half of the 20th (ironically it used to be British culture, hah), and that it has too much impact/an affect on everyone else, I disagree that it becoming less influential would result...