The USA seems to be in a constant fight with countries in the Middle East. Mainly because those countries are muslim nations that treat women badly, oppress their civilians and so forth. Right? The single most extreme muslim
country is Saudi Arabia. Women are not allowed to even drive! On top of that, the country is known to support terrorist organisation Al Qaida.
So why is it that nobody seems to wonder why the United States of America and other western nations sell billions worth of weapons to the most extreme muslim country in the world?