After fighting wars with Britain, Germany, and Japan in the past, how is the United States' relationships with those
nations now?
The relations are very strained due to the past wars.
The United States is on friendly terms with all these countries.
The United States conquered the countries and now controls them as colonies.
The United States and Britain work well together because they share a common language, but Germany and Japan are not friendly
to the United States.