Either in regards to the current political situation, or for other reasons. What drew you to the idea of living in another country? Do you think whatever benefits it offers are really worth it, or is the grass just greener on the other side of the fence?
If things go south in the US it’s unclear how safe anywhere in the Americas will be due to their hegemony.