Longtermism is the idea that current life is less valuable than potential/expected future life. It’s one of the main pillars of the Effective Altruism “movement” which aims to raise as much money as possible, as quickly as possible, without looking at current repercussions of said activities, to fund “long-term” projects (with long-term meaning several thousands of years into the future). These projects, they say, are necessary to advance humanity.

Longtermists are absolutely frightened by “AI”, or more precisely, “AGI” (Advanced General Intelligence).