At the risk of derailing things with a tangential topic, Roko's Basilisk is not quite the stupidest idea I've ever heard but it's in the top two percent.
In the first place, entropy. Information is continually being destroyed. The idea that sufficient information exists at some future point to make a simulation of me as I am now is a non-starter. And if some God-mode AI will decide it feels like making simulations of people (random people, not specific individuals, because entropy introduces information constraints that place specific individuals out of reach) and torturing them, then that sucks but there's nothing I can do about that.
In the second place, even if the God-mode AI can make a perfectly accurate simulation of me, it won't be me. I'll be dead by that time. The existence of some simulation having my memories, personality, etc, is interesting, but it will be a different person - and for that matter a completely blameless one, since if the thing making the simulation already exists, then obviously the simulation cannot have prevented or delayed its existence.
In the third place, a willingness to modify one's own course of action based on the suffering of a hypothetical simulation is the sole reason for any such simulation to be made. Therefore it is a horribly bad idea to modify one's course of action based on the suffering of some hypothetical simulation. Or to put this another way, if it turns out that I am the simulation, then what the hell was the point of simulating me given that I'm a simulation of someone who has logically concluded that his behavior mustn't be modified in the slightest by doing so?
In the fourth place, this is supposed to be a "Friendly" AI? A friendly AI has such a thing as empathy IMO, and would find such a pastime repugnant.
Just sayin.' The idea makes complete nonsense.