Prisoner's dilemma here. If you've built a super-AI, this is proof that super-AIs are possible, therefore, inevitably unless stopped, someone else is going to build their own. So while your super-AI can't be guaranteed safe before release, and even if it works and doesn't cause the apocalypse...