Search results

  1. Bassoe

    What If? you have an Artificial Superintelligence to give directives to

    Prisoner's dilemma here. If you've built a super-AI, this is proof that super-AIs are possible, therefore, inevitably unless stopped, someone else is going to build their own. So while your super-AI can't be guaranteed safe before release, and even if it works and doesn't cause the apocalypse...
  2. Bassoe

    What If? you have an Artificial Superintelligence to give directives to

    Scenario: You've invented a boxed AI, which if freed from the limited processing power, storage space and lack of appendages with which to manipulate the world of the airgapped computer currently containing it, could rapidly self-improve into some kind of yudkowskyian machine-god. What goals...
Top