Nov. 29th, 2014

про AGI

Nov. 29th, 2014 12:55 am
109: (animated-1)
если кому интересно, Nick Bostrom (да, тот самый Bostrom, который simulation argument) выпустил новую книжку два месяца назад.

и да, уже не до шуток теперь:

As the fate of the gorillas now depends more on us humans than on the gorillas themselves, so the fate of our species then would come to depend on the actions of the machine superintelligence.

This is quite possibly the most important and most daunting challenge humanity has ever faced. And – whether we succeed or fail – it is probably the last challenge we will ever face.

http://www.amazon.com/Superintelligence-Dangers-Strategies-Nick-Bostrom/dp/0199678111

Profile

109: (Default)
109

March 2019

S M T W T F S
     12
3456789
101112131415 16
17181920212223
24252627282930
31      

Most Popular Tags

Page Summary

Style Credit

Expand Cut Tags

No cut tags