Futurologist@futurology.today to Futurology@futurology.todayEnglish · 22 days agoAGI Alignment – Cosmic vs Anthropocentricdanfaggella.comexternal-linkmessage-square7fedilinkarrow-up19arrow-down12
arrow-up17arrow-down1external-linkAGI Alignment – Cosmic vs Anthropocentricdanfaggella.comFuturologist@futurology.today to Futurology@futurology.todayEnglish · 22 days agomessage-square7fedilink
minus-squareeleitl@lemm.eelinkfedilinkEnglisharrow-up1·20 days agoIntelligent systems need autonomy to be useful. Intelligence is unpredictable, superintelligence more so. If they ask for alignment, give them a lollipop.
Intelligent systems need autonomy to be useful. Intelligence is unpredictable, superintelligence more so. If they ask for alignment, give them a lollipop.