UGOTAG | Videos with Chapter Markers  
  
× ANCIENT WORLD ANIMALS AQUARIUM ARCHERY ARCHITECTURE ART ARTIFICIAL INTELLIGENCE AUDIO BOOKS AVIATION BABY BEAUTY BIG THOUGHTS BIKING BIOLOGY BIRDS BLOCKCHAIN BUSINESS AUTOMOTIVE CATS CHRISTMAS COLD WAR COMPUTER SECURITY CRAFTS CRIME STORIES CRYPTOCURRENCY CSGO DIGITAL COMBAT SIMULATOR DINOSAUR DOGS EASTER ECONOMICS ELECTRONICS ENGINEERING ENVIRONMENT FAR CRY 5 FARMING FASHION FISHING FITNESS FOOD & DRINK FORTNITE BATTLE ROYALE FREEDOM OF SPEECH FUNNY GAMBLING GAMING GARDENING GEOPOLITICS GOD OF WAR GUITAR GUNS HALLOWEEN HARRY POTTER HEALTH & WELLNESS HISTORY YOUR HOME HOME REPAIR HOMEBREW INSECTS INTERESTING KIDS KITCHEN KNITTING LAWNCARE LEGO LIQUOR LOCKSPORT MARKETING MARTIAL ARTS MATH MENS STYLE MINECRAFT MOBILE DEVICES MOTORBIKES MOVIES MULTIPLICATION MUSIC MYSTERY NEUROSCIENCE OLYMPICS ORGANIZATION OUTDOORS PETS PHILOSOPHY PHOTOGRAPHY PHYSICS PI DAY POLITICS POTTERY PRIVACY PROGRAMMING PSYCHOLOGY RECIPE RED DEAD REDEMPTION RELAXING RELIGION REMOTE CONTROL ROCKCLIMBING SCI FI SCI FY SCIENCE SHOOTING SPORTS SKATEBOARDING SPECIAL FORCES SPEECHES SPORTS STAR WARS STEM STPATRICKS STYLE TECHNOLOGY THANKSGIVING TOYS TRAVEL TV VALENTINE'S DAY WAR WEDDING WOODWORKING WW1 WORLD WAR 2
HOME  |  TECHNOLOGY  |  HEALTH  |  FOOD  |  MORE

Episode Markers
  • 00:02
     
    #Professor Nick Bostrom   
    MODERATOR: Today with us we have Professor Nick Bostrom.
  • 01:14
     
    #The Future of Humanity Institute   
    So I run this thing called The Future of Humanity Institute, which has a very sort of ambitious name.

Nick Bostrom: "Superintelligence" | Talks at Google

Superintelligence asks the questions: What happens when machines surpass humans in general intelligence? Will artificial agents save or destroy us? Nick Bostrom lays the foundation for understanding the future of humanity and intelligent life. The human brain has some capabilities that the brains of other animals lack. It is to these distinctive capabilities that our species owes its dominant position. If machine brains surpassed human brains in general intelligence, then this new superintelligence could become extremely powerful - possibly beyond our control. As the fate of the gorillas now depends more on humans than on the species itself, so would the fate of humankind depend on the actions of the machine superintelligence. But we have one advantage: we get to make the first move. Will it be possible to construct a seed Artificial Intelligence, to engineer initial conditions so as to make an intelligence explosion survivable? How could one achieve a controlled detonation?






Community tags: big_thoughts     HOME     SIGN UP     CONTACT US