Microsoft and OpenAI announced a partnership last year to develop new artificial intelligence technologies, and Microsoft just revealed the first product of this deal: a massively powerful supercomputer. The system is one of the top five most powerful computers in the world, and it’s exclusively for training AI models. The companies hope this supercomputer will be able to create more human-like intelligences. We just hope those intelligences will not exterminate humanity.
Microsoft didn’t say where exactly its new Azure-hosted supercomputer ranks on the TOP500 list; just that it’s in the top five. Based on the last list update in November 2019, Microsoft’s system is capable of at least 38,745 teraflops per second, which is the peak speed of the number five ranked University of Texas Frontera supercomputer. Although, it could be as high as 100,000 teraflops without moving up the list — there’s a big gap between numbers four and five.
While we don’t have measurements of its raw computing power, Microsoft was happy to talk about all the hardware inside its new supercomputer. There are 285,000 CPU cores, which sounds like a lot. However, that’s less than any of the supercomputers currently in the top five. Microsoft’s AI computer also sports 10,000 GPUs and 400 gigabits of data bandwidth for each GPU server.
You might know OpenAI from its work on GPT-2, the fake news bot that the company initially deemed too dangerous to release. OpenAI used a technique called self-supervised learning to create GPT-2, and it will be able to do more of that with the new supercomputer. In self-supervised learning (sometimes called unsupervised learning), computers create models by assimilating large amounts of unlabeled data, and the humans can make adjustments to the model. This has the potential to create much more nuanced and effective AI, but it takes a lot of processing power.
We can only guess at what OpenAI will be able to develop with one of the world’s fastest supercomputers at its disposal. Microsoft and OpenAI believe that a powerful computer with reinforced learning techniques can learn to do anything a human can do — it’s just a matter of time and scale. In a human brain, there are trillions of synapses carrying electrical impulses that create conscious thought. In AI, the equivalent is a parameter. The latest OpenAI model has about 17 billion parameters, and the companies think parameters will reach into the trillions very soon.
Now read:
- OpenAI Releases Fake News Bot It Previously Deemed Too Dangerous
- Nvidia Announces Company-Wide Raises, Promises No Layoffs During Pandemic
- Sony, Microsoft Partner to Deploy AI Analytics in New Image Sensors
No comments:
Post a Comment