Turing Award honors a different kind of AI network with ‘Nobel Prize of computing’

Turing Award honors a different kind of AI network with ‘Nobel Prize of computing’

1:14pm, 28th March, 2019
Facebook’s Yann LeCun, Mila’s Yoshua Bengio and Google’s Geoffrey Hinton share the 2018 Turing Award. (ACM Photos) The three recipients of the Association for Computing Machinery’s 2018 Turing Award, known as the “Nobel Prize of computing,” are sharing the $1 million award for their pioneering work with artificial neural networks — but that’s not all they share. Throughout their careers, the researchers’ career paths and spheres of influence in the field of artificial intelligence have crossed repeatedly. Yann LeCun, vice president and chief AI scientist at Facebook, conducted postdoctoral research under the supervision of Geoffrey Hinton, who is now a vice president and engineering fellow at Google. LeCun also worked at Bell Labs in the early 1990s with Yoshua Bengio, who is now a professor at the University of Montreal, scientific director of Quebec’s Mila AI institute, and an adviser for Microsoft’s AI initiative. All three also participate in the program sponsored by CIFAR, previously known as the Canadian Institute for Advanced Research. In , ACM credited the trio with rekindling the AI community’s interest in deep neural networks — thus laying the groundwork for today’s rapid advances in machine learning. “Artificial intelligence is now one of the fastest-growing areas in all of science, and one of the most-talked-about topics in society,” said ACM President , a professor emeritus of computer science at Oregon State University. “The growth of and interest in AI is due, in no small part, to the recent advances in deep learning for which Bengio, Hinton and LeCun laid the foundation.” And you don’t need to work in a lab to feel their impact. “Anyone who has a smartphone in their pocket can tangibly experience advances in natural language processing and computer vision that were not possible just 10 years ago,” Pancake said. The current approach to machine learning, championed by Hinton starting in the early 1980s, shies away from telling a computer explicitly how to solve a given task, such as object classification. Instead, the software uses an algorithm to analyze the patterns in a data set, and then apply that algorithm to classify new data. Through repeated rounds of learning, the algorithm becomes increasingly accurate. Hinton, LeCun and Bengio focused on developing neural networks to facilitate that learning. Such networks are composed of relatively simple software elements that are interconnected in ways inspired by the connections between neurons in the human brain.
Microsoft’s quantum computing network takes one giant leap at Startup Summit

Microsoft’s quantum computing network takes one giant leap at Startup Summit

7:34pm, 28th February, 2019
Microsoft is focusing on the development of quantum computers that take advantage of cryogenically cooled nanowires. (Microsoft Photo) REDMOND, Wash. — Quantum computing may still be in its infancy — but the is all grown up, fostered by in-house developers, research affiliates and future stars of the startup world. The network , during a Startup Summit that laid out the company’s vision for quantum computing and introduced network partners to Microsoft’s tools of the quantum trade. Quantum computing stands in contrast to the classical computer technologies that have held sway for more than a half-century. Classical computing is based on the ones and zeroes of bit-based processing, while quantum computing takes advantage of the weird effects of quantum physics. Quantum bits, or qubits, needn’t represent a one or a zero, but can represent multiple states during computation. The quantum approach should be able to solve computational problems that can’t easily be solved using classical computers, such as modeling molecular interactions or optimizing large-scale systems. That could open the way to world-changing applications, said Todd Holmdahl, corporate vice president of Microsoft’s Azure Hardware Systems Group. “We’re looking at problems like climate change,” Holmdahl said. “We’re looking at solving big food production problems. We think we have opportunities to solve problems around materials science, personal health care, machine learning. All of these things are possible and obtainable with a quantum computer. We have been talking around here that we’re at the advent of the quantum economy.” Todd Holmdahl, Microsoft corporate vice president for the Azure Hardware Systems Group, speaks during a Startup Summit kicking off the Microsoft Quantum Network. (Microsoft Photo) Representatives from 16 startups were invited to this week’s Startup Summit, which features talks from Holmdahl and other leaders of Microsoft’s quantum team as well as demos and workshops focusing on Microsoft’s programming tools. (The closest startup to Seattle is , based in Vancouver, B.C.) Over the past year and a half, Microsoft has called Q# (“Q-sharp”) as part of its , and has worked with researchers at Pacific Northwest National Laboratory and academic institutions around the world to lay the technical groundwork for the field. A big part of that groundwork is the development of , based on a topological architecture that builds error-correcting mechanisms right into the cryogenically cooled, nanowire-based hardware. Cutting down on the error-producing noise in quantum systems will be key to producing a workable computer. “We believe that our qubit equals about 1,000 of our competition’s qubits,” Holmdahl said. There’s lots of competition in the quantum computing field nowadays: , and are all working on similar technologies for a universal quantum computer, while Canada’s is taking advantage of a more limited type of computing technology known as quantum annealing. This week, that it said would reduce quantum noise and more than double the qubit count of its existing platform, from 2,000 linked qubits to 5,000. But the power of quantum computing shouldn’t be measured merely by counting qubits. The efficiency of computation and the ability to reduce errors can make a big difference, said Microsoft principal researcher Matthias Troyer. For example, a standard approach to simulating the molecular mechanism behind nitrogen fixation for crops could require 30,000 years of processing time, he said. But if the task is structured to enable parallel processing and enhanced error correction, the required runtime can be shrunk to less than two days. “Quantum software engineering is really as important as the hardware engineering,” Troyer said. Julie Love, director of Microsoft Quantum Business Development, talks about the promise of quantum computing at a Startup Summit on the Microsoft campus. (GeekWire Photo / Alan Boyle) Julie Love, director of Microsoft Quantum Business Development, said that Microsoft will start out offering quantum computing through Miicrosoft’s Azure cloud-based services. Not all computational problems are amenable to the quantum approach: It’s much more likely that an application will switch between classical and quantum processing — and therefore, between classical tools such as the C# programming language and quantum tools such as Q#. “When you work in chemistry and materials, all of these problems, you hit this ‘known to be unsolvable’ problem,” Love said. “Quantum provides the possibility of a breakthrough.” Love shies away from giving a firm timetable for the emergence of specific applications — but last year, Holmdahl predicted that commercial quantum computers would exist (Check back in 2023 to see how the prediction panned out.) The first applications could well focus on simulating molecular chemistry, with the aim of prototyping better pharmaceuticals, more efficient fertilizers, better batteries, more environmentally friendly chemicals for the oil and gas industry, and a new class of high-temperature superconductors. It might even be possible to address the climate change challenge by custom-designing materials that pull excess carbon dioxide out of the air. Love said quantum computers would also be well-suited for addressing optimization problems, like figuring out how to make traffic flow better through Seattle’s urban core; and for reducing the training time required for AI modeling. “That list is going to continue to evolve,” she said. Whenever the subject quantum computing comes up, cryptography has to be mentioned as well. It’s theoretically possible for a quantum computer to break the codes that currently protect all sorts of secure transactions, ranging from email encryption to banking protocols. Love said those code-breaking applications are farther out than other likely applications, due to the huge amount of computation resources that would be required even for a quantum computer. Nevertheless, it’s not too early to be concerned. “We have a pretty significant research thrust in what’s called post-quantum crypto,” she said. Next-generation data security is one of the hot topics addressed that was approved by Congress and the White House last December. Love said Microsoft’s have already gone through an initial round of vetting by the . “We’ve been working at this in a really open way,” she said. Like every technology, quantum computing is sure to have a dark side as well as a bright side. But it’s reassuring to know that developers are thinking ahead about both sides.