Close

NTT Co-authored Papers at NeurIPS to Advance Machine Learning Efficiency and Performance

December 3, 2020 8:07 AM EST

Ground-breaking Topics Include Neural Network Pruning, Meta Learning and Alternative Bayesian Model

PALO ALTO, Calif.--(BUSINESS WIRE)-- NTT Research, Inc., a division of NTT (TYO: 9432), NTT Communication Science Laboratories and NTT Software Innovation Center today announced that three papers co-authored by scientists from several of their divisions were selected (including one Spotlight paper) for this year’s NeurIPS 2020, the 34th annual conference of the Neural Information Processing Systems Foundation. A non-profit corporation that fosters the exchange of research on neural information processing systems in their biological, technological, mathematical and theoretical aspects, the NeurIPS Foundation will hold this year’s all-virtual conference on December 6-12. Its selection committee accepted 16 percent of the more than 12,000 abstract submissions they received, including the following three, which touch upon deep neural networks, theory and algorithms, deep learning and Bayesian modeling:

  • Pruning Neural Networks Without any Data by Iteratively Conserving Synaptic Flow,” Hidenori Tanaka (NTT Research Physics & Informatics Lab (PHI) Lab; Department of Applied Physics, Stanford), Daniel Kunin (co-first author, Institute for Computational & Mathematical Engineering, Stanford), Daniel L. Yamins (Departments of Psychology and Computer Science, Stanford), Surya Ganguli (Department of Applied Physics, Stanford). This paper, which builds upon another paper delivered at last year’s event, addresses a problem involving existing algorithms for network pruning, or the compression of neural networks by removing parameters (connections) between artificial neurons. Interest in pruning has grown due to its potential to save time, memory and energy during a neural network’s training and test phases. One problem with this technique, however, is layer collapse, which occurs when an entire layer is pruned, making a network untrainable. The answer proposed here is a new algorithm, Iterative Synaptic Flow Pruning (SynFlow), in which successive iterations enable it to go farther and reach higher levels of compression without layer collapse. “We have characterized and provably solved a key failure mode of existing neural network pruning algorithms and have taken a step toward achieving more efficient deep learning models,” said Hidenori Tanaka, a senior scientist in the NTT Research PHI Lab. A new work summarizing findings in the pruning paper by using theoretical tools from physics, titled “Neural Mechanics: Symmetry and Conservation Laws in Learning Dynamics,” will be presented at the NeurIPS 2020 Machine Learning and Physical Sciences workshop.
  • Meta-learning from Tasks with Heterogeneous Attribute Spaces,” Tomoharu Iwata (NTT Communication Science Laboratories), and Atsutoshi Kumagai (NTT Software Innovation Center). Standard machine-learning methods train networks on a specific task. New tasks require another large amount of training data. One way around the related time and cost is “few-shot learning,” a framework for learning from fewer examples. Existing models, however, assume that training and target tasks share the same attribute space, and tasks with heterogeneous attribute spaces to-date have assumed only two tasks and require target data sets for training. This paper introduces a new “few-shot” learning method for tasks with heterogeneous attribute spaces. The model infers latent representations of each attribute and each response from a few labeled instances. Then responses of unlabeled instances are predicted with the inferred representations using a prediction network. “Based on the knowledge learned from a wide-variety of training datasets, the proposed method can quickly adapt to and learn new tasks even when their attribute spaces are different from the training datasets,” said Iwata, a distinguished researcher at NTT Communication Science Laboratories.
  • Baxter Permutation Process,” Masahiro Nakano, Akisato Kimura, Takeshi Yamada, Naonori Ueda (NTT Communication Science Laboratories). A Bayesian nonparametric (BNP) model for Baxter permutations (BPs), termed BP process (BPP), is proposed and applied to relational data analysis. “With our algorithm, we are able to find any combination of clusters with any sizes without any prior information, such as the number of clusters,” said Masahiro Nakano, a researcher at NTT Communication Science Laboratories. “Our achievement opens up new possibilities for applying Bayesian nonparametric machine learning, expected to be one of the next-gen machine learning technologies, to multi-dimensional — or even infinite-dimensional — data.” This paper was selected as a Spotlight paper and will be presented during an online session on December 10.

“There is no better place to explore the overlap between machine learning and computational neuroscience than the annual NeurIPS event,” said Yoshihisa Yamamoto, PHI Lab Director. “We are excited to see the latest paper by Dr. Tanaka and his Stanford colleagues, as well as those by our colleagues at the NTT Software Innovation Center and NTT Communication Science Laboratories and expect the fields of neural networking and machine learning will benefit from the efficiencies and expanded capabilities that they are proposing.”

This year’s seven-day virtual NeurIPS event includes an expo, conference sessions, tutorials and workshops. The authors of these papers will participate in the event through poster and short recorded presentations. A follow-up to the “Pruning Neural Networks” paper, as noted above, will be presented at one of the event’s workshops. As an indication of the vitality of this sub-field of neuroscience, the event organizers noted a 40 percent year-over-year increase in the number of submitted abstracts, similar to the growth from 2018 to 2019. Papers in the areas of algorithms, deep learning and applications comprised 66 percent of the papers that were reviewed. Among this year’s keynote speakers are Christopher Bishop, director of the Microsoft Research Lab in Cambridge, England; Shafi Goldwasser, Director of the Simons Institute for the Theory of Computing; and Marloes Maathuis, Professor of Statistics at ETH (the Swiss Federal Institute of Technology) in Zurich.

About NTT Research

NTT Research opened its Palo Alto offices in July 2019 as a new Silicon Valley startup to conduct basic research and advance technologies that promote positive change for humankind. Currently, three labs are housed at NTT Research: the Physics and Informatics (PHI) Lab, the Cryptography and Information Security (CIS) Lab, and the Medical and Health Informatics (MEI) Lab. The organization aims to upgrade reality in three areas: 1) quantum information, neuro-science and photonics; 2) cryptographic and information security; and 3) medical and health informatics. NTT Research is part of NTT, a global technology and business solutions provider with an annual R&D budget of $3.6 billion.

NTT and the NTT logo are registered trademarks or trademarks of NIPPON TELEGRAPH AND TELEPHONE CORPORATION and/or its affiliates. All other referenced product names are trademarks of their respective owners. © 2020 NIPPON TELEGRAPH AND TELEPHONE CORPORATION

NTT Research:
Chris Shaw
Vice President, Global Marketing
NTT Research
+1-312-888-5412
[email protected]

Media:
Stephen Russell
Wireside Communications®
For NTT Research
+1-804-362-7484
[email protected]

Source: NTT Research, Inc.



Serious News for Serious Traders! Try StreetInsider.com Premium Free!

You May Also Be Interested In





Related Categories

Business Wire, Press Releases