Analytics for Neural Networks

slider image

About Analytics4NN

This project tackles the data challenge of enabling efficient neural network architecture search on the next-generation supercomputers by:
Defining methods to increase throughput in neural network architecture search, enabling rapid and flexible training termination early in the training process across fitness measurements, datasets, and problems
Designing the building blocks that transform existing neural network architecture search implementations from tightly-coupled, monolithic software tools embedding both search and prediction into a flexible, modular, and reusable workflow in which search and prediction are decoupled
Generating a searchable and reusable neural network data commons that shares full provenance of the lifespan of diverse neural networks through the generation, training, and validation stages
Developing new curriculum material, online courses, and online training material targeting neural network analysis and efficient architecture search topics, and designed to broader the HPC and AI communities
The project's methods, workflows, and neural network data commons support users in studying a large and diverse suite of neural networks and connecting those neural networks with scientific knowledge embedded in real datasets. Users across scientific domains can deploy our products to study the fitness curve of neural network, design early termination criteria, reconstruct neural network models, and explain and reproduce AI results.

Selected Publications

Ariel Keller Rorabaugh, Silvina Caino-Lores, Travis Johnston, and Michela Taufer. High Frequency Accuracy and Loss Data of Random Neural Networks Trained on Image Datasets Data in Brief, Elsevier, 40:107780, 2022. [link]
Ariel Keller Rorabaugh, Silvina Caino-Lores, Travis Johnston, and Michela Taufer. Building High-throughput Neural Architecture Search Workflows via a Decoupled Fitness Prediction Engine IEEE Trans. Parallel Distributed Syst. (TPDS), 33(11):2913–2926, 2022. [link]
Paula Olaya, Silvina Caino-Lores, Vanessa Lama, Ria Patel, Ariel Rorabaugh, Osamu Miyashita, Florence Tama, and Michela Taufer. Identifying Structural Properties of Proteins from X- ray Free Electron Laser Diffraction Patterns. In Proceedings of the 18th IEEE International Conference on e-Science (eScience), pages 1–10, Salt Lake City, Utah, USA, October 2022. IEEE Computer Society. [link]
Ria Patel, Ariel Keller Rorabaugh, Paula Olaya, Silvina Caino-Lores, Georgia Channing, Catherine Schuman, Osamu Miyashita, Florence Tama, and Michela Taufer. A Methodology to Generate Efficient Neural Networks for Classification of Scientific Datasets In Proceedings of the 18th IEEE International Conference on e-Science (eScience), pages 1–2, Salt Lake City, Utah, USA, October 2022. IEEE Computer Society. (Short paper). [link]
Ariel Keller Rorabaugh, Silvina Caino-Lores, Michael R. Wyatt II, Travis Johnston, and Michela Taufer. PEng4NN: An Accurate Performance Estimation Engine for Efficient Automated Neural Network Architecture Search CoRR, abs/2101.04185, 2021. [link]

Ria Patel

Undergraduate Research Assistant at University of Tennessee, Knoxville