Machine Learning Applications in Electronic Design Automation

Machine Learning Applications in Electronic Design Automation

Machine Learning Applications in Electronic Design Automation

Machine Learning Applications in Electronic Design Automation

Hardcover(1st ed. 2022)

    Qualifies for Free Shipping
    Choose Expedited Shipping at checkout for delivery by Monday, December 4
    Check Availability at Nearby Stores

Related collections and offers


This book serves as a single-source reference to key machine learning (ML) applications and methods in digital and analog design and verification. Experts from academia and industry cover a wide range of the latest research on ML applications in electronic design automation (EDA), including analysis and optimization of digital design, analysis and optimization of analog design, as well as functional verification, FPGA and system level designs, design for manufacturing (DFM), and design space exploration. The authors also cover key ML methods such as classical ML, deep learning models such as convolutional neural networks (CNNs), graph neural networks (GNNs), generative adversarial networks (GANs) and optimization methods such as reinforcement learning (RL) and Bayesian optimization (BO). All of these topics are valuable to chip designers and EDA developers and researchers working in digital and analog designs and verification.

Product Details

ISBN-13: 9783031130731
Publisher: Springer International Publishing
Publication date: 12/23/2022
Edition description: 1st ed. 2022
Pages: 583
Product dimensions: 6.10(w) x 9.25(h) x (d)

About the Author

Haoxing Ren (Mark) was born in Nanchang, China in 1976. He received two BS degrees in Electrical Engineering and Finance, and MS degree in Electrical Engineering from Shanghai Jiao Tong University, China in 1996, and 1999, respectively; MS in Computer Engineering from Rensselaer Polytechnic Institute in 2000; and PhD in Computer Engineering from University of Texas at Austin in 2006. From 2000 to 2015, he worked at IBM Microelectronics and Thomas J. Watson Research Center (after 2006) developing physical design and logic synthesis tools and methodology for IBM microprocessor and ASIC designs. He received several IBM technical achievement awards including the IBM Corporate Award for his work on improving microprocessor design productivity. After his 15 years tenue at IBM, he had a brief stint as a technical executive at a chip design start-up developing server-class CPUs based on IBM OpenPOWER technology. In 2016, Mark joined NVIDIA Research where he currently leads the Design Automation research group, whose mission is to improve the quality and productivity of chip design through machine learning and GPU accelerated tools. He published many papers in the field of design automation including several book chapters in logic synthesis and physical design. He also received the best paper awards at International Symposium on Physical Design (ISPD) in 2013, Design Automation Conference (DAC) in 2019 and IEEE Transactions on Computer-Aided Design of Integrated Circuits and Systems in 2021.

Jiang Hu received the B.S. degree in optical engineering from Zhejiang University (China) in 1990, the M.S. degree in physics in 1997 and the Ph.D. degree in electrical engineering from the University of Minnesota in 2001. He worked with IBM Microelectronics from January 2001 to June 2002.

In 2002 he joined the electrical engineering faculty at Texas A&M University. His research interests include design automation of VLSI circuits and systems, computer architecture, hardware security and machine learning applications. Honors include receiving a best paper award at the ACM/IEEE Design Automation Conference in 2001, an IBM Invention Achievement Award in 2003, a best paper award at the IEEE/ACM International Conference on Computer-Aided Design in 2011, a best paper award at the IEEE International Conference on Vehicular Electronics and Safety in 2018 and a best paper award at the IEEE/ACM International Symposium on Microarchitecture in 2021. He has served as technical program committee members for DAC, ICCAD, ISPD, ISQED, ICCD, DATE, ISCAS, ASP-DAC and ISLPED. He is the general chair for the 2012 ACM International Symposium on Physical Design. He served as an associate editor for IEEE Transactions on CAD and the ACM Transactions on Design Automation of Electronic Systems. He received the Humboldt Research Fellowship in 2012. He was named an IEEE Fellow in 2016.

Table of Contents

1. Introduction
2. Analysis of Digital Design: Routability Optimization for Industrial Designs at Sub-14nm Process Nodes Using Machine Learning
3. RouteNet: Routability Prediction for Mixed-size Designs Using Convolutional Neural Network
4. High Performance Graph Convolutional networks with Applications in Testability Analysis
5. MAVIREC: ML-Aided Vectored IR-Drop Estimation and Classification
6. GRANNITE: Graph Neural Network Inference for Transferable Power Estimation
7. Machine Learning-Enabled High-Frequency Low-Power Digital Design Implementation at Advanced Process Nodes
8. Optimization of Digital Design: Chip Placement with Deep Reinforcement learning
9. DREAMPlace: Deep Learning Toolkit-Enabled GPU Acceleration for Modern VLSI Placement
10. TreeNet: Deep Point Cloud Embedding for Routing Tree Construction
11. Asynchronous Reinforcement Learning Framework for Net Order Exploration in Detailed Routing
12. Standard Cell Routing with Reinforcement Learning and Genetic Algorithm in Advanced Technology Nodes
13. PrefixRL: Optimization of Parallel Prefix Circuits using Deep Reinforcement Learning
14. GAN-CTS: A Generative Adversarial Framework for Clock Tree Prediction and Optimization
15. Analysis and Optimization of Analog Design: Machine Learning Techniques in Analog Layout Automation
16. Layout Symmetry Annotation for Analog Circuits with Graph Neural Networks
17. ParaGraph: Layout parasitics and device parameter prediction using graph neural network
18. GCN-RL circuit designer: Transferable transistor sizing with graph neural networks and reinforcement learn
19. Parasitic-Aware Analog Circuit Sizing with Graph Neural Networks and Bayesian Optimization
20. Logic and Physical Verification: Deep Predictive Coverage Collection/ Dynamically Optimized Test Generation Using Machine Learning
21. Novelty-Driven Verification: Using Machine Learning to Identify Novel Stimuli and Close Coverage
22. Using Machine Learning Clustering To Find Large Coverage Holes.- GAN-OPC: Mask optimization with lithography-guided generative adversarial nets.- Layout hotspot detection with feature tensor generation and deep biased learning.
From the B&N Reads Blog

Customer Reviews