AI Transforms Inverse Lithography Technology To Overcome Semiconductor Manufacturing Bottlenecks

A new review shows how AI is reshaping inverse lithography technology, driving breakthroughs in semiconductor design and manufacturing while tackling the long-standing challenges of resolution, cost, and scalability.

Review: Advancements and challenges in inverse lithography technology: a review of artificial intelligence-based approaches. Image Credit: ArtemisDiana  / Shutterstock

The advancement of semiconductor manufacturing is a key driver of electronic device innovations. As Moore's Law progresses, lithography becomes a critical process for integrated circuit fabrication. Lithography systems improved resolution by reducing exposure wavelength or increasing numerical aperture. As feature sizes shrink, these approaches face technical bottlenecks and cost pressure.

Computational lithography optimizes the process factor to improve resolution. Inverse lithography technology (ILT) has global optimization capabilities, attracting attention from both academia and industry. In recent years, artificial intelligence (AI) has brought breakthroughs to ILT, improving performance in lithography modeling and mask optimization.

Recently, a research team from Tsinghua University published a review titled "Advancements and challenges in inverse lithography technology: a review of artificial intelligence-based approaches" in Light: Science & Applications. This paper summarizes the principles, developments, and AI applications of ILT, and discusses the challenges and prospects of ILT.

The corresponding author is Prof. Liangcai Cao from Tsinghua University, with Ph.D. candidate Yixin Yang as the first author. Contributors include Ph.D. candidates Kexuan Liu, Yunhui Gao, and Prof. Chen Wang.

Computational Lithography

The lithography process comprises several steps, including coating, pre-baking, exposure, baking, development, etching, resist stripping, and metrology. Lithography systems have evolved through contact, proximity, and projection configurations. Contact lithography is prone to mask contamination and damage. Proximity lithography is constrained by wafer flatness. In 1973, the projection lithography machine was introduced, enabling pattern transfer through the optical projection of masks.

Resolution enhancement techniques include off-axis illumination, optical proximity correction, and phase-shift masks. Computational lithography models the lithography process and optimizes the illumination source and mask design according to the target wafer pattern.

Computational lithography evolved through rule-based optical proximity correction (RBOPC), model-based optical proximity correction (MBOPC), and ILT. ILT characterizes the optical imaging process using Hopkins' theory and the transmission cross-coefficient. The inverse problem is solved through gradient-based optimization algorithms, predicting wafer patterns to closely approximate target designs.

ILT was first proposed in 1981 by researchers from the University of Wisconsin-Madison. The industrial application was achieved in 2003 by Luminescent Technologies, with commercialization accelerated by Intel. In 2010, the introduction of the regularization framework and conjugate gradient algorithm improved ILT's computational efficiency. The integration of deep learning in 2017 marked a new stage of ILT, as Advanced Semiconductor Materials Lithography (ASML) adopted convolutional neural networks to optimize the lithography process. Graphics processing unit (GPU)-based computing platforms advanced the ILT implementation. Over four decades, ILT has evolved from a concept into a critical technology in semiconductor manufacturing.

AI-Driven Inverse Lithography Technology

AI has brought transformative breakthroughs to ILT. In lithography modeling, data-driven strategies have improved the computational efficiency of thick-mask near-field simulations and photoresist effect modeling. Deep learning frameworks, such as convolutional neural networks and generative adversarial networks, have enhanced mapping accuracy from masks to wafer patterns, achieving physical simulation-level predictive capability in extreme ultraviolet lithography. The integration of AI effectively mitigates the trade-off between precision and efficiency in lithography modeling.

AI is reshaping the ILT's algorithm framework. By hybridizing physical models with neural networks, the architecture maintains the physical consistency of optical systems while leveraging the advantages of data-driven learning. Generative models enable rapid synthesis of high-fidelity mask patterns. Graph neural networks handle complex design constraints in layout optimization. AI-driven ILT enhances imaging quality and overcomes computational bottlenecks, laying the foundation for large-scale industrial adoption.

Prospects

ILT faces multiple challenges. In computational efficiency, ILT requires more processing time compared to conventional optical proximity correction, limiting its application primarily to local hotspot correction. Full-chip ILT optimization necessitates partitioning the layout into smaller units, which introduces boundary stitching artifacts. In mask manufacturing, electron-beam direct writing processes remain immature and time-consuming, requiring Manhattanization of curvilinear mask patterns to comply with manufacturing rules. Though AI techniques enhance lithography modeling precision and efficiency, deep learning models suffer from limitations, including insufficient interpretability and data dependency.

In the future, ILT development will focus on the integration of AI, model and algorithm optimization, and mask fabrication innovations. Computational efficiency can be accelerated by a GPU. Physics-embedded deep learning approaches maintain the advantages of data-driven algorithms and enhance physical consistency and interpretability. Multi-beam mask writing (MBMW) technology is expected to break through manufacturing bottlenecks for curvilinear masks, achieving improvements in resolution and throughput. The development of automated mask generation workflows, multi-scale physical modeling frameworks, and co-optimization of source-mask designs will advance ILT in advanced integrated circuit manufacturing.

Source:
Journal reference:
  • Yang, Y., Liu, K., Gao, Y., Wang, C., & Cao, L. (2025). Advancements and challenges in inverse lithography technology: A review of artificial intelligence-based approaches. Light: Science & Applications, 14(1), 1-21. DOI: 10.1038/s41377-025-01923-w, https://www.nature.com/articles/s41377-025-01923-w

Comments

The opinions expressed here are the views of the writer and do not necessarily reflect the views and opinions of AZoAi.
Post a new comment
Post

Sign in to keep reading

We're committed to providing free access to quality science. By registering and providing insight into your preferences you're joining a community of over 1m science interested individuals and help us to provide you with insightful content whilst keeping our service free.

or

While we only use edited and approved content for Azthena answers, it may on occasions provide incorrect responses. Please confirm any data provided with the related suppliers or authors. We do not provide medical advice, if you search for medical information you must always consult a medical professional before acting on any information provided.

Your questions, but not your email details will be shared with OpenAI and retained for 30 days in accordance with their privacy principles.

Please do not ask questions that use sensitive or confidential information.

Read the full Terms & Conditions.

You might also like...
Matillion Unleashes AI Data Engineers To Transform Data Workflows And Empower Every User