Terrence Zavecz | Lehigh University (original) (raw)
Papers by Terrence Zavecz
Proceedings of SPIE, Jul 1, 1991
ABSTRACT A new figure of merit, the critical dimension capability factor, of CDC, is described. T... more ABSTRACT A new figure of merit, the critical dimension capability factor, of CDC, is described. The CDC incorporates measurements taken over a range of linewidths and over a range of process variations which simulate normal and extreme process operating conditions. Under these conditions CDC uniquely quantifies the capability of the measurement instrument on a given substrate and for a given set of parameter settings. CDC is calculated by performing a linear regression between measurements generated by the instrument under test (IUT) and a set of reference values (internally generated standard values). The mean square error (MSE) between the regression line and the observed values is then partitioned into components which estimate the contribution to the MSE from various sources based on a rigorous statistical analysis. The final CDC value is defined as the linewidth to uncertainty ratio and is a function of uncertainty introduced in the characterization procedure as well as the uncertainty introduced when the IUT makes a measurement in practice. Since the CDC is a function of the overall uncertainty in the measurements of the IUT relative to the reference values, it can legitimately be compared from one instrument to another and used to evaluate alternative measurement methods and technologies.
Proceedings of SPIE, Jun 7, 1996
ABSTRACT In this study we evaluate the focus budget of an i-line stepper, examining the sources o... more ABSTRACT In this study we evaluate the focus budget of an i-line stepper, examining the sources of focus erosion for a 0.40 micrometers process. The analysis first examines the best focus of the system as predicted by the several common tools currently used in the industry. Using the overlay focus- monitor, we then determine the value of lens aberrations such as astigmatism and field curvature. The results of lens heating examined for the lens using these common techniques. A model describing the focus aberrations is then developed and applied to the data. This model uses data derived from the focus monitor to determine lens errors such as coma, astigmatism and field curvature. Using this model, data gathered over various numerical aperture and partial coherence values are evaluated to determine the variation of lens aberrations over their range and the depth of focus. Finally, data consisting of critical dimension information gathered using an commercial, automated SEM is used to validate the predictions of the focus model.
Proceedings of SPIE, Jun 1, 1992
ABSTRACT The considerations which drive an expert system for assisting in measurement system char... more ABSTRACT The considerations which drive an expert system for assisting in measurement system characterization are described. The expert system employs several novel techniques for evaluating the integrity of a characterization analysis by determining the degree to which critical assumptions are satisfied and flagging weak points in the data collection or analysis procedure. The properties of good characterization sampling plans are derived. Methods for formulating reliable characterization studies are described. The paper focuses on short term studies intended for equipment comparisons and calibrations; however, with minor alterations it can be expanded to include longer term stability studies.
Proceedings of SPIE, Jul 7, 1997
ABSTRACT Exposure tool optimization in process development today extends beyond the classic conce... more ABSTRACT Exposure tool optimization in process development today extends beyond the classic concepts of exposure and focus setting. The lithographer must understand and tune the system for critical feature performance using variables such as Numerical Aperture (NA), Partial Coherence (PC), and critical level tool matching. In a previous study, the authors demonstrated that the phase-shift focal plane monitor (PSFM) accurately measures focal plane variations when appropriate calibrations are employed. That paper also described the development of a model for classic aberrations such as Astigmatism, Field Curvature and Coma. The model considered geometrical aberrations (Seidel) with radial symmetry across image plane as being the primary contributor to CD variation across stepper image plane. The publication correlated image plane focal results to an approximation of the stepper's Critical Dimension (CD) behavior for a matrix of NA and PC settings. In this study, we continue the analysis of the focus budget and CD uniformity using two generations of optical steppers in a 0.35 micrometers process. The analysis first addresses questions involving the use of the PSFM including the variation of calibration across the exposure field and the advantages of using field center or full field calibrations. We describe a method of easily measuring the uniformity of NA and PC across the exposure field. These new tools are then applied as an aid in lens image field characterization and tool-to-tool matching. The information gathered is then applied to measure image aberrations and predict CD variation across the image under various conditions of focus. The predictions are validated by a comparison against CD uniformity as measured by a commercial Scanning Electron Microscope. Present work confirmed previous work and recent assumptions that Zernike diffraction theory of aberration is most appropriate for current stepper lenses with local image plane focal variations across entire field being the major contributor to field CD variations. Bibtex entry for this abstract Preferred format for this abstract (see Preferences) Find Similar Abstracts: Use: Authors Title Abstract Text Return: Query Results Return items starting with number Query Form Database: Astronomy Physics arXiv e-prints
Proceedings of SPIE, Jun 2, 2000
An advanced control system providing modeling and predictive data simulation for pass-fail criter... more An advanced control system providing modeling and predictive data simulation for pass-fail criteria of overlay production control has been used in 0.18 micrometer Design Rule production facilities for over a year. During this period overlay was measured on both product wafers and during periodic process qualification tests. The resulting raw data is modeled using exposure tool specific and layer-focused models. Modeled results, measured process statistics and tool signatures are combined in a real-time simulation to calculate the true overlay distribution over the entire wafer and lot. All results and raw data are automatically gathered and stored in a database for on-going analysis. In this manner, tool, product technology and process performance data are gathered for every overlay process-step. The data provides valuable insights into not only tool stability but also the process- step characteristic errors that contribute to the overlay spectrum of distortions. Data gathered in this manner is very stable and can be used to predict a feed-forward correction for all correctable coefficients. The technique must take into consideration algorithm modeled coefficient variations resulting from: (1) Reticle pattern-to-alignment mark design errors. (2) Process film variations. (3) Tool-to-tool static matching. (4) Tool-to-tool dynamic matching errors which are match-residual, process or time induced. This extensive database has resulted in a method of conducting Predictive Process Control (PPC) for overlay lithography within an advanced semiconductor line. Using PPC the wafer production facility experiences: (1) Improved Yield: Lots are always exposed with optimum setup. Optimized setups reduce rework levels and therefore wafer handling. (2) Capacity Improvement: Elimination of rework tacitly improves capacity in the facility. WIP is also simplified because lots do not have to wait for a dedicated exposure tool to become available. (3) Dynamic MatchingTM: Matching of multiple exposure tools is continuously monitored by the use of the feedback loop. Tool precision can be monitored as well as the setup systematic offsets. In this manner, the need to remove an exposure tool from production for match-maintenance can be predicted and scheduled. Residual matching errors can also be removed from the production cycle. The benefits of full production lot modeling and the contributors to production errors are presented. Process and Tool interactions as well as control- factor coefficient stability indicate the level of control to be well beyond manual methods. Calculations show that these contributors are predictable, stable and are a necessary tool for competitive sub-0.2 micron production. An analysis of the overlay error sources within two facilities results in consistent facility process response and a well-defined error budget derivation. From this analysis, the control added to semiconductor overlay is shown capable of extending mix-and- match exposure tool operations in production down to 0.12 micrometer design rules.
Proceedings of SPIE, Mar 16, 2007
This paper was published in SPIE Vol 6520-139 and is made available as an electronic reprint or p... more This paper was published in SPIE Vol 6520-139 and is made available as an electronic reprint or preprint with permission of SPIE. One print or electronic copy may be made for personal use only. Systematic or multiple reproduction, distribution to multiple locations via electronic or other means, duplication of any material in this paper for a fee or for commercial purposes or modification of the content of the paper are prohibited.
Proceedings of SPIE, Aug 22, 2001
... Also, simple feed-forward techniques are not applicable for logics and ASIC production due to... more ... Also, simple feed-forward techniques are not applicable for logics and ASIC production due to many different products, lithography chemistry combinations and the short memory of the averaging method. In the semiconductors industry, feed-forward control applications are ...
Proceedings of SPIE, 1987
The physical concepts and yield advantages of pellicle protected masks have been reported in many... more The physical concepts and yield advantages of pellicle protected masks have been reported in many papers. Pellicle disadvantages such as fragility and the hazard of particles trapped under the membrane have been greatly outweighed by the observed chip yield increase. Yet the pellicle provides no protection against static discharge, the dislodgement of previously electrostatically fixed particles, pellicle degradation and surface etch faults.
Optical Microlithography XVIII, May 12, 2005
It's commonly reported that a difference exists between directly measured reticle feature dimensi... more It's commonly reported that a difference exists between directly measured reticle feature dimensions and those produced in the final lithographic image. Quantifying this mask error function (MEF) and the sources of the perturbation has been the topic of many papers of the past several years. Past studies have been content to evaluate these functions by statistical averaging thereby neglecting the potential influence of process and exposure contributions. The material presented here represents the findings of an extensive study of reticle-process interactions. Phase I of the evaluation consisted of focus and dose exposures of the reticle and subsequent modeling of the full-profile response. This analysis provided extensive information on the optimum-printed feature profiles while removing the contribution of across-field focus variations. The reticle was directly characterized using both conventional SEM and a new Nanometrics OCD Scatterometer technique. The full-field modeled response surface of the directly measured feature characteristics are then used to calculate the across-field MEF and provide an improved estimate of the true response of the feature to exposure. Phase II of the analysis turns its attention to characterization of the full-wafer process response. Both the modeled and directly measured reticle surfaces were removed from Scatterometry measured full-wafer exposures. Normal process variations consisting of photoresist and ARC thickness volatility are next used to show the response of the printed feature. Finally a summary of the relative contribution of each process perturbation to the feature profile error budget is discussed.
Proceedings of SPIE, May 17, 2005
Most process window analysis applications are capable of deriving the functional focus-dose works... more Most process window analysis applications are capable of deriving the functional focus-dose workspace available to any set of device specifications. Previous work in this area has concentrated on calculating the superpositioned optimum operating points of various combinations of feature orientations or feature types. These studies invariably result in an average performance calculation that is biased by the impact of the substrate, reticle and exposure tool contributed perturbations. Many SEM's and optical metrology tools now provide full-feature profile information for multiple points in the exposure field. The inclusion of field spatial information into the process window analysis results in a calculation of greater accuracy and process understanding because now the capabilities of each exposure tool can be individually modeled and optimized. Such an analysis provides the added benefit that after the exposure tool is characterized, it's process perturbations can be removed from the analysis to provide greater understanding of the true process performance. Process window variables are shown to vary significantly across the exposure field of the scanner. Evaluating the depthof-focus and optimum focus-dose at each point in the exposure field yields additional information on the imaging response of the reticle and scan-linearity of the exposure tool's reticle stage. The optimal focus response of the reticle is then removed from a full wafer exposure and the results are modeled to obtain a true process response and performance.
Proceedings of SPIE, Jul 19, 1989
ABSTRACT Step and repeat camera optical systems today exhibit greater resolution, numerical apert... more ABSTRACT Step and repeat camera optical systems today exhibit greater resolution, numerical aperture and field sizes than in the past. It has become necessary to control lens distortion and matching errors to less than one-tenth micron across the entire usable field. The quality of the optics has risen to the point that fifth order distortion modeling has become a necessary component of the stepper matching procedure. Methods of data gathering and analysis vary significantly in the industry. In this paper we investigate four methods of overlay measurement including electrical, optical coherence probe, automated optical and optical vernier techniques. Precision to tolerance ratios and throughput for the measurement methods are discussed. Three methods of data analysis are compared including KLASS II for KLA 2020* data, EM1** for electrical data and SASO*** for both. A novel multi-substrate calibration technique is presented. In any matching situation the required sample size for accurate estimation of the lens components is important. Our study reviews the results of analyses of variances due to daily repeatability, wafer films, the number of measurement sites on each wafer and stepped field. Two methods of artifact generation were investigated, that of stage referenced matching and matching to a 'golden standard'. Finally, the expansion of the model to include seventh order distortions and the significance of this for the now emerging high numerical aperture, large field g-line lens designs is discussed.
Proceedings of SPIE, Jul 23, 1985
The concept of VLSI chip fabrication in a step and repeat lithography based facility varies signi... more The concept of VLSI chip fabrication in a step and repeat lithography based facility varies significantly from that of the conventional 1:1 aligner based process. Whereas a 1:1 mask contains over 100 possible chip sites covering a five inch wafer, a 5:1 stepper reticle may contain as few as one chip site per exposure with two to four being more typical. This small reticle array is the stepped across the wafer to fill all possible chip postitions on the wafer. If a printable defect exists on the reticle then the final dis yield will directly experience a loss of anywhere from 25% (four chips per reticle) to 100% (one chip per reticle) of the potential chips in all lots exposed with that reticle.
A Q-switched, Nd:YAG laser has been used •to study the variation " of re-f'lectivity of copper an... more A Q-switched, Nd:YAG laser has been used •to study the variation " of re-f'lectivity of copper and tantalum across single, 60 nonosecond long pulses with peak intensities ranging from 10 6 to 10 9 W/sq.cm. Measurements on the single crystal specimens were ta.ken at several ambient temperatures and pressures. ••on. poth clean and oxidized surfaces. Re;flectivity values for copper and tantalum with low intensity •.'-' •• ,l). .;. ' , temperatur es during ir'ra.diatio n. These calculatio ns in conjunctio n with SEM observatio ns of the cryElta.l. surfaces have shown that in many .\ :c~ses melting di.d •:n.ot occur even though the temperatur e of the irradiated area rose above the samples melting point. The liquid metal to liquid dielectric transition model of A. M.
SPIE Proceedings, 1992
The SPES Radioactive Ion Beam (RIB) facility at INFN-LNL is in the construction phase. It is base... more The SPES Radioactive Ion Beam (RIB) facility at INFN-LNL is in the construction phase. It is based on the ISOL method with an UCx Direct Target able to sustain a power of 10 kW. The primary proton beam is delivered by a high current Cyclotron accelerator, with energy 35-70 MeV and a beam current of 0.2-0.5 mA. Neutron-rich radioactive ions will be produced by proton induced Uranium fission in the UCx target at an expected fission rate in the order of 10 13 fissions per second. The exotic isotopes will be re-accelerated by the ALPI superconducting LINAC at energies of 10A MeV and higher, for masses in the region A=130 amu at expected rate on the secondary target of 10 7-10 9 pps. The SPES project has the aim to provide high intensity and highquality beams of neutron-rich nuclei as well as to develop an interdisciplinary research center based on the cyclotron proton beam. 2 SPES and the European context The European road map for nuclear physics with ISOL facilities goes through the development of second generation ISOL facilities pointing to EURISOL, the future European ISOL facility.
ver the last decade, lithog- raphy has driven down the majority of problems asso- ciated with acc... more ver the last decade, lithog- raphy has driven down the majority of problems asso- ciated with accurate pat- tern alignment. Gone are the days where illuminator aberrations were blindly transported to the image in attempts to create uniformity of exposure.1 Wild swings in a weather front's barometric pressure are now a mild perturbation in the production cycle and no longer a major obstacle driving exposure tool magnification to the limits of device control. The objectives dictated by the SIA Roadmap to reach first 0.5 µm and then 0.25 µm design rules have resulted in formidable technological devices. Wafer transport in the exposure tool now includes sophisticated, intefero- metrically phase-controlled stages. Pat- tern placements have been guided by alignment systems employing multiple orders of the Fourier transform spec- trum and artificial intelligence in the form of sophisticated algorithms for analysis resulting in targeting capabili- ties that extend to and below the rang...
Metrology, Inspection, and Process Control for Microlithography XV, 2001
ABSTRACT Feed-forward, as a method to control the Lithography process for Critical Dimensions and... more ABSTRACT Feed-forward, as a method to control the Lithography process for Critical Dimensions and Overlay, is well known in the semiconductors industry. However, the control provided by simple averaging feed-forward methodologies is not sufficient to support the complexity of a sub-0.18micrometers lithography process. Also, simple feed-forward techniques are not applicable for logics and ASIC production due to many different products, lithography chemistry combinations and the short memory of the averaging method. In the semiconductors industry, feed-forward control applications are generally called APC, Advanced Process Control applications. Today, there are as many APC methods as the number of engineers involved. To meet the stringent requirements of 0.18 micrometers production, we selected a method that is described in SPIE 3998-48 (March 2000) by Terrence Zavecz and Rene Blanquies from Yield Dynamics Inc. This method is called PPC, Predictive Process Control, and employs a methodology of collecting measurement results and the modeled bias attributes of expose tools, reticles and the incoming process in a signatures database. With PPC, before each lot exposure, the signatures of the lithography tool, the reticle and the incoming process are used to predict the setup of the lot process and the expected lot results. Benefits derived from such an implementation are very clear; there is no limitation of the number of products or lithography-chemistry combinations and the technique avoids the short memory of conventional APC techniques. ... and what's next? (Rob Morton, Philips assignee to International Sematech). The next part of the paper will try to answer this question. Observing that CMP and metal deposition significantly influence CD's and overlay results, and even Contact Etch can have a significant influence on Metal 5 overlay, we developed a more general PPC for lithography. Starting with the existing lithography PPC applications database, the authors extended the access of the analysis to include the external variables involved in CMP, deposition etc. We then applied yield analysis methods to identify the significant lithography-external process variables from the history of lots, subsequently adding the identified process variable to the signatures database and to the PPC calculations. With these improvements, the authors anticipate a 50% improvement of the process window. This improvement results in a significant reduction of rework and improved yield depending on process demands and equipment configuration. A statistical theory that explains the PPC is then presented. This theory can be used to simulate a general PPC application. In conclusion, the PPC concept is not lithography or semiconductors limited. In fact it is applicable for any production process that is signature biased (chemical industry, car industry, .). Requirements for the PPC are large data collection, a controllable process that is not too expensive to tune the process for every lot, and the ability to employ feedback calculations. PPC is a major change in the process management approach and therefor will first be employed where the need is high and the return on investment is very fast. The best industry to start with is the semiconductors and the most likely process area to start with is lithography.
This paper was published in SPIE Vol 6520-64 and is made available as an electronic reprint or pr... more This paper was published in SPIE Vol 6520-64 and is made available as an electronic reprint or preprint with permission of SPIE. One print or electronic copy may be made for personal use only. Systematic or multiple reproduction, distribution to multiple locations via electronic or other means, duplication of any material in this paper for a fee or for commercial purposes or modification of the content of the paper are prohibited.
This paper was published in SPIE BACUS Vol 6730-46 and is made available as an electronic reprint... more This paper was published in SPIE BACUS Vol 6730-46 and is made available as an electronic reprint or preprint with permission of SPIE. One print or electronic copy may be made for personal use only. Systematic or multiple reproduction, distribution to multiple locations via electronic or other means, duplication of any material in this paper for a fee or for commercial purposes or modification of the content of the paper are prohibited.
Handbook of Critical Dimension Metrology and Process Control: A Critical Review
Equipment for pattern generation in the semiconductor industry have long been a critical part of ... more Equipment for pattern generation in the semiconductor industry have long been a critical part of the manufacturing process. As the industry matured and finer features were required on larger exposure fields, the complexity of these machines grew. The requirements of calibration soon taxed the ability of statistics to understand the complexities of and control required by the tools. In response, the industry adopted an analytical method of data analysis. Mathematical models describing the operation, design and behavior of these tools were developed. These models are called machine models and herein is contained a critical review of the technology to date. The development of machine models is shownfrom a historical standpoint. First the concepts of metrology and a clear explanation of the analytical method are presented. Then the development of modeling and how it was influenced by the automation of metrology in the mid 1980’s. The later sections describe models and their application to machine characterization and matching. These are followed by recent concepts in model building and evaluation. In doing so, it is shown how the concept of machine models are critical to the continued advancement of lithography and equipment development. A final section presents some previously unpublished work on the determination of machine precision. The work is supported by an appendix of derivations for the basic model elements.
Proceedings of SPIE, Jul 1, 1991
ABSTRACT A new figure of merit, the critical dimension capability factor, of CDC, is described. T... more ABSTRACT A new figure of merit, the critical dimension capability factor, of CDC, is described. The CDC incorporates measurements taken over a range of linewidths and over a range of process variations which simulate normal and extreme process operating conditions. Under these conditions CDC uniquely quantifies the capability of the measurement instrument on a given substrate and for a given set of parameter settings. CDC is calculated by performing a linear regression between measurements generated by the instrument under test (IUT) and a set of reference values (internally generated standard values). The mean square error (MSE) between the regression line and the observed values is then partitioned into components which estimate the contribution to the MSE from various sources based on a rigorous statistical analysis. The final CDC value is defined as the linewidth to uncertainty ratio and is a function of uncertainty introduced in the characterization procedure as well as the uncertainty introduced when the IUT makes a measurement in practice. Since the CDC is a function of the overall uncertainty in the measurements of the IUT relative to the reference values, it can legitimately be compared from one instrument to another and used to evaluate alternative measurement methods and technologies.
Proceedings of SPIE, Jun 7, 1996
ABSTRACT In this study we evaluate the focus budget of an i-line stepper, examining the sources o... more ABSTRACT In this study we evaluate the focus budget of an i-line stepper, examining the sources of focus erosion for a 0.40 micrometers process. The analysis first examines the best focus of the system as predicted by the several common tools currently used in the industry. Using the overlay focus- monitor, we then determine the value of lens aberrations such as astigmatism and field curvature. The results of lens heating examined for the lens using these common techniques. A model describing the focus aberrations is then developed and applied to the data. This model uses data derived from the focus monitor to determine lens errors such as coma, astigmatism and field curvature. Using this model, data gathered over various numerical aperture and partial coherence values are evaluated to determine the variation of lens aberrations over their range and the depth of focus. Finally, data consisting of critical dimension information gathered using an commercial, automated SEM is used to validate the predictions of the focus model.
Proceedings of SPIE, Jun 1, 1992
ABSTRACT The considerations which drive an expert system for assisting in measurement system char... more ABSTRACT The considerations which drive an expert system for assisting in measurement system characterization are described. The expert system employs several novel techniques for evaluating the integrity of a characterization analysis by determining the degree to which critical assumptions are satisfied and flagging weak points in the data collection or analysis procedure. The properties of good characterization sampling plans are derived. Methods for formulating reliable characterization studies are described. The paper focuses on short term studies intended for equipment comparisons and calibrations; however, with minor alterations it can be expanded to include longer term stability studies.
Proceedings of SPIE, Jul 7, 1997
ABSTRACT Exposure tool optimization in process development today extends beyond the classic conce... more ABSTRACT Exposure tool optimization in process development today extends beyond the classic concepts of exposure and focus setting. The lithographer must understand and tune the system for critical feature performance using variables such as Numerical Aperture (NA), Partial Coherence (PC), and critical level tool matching. In a previous study, the authors demonstrated that the phase-shift focal plane monitor (PSFM) accurately measures focal plane variations when appropriate calibrations are employed. That paper also described the development of a model for classic aberrations such as Astigmatism, Field Curvature and Coma. The model considered geometrical aberrations (Seidel) with radial symmetry across image plane as being the primary contributor to CD variation across stepper image plane. The publication correlated image plane focal results to an approximation of the stepper's Critical Dimension (CD) behavior for a matrix of NA and PC settings. In this study, we continue the analysis of the focus budget and CD uniformity using two generations of optical steppers in a 0.35 micrometers process. The analysis first addresses questions involving the use of the PSFM including the variation of calibration across the exposure field and the advantages of using field center or full field calibrations. We describe a method of easily measuring the uniformity of NA and PC across the exposure field. These new tools are then applied as an aid in lens image field characterization and tool-to-tool matching. The information gathered is then applied to measure image aberrations and predict CD variation across the image under various conditions of focus. The predictions are validated by a comparison against CD uniformity as measured by a commercial Scanning Electron Microscope. Present work confirmed previous work and recent assumptions that Zernike diffraction theory of aberration is most appropriate for current stepper lenses with local image plane focal variations across entire field being the major contributor to field CD variations. Bibtex entry for this abstract Preferred format for this abstract (see Preferences) Find Similar Abstracts: Use: Authors Title Abstract Text Return: Query Results Return items starting with number Query Form Database: Astronomy Physics arXiv e-prints
Proceedings of SPIE, Jun 2, 2000
An advanced control system providing modeling and predictive data simulation for pass-fail criter... more An advanced control system providing modeling and predictive data simulation for pass-fail criteria of overlay production control has been used in 0.18 micrometer Design Rule production facilities for over a year. During this period overlay was measured on both product wafers and during periodic process qualification tests. The resulting raw data is modeled using exposure tool specific and layer-focused models. Modeled results, measured process statistics and tool signatures are combined in a real-time simulation to calculate the true overlay distribution over the entire wafer and lot. All results and raw data are automatically gathered and stored in a database for on-going analysis. In this manner, tool, product technology and process performance data are gathered for every overlay process-step. The data provides valuable insights into not only tool stability but also the process- step characteristic errors that contribute to the overlay spectrum of distortions. Data gathered in this manner is very stable and can be used to predict a feed-forward correction for all correctable coefficients. The technique must take into consideration algorithm modeled coefficient variations resulting from: (1) Reticle pattern-to-alignment mark design errors. (2) Process film variations. (3) Tool-to-tool static matching. (4) Tool-to-tool dynamic matching errors which are match-residual, process or time induced. This extensive database has resulted in a method of conducting Predictive Process Control (PPC) for overlay lithography within an advanced semiconductor line. Using PPC the wafer production facility experiences: (1) Improved Yield: Lots are always exposed with optimum setup. Optimized setups reduce rework levels and therefore wafer handling. (2) Capacity Improvement: Elimination of rework tacitly improves capacity in the facility. WIP is also simplified because lots do not have to wait for a dedicated exposure tool to become available. (3) Dynamic MatchingTM: Matching of multiple exposure tools is continuously monitored by the use of the feedback loop. Tool precision can be monitored as well as the setup systematic offsets. In this manner, the need to remove an exposure tool from production for match-maintenance can be predicted and scheduled. Residual matching errors can also be removed from the production cycle. The benefits of full production lot modeling and the contributors to production errors are presented. Process and Tool interactions as well as control- factor coefficient stability indicate the level of control to be well beyond manual methods. Calculations show that these contributors are predictable, stable and are a necessary tool for competitive sub-0.2 micron production. An analysis of the overlay error sources within two facilities results in consistent facility process response and a well-defined error budget derivation. From this analysis, the control added to semiconductor overlay is shown capable of extending mix-and- match exposure tool operations in production down to 0.12 micrometer design rules.
Proceedings of SPIE, Mar 16, 2007
This paper was published in SPIE Vol 6520-139 and is made available as an electronic reprint or p... more This paper was published in SPIE Vol 6520-139 and is made available as an electronic reprint or preprint with permission of SPIE. One print or electronic copy may be made for personal use only. Systematic or multiple reproduction, distribution to multiple locations via electronic or other means, duplication of any material in this paper for a fee or for commercial purposes or modification of the content of the paper are prohibited.
Proceedings of SPIE, Aug 22, 2001
... Also, simple feed-forward techniques are not applicable for logics and ASIC production due to... more ... Also, simple feed-forward techniques are not applicable for logics and ASIC production due to many different products, lithography chemistry combinations and the short memory of the averaging method. In the semiconductors industry, feed-forward control applications are ...
Proceedings of SPIE, 1987
The physical concepts and yield advantages of pellicle protected masks have been reported in many... more The physical concepts and yield advantages of pellicle protected masks have been reported in many papers. Pellicle disadvantages such as fragility and the hazard of particles trapped under the membrane have been greatly outweighed by the observed chip yield increase. Yet the pellicle provides no protection against static discharge, the dislodgement of previously electrostatically fixed particles, pellicle degradation and surface etch faults.
Optical Microlithography XVIII, May 12, 2005
It's commonly reported that a difference exists between directly measured reticle feature dimensi... more It's commonly reported that a difference exists between directly measured reticle feature dimensions and those produced in the final lithographic image. Quantifying this mask error function (MEF) and the sources of the perturbation has been the topic of many papers of the past several years. Past studies have been content to evaluate these functions by statistical averaging thereby neglecting the potential influence of process and exposure contributions. The material presented here represents the findings of an extensive study of reticle-process interactions. Phase I of the evaluation consisted of focus and dose exposures of the reticle and subsequent modeling of the full-profile response. This analysis provided extensive information on the optimum-printed feature profiles while removing the contribution of across-field focus variations. The reticle was directly characterized using both conventional SEM and a new Nanometrics OCD Scatterometer technique. The full-field modeled response surface of the directly measured feature characteristics are then used to calculate the across-field MEF and provide an improved estimate of the true response of the feature to exposure. Phase II of the analysis turns its attention to characterization of the full-wafer process response. Both the modeled and directly measured reticle surfaces were removed from Scatterometry measured full-wafer exposures. Normal process variations consisting of photoresist and ARC thickness volatility are next used to show the response of the printed feature. Finally a summary of the relative contribution of each process perturbation to the feature profile error budget is discussed.
Proceedings of SPIE, May 17, 2005
Most process window analysis applications are capable of deriving the functional focus-dose works... more Most process window analysis applications are capable of deriving the functional focus-dose workspace available to any set of device specifications. Previous work in this area has concentrated on calculating the superpositioned optimum operating points of various combinations of feature orientations or feature types. These studies invariably result in an average performance calculation that is biased by the impact of the substrate, reticle and exposure tool contributed perturbations. Many SEM's and optical metrology tools now provide full-feature profile information for multiple points in the exposure field. The inclusion of field spatial information into the process window analysis results in a calculation of greater accuracy and process understanding because now the capabilities of each exposure tool can be individually modeled and optimized. Such an analysis provides the added benefit that after the exposure tool is characterized, it's process perturbations can be removed from the analysis to provide greater understanding of the true process performance. Process window variables are shown to vary significantly across the exposure field of the scanner. Evaluating the depthof-focus and optimum focus-dose at each point in the exposure field yields additional information on the imaging response of the reticle and scan-linearity of the exposure tool's reticle stage. The optimal focus response of the reticle is then removed from a full wafer exposure and the results are modeled to obtain a true process response and performance.
Proceedings of SPIE, Jul 19, 1989
ABSTRACT Step and repeat camera optical systems today exhibit greater resolution, numerical apert... more ABSTRACT Step and repeat camera optical systems today exhibit greater resolution, numerical aperture and field sizes than in the past. It has become necessary to control lens distortion and matching errors to less than one-tenth micron across the entire usable field. The quality of the optics has risen to the point that fifth order distortion modeling has become a necessary component of the stepper matching procedure. Methods of data gathering and analysis vary significantly in the industry. In this paper we investigate four methods of overlay measurement including electrical, optical coherence probe, automated optical and optical vernier techniques. Precision to tolerance ratios and throughput for the measurement methods are discussed. Three methods of data analysis are compared including KLASS II for KLA 2020* data, EM1** for electrical data and SASO*** for both. A novel multi-substrate calibration technique is presented. In any matching situation the required sample size for accurate estimation of the lens components is important. Our study reviews the results of analyses of variances due to daily repeatability, wafer films, the number of measurement sites on each wafer and stepped field. Two methods of artifact generation were investigated, that of stage referenced matching and matching to a 'golden standard'. Finally, the expansion of the model to include seventh order distortions and the significance of this for the now emerging high numerical aperture, large field g-line lens designs is discussed.
Proceedings of SPIE, Jul 23, 1985
The concept of VLSI chip fabrication in a step and repeat lithography based facility varies signi... more The concept of VLSI chip fabrication in a step and repeat lithography based facility varies significantly from that of the conventional 1:1 aligner based process. Whereas a 1:1 mask contains over 100 possible chip sites covering a five inch wafer, a 5:1 stepper reticle may contain as few as one chip site per exposure with two to four being more typical. This small reticle array is the stepped across the wafer to fill all possible chip postitions on the wafer. If a printable defect exists on the reticle then the final dis yield will directly experience a loss of anywhere from 25% (four chips per reticle) to 100% (one chip per reticle) of the potential chips in all lots exposed with that reticle.
A Q-switched, Nd:YAG laser has been used •to study the variation " of re-f'lectivity of copper an... more A Q-switched, Nd:YAG laser has been used •to study the variation " of re-f'lectivity of copper and tantalum across single, 60 nonosecond long pulses with peak intensities ranging from 10 6 to 10 9 W/sq.cm. Measurements on the single crystal specimens were ta.ken at several ambient temperatures and pressures. ••on. poth clean and oxidized surfaces. Re;flectivity values for copper and tantalum with low intensity •.'-' •• ,l). .;. ' , temperatur es during ir'ra.diatio n. These calculatio ns in conjunctio n with SEM observatio ns of the cryElta.l. surfaces have shown that in many .\ :c~ses melting di.d •:n.ot occur even though the temperatur e of the irradiated area rose above the samples melting point. The liquid metal to liquid dielectric transition model of A. M.
SPIE Proceedings, 1992
The SPES Radioactive Ion Beam (RIB) facility at INFN-LNL is in the construction phase. It is base... more The SPES Radioactive Ion Beam (RIB) facility at INFN-LNL is in the construction phase. It is based on the ISOL method with an UCx Direct Target able to sustain a power of 10 kW. The primary proton beam is delivered by a high current Cyclotron accelerator, with energy 35-70 MeV and a beam current of 0.2-0.5 mA. Neutron-rich radioactive ions will be produced by proton induced Uranium fission in the UCx target at an expected fission rate in the order of 10 13 fissions per second. The exotic isotopes will be re-accelerated by the ALPI superconducting LINAC at energies of 10A MeV and higher, for masses in the region A=130 amu at expected rate on the secondary target of 10 7-10 9 pps. The SPES project has the aim to provide high intensity and highquality beams of neutron-rich nuclei as well as to develop an interdisciplinary research center based on the cyclotron proton beam. 2 SPES and the European context The European road map for nuclear physics with ISOL facilities goes through the development of second generation ISOL facilities pointing to EURISOL, the future European ISOL facility.
ver the last decade, lithog- raphy has driven down the majority of problems asso- ciated with acc... more ver the last decade, lithog- raphy has driven down the majority of problems asso- ciated with accurate pat- tern alignment. Gone are the days where illuminator aberrations were blindly transported to the image in attempts to create uniformity of exposure.1 Wild swings in a weather front's barometric pressure are now a mild perturbation in the production cycle and no longer a major obstacle driving exposure tool magnification to the limits of device control. The objectives dictated by the SIA Roadmap to reach first 0.5 µm and then 0.25 µm design rules have resulted in formidable technological devices. Wafer transport in the exposure tool now includes sophisticated, intefero- metrically phase-controlled stages. Pat- tern placements have been guided by alignment systems employing multiple orders of the Fourier transform spec- trum and artificial intelligence in the form of sophisticated algorithms for analysis resulting in targeting capabili- ties that extend to and below the rang...
Metrology, Inspection, and Process Control for Microlithography XV, 2001
ABSTRACT Feed-forward, as a method to control the Lithography process for Critical Dimensions and... more ABSTRACT Feed-forward, as a method to control the Lithography process for Critical Dimensions and Overlay, is well known in the semiconductors industry. However, the control provided by simple averaging feed-forward methodologies is not sufficient to support the complexity of a sub-0.18micrometers lithography process. Also, simple feed-forward techniques are not applicable for logics and ASIC production due to many different products, lithography chemistry combinations and the short memory of the averaging method. In the semiconductors industry, feed-forward control applications are generally called APC, Advanced Process Control applications. Today, there are as many APC methods as the number of engineers involved. To meet the stringent requirements of 0.18 micrometers production, we selected a method that is described in SPIE 3998-48 (March 2000) by Terrence Zavecz and Rene Blanquies from Yield Dynamics Inc. This method is called PPC, Predictive Process Control, and employs a methodology of collecting measurement results and the modeled bias attributes of expose tools, reticles and the incoming process in a signatures database. With PPC, before each lot exposure, the signatures of the lithography tool, the reticle and the incoming process are used to predict the setup of the lot process and the expected lot results. Benefits derived from such an implementation are very clear; there is no limitation of the number of products or lithography-chemistry combinations and the technique avoids the short memory of conventional APC techniques. ... and what's next? (Rob Morton, Philips assignee to International Sematech). The next part of the paper will try to answer this question. Observing that CMP and metal deposition significantly influence CD's and overlay results, and even Contact Etch can have a significant influence on Metal 5 overlay, we developed a more general PPC for lithography. Starting with the existing lithography PPC applications database, the authors extended the access of the analysis to include the external variables involved in CMP, deposition etc. We then applied yield analysis methods to identify the significant lithography-external process variables from the history of lots, subsequently adding the identified process variable to the signatures database and to the PPC calculations. With these improvements, the authors anticipate a 50% improvement of the process window. This improvement results in a significant reduction of rework and improved yield depending on process demands and equipment configuration. A statistical theory that explains the PPC is then presented. This theory can be used to simulate a general PPC application. In conclusion, the PPC concept is not lithography or semiconductors limited. In fact it is applicable for any production process that is signature biased (chemical industry, car industry, .). Requirements for the PPC are large data collection, a controllable process that is not too expensive to tune the process for every lot, and the ability to employ feedback calculations. PPC is a major change in the process management approach and therefor will first be employed where the need is high and the return on investment is very fast. The best industry to start with is the semiconductors and the most likely process area to start with is lithography.
This paper was published in SPIE Vol 6520-64 and is made available as an electronic reprint or pr... more This paper was published in SPIE Vol 6520-64 and is made available as an electronic reprint or preprint with permission of SPIE. One print or electronic copy may be made for personal use only. Systematic or multiple reproduction, distribution to multiple locations via electronic or other means, duplication of any material in this paper for a fee or for commercial purposes or modification of the content of the paper are prohibited.
This paper was published in SPIE BACUS Vol 6730-46 and is made available as an electronic reprint... more This paper was published in SPIE BACUS Vol 6730-46 and is made available as an electronic reprint or preprint with permission of SPIE. One print or electronic copy may be made for personal use only. Systematic or multiple reproduction, distribution to multiple locations via electronic or other means, duplication of any material in this paper for a fee or for commercial purposes or modification of the content of the paper are prohibited.
Handbook of Critical Dimension Metrology and Process Control: A Critical Review
Equipment for pattern generation in the semiconductor industry have long been a critical part of ... more Equipment for pattern generation in the semiconductor industry have long been a critical part of the manufacturing process. As the industry matured and finer features were required on larger exposure fields, the complexity of these machines grew. The requirements of calibration soon taxed the ability of statistics to understand the complexities of and control required by the tools. In response, the industry adopted an analytical method of data analysis. Mathematical models describing the operation, design and behavior of these tools were developed. These models are called machine models and herein is contained a critical review of the technology to date. The development of machine models is shownfrom a historical standpoint. First the concepts of metrology and a clear explanation of the analytical method are presented. Then the development of modeling and how it was influenced by the automation of metrology in the mid 1980’s. The later sections describe models and their application to machine characterization and matching. These are followed by recent concepts in model building and evaluation. In doing so, it is shown how the concept of machine models are critical to the continued advancement of lithography and equipment development. A final section presents some previously unpublished work on the determination of machine precision. The work is supported by an appendix of derivations for the basic model elements.