Statistically enhanced analogue and mixed-signal design and test (original) (raw)

Monte Carlo-Alternative Probabilistic Simulations for Analog Systems

2006

Probabilistic system simulations for analog circuits have traditionally been handled with Monte Carlo analysis. For a manufacturable design, fast and accurate simulations are necessary for time-to-market, design for manufacturability and yield concerns. In this paper, a fast and accurate probabilistic simulation alternative is proposed targeting the simulation of analog systems. The proposed method shows high accuracy for performance estimation combined with a 100fold reduction in run-time with respect to a 1000-sample Monte Carlo analysis.

Practical statistical simulation for efficient circuit design

In wireless handset design, specifically power amplifiers (PAs), there is constant pressure to improve time-to-market while maintaining high yields. To meet these demands, designers need to evaluate current design practices and identify areas for improvement. Presently, some PA designers spend a great deal of time bench-tuning to optimize circuits. Because this is very time consuming, the main focus is obtaining the best "nominal" performance, and process variation is generally an afterthought. Frequently, new circuit topologies are implemented and minimal sample sizes are evaluated (often on a single wafer) leading to "one-wafer wonder" results.

DC Statistical Circuit Analysis for Bipolar IC's Using Parameter Correlations-An Experimental Example

IEEE Transactions on Computer-Aided Design of Integrated Circuits and Systems, 1984

Statistical analysis simulates circuit performance variations caused by tolerance variations and other circuit production factors. The procedure of statistical circuit simulation is illustrated with the help of a simple circuit example. Measurements are made on a sample of this circuit. These measured results are compared with the simulation results from worst-case and statistical analyses without and with model param eter coorelations for the devices used in the circuit. The neccessity for properly including the model parameter correlations is evident from this comparison.

From statistical model checking to statistical model inference: Characterizing the effect of process variations in analog circuits

2013 IEEE/ACM International Conference on Computer-Aided Design (ICCAD), 2013

This paper studies the effect of parameter variation on the behavior of analog circuits at the transistor (netlist) level. It is well known that variation in key circuit parameters can often adversely impact the correctness and performance of analog circuits during fabrication. An important problem lies in characterizing a safe subset of the parameter space for which the circuit can be guaranteed to satisfy the design specification. Due to the sheer size and complexity of analog circuits, a formal approach to the problem remains out of reach, especially at the transistor level. Therefore, we present a statistical model inference approach that exploits recent advances in statistical verification techniques. Our approach uses extensive circuit simulations to infer polynomials that approximate the behavior of a circuit. A procedure inspired by statistical model checking is then introduced to produce "statistically sound" models that extend the polynomial approximation. The resulting model can be viewed as a statistically guaranteed over-approximation of the circuit behavior. The proposed technique is demonstrated with two case studies in which it identifies subsets of parameters that satisfy the design specifications.

Hierarchical statistical characterization of mixed-signal circuits using behavioral modeling

1997

Abstract A methodology for hierarchical statistical circuit characterization which does not rely upon circuit-level Monte Carlo simulation is presented. The methodology uses principal component analysis, response surface methodology, and statistics to directly calculate the statistical distributions of higher-level parameters from the distributions of lower-level parameters. We have used the methodology to characterize a folded cascode operational amplifier and a phase-locked loop.

A comparison of deterministic and statistical sampling techniques for quality analysis of integrated circuits

Quality and Reliability Engineering International, 1993

There has been a great amount of publicity about Taguchi methods which employ deterministic sampling techniques for robust design. Also given wide exposition in the literature is tolerance design which achieves similar objectives but employs random sampling techniques. The question arises as to which approach-random or deterministic-is more suitable for robust design of integrated circuits. Robust design is a two-step process and quality analysis-the first step-involves the estimation of 'quality factors', which measure the effect of noise on the quality of system performance. This paper concentrates on the quality analysis of integrated circuits. A comparison is made between the deterministic sampling technique based on Taguchi's orthogonal arrays and the random sampling technique based on the Monte Carlo method, the objective being to determine which of the two gives more reliable (i.e. more consistent) estimates of quality factors. Results obtained indicated that the Monte Carlo method gave estimates of quality which were at least 40 per cent more consistent than orthogonal arrays. The accuracy of prediction of quality by Taguchi's orthogonal arrays is strongly affected by the choice of parameter quantization levels-a disadvantage-since there is a very large number (theoretically infinite) of choices of quantization levels for each parameter of an integrated circuit. The cost of the Monte Carlo method is independent of the dimensionality (number of designable parameters), being governed only by the confidence levels required for quality factors, whereas the size of orthogonal array required for a given problem is partly dependent on the number of circuit parameters. Two integrated circuits-a 7-parameter CMOS voltage reference and a 20-parameter bipolar operational amplifier-were employed in the investigation. Quality factors of interest included performance variability, acceptability (relative to customer specifications) and deviation from target. KEY WORDS Quality analysis Quality design Tolerance analysis Tolerance design Integrated circuit design Taguchi methods Robust design

Advanced Statistical Methodologies for Tolerance Analysis in Analog Circuit Design

Advances in Analog Circuits, 2011

The influence of process variations is becoming extremely critical for nano technology nodes (90nm and below), due to geometric tolerances and manufacturing non-idealities (such as edge or surface roughness, or the fluctuation in the number of doping atoms). The most worrying of all is the statistical variability introduced by discreteness of charge and granularity matter in the transistors approaching molecular and atomic scale dimensions. The main sources of statistical variability are the random distributions of discrete dopants and charged defects, the line edge roughness of the photo resist and the granularity of the materials . As a result, production yields and circuit figures of merit (such as performance, power, and reliability) have became extremely sensitive to incontrollable statistical process variations (PV). The main sources of variations are: environmental factors, whose transient arises during the operation of a circuit (e.g. power supply or temperature variations), and physical factors due to the manufacturing process, which result in a (permanent or aging) variation of the device structure and interconnections. The latter reflect into random (possibly spatial) drifts of the design parameter. Although already considered in the past, the increasing impact of these drawbacks constitutes a completely new challenge. While process engineers have traditionally coped with die-to-die fluctuations, the today within-die variations are more subtle since they imply that different areas of the same die exhibit different values of the various parameters. With a further shrinking of process technology, the on-chip variation is getting worse for each technology node, thus having a direct impact on the design flows. By contrast, the latter conventionally rely on deterministic models. At a front end, parameter variability has a significant impact both on the power dissipation and performance of a circuit, with a consequent yield decrease and remarkable cost implications. Indeed, to maintain production efficiency we must raise up control costs and cycle time, a drawback which dramatically increases with the process complexity. To contrast it, the following two joint tasks become essential:

Statistical runtime verification of analog and mixed signal designs

2009 3rd International Conference on Signals, Circuits and Systems (SCS), 2009

The author has granted a nonexclusive license allowing Library and Archives Canada to reproduce, publish, archive, preserve, conserve, communicate to the public by telecommunication or on the Internet, loan, distribute and sell theses worldwide, for commercial or noncommercial purposes, in microform, paper, electronic and/or any other formats. L'auteur a accorde une licence non exclusive permettant a la Bibliotheque et Archives Canada de reproduire, publier, archiver, sauvegarder, conserver, transmettre au public par telecommunication ou par I'lnternet, prefer, distribuer et vendre des theses partout dans le monde, a des fins commerciales ou autres, sur support microforme, papier, electronique et/ou autres formats. Bien que ces formulaires aient inclus dans la pagination, il n'y aura aucun contenu manquant.

A statistical approach for design and testing of analog circuitry in low-cost SoCs

2010 53rd IEEE International Midwest Symposium on Circuits and Systems, 2010

A novel design-for-testability approach is proposed, which is derived from the aggressive probabilistic targets set forth for the yield and quality to be achieved in the massproduction of high-volume low-cost transceiver SoCs, thus requiring solutions that are fundamentally different from the traditional approaches. Statistical analysis is presented as the basis for the proposed approach, and specific guidelines are defined and demonstrated through examples. The proposed approach, based on built-in-self-testing (BIST) of RF/mixedsignal functions in the transceiver SoC, relies on digital processing resources that are typically available within the SoC at no additional cost and may aid in its testing and calibration. The important roles of characterization and built-in-selfcalibration and compensation in this context are also defined.

A Tool for Analog/RF BIST Evaluation Using Statistical Models of Circuit Parameters

ACM Transactions on Design Automation of Electronic Systems, 2015

Testing analog integrated circuits is expensive in terms of both test equipment and time. To reduce the cost, Design-For-Test techniques (DFT) such as Built-In Self-Test (BIST) have been developed. For a given Circuit Under Test (CUT), the choice of a suitable technique should be made at the design stage as a result of the analysis of test metrics such as test escapes and yield loss. However, it is very hard to carry out this estimation for analog/RF circuits by using fault simulation techniques. Instead, the estimation of parametric test metrics is made possible by Monte Carlo circuit-level simulations and the construction of statistical models. These models represent the output parameter space of the CUT in which the test metrics are defined. In addition, models of the input parameter space may be required to accelerate the simulations and obtain higher confidence in the DFT choices. In this work, we describe a methodological flow for the selection of most adequate statistical mod...