Using Quality Estimate of a New Product to Analyze Efficiency of Testing

Similar documents
Functional Test coverage

Title Page. A coverage tool to explore the question. What does it mean when the board test passes? Lars Kongsted-Jensen

Utilizing current test strategies to drive diagnostics development, deployment and support through software tools

1. Define risk. Which are the various types of risk?

Risk Control and Opportunity Realization

What is Your SIS Doing When You re Not Watching? Monitoring and Managing Independent Protection Layers and Safety Instrumented Systems

Performance and Economic Evaluation of Fraud Detection Systems

SAGE 100 ERP TIPS AND TRICKS BANK RECONCILIATION Presented by Target System Technology, Inc.

Understanding the customer s requirements for a software system. Requirements Analysis

Managing Calibration Confidence in the Real World

Making Risks Manageable. Technical Risk Management for your Business

(a) Calculate planning and operating variances following the recognition of the learning curve effect. (6 marks)

2014 Purchase Price Allocation Study. December 2015

Predicting and Preventing Credit Card Default

Risk Analysis Risk Management

starting on 5/1/1953 up until 2/1/2017.

Optimize RRSP Contribution Strategy Summary

Article from The Modeling Platform. November 2017 Issue 6

SFC reprimands and fines Noah Holdings (Hong Kong) Limited HK$5 million for regulatory breaches

Braindumps.PRINCE2-Foundation.150.QA

Tarant Case Studies for FLEXCUBE Leading Mortgage Bank in CEE

Managerial Accounting Prof. Dr. Varadraj Bapat Department of School of Management Indian Institute of Technology, Bombay

Purchase Price Allocation Study. December 2017

Geoff Considine, Ph.D.

RISK EVALUATIONS FOR THE CLASSIFICATION OF MARINE-RELATED FACILITIES

Examining Long-Term Trends in Company Fundamentals Data

A DECISION SUPPORT SYSTEM FOR HANDLING RISK MANAGEMENT IN CUSTOMER TRANSACTION

Actualtests.PRINCE2Foundation.120questions

Managing Project Risk DHY

Regional IAM: analysis of riskadjusted costs and benefits of climate policies

Forecasting stock market prices

PRINCE2-PRINCE2-Foundation.150q

Option Volatility "The market can remain irrational longer than you can remain solvent"

Iran s Stock Market Prediction By Neural Networks and GA

Building the Business Case for Automation

Guidelines for brief feasibility study

2015 Purchase Price Allocation Study. December 2016

A new tool for selecting your next project

Solved MCQs Of CS615 effort, risks, and resources are the factors included in

2013 Purchase Price Allocation Study. December 2014

The TradeMiner Neural Network Prediction Model

Errors in Operational Spreadsheets: A Review of the State of the Art

Jacob: The illustrative worksheet shows the values of the simulation parameters in the upper left section (Cells D5:F10). Is this for documentation?

How to Measure Herd Behavior on the Credit Market?

Money Management Questionnaire

An Introduction to Risk

Financial Planning Checklist

AN ARTIFICIAL NEURAL NETWORK MODELING APPROACH TO PREDICT CRUDE OIL FUTURE. By Dr. PRASANT SARANGI Director (Research) ICSI-CCGRT, Navi Mumbai

Fig. 1. Min-Max Timing Simulation 1, 3 1, 2 1, 2 1, , 3 3, 4

ARTES Competitiveness & Growth Full Proposal. Requirements for the Content of the Management Proposal

A Comparison Between the Non-Mixed and Mixed Convention in CPM Scheduling. By Gunnar Lucko 1

Chapter 9 Case on Quality Engineering: Acceptance Sampling

PRINCE2 Style Exam Questions

Project Management. Project Mangement. ( Notes ) For Private Circulation Only. Prof. : A.A. Attarwala.

Innovating to Reduce Risk

FUZZY LOGIC INVESTMENT SUPPORT ON THE FINANCIAL MARKET

Measurable value creation through an advanced approach to ERM

Risk Assessment Policy

Looking for the right business finance? We re here to help.

Basics of Financial Statement Analysis: Statements

ESD 71 / / etc 2004 Final Exam de Neufville ENGINEERING SYSTEMS ANALYSIS FOR DESIGN. Final Examination, 2004

RISKTOPICS DISCUSSION. Product Design January 2013

5000 PUBLIC PERSONAL INJURY COMPENSATION PLANS

CAPITAL BUDGET NUCLEAR

ALM as a tool for Malaysian business

An Easy Way to Manage Your Thrift Savings Plan. The TSP Optimizer helps you customize your Thrift Savings Plan

Prince2 Foundation.exam.160q

Risk Management Relevance to PAS 55 (ISO 55000) Deciding on processes to implement risk management

Eight Simple Steps for Balancing your Checkbook

Effective Strategies for Saving

INTRODUCTION AND OVERVIEW

WORK BREAKDOWN STRUCTURE A TOOL FOR SOLVING DECISION MAKING PROBLEM IN PROJECT MANAGEMENT

PROJECT MANAGEMENT DIPLOMA COURSE

ALLEVO IS REVOLUTIONIZING CONTROLLING!

ScienceDirect. Project Coordination Model

Project Theft Management,

2015, IJARCSSE All Rights Reserved Page 66

2. 5 of the 75 questions are under trial and will not contribute to your overall score. There is no indication of which questions are under trial.

3.2 Aids to decision making

Managerial Accounting Prof. Dr. Varadraj Bapat Department School of Management Indian Institute of Technology, Bombay

Project Planning. Planning is an important step in project execution. Planning means:

Mortgage Lender Sentiment Survey

Lecture 7. Requirements Prioritisation. Risk Management

5.- RISK ANALYSIS. Business Plan

Discussion of Earnings at Risk (EAR) An Effective and Understandable Methodology for IRR Management. Randy C. Thompson, Ph.D.

Original SSAP and Current Authoritative Guidance: SSAP No. 70

The Control Chart for Attributes

New rules on credit rating agencies (CRAs) enter into force frequently asked questions

Cost Structures in Mobile Networks and their Relationship to Prices. Responding to Oftel. A Note by Europe Economics

Group-Sequential Tests for Two Proportions

ELECTRIC COOPERATIVE INTERCONNECTION AND PARALLEL OPERATION AGREEMENT FOR DISTRIBUTED GENERATION RATED 2 MW AND LESS

Feasibility Analysis Simulation Model for Managing Construction Risk Factors

Insertion loss (db) TOP VIEW SIDE VIEW BOTTOM VIEW. 4x ± ± Orientation Marker Denotes Pin Location 4x 0.

X3 Intelligence Reporting

LECTURE NOTES ON MICROECONOMICS

Hazard Identification, Risk Assessment and Control at Gas Inlet Area of Onshore Terminal Yeshaswee Bijalwan 1 Dr. Nehal A Siddique 2

IPO Readiness. IPO Milestones Timeline. CMA review and decision. IPO decision. Subscription period. IPO readiness. Submit IPO application

Available online at ScienceDirect. Procedia Economics and Finance 34 ( 2015 )

Integrated Child Support System:

Crowe, Dana, et al "EvaluatingProduct Risks" Design For Reliability Edited by Crowe, Dana et al Boca Raton: CRC Press LLC,2001

Transcription:

Using Quality Estimate of a New Product to Analyze Efficiency of Testing Jukka Antila Nokia Networks Finland Markku Moilanen University of Oulu Finland 1. Abstract Production test strategy for a new product has to be planned in early design phase before any design is actually finalized. This is necessary for considering all relevant test technology dependent testability aspects in design of the product. Also the expected test investments and work costs should be known with adequate accuracy. This paper presents how production process quality data can be used to estimate amount of various defects in the coming new product. Paper describes the calculation procedure how to analyze test strategy alternatives by using these estimated defects. As an output of the calculations we get detailed information of test efficiency, yield and escaped defects in each test operation. 2. Introduction Focus of this paper is all post reflow board level test and inspection (referred hereafter simply testing ). A decision of test strategy has to be done early to be able to calculate manufacturing costs for the planned product. Selected test methods also affect to design of the new product and therefore decisions have to be done early. The analysis methods presented here are based on estimating amount of various defects in the coming new product and simulating the detection of these defects when using different test technologies. This approach is different than try to analyze component by component whether it is tested or not. In this method we don t concentrate in testing of each component but see the test efficiency as statistical entity. The problem is of course to know what defects you are going to have in a new product that hasn t been manufactured or even designed yet. To make a decent estimate of the defects, the quality data of same (or similar) manufacturing process is used. Figure 1 illustrates the basic method of analyzing efficiency of a test process. After reflow soldering, the board going to the first tester in the process contains some defects (d in ). The defect coverage (C d1 ) of the test set T1, defines how many defects are found and removed from the board (d 1 ). Minimizing the amount of defects escaping the whole test process (d out ) is one of the key figures in our analysis. d in Test operation T 1 C d1 d 1 Figure 1: Test Flow d out1 = d in2 Test operation T 2 C d2 d 2 d out All data in this paper concerning products, defect spectrum and defect coverage are here not from any real product or production line. They are imaginary examples and used here only to demonstrate the calculation procedures. 3. Estimating amount of defects 3.1 Main principle There are several models for defect categorization available, among them PPVS [4], PCOLA/SOQ [5] and MPS [6]. Those are not suitable as such for the methods presented in this paper. This method is based the categorization which is used in the actual production line [7]. The same categorization will be used throughout the efficiency analysis procedure. Naturally the used defect groups have to be such that characteristics of test technologies come visible. IPC (Association Connecting Electronics Industries) has standardized defect categorization by using three main groups [3]: 1. Termination s 2. Placement s 3. Component s These three categories are widely used as industry standard and they are used also in this presentation. 1

To see the differences in defect coverage of various test methods we need also more detailed defect subcategories. IPC has presented one sub categorization [2]. When that is used in production line, it is good also for this analysis. Mainly because of long history in manufacturing, companies are in practice using many slightly different ways to classify defects. For presentation purposes we use here a simple sub categorization (Table 1). The key thing is to use same categorization also in efficiency analysis (Chapter 4). 1. Termination 1.1 Bridge 1.2 Insufficient 1.3 Open 1.4 Excess 1.5 Residue 1.6 Grainy 1.7 Other 2. Placement 3. Component Categorization 2.1 Missing component 2.2 Wrong component 2.3 Misaligned component 2.4 Tombstone 2.5 Inverted component 2.6 Other 3.1 Electrically dead 3.2 Tolerance defect 3.3 Other Table 1: Example of defect categorization In the following chapters we present two methods to make defect estimates for a new product. Use of them is depending on how much we know of the coming product and how detailed data we have available concerning quality of the production line. The models presented in the following chapters are: Model 1: Complexity based estimate Model 2: Package specific estimate 3.2 Complexity based estimate Even though we do not know what is the BOM (Bill Of Material) for the product, design engineer can estimate the complexity of the board. This is based on his experience, targets and the selections, which are already available. The complexity of the new product from production testing point of view can be characterized by: 1. Amount of components 2. Amount of terminations In this method we first use production DPMO figures to estimate total amount of defects in three main categories. After that the known defect spectrum is used to divide the defects to sub categories. DPMO stands for s Per Million Opportunities and is standardized by IPC [2]. We can calculate estimated amount of defects in the product using formula: x = (DPMO x /1 000 000) x O x (1) Where = s Per Unit DPMO = s Per Million Opportunities O = Amount of Opportunities per board The subscript x refers to the defect categories: T = Termination P = Placement C = Component In Example 1 below can be seen the estimates for some product. This same product example is used also later in this presentation. Example 1 Production line characteristics: Termination DPMO 30 Placement DPMO 40 Component DPMO 20 New product characteristics (opportunities): Components 2 000 Terminations 12 000 Based on the figures above, we obtain the following estimation of defects per unit (): Termination 0.360 Placement 0.080 Component 0.040 According to this example we can expect total amount of 0.48 defects in every manufactured board (or 48 defects in 100 boards). In practice this means that 38 % of the boards are faulty (Y= e - ). To get a defect estimate for each subgroup we use production defect spectrum of the production line. spectrum has to be presented per defect category. That means e.g. open terminations of all termination defects (%). Table 2 presents how figures are converted to defect estimates per each defect subcategory. The used defect spectrum can be seen in column Spectrum. 2

DPMO Spectrum per category 1 TERMINATION 30 0.360 100.0 % 0.360 1.1 Bridge 29.0 % 0.104 1.2 Insufficient 22.1 % 0.080 1.3 Open 45.6 % 0.164 1.4 Excess 0.3 % 0.001 1.5 Residue 0.3 % 0.001 1.6 Grainy 2.8 % 0.010 1.7 Other 0.0 % 0.000 2 PLACEMENT 40 0.080 100.0 % 0.080 2.1 Missing 49.4 % 0.040 2.2 Wrong 2.5 % 0.002 2.3 Misaligned 27.8 % 0.022 2.4 Tombstone 18.5 % 0.015 2.5 Inverted 1.2 % 0.001 2.6 Other 0.6 % 0.000 3 COMPONENT 20 0.040 100.0 % 0.040 3.1 Dead 74.6 % 0.030 3.2 Tolerance 25.4 % 0.010 3.3 Other 0.0 % 0.000 Table2. estimate 3.3 Package specific estimate From production quality point of view component packages are different: some are more difficult to solder properly than others, some have more assembly problems due to shape of the component. Production methods and machinery as well as competence of people make differences between production lines. Quality forecasting can be done simply by using one figure for all various defects of one component package [8]. This is too general level when we want to use the estimate to see the differences in coverage of testing. One component package based defect estimation method is presented by NEMI [1]. In that method the estimates are calculated based on structural DPMO joints, structural DPMO component and electrical DPMO component for every component present on board. Package specific defect estimate in this paper is based in whole spectrum of defects for each component. We use the actual production defect data with all detailed categorization. We use defect category specific DPMO for every component package (e.g. 0603, QFP240) to calculate estimated amount of defects. When we know number of components in coming product using that package we can get the estimate of defects in components of that package type. Estimated amount of certain defects in certain component package can be calculated: xy = n x (DPMO xy /1 000 000) x O x (2) Where n = Number of packages/board = s Per Unit DPMO = s Per Million Opportunities O x = Opportunities/component The subscript x refers to the defect categories: T = Termination P = Placement C = Component The subscript y refers to the defect subcategories like Bridge, Insufficient, etc. Estimated defects in all various component packages in the product are summed category by category. Calculation procedures presented here are not suitable for low volume production. The reliability of defect data is depending heavily on volume of each component type. There are cases when new product is using a component package that has not been used earlier and there is no statistics available. Experiences of nearly same kind of components and industry data of the new package has to be used to define a default DPMO figures for new packages. estimates could be produced also for each different component item; calculations for each resistor in 0603 package with different value, power or accuracy. When thinking of defect group Component defects and especially active components, this can give even better estimates than package specific view. The volume of each component is smaller in this approach, which then again can decrease accuracy of the estimate. 4. Test Efficiency 4.1 coverage Producing a defect coverage estimate for technologies used in process testing (called also structural testing) is a fairly uncomplicated task. For easy and reusable modelling, it is important to keep separate the generic defect coverage of the test method and product specific testability. In calculations, these two factors are combined through simple multiplication: C D = C M x C T (3) where C D = C M = of the Method C T = of the Product (Testability) Thus, the generic defect coverage may be for example 80 %. However, if product s testability (e.g. access) is not good and testability coverage is only 50%, the final defect coverage is no more than 3

40%. Testability coverage is discussed in more detail in Chapter 4.2. coverage figures can be obtained from literature and manufacturers of test equipment [9,10]. Although these data provide a good starting point and offer a reference for comparison, it is important to have you own coverage estimates based on the systems and processes you actually use. Technologies develop and processes change at such a high rate that forming your own opinion on defect coverage is essential. Practical study or data analysis in your own production environment are an invaluable source of information concerning actual defect coverage (C D ) [9]. To create default defect coverage for the technologies in your production lines, you need external opinions, internal data and discussion. After that you can produce defect coverage estimates such as those presented in Table 3. HVI ICT AOI AXI BSCAN 1.1 Bridge 80 % 90 % 70 % 90 % 100 % 1.2 Insufficient 60 % 0 % 90 % 80 % 0 % 1.3 Open 50 % 80 % 80 % 90 % 100 % 1.4 Excess 60 % 0 % 90 % 80 % 0 % 1.5 Residue 80 % 0 % 0 % 20 % 0 % 1.6 Grainy 60 % 0 % 50 % 70 % 0 % 1.7 Other 0 % 0 % 0 % 0 % 0 % 2.1 Missing 80 % 80 % 90 % 80 % 100 % 2.2 Wrong 50 % 80 % 0 % 10 % 100 % 2.3 Misaligned 70 % 0 % 90 % 90 % 0 % 2.4 Tombstone 50 % 80 % 90 % 90 % 20 % 2.5 Inverted 30 % 80 % 30 % 10 % 100 % 2.6 Other 0 % 0 % 0 % 0 % 0 % 3.1 Dead 0 % 30 % 0 % 0 % 100 % 3.2 Tolerance 0 % 30 % 0 % 0 % 20 % 3.3 Other 0 % 0 % 0 % 0 % 0 % Table 3. Example of defect coverage estimates. 4.2 Testability Testability coverage refers in this paper to the success of a particular product s testability design (access). There are commercial tools available offering methods to make analysis of testability coverage. Those are not very useful when you are in early design phase and do not have any circuit diagram and layout files available. In early phase you have to estimate testability coverage based on experiences of existing products and plans of the new product. Main factors used in testability coverage for various test technologies are: 1. Automatic Optical Inspection (AOI) and Human Visual Inspection (HVI) Components: non-visible components/all components Terminations: non-visible solder joints/all joints 2. Automatic XRAY Inspection (AXI) shadowed joints/all joints 3. Boundary Scan Testing (IEEE 1149.1) BSCAN components/all components 4. InCircuit Testing (ICT) nodes with test pin access/all nodes 4.3 Test Process Efficiency Test efficiency can be presented by using defect coverage: how many defects of all defects in the unit are detected. It is important to remember that for a decent estimate of test efficiency, we have to calculate defects and defect coverage for each defect group. The final test efficiency (E) can be calculated after summing defects of defect groups together. D xy E (%) 100% (4) = xy The subscript x refers to the defect categories: T = Termination P = Placement C = Component The subscript y refers to the defect subcategories like Bridge, Insufficient, etc is here the defect estimate and D we get from simple calculation for each defect sub category: D = CD (5) Where C D is the final defect coverage including both coverage of the method (C M ) and testability coverage of the product (C T ) (Formula 3). Example of complete test efficiency analysis of on test operation is presented in Table 4. AOI incl. Testability D (detected) E (escaped) 1 TERMINATION 0.360 0.225 0.135 1.1 Bridge 0.104 70 % 56.0 % 0.058 0.046 1.2 Insufficient 0.080 90 % 72.0 % 0.057 0.022 1.3 Open 0.164 80 % 64.0 % 0.105 0.059 1.4 Excess 0.001 90 % 72.0 % 0.001 0.000 1.5 Residue 0.001 0 % 0.0 % 0.000 0.001 1.6 Grainy 0.010 50 % 40.0 % 0.004 0.006 1.7 Other 0.000 0 % 0.0 % 0.000 0.000 2 PLACEMENT 0.080 0.062 0.018 2.1 Missing 0.040 90 % 81.0 % 0.032 0.008 2.2 Wrong 0.002 0 % 0.0 % 0.000 0.002 2.3 Misaligned 0.022 90 % 81.0 % 0.018 0.004 2.4 Tombstone 0.015 90 % 81.0 % 0.012 0.003 2.5 Inverted 0.001 30 % 27.0 % 0.000 0.001 2.6 Other 0.000 0 % 0.0 % 0.000 0.000 3 COMPONENT 0.040 0.000 0.040 3.1 Dead 0.030 0 % 0.0 % 0.000 0.030 3.2 Tolerance 0.010 0 % 0.0 % 0.000 0.010 3.3 Other 0.000 0 % 0.0 % 0.000 0.000 TOTAL 0.480 59.9 % 0.288 0.192 Table 4. Complete example of efficiency analysis for one test operation (AOI). In most cases, you need to analyze the efficiency of the board test process with more than one 4

technology. In such case the output (escapes) of previous operation is used as input for the next one (Figure 1). To provide an example, Table 5 shows a complete efficiency analysis with two successive test operations. The complete efficiency E of case presented in Table 5 can be calculated by using formula 4: E = (0.288+0.100)/0.480 x 100 % = 80.8 % Group incl. Testability 1 AOI 2 ICT D (detected) E (escaped) incl. Testability D (detected) E (escaped) 1 TERMINATION 0.360 0.225 0.135 0.080 0.05479 1.1 Bridge 0.104 56.0 % 0.058 0.046 81.0 % 0.037 0.00873 1.2 Insufficient 0.080 72.0 % 0.057 0.022 0.0 % 0.000 0.02228 1.3 Open 0.164 64.0 % 0.105 0.059 72.0 % 0.043 0.01654 1.4 Excess 0.001 72.0 % 0.001 0.000 0.0 % 0.000 0.00028 1.5 Residue 0.001 0.0 % 0.000 0.001 0.0 % 0.000 0.00099 1.6 Grainy 0.010 40.0 % 0.004 0.006 0.0 % 0.000 0.00597 1.7 Other 0.000 0.0 % 0.000 0.000 0.0 % 0.000 0.00000 2 PLACEMENT 0.080 0.062 0.018 0.009 0.00836 2.1 Missing 0.040 81.0 % 0.032 0.008 72.0 % 0.005 0.00210 2.2 Wrong 0.002 0.0 % 0.000 0.002 72.0 % 0.001 0.00055 2.3 Misaligned 0.022 81.0 % 0.018 0.004 0.0 % 0.000 0.00422 2.4 Tombstone 0.015 81.0 % 0.012 0.003 72.0 % 0.002 0.00079 2.5 Inverted 0.001 27.0 % 0.000 0.001 72.0 % 0.001 0.00020 2.6 Other 0.000 0.0 % 0.000 0.000 0.0 % 0.000 0.00049 3 COMPONENT 0.040 0.000 0.040 0.011 0.02920 3.1 Dead 0.030 0.0 % 0.000 0.030 27.0 % 0.008 0.02177 3.2 Tolerance 0.010 0.0 % 0.000 0.010 27.0 % 0.003 0.00743 3.3 Other 0.000 0.0 % 0.000 0.000 0.0 % 0.000 0.00000 TOTAL 0.480 0.288 0.192 0.100 0.09235 YIELD 75.0 % 90.5 % Product specific coverage for terminations 80 % 90 % Product specific coverage for components 90 % 90 % Table 5. Complete analysis of a process using two test methods 4.4 Functional Testing (FT) Estimating the coverage of functional testing (or performance testing) is more complicated, and requires rather elaborate calculations. If you can estimate the coverage of each defect class in your functional testing, you should use that information in your calculations. However, even though complete coverage information is not available, we know something useful of functional testing: unfortunately the defect coverage C D of certain groups is almost always 0 %. When using the categorization presented in Table 1 these groups are: 1.2 Insufficient (0%) 1.4 Excess (0%) 1.5 Residue (0%) 1.6 Grainy (0%) 2.3 Misaligned (0%) Using the FT defect coverage figures shown above (0% coverage) and setting the coverage level for all other groups at 100%, allows us to derive a special defect coverage model for Functional testing (Table 6). This information can then be utilized to provide a case analysis, which gives us two useful figures: 1. Lowest possible yield in FT 2. Lowest possible E (escapes) after FT FT 1.1 Bridge 100 % 1.2 Insufficient 0 % 1.3 Open 100 % 1.4 Excess 0 % 1.5 Residue 0 % 1.6 Grainy 0 % 1.7 Other 100 % 2.1 Missing 100 % 2.2 Wrong 100 % 2.3 Misaligned 0 % 2.4 Tombstone 100 % 2.5 Inverted 100 % 2.6 Other 100 % 3.1 Dead 100 % 3.2 Tolerance 100 % 3.3 Other 100 % Table 6. coverage for functional testing to be used in best-case analysis. Lowest possible yield and escapes are useful information in terms of economy, quality and capacity. 5

5. Using the model Before widely using the methods presented here it is necessary to do some tuning with practical examples. Complete analysis for products that are in production ensures the model s performance. The results are used to adjust the estimated factors and to ensure that input data is accurate enough. This way you also get familiar with the reliability of analysis in you manufacturing environment. Selecting the best test process for a new product necessitates a comprehensive analysis. This involves calculating the technical feasibility of the available technologies with the presented model. The outputs of technical analysis are used to produce a capacity estimate and an economical analysis. Pertinent questions that must be answered are how many test systems, fixtures and test applications are needed, how much personnel is required, what are the effects of testing on guarantee costs, etc. The estimates for detected defects, escaped defects and yields are needed again when production starts. All mismatches between estimated and actual results must be investigated and explained. Special attention should be paid to functional testing: if the yield figures are much better than in best case analysis (Table 6), it is possible that FT coverage is worse than expected and level of escapes is high. Differences between the real process and estimates can indicate problems in design, materials, test systems or the manufacturing process. Early identification of these problems is essential, when introducing a new product. 6. Conclusions The methods presented here are focused in analysing efficiency of test process in early development phases when design is not ready. This ensures powerful adaptation of concurrent engineering practices in the project. When development project proceeds the calculations can be updated with more detailed data of the product s structure and testability coverage. The efficiency is not the only result of the analysis. Amount of escapes from the test process presents the outgoing quality and is therefore key measure describing the complete manufacturing and test process. The produced quality estimates can be used to indicate if there are potential yield problems in the product. Efficient testing is not the only way to improve quality; also component selections can be re-evaluated and improvement needs in manufacturing process can be identified. defect coverage can be produced. Thus, it has been successfully used for AXI (Automatic Xray Inspection), AOI (Automatic Optical Inspection), ICT (InCircuit Testing) and Boundary Scan testing (IEEE 1149.1). Also the usability of human visual inspection (HVI) can be justified with this method. And it has a certain value in estimating the need for Functional Testing (FT). The downside of the model presented here is that it is not very usable for partial testing. Such case is a process, where AXI is used to inspect only BGA terminations, while ignoring all other joints. Although cases such as these can be calculated with the methods presented in this paper, the procedures are more complicated because the product has to be divided into virtual sub modules. Consequently, these special cases fall outside the focus of this paper. 7. References [1] NEMI Manufacturing Test Strategy Cost Model User s Guide, http://www.nemi.org, 2003 [2] Std IPC-9261, In-Process DPMO and Estimated Yield for PWAs, www.ipc.org, 2002 [3] Std IPC-7912, Calculation of DPMO and Manufacturing Indices for Printed Board Assemblies, www.ipc.org, 2000 [4] TestWay, www.aster-ingenierie.com [5] K. Hird, K. P. Parker, B. Follis, Test : What Does It Mean when a Board Test Passes?, Proceedings of International Test Conference, 2002 [6] W. Rijckaert, F. de Jong, Board Test, The value of prediction and how to compare numbers, Proceedings of International Test Conference, 2003, pp 190-199 [7] J. Antila, M. Moilanen, Test Efficiency Analysis with Basic Quality Data, European Test Symposium, 2005 [8] E. Juuso, T. Jokinen, J. Ylikunnari, K. Leiviskä, Quality Forecasting Tool for Electronics Manufacturing, 2000, ISBN 951-42-5599-2 [9] S. Oresjo, What to consider when selecting the optimal test strategy, www.agilent.com [10] C. Robinson, A. Verma, Optimizing Test Strategies During PCB Design For Boards With Limited ICT Access, Telecom Hardware Solution Conference & Exhibition, 2002 The presented methods can be easily taken to use with any test technology if reasonable estimate of 6