www.fgks.org   »   [go: up one dir, main page]

Magazine spring06 coverstory

Page 1

Cover

Story

From Data to Decisions Chris A. Mack, Lithography Consultant John Robinson, KLA-Tencor Corporation

The value of metrology data is explored conceptually by describing the systematic progression from the data to a decision made in the fab. The Knowledge Hierarchy, a conceptual framework for understanding the increasing value of data as it becomes information, then knowledge, then a decision, is introduced. Carefully spelling out every step in the decision making process allows for an understanding of where the weak links in the chain are located, and which improvements will have the greatest impact on overall decision quality. This framework then allows one to properly assess the relationship between data quality and decision quality, and work towards systematically improving decision quality. Introduction

How does one quantify the value of a metrology tool? Obviously this is a commonly asked question for both producers and purchasers of metrology equipment. While there are many answers and approaches used for specific applications of metrology data, there are some common themes that apply to all metrology value statements. Thinking about these commonalities over the last several years, we have developed a framework for understanding the value of metrology that we call “from data to decisions.” A carpenter friend of ours is fond of saying, “Nobody wants a 1/4" drill. They want a 1/4" hole.” The drill is the most effective tool for getting the hole they really want, but the value comes from the hole. Likewise, nobody wants a metrology tool. What they really want is: • A process that’s in control and is predictable • Lower rework rates • Better bin sort (device performance) • Faster ramp to high yield • Sustained higher yields • Quick detection and elimination of yield crashes or potential yield crashes Metrology tools are just an effective means to achieve these primary goals of profitable semiconductor manufacturing. Obviously, if one is to express, and hopefully quantify, the value that a metrology tool adds to a fab, then one must clearly link the immediate use of the tool (collecting data of some sort) to the final goals of that use (improved fab profitability). Spring 2006

www.kla-tencor.com/magazine

15


C

o

v

e

r

S

t

The Knowledge Hierarchy

Before tackling the problem of how to understand the value of metrology in a wafer fab, a few preliminary concepts and terms should be clarified. What is the difference between data and information? Between information and knowledge? How are these concepts related? Data, information, knowledge, and finally acting on that knowledge to make a decision, form a chain of increasing value we call the Knowledge Hierarchy.

$!4!

).&/2-!4)/.

+./7,%$'%

To understand this hierarchy let’s define and give examples for each step. Data are, of course, the raw numbers or images provided by the measurement tool (from the Latin, it is “the thing given”). It is a collection of numbers with units and with known uncertainty (that is, known precision and accuracy). Information, on the other hand, is data in context. Information includes sufficient details of what was measured where and when so that it can be easily discerned from other similar collections of data. It is organized and accessible. Information may also include the filtering out of extraneous bits of the data or distilling the numbers down as much as possible (reporting a mean and standard deviation, for example). Knowledge is an interpretation of the information based on an understanding (that is, a model) of cause and effect. Whereas information answers the question of “what”, knowledge answers the question “why”. Finally, decision means acting on the knowledge obtained. In our fab context, this means acting with the intention of improving the fab’s bottom line (improving yield, bin sort, etc.). 16

Spring 2006

o

r

y

The table below provides a CD metrology example that clarifies the distinctions between data, information, knowledge, and decisions. Suppose a product lot uses a standard sampling plan where some number of CD measurements are made. The resulting collection of numbers, with their associated uncertainties (either measured or assumed), is the data. As you might imagine, a collection of numbers out of context is next to useless. Thus, when the context is added (the targets were nominally 90 nm isolated lines in resist ar$%#)3)/. ranged along the scanner slit) and statistically interpreted (determining that the variation was statistically different from most other lots), the data becomes information. Already, the transformation of the data to information can be immensely useful. But information alone is not enough. What is causing the systematic ).&/2-!4)/. variation $!4! in CDs across the slit? In Table 1, the targets have been specifically designed for sensitivity to focus

errors. Adding other information (a separate measurement to monitor dose errors) and a previously calibrated model of how these targets vary with focus and exposure enables us to assign a cause to the variation seen in the data: there was a -80 nm focus tilt across the slit. Further, the model also allows us to estimate how much a process change might reduce the CD variation along the slit. This is knowledge. And it is powerful knowledge, because it allows us to understand what actions will cause what benefits. It allows us to make a decision. From this example we can see that data adds value to the fab only when it moves up the Knowledge Hierarchy and enables a decision to be made. The value to the fab is in the decision – or, more properly, in making the correct decision. Thus, the best way to judge the value of the data is to judge the value of the decision that the data enables. But we’re getting ahead of ourselves. First we must +./7,%$'% $%#)3)/. understand how to systematically move up the Knowledge Hierarchy from data to decisions.

Concept

Definition

Example (from CD metrology)

DATA

A collection of numbers, calibrated and with known repeatability

The measured CDs are 96.2 ± 0.9 nm, 94.4 ± 0.8 nm, etc.

INFORMATION

The right data, at the right time, in the right context, organized for access

Isolated 90 nm lines vary systematically by 6.5 nm across the slit for this lot

KNOWLEDGE

Interpretation of information to assign cause and effect

A -80 nm focal tilt adjustment would reduce the systematic CD variation across the slit from 6.5 nm to 2.1 nm

DECISION

Acting on the knowledge with Let’s make the focus tilt adjustment an expectation for benefit before the next lot is run because we believe it will have a positive, noticeable impact on yield

Table 1: Knowing how to reduce CD variation.

Yield Management Solutions


C

Moving up the Knowledge Hierarchy

The details of the Knowledge Hierarchy are very use-case dependent, with the most generic elements near the bottom and the most usecase specific elements at the decision stage. Data is characterized by accuracy and precision specifications that, in many cases, apply to a wide range of applications. Once the context is added to make information, however, we already have one or more decisions in mind. Targets to be measured are optimized for maximum sensitivity to some variables and minimum sensitivity to others, in order to be most useful in moving up the Knowledge Hierarchy. Knowledge is extremely use-case dependent, working towards a specific decision. Thus, it seems that a systematic approach for defining a Knowledge Hierarchy must be top down: start with the question that you want to answer, the decision that you want to make. Let’s consider an example. One of the most common and important use cases for parametric metrology data is lot dispositioning at photolithography: measure representative wafers from a lot after lithography but before etch to see if the lot should be reworked. Thus, the driving question (decision) is, “Should this lot be reworked?” Important and related questions are, “If this lot is reworked, what should be done differently?” and “Can we feed any information forward to the etch step that will improve the final results?” Let’s pick the first question, and see what is involved in making the decision. To do this, we must methodically list every step in the sequence of steps that go into that decision. On the right is an attempt at a fairly exhaustive listing of activities that go on, from start to finish, when making the lot go/no-go decision for the case of overlay.

o

v

e

r

S

t

o

r

y

I. Preparation A. Define the metrology tool to be used

• Required precision, accuracy and throughput

B. Design the measurement target C. Define the within-field sampling plan D. Put measurement targets in the chip design • Scribe kerf, interdie streets, within die

E. Define the full sampling plan

• Fields per wafer, wafers per lot, lot frequency

F. Create measurement recipe G. Create analysis recipe

• May include reticle data, lens distortion map

H. Create overlay spec for lot pass/fail

• Spec is intended to reflect device yield/performance • Spec may depend/influence sample plan, tool specs, analysis approach

I. Define the process (action plan) that applies the spec in production

II. Measurement A. Print wafers B. Transport wafers to overlay tool and load C. Select wafers to measure • May be manual or from host or recipe

D. Make measurements E. Perform analysis (usually automatically) F. Upload measurement and analysis results to host or 3rd party system

III. Analysis Method A. Method 1: compare raw data statistics to spec B. Method 2: compare modeled results (model coefficients, modeled max error, overlay limited yield) to specs C. Method 3: SPC-like analysis (check for out-of-control condition) D. Apply some combination of above methods E. Assess the quality of the data

IV. Decision Regimes

A. Obvious pass – send the lot on B. Obvious fail – rework the lot C. Gray Area Options

• Consider gray area as failure – rework the lot • Shrink the gray area – Make more measurements (repeat on same points, increase sample), possibly on a different tool – Change measurement algorithm for greater precision • Apply human judgement (last resort)

V. Decision Post-Mortem A. For reworked lots, how have things improved? • Measure reworked lots • Compare new measurements to old • Did corrections work as expected?

B. For reworked lots, what is the root cause of the problem? • What process changes would reduce rework rate? • Are the processes and tools in control?

C. For passed lots, are things OK downstream? • Correlation of overlay results to yield • Is the expected failure rate obtained?

D. Can the overall dispositioning process be improved?

• Relate results to fab metrics (yield, cycle time, throughput, CoO) • Time to results • Measurement costs • Cost of a bad decision

Spring 2006

www.kla-tencor.com/magazine

17


C

o

v

e

r

S

t

o

r

y

$!4!

Generally, there are five basic steps in the making of a standard production decision: preparation, measurement, analysis, decision, and post-mortem. Preparation (or planning) is often underestimated in terms of its importance to the overall quality of the decision to be made. In particular, the sampling plan and the design of the measurement target will have a profound impact on the quality of the decision, often far in excess of the measurement uncertainty itself. Proper preparation allows data to be seamlessly turned into information. Of course, the measurement itself is important – no amount of planning or analysis can make up for uncertainty or inaccuracy in the raw data. But a focus on measurement tool specifications out of context of the decision to be made has little value. Analysis of the data (turning information into knowledge) assigns a probable cause to what is happening on the wafer. In overlay, it assigns correctables: if the lot had been run with these different settings, this is how much better things would have turned out. The decision is made by relating the knowledge of what could be done better with what is effectively a cost analysis: is it worth it to make a process change (e.g., to rework the lot)? The final step, after the decision has been made and implemented, is the post-mortem. Have we learned anything from this experience that we can use to do things better next time? As the example in the next section illustrates, there is a direct correlation between the basic way in which fab decisions are made and the process of moving up the Knowledge Hierarchy.

18

Spring 2006

).&/2-!4)/.

+./7,%$'%

$%#)3)/.

PREPARATION

System to turn data into information

MEASUREMENT

Create data

ANALYSIS

Turn information into knowledge

DECISION

Turn knowledge into action

POST-MORTEM

Improve the data to decision process

Assessing the value of data

Given a thorough understanding of how data moves up the Knowledge Hierarchy to become a decision, we now have a framework for how to assess the value of metrology. Data is valuable only in so much as it affects the quality of a decision. (Note that deciding to make no changes to the process is still an important decision to make, one that should be made actively, not passively.) If the process of making a decision is systematized (as is often the case in our increasingly automated fabs), it will be possible to make a quantitative correlation between data quality and decision quality. For example, for the case of a go/ no-go lot dispositioning decision we break decision errors into alpha and beta errors. An alpha error (also called a type I error) is when we decide to rework a good lot. A beta (or type II) error is when a bad lot is not reworked. Each error has its own unique costs in the fab. The cost of making a bad decision is the cost of an alpha error multiplied by the probability that an alpha error will occur, plus the cost of a beta error times its probability of occurrence. Lowest overall cost occurs when the ratio of the probabilities of alpha errors to beta errors is equal to the ratio of the costs of beta errors to alpha errors. It is always easy to decrease one type of error at the expense

Yield Management Solutions

of the other, so good metrology and analysis will strive to both lower and balance the two types of errors. Given a systematic analysis and decision making method, one can relate the quality of the data and the actual problem occurrence probability to the probability of making alpha and beta errors. In that way, real dollar values can be a signed to measurements and to measurement specifications. When determining correctables, data uncertainty can be related directly to uncertainty in the correctables (which then can be related to the value of the reworked lots, as well as the changing problem occurrence probability). Again, it is possible to use the systematic decision making process to quantify the impact of different measurement specifications, target quality, sampling plans, and data analysis models on the quality of the correctable decisions being made. Quantifying the relationship between data quality and decision quality provides a tool for assessing the decision process as well. Given a certain data quality, what sampling, target design, and analysis method provides the needed decision quality? Often metrologists focus on improving measurement precision when decision quality can more easily be improved with better preparation or analysis.


C

Conclusions

Metrology is valuable, which is why we make measurements. The value, however, can only be quantified by first describing the systematic progression from the data to a decision, then assessing the relationship between data quality and decision quality. The Knowledge Hierarchy is a conceptual framework for understanding the increasing value of data as it becomes information, then knowledge, then a decision. It maps directly to the actual steps used in a fab when metrology data drives decisions. Carefully spelling out every step in the decision making process allows for an understanding of where the weak links in the chain are located, and which improvements will have the greatest impact on overall decision quality. Metrology data only “adds value” to the wafers when it moves up the hierarchy to become a valuable decision.

o

v

e

r

S

t

o

r

y

Author’s note

Anybody who works for a metrology company has no doubt heard this refrain from a customer at some point or another: unlike process tools, metrology tools do not “add value” to the wafer. I’ve never quite understood what this comment means. Certainly it can’t mean that metrology tools aren’t valuable; otherwise, why would people buy them? I think it means that the way in which metrology “adds value” to the wafer is more indirect, and thus harder to express in simple, short sentences. During my tenure at KLA-Tencor, I thought about this problem, and with my colleague John Robinson, developed a framework for understanding the value of metrology that we call “from data to decisions.” - Chris Mack

Biographies Chris Mack was Vice President of Lithography Technology for KLA-Tencor from 2000 – 2005. He currently writes and consults in Austin, Texas. John Robinson, Director of Marketing and Applications Development at KLA-Tencor, received his Ph.D. in physics at the University of Texas at Austin in 1995. Based in Austin, Texas, Robinson joined KLA-Tencor in 1997, and has been responsible for overlay, optical CD, and CD-SEM analysis products.

Spring 2006

www.kla-tencor.com/magazine

19


Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.