sampling approaches to metrology in semiconductor ...cden.ucsd.edu › archive › secure ›...

34
Sampling Approaches to Metrology in Semiconductor Manufacturing Tyrone Vincent 1 and Broc Stirton 2 , Kameshwar Poolla 3 1 Colorado School of Mines, Golden CO 2 GLOBALFOUNDRIES, Austin TX 3 University of California, Berkeley CA (IMPACT) Metrology Sampling 1 / 30

Upload: others

Post on 31-Jan-2021

3 views

Category:

Documents


0 download

TRANSCRIPT

  • Sampling Approaches to Metrology in SemiconductorManufacturing

    Tyrone Vincent1 and Broc Stirton2, Kameshwar Poolla3

    1Colorado School of Mines, Golden CO2GLOBALFOUNDRIES, Austin TX

    3University of California, Berkeley CA

    (IMPACT) Metrology Sampling 1 / 30

  • Approach

    Outline

    1 Approach

    2 Modeling and Identification

    3 Experimental Results

    4 Conclusion

    (IMPACT) Metrology Sampling 2 / 30

  • Approach

    Context

    the progression for Advanced Process Control (APC) places increaseddemand on metrology

    lot level variation↘

    wafer level variation↘

    within-wafer variation

    (IMPACT) Metrology Sampling 3 / 30

  • Approach

    Goal

    decrease the number of measurements needed to determine keyfeatures on wafers

    (IMPACT) Metrology Sampling 4 / 30

  • Approach

    Approach

    develop a correlation model of metrology measurements for aparticular process

    conduct an a priori analysis to determine the optimally informativesites that should be measured by minimizing an expected predictionerror at the unmeasured sites.

    use a subset of measurement sites together with the correlation modelto predict measurements at sites which are not measured

    (IMPACT) Metrology Sampling 5 / 30

  • Approach

    Prediction Process

    IdentificationSet

    m sitesmeasured

    Prediction Set

    q sites measuredm− q sitespredicted

    (IMPACT) Metrology Sampling 6 / 30

  • Modeling and Identification

    Outline

    1 Approach

    2 Modeling and Identification

    3 Experimental Results

    4 Conclusion

    (IMPACT) Metrology Sampling 7 / 30

  • Modeling and Identification

    Sources of Variation

    Lgate(f, d, k) = L0 + La(k) + Lb(f, d, k) + Lc(f, k)

    +Ld(d, k) + Le(d) + Lf (f, d, k)

    pos

    Lgate

    (a)

    (b)

    (c)

    (d)

    (e)

    (f)

    (c)

    (d)

    (e)

    (f)

    (c)

    (d)

    (e)

    (f)

    (c)

    (d)

    (e)

    (f)

    (c)

    (d)

    (e)

    (f)

    (c)

    (d)

    (e)

    (f)

    (c)

    (d)

    (e)

    (f)“random” variationlayout dependent variation

    field level systematicvariation and bias

    wafer level systematicvariation and bias

    (IMPACT) Metrology Sampling 8 / 30

  • Modeling and Identification

    Metrology Model 1

    wafer sequence

    −1 −0.5 0 0.5 1−1

    −0.8

    −0.6

    −0.4

    −0.2

    0

    0.2

    0.4

    0.6

    0.8

    1

    spatial variation

    yk(p) = ȳ(p) +∑

    i Ci(p)xi,k + nk(p)

    parameters

    yk(p) - measurements at position p on wafer kCi(p) - variation basis functionxi,k - scaling factor for variation function ink(p) - measurement noise

    (IMPACT) Metrology Sampling 9 / 30

  • Modeling and Identification

    Identification Process 1

    vectorized model

    yk =[yk(p1) yk(p2) · · · yk(pm)

    ]Txk =

    [x1,k x2,k · · · xn,k

    ]Tnk =

    [nk(p1) nk(p2) · · · nk(pm)

    ]Tyk = ȳ + Cxk + nk

    case 1: xk iid random sequence

    identification process: C and covariance of xk identified using PrincipleComponent Analysis

    (IMPACT) Metrology Sampling 10 / 30

  • Modeling and Identification

    Metrology Model 2

    wafer sequence

    −1 −0.5 0 0.5 1−1

    −0.8

    −0.6

    −0.4

    −0.2

    0

    0.2

    0.4

    0.6

    0.8

    1

    spatial variation

    0 5 10 15 20 25 30−2

    −1

    0

    1

    2

    3

    4

    5

    Wafer #

    Mea

    sure

    men

    t

    time variation

    yk = ȳ+∑

    i Ci(p)xi,k +nkx1,k+1...xn,k+1

    = Ax1,k...xn,k

    +wk

    (IMPACT) Metrology Sampling 11 / 30

  • Modeling and Identification

    Identification Process 2

    Case 2: xk time correlated

    xk+1 = Axk + wk

    yk = ȳ + Cxk + nk

    parameters A, C and covariance of wk identified using CanonicalCorrelation Analysis

    (IMPACT) Metrology Sampling 12 / 30

  • Modeling and Identification

    Prediction

    measured data

    yMk = ȳM + CMxk + n

    Mk

    unmeasured values

    zUk = ȳU + CUxk

    common variable xk can beestimated from yMk and used topredict zUk

    - M- U

    (IMPACT) Metrology Sampling 13 / 30

  • Modeling and Identification

    Performance Prediction

    models include uncertainty in process variation and measurement error

    performance prediction possible

    SU := tr Cov[ẑUk − zUk

    ]

    (IMPACT) Metrology Sampling 14 / 30

  • Modeling and Identification

    Measurement Sequencing

    greedy algorithm for minimizing prediction error

    1 Set M = ∅, U = {1, . . . ,m}2 For each element i ∈ U, calculate SU−{i}.3 Select the element j for which trSU−{j} is minimum.4 Remove j from U and add it to M.5 If |M| < q goto to step 2, otherwise quit.

    (IMPACT) Metrology Sampling 15 / 30

  • Modeling and Identification

    Implementation

    model requires data - how do we get it?

    re-modeling may be needed - how do we check?

    (IMPACT) Metrology Sampling 16 / 30

  • Modeling and Identification

    Implementation Process 1

    initialdata set︷ ︸︸ ︷

    predictedfeatures

    {measuredfeatures

    {

    lower rate measurementsto monitor performance

    (IMPACT) Metrology Sampling 17 / 30

  • Modeling and Identification

    Implementation Process 1

    initialdata set︷ ︸︸ ︷

    predictedfeatures

    {measuredfeatures

    {modelexpires

    (IMPACT) Metrology Sampling 17 / 30

  • Modeling and Identification

    Implementation Process 1

    initialdata set︷ ︸︸ ︷

    predictedfeatures

    {measuredfeatures

    {modelexpires

    data set torenew model︷ ︸︸ ︷

    (IMPACT) Metrology Sampling 17 / 30

  • Modeling and Identification

    Implementation Process 1

    initialdata set︷ ︸︸ ︷

    predictedfeatures

    {measuredfeatures

    {modelexpires

    data set torenew model︷ ︸︸ ︷

    modelexpires

    (IMPACT) Metrology Sampling 17 / 30

  • Modeling and Identification

    Implementation Process 1

    initialdata set︷ ︸︸ ︷

    predictedfeatures

    {measuredfeatures

    {modelexpires

    data set torenew model︷ ︸︸ ︷

    modelexpires

    data set torenew model︷ ︸︸ ︷

    (IMPACT) Metrology Sampling 17 / 30

  • Modeling and Identification

    Implementation Process 2

    initialdata set︷ ︸︸ ︷

    predictedfeatures

    {measuredfeatures

    {modelexpires

    data set torenew model︷ ︸︸ ︷

    (IMPACT) Metrology Sampling 18 / 30

  • Experimental Results

    Outline

    1 Approach

    2 Modeling and Identification

    3 Experimental Results

    4 Conclusion

    (IMPACT) Metrology Sampling 19 / 30

  • Experimental Results

    Will this work?

    performance is process specific! best case:

    process variation is “low order”process variation is “stationary”

    validation requires real process data

    (IMPACT) Metrology Sampling 20 / 30

  • Experimental Results

    Real Process Data use for Verification

    poly-gate critical dimension (CD) process data fromGLOBALFOUNDRIES Fab1

    1789 wafers with common litho-etch equipment sequence

    Same feature measured on 18 die.

    (IMPACT) Metrology Sampling 21 / 30

  • Experimental Results

    Average wafer

    −1.5 −1 −0.5 0 0.5 1 1.5−1.5

    −1

    −0.5

    0

    0.5

    1

    1.5

    [7]

    [10]

    [6]

    [8]

    [11]

    [9]

    Average wafer

    X position

    [5]

    [1]

    [12]

    [4]

    [2]

    [13]

    [3]

    Y p

    ositi

    on

    −1

    −0.8

    −0.6

    −0.4

    −0.2

    0

    0.2

    0.4

    0.6

    0.8

    (IMPACT) Metrology Sampling 22 / 30

  • Experimental Results

    PCA variance contribution

    0 5 10 150

    0.01

    0.02

    0.03

    0.04

    0.05

    (2σ)2

    State index

    Var

    ianc

    e C

    ontr

    ibut

    ion

    PCA state dimension

    0 2 4 6 8 10 120

    0.1

    0.2

    0.3

    0.4

    0.5

    0.6

    n=1

    n=2

    n=3

    n=7n=13

    Number of measurements per wafer

    Ave

    rage

    pre

    dict

    ion

    erro

    r (u

    nmea

    sure

    d si

    tes

    only

    )A plot of average prediction er-ror vs. # of sites measured, fordifferent model dimensions, n.

    (IMPACT) Metrology Sampling 23 / 30

  • Experimental Results

    First four basis elements

    −1.5 −1 −0.5 0 0.5 1−1.5

    −1

    −0.5

    0

    0.5

    1

    1.5

    [7]

    [10]

    [6]

    [11]

    [8]

    [9]

    [5]

    PCA C matrix, column 1

    X position

    [1]

    [12]

    [4]

    [2]

    [13]

    [3]

    Y p

    ositi

    on

    −0.3

    −0.295

    −0.29

    −0.285

    −0.28

    −0.275

    −0.27

    −0.265

    −0.26

    −1.5 −1 −0.5 0 0.5 1−1.5

    −1

    −0.5

    0

    0.5

    1

    1.5

    [7]

    [10]

    [6]

    [8]

    [11]

    [9]

    [5]

    PCA C matrix, column 2

    X position

    [1]

    [12]

    [4]

    [2]

    [13]

    [3]

    Y p

    ositi

    on

    −0.5

    −0.4

    −0.3

    −0.2

    −0.1

    0

    0.1

    0.2

    0.3

    0.4

    −1.5 −1 −0.5 0 0.5 1−1.5

    −1

    −0.5

    0

    0.5

    1

    1.5

    [7]

    [10]

    [6]

    [8]

    [11]

    [9]

    [5]

    PCA C matrix, column 3

    X position

    [1]

    [12]

    [4]

    [2]

    [13]

    [3]

    Y p

    ositi

    on

    −0.4

    −0.3

    −0.2

    −0.1

    0

    0.1

    0.2

    0.3

    0.4

    −1.5 −1 −0.5 0 0.5 1−1.5

    −1

    −0.5

    0

    0.5

    1

    1.5

    [7]

    [10]

    [6]

    [8]

    [11]

    [9]

    [5]

    PCA C matrix, column 4

    X position

    [1]

    [12]

    [4]

    [2]

    [13]

    [3]

    Y p

    ositi

    on

    −0.3

    −0.2

    −0.1

    0

    0.1

    0.2

    0.3

    0.4

    (IMPACT) Metrology Sampling 24 / 30

  • Experimental Results

    Data Splits for Validation

    Full Data Set for Litho i/Etch j

    Split: wafer Ns

    Identification Set:N wafers

    m sites measured

    Prediction Set:M wafers

    q sites measuredm− q sites predicted

    (IMPACT) Metrology Sampling 25 / 30

  • Experimental Results

    Prediction Error vs. Model Window

    0 2 4 6 8 10 120

    0.1

    0.2

    0.3

    0.4

    0.5

    N=50

    N=200N=899

    Number of measurements per wafer

    Ave

    rage

    pre

    dict

    ion

    erro

    r (u

    nmea

    sure

    d si

    tes

    only

    )

    Prediction error with M = 899

    Required size of model set ∼ 200 wafers

    (IMPACT) Metrology Sampling 26 / 30

  • Experimental Results

    Prediction Error vs. Prediction Window

    0 2 4 6 8 10 120

    0.1

    0.2

    0.3

    0.4

    0.5

    M=50

    M=1598

    Number of measurements per wafer

    Ave

    rage

    pre

    dict

    ion

    erro

    r (u

    nmea

    sure

    d si

    tes

    only

    )

    Prediction Error with N = 200.

    Performance degrades gracefully with prediction window

    (IMPACT) Metrology Sampling 27 / 30

  • Experimental Results

    Time correlation

    0 1 2 3 4 50

    0.1

    0.2PCA M=50

    CCA M=50

    0 1 2 3 4 50

    0.1

    0.2PCA M=400

    CCA M=400

    Number of measurements per wafer

    Ave

    rage

    pre

    dict

    ion

    erro

    r (u

    nmea

    sure

    d si

    tes

    only

    )

    No improvement with more complex model, except for small numberof measurements per wafer

    (IMPACT) Metrology Sampling 28 / 30

  • Conclusions

    Outline

    1 Approach

    2 Modeling and Identification

    3 Experimental Results

    4 Conclusion

    (IMPACT) Metrology Sampling 29 / 30

  • Conclusions

    Conclusions

    Intelligent strategies can reduce the expense of metrology withoutcompromising effectiveness

    Reduced sampling realized through models which predict data atunmeasured sites

    The real challenge: SPC and fault detection

    Why do we use metrology? To tell us if something goes wrong [alsoclosed-loop process control]

    What are optimal sampling strategies that don’t compromise SPC orfault detection?

    Metrics: time to detect a process drift or failure, false alarm rates, etc

    Supported in part by NSF grant ECS 01-34132, Berkeley IMPACT program and DOE Programs DE-ZDO-2-30628-07 and

    DE-FG36-08GO88100.

    (IMPACT) Metrology Sampling 30 / 30

    ApproachModeling and IdentificationExperimental ResultsConclusion