TECHNET Archives

February 1997

TechNet@IPC.ORG

Options: Use Monospaced Font
Show Text Part by Default
Show All Mail Headers

Message: [<< First] [< Prev] [Next >] [Last >>]
Topic: [<< First] [< Prev] [Next >] [Last >>]
Author: [<< First] [< Prev] [Next >] [Last >>]

Print Reply
Subject:
From:
Date:
Mon, 10 Feb 1997 12:46:32 -0500 (EST)
Content-Type:
text/plain
Parts/Attachments:
text/plain (212 lines)
Dear TechNet,

My response last week to Mr. Gordon Davy’s reliability related question was
rather acidic and I don’t believe that Gordon deserved the firestorm I
generated.  Such a response is not my usual nature.  Having had a better
weekend and sunny weather today, let me take an alternate approach to the
questions posed.  I apologize for the length, but there are some things that
need to be said from similar questions in past months.

The issue is the use of hard reliability data in the formation of industry
standard specifications and guidelines vs. levels set by committee consensus
or technical opinions.

I think every IPC technical staff member and every committee chair would
agree with a statement that specification levels and pass/fail requirements
should be based on hard objective data, published for the world to see.  As
Susan Mansilla has pointed out, if you have been in the IPC long enough, you
will have been bit by levels set on the <smarmy smile> "trust me" approach.
 There are three issues at work that need to be examined:  

(1)  Is the data available?  (keyword being available)
(2)  How applicable is the data?
(3)  What other factors come into play on spec/standard development?

In the IPC-SM-840 (solder mask) document, which started my firestorm, we had
the choice of staying with the conventional IPC Class 1,2,3 designations or
going to the Class H and Class T.  Yes, we had specific data in a two round
robin studies, enough to make decisions upon, but it was not specifically
designed as a class 1,2,3 vs. Class H,T experiment.  It did, however, give
the technical experts sufficient information to make an informed choice.  The
technical experts (not marketing people) also indicated that in reality,
there is very little difference in masks between class 2 or class 3, class H
or class T.  The only real difference is the testing required for each class
and the pass / fail requirements.  So it did not make sense to stay with a
convention that was not applicable.  Common sense should prevail over
convention (in my opinion).  By no means was this an easy decision, or a
universally accepted decision.  Such is the case for many specifications.
 Level X is set by a 7-6 vote in committee.

Much of the specification information was based on test data.  The SM-840
committee is comprised of 6-8 mask vendors who volunteer their time to
support the document.  When these vendors qualify a solder mask, test data is
available.  It is not always shared, but contributes to the technical experts
understanding of the materials issues involved.  These same experts (e.g.
Larry Fisher) are the ones who deal with their customers when the customers
are having a problem processing Mask X.  They often do testing to get to the
root of the problem.  Data is available - to them.  This gets to the issue of
the availability of the data.

Take the following scenario.  Corporation XYZ has a problem with Mask X and
calls in Mr. Fisher.  Testing is performed, data is generated.  Mr. Fisher,
being an honorable individual, holds all of that data as highly confidential
since he is certain that Corporation XYZ does not want information on it’s
failures circulated to the industry in general.  Mr. Fisher has the
knowledge, but cannot share the data.

I can relate to this situation.  I consult on a daily basis on problems with
manufacturing processes.  I see every conceivable way to screw up a
manufacturing process.  Much of the data that I see IS hard field failure
data, traced back to specific manufacturing parameters or materials.  How
happy would my customers be to find me circulating data from their 40% field
failure rates?  How long do you think it would be before their LEGAL
department had me drawn, quartered, diced and pureed?  Like the above
scenario, I have the information to make informed decisions, but can neither
publish nor share the hard data.  The best I can do is genericize the data,
as I do in my Circuits Assembly column.

I would not support footnoting technical consensus items with notes
indicating that that the consensus was based on unpublished data or as a
technical opinion.  I would find such a statement to be of no value since it
would occur in numerous places throughout any specification.  In addition,
many specification levels are set as a "best estimate at the time", hoping to
get inputs on relevant data during the spec drafting process or during the
time the spec is in force.

So, if we cannot publish or make known this kind of very valuable data, what
are the alternatives?  There are academic partnerships, ARPA research grants,
industry consortia, and technical papers.  Each sources has its advantages
and disadvantages.  The first three often take significant outlays of either
time, personnel, or money.  The Low Residue Soldering Task Force yielded some
excellent data that went into many of the assembly and joining specs and
standards.  Care to hazard a guess at the tens of thousands of dollars spent
or the thousands of man hours put in?  Will your consortia publish the data
to non-consortia members?  How soon?  Ever applied for an ARPA grant?  Got a
few years to kill?  Ever deal with an ivory towered academic?  

As far as technical papers go, do you have the time?  I have lots of
information about SIR.  Three to four day tests, little checks here and
there, mini-studies to get me a quick answer, but not enough to publish.  My
day is already packed.  No time to publish this other work.  Many of my
counterparts in industry are now multi-tasked after corporate reductions.
 When you are doing two jobs already, it is difficult to tackle a third job
as publisher.  Dave Hillman is another example.  Dave knows solderability far
more than I could ever hope (or desire) to know.  He gets it from similar
side studies.  But publishing the work often takes time he doesn’t have.  

However, when Dave volunteers his time, just as other IPC participants do,
that kind of information works its way into specifications.  Does it make for
a sound spec?  Yes.  Is the data published? No, and don’t hold your breath.
None of the IPC committees have been so arrogant to believe that they drafts
they pull together are the end of the line.  That is why all specification
drafts are circulated to wider and wider circles, looking for those with
dissenting opinion and more importantly, hard dissenting data.   If there is
something you see that you don't agree with, by all means, suggest an
alternative.  If you have data to support your claim, excellent.  We'd love
to see it.  As an old IPC adage goes "In God we trust, all others bring
data".  All task group chairman welcome new blood and more helpers.

Often, the IPC committees are vexed over what levels to specify.  Take SIR
levels as an example.  Historically, we have used 85C/85% RH for many years
now.  Data by John Sohn indicates that perhaps 65/90 is better for low solids
fluxes.  I don’t totally agree with John, but we have debated the issues
(amiably).  Is John’s study enough evidence to make specification changes?
 How much data must be generated to change the direction of an industry?
 This summer, when contemplating the specifics of Appendix D in J-STD-001B,
we had some data to make decisions, but some considered that we did not have
enough "specific" data, or "enough" to make a broad based change.

The ideal solution would be to have a designed experiment, run by the
volunteer committee members, to specifically answer the questions before the
spec drafting team.  OK, will your company pony up the money to support this?
 Will your company provide the man-power or resources to do this?  Do your
people know how to do the testing in question repeatably and reliably?  Can
you get it done in the next two month so we can have results by the next IPC
meeting?  What’s in it for the company?  Who will analyze the data?  It
should be a good statistician.  Got one you can loan?

Example:  John Sohn has had a designed experiment on the table since his last
published paper that would answer many questions we have in the SIR realm
regarding test environment.  AT&T went through massive shake-ups and became
Lucent.  Anyone who has been through such an experience knows that most
productive work goes on hold until things settle down.  Dr. Karen Tellefsen,
Alpha Metals, has John’s test substrates, but is buried by her own work and
cannot do the testing.  Same for me.  Should the testing go to some other
facility to accomplish the task?  It could.  Are there many who I would trust
with industry-defining studies?  Less so.  Got the money it would take to
fund the test??

The whole point that I am trying to get at is that good, hard, reliable test
data is

(1) damn hard to find
(2) often unavailable for a variety of reasons
(3) not immediately applicable to the point being made.

This is why I sometimes fly off the handle when people who are not involved
with the IPC volunteer process suggest that there is not hard data to back up
the specification levels.
I understand the "wondering" of people who would like to know how such levels
came to be in existence, or why Specification X took direction Y.  And the
question is asked innocently, and with the best of intentions.  We all
realize this.  Often, the only answer lies in the collective memories of the
task group members, many of whom have now retired and the information is
lost.  I have advocated generating a technical paper to go along with various
specs and standards documenting why Specification X took direction Y.  Jim
Maguire and I did such an effort with IPC-TR-467.  It documents why
J-STD-001B, Appendix D is the way it is.  The J-STD-001 Handbook committee is
taking all of the information removed from J-1A and expanding it, so the
collective intelligence is not lost to ensuing generations.  I can see more
of this occurring in the future.

Mr. Davy suggested that we could possibly relate reliability to the end use
environment.  Good in theory, extremely difficult in practice.  I like the
present definition of class 1, if it goes out it's a minor annoyance.  Class
2, it's a hassle, but no one dies.  Class 3, if it
goes out, people die.  The consideration of end-use environment should be in
the design and ruggedization of the assemblies - do I conformally coat, do I
build a hermetic box, do I use redundant circuits, etc.  I might call a
portable computer an office item, except when taking it from a -40C car in
northern Wisconsin into a heated office.  Can you say condensation?  For any
proposed class structure, I could find numerous exceptions.  We would be
adding classes to the IPC structure till the cows come home (I’m from
Wisconsin, they come home about 5:00 p.m.)  The present system deals with the
consequences of failure, and that should be the primary consideration.

In general, reliability itself is a very slippery topic, dependent on
sooooooo many factors.  And when we try to put such concepts into a
specification, readable by the often lesser educated populace, the task
becomes nigh on impossible.  Keep in mind that you cannot write to a college
Ph.D. level, you have to write for the average reader, a high school
education level at best.  I've tried, and I'm a good technical writer, and I
can't do it.  I’ve read the military specifications on defining reliability,
and frankly, I was baffled trying to get the specifics.  If you understand
them, more power to you.

Having rambled on far too long as it is, I will make a general invitation to
everyone on Technet.  The IPC semi-annual meeting is in San Jose, March
8-13th.  Take part in the volunteer process.  If something about a spec bugs
you, participate.  If you have data contrary to the spec levels, share it.
 If you can’t spend the travel dollars to attend, get on the mailing list.
 We are doing more and more things electronically by e-mail and FTP sites.
 ANYONE can participate and you will be welcomed with open arms.

I welcome any rebuttals or questions.

Doug Pauls
Contaminated Studies Labs
[log in to unmask]


***************************************************************************
* TechNet mail list is provided as a service by IPC using SmartList v3.05 *
***************************************************************************
* To unsubscribe from this list at any time, send a message to:           *
* [log in to unmask] with <subject: unsubscribe> and no text.        *
***************************************************************************
* If you are having a problem with the IPC TechNet forum please contact   *
* Dmitriy Sklyar at 847-509-9700 ext. 311 or email at [log in to unmask]      *
***************************************************************************



ATOM RSS1 RSS2