Brian,
Unfortunately I must disagree with part of your first paragraph. Certainly in the three standards I work to voltage is applied during the entire test (not counting the initial chamber stabilization phase).  These three standards and applicable sections are:

1) Nortel Networks "Corrosiveness of Soldering Fluxes"  Section 2.5.4 states "A bias voltage of 45-50 volts is applied for 4 days."

2) Telecordia's GR-78-CORE, section 13.1.3.2.5 "After the stabilization measurement, apply a dc bias of 45V to 50V to all parallel conductors during the entire conditioning period."

3)  IPC-TM-650, section 5.3.5 "Connect the 45-50V DC voltage source to the specimen points to apply the bias voltage to al specimens."

It would seem to me that applying a voltage under stressful conditions (35 C., 85%) RH) could constitute an attempt to provoke a failure.

Brian, one may not like it that people are calling what I described above as SIR, but since I think the majority of the electronics community does so means it is a fait accompli.

I have never heard of an instrument that applies the test voltage (as opposed to the bias voltage) to all test points at once.

For electromigration all of: the Nortel Networks spec, the Telecordia spec and the soon-to be-reissued IPC electromigration spec call for a bias voltage of 10 V DC, lower, not higher than that used in SIR.  One other huge NA company that I know of claims that even a lower bias voltage (but still not zero) should be used.

Back in the pre-nineties I think it was Bell Labs that did the work that showed that if one lets the test coupons acclimatize for 24 hours at 35C/85% or for 96 hours at 85C/85%RH it did not make a difference how the chamber ramped (temperature ahead of humidity or one or more dew point conditions on the way to the test condition).  Another alternative is IPC-TM-650 suggests a controlled ramp where the chamber is set to low humidity, the chamber is ramped to temperature and then to humidity.

regards,
Bev Christian
Nortel Networks