Question: Why is two to three seconds the criterion of determination of a materials’ wetting level? Wouldn’t permanent wetting be a better criterion?
Answer: Actually, ASTM Std. D2578 and ISO 8296 both specify 2 seconds as the timeframe for evaluation, but we usually suggest 2 to 3 seconds, as most testers seem more comfortable when a brief range is specified, rather than a single instant in time. The timeframe is partly historical artifact, and partly due to the basis of the test, which derives from the behavior of retreating contact angles.
As the test fluid is applied to the surface, it is spread over a given area – usually about a square inch (about 6 or 7 square cm) either in a line or as a block. The results of the test are based on how (and if) the fluid film reticulates (shrinks) into individual beads, and this is obviously a process that takes a finite amount of time to achieve a balance.
I believe the 2 second timeframe was originally established to balance the effect of evaporation (the lower surface tension component of the test fluids evaporates more readily) and the effect of de-wetting per se. In other words, if you wait longer, evaporation will start to have more of an effect, which induces greater de-wetting. The idea is to evaluate at the time that surface forces per se are most important to the interaction of the test fluid and the substrate. When the test was first developed some 60 years ago, the 2 second timeframe was established as a way to standardize interpretation and meet this goal. Based solely on empirical evidence, it appears to be an effective specification, as no serious alternatives have been suggested.
Like most questions regarding surface energy testing, whereas the question may be simple, often the answers are not so much so!