The convergence of new technologies and measurement needs has triggered an evolution in the design of test and measurement (T&M) platforms. For many years, commercial development of T&M systems was driven by the needs of the R&D community. As more and more high-tech products enter consumer markets, the greatest demand for sophisticated T&M systems has shifted to production applications. In these environments, pressure to produce more at lower cost largely determines test system attributes and the speed at which designs evolve.
While production T&M may not need sensitivity and accuracy quite as high as R&D applications, production often has higher throughput requirements. Also, most production test engineers want application-specific solutions; an economical way of getting these is to use PC-based systems. Still, these systems must be capable of quickly adapting to new test requirements brought about by rapid product change. Further complicating matters is a growing demand for more open T&M system architectures and Web-enabled systems to allow data sharing throughout the enterprise.
These are the major demands propelling change in PC-based production test architectures. The two most common ones are:
1. Systems using PC plug-in boards.
2. Standard hardware modules and instruments connected to a PC.
Some of their features will change more dramatically than others.
Exploiting new test platforms
Plug-in board systems
Probably, the biggest change will occur in these systems. The driving force appears to be a shift in data bus technology envisioned by Intel and others. To further shrink the size of PCs and increase processing speed, with or without faster clock speeds, a slotless computing environment is envisioned. Connection to the outside world has not been fully defined, but could be some form of internal bus extension, USB data bus, IEEE-1394 (Firewire) connection, Ethernet or other high bandwidth bus.
With no slots, PC plug-in cards would disappear, replaced by external modules or card chassis. To adapt to the new bus configuration and take full advantage of higher bandwidth, the design of cards and modules may also have to change. However, the software giants are working on ways of minimising this with such things as Virtual Interface Architecture and Jini technology to complement Java and ActiveX controls. Such technology strongly appeals to test engineers who must minimise production outages due to system redesigns.
For these reasons, a shift to slotless computers (when available) is likely to be slow in production facilities. Witness the fact that there is still a brisk market in ISA plug-in cards, even though PCI cards have been widely available for the past couple of years. The philosophy of "If it ain't broke, don't fix it!" can be very powerful when a production line hums along with minimal problems.
Still, a test system developer considering a plug-in card design should also think about the slow disappearance of PCs that accept these cards. Such a platform will probably be more difficult to implement as time goes on.
Standard hardware modules and instruments connected to a PC
These systems are also controlled by PCs and application specific software. However, they use external bench-top instruments and related hardware modules to get higher sensitivity, accuracy and other features not available with limited PC card real estate. These standard building blocks provide an economical way to create a customised system that more closely fits specific needs of a production line. Also, the DMMs and switching systems typically used in this type of platform can be easily re-used when test requirements change. You will not have to worry about PC slots disappearing since these devices are already external.
The most obvious change you can expect in future instruments and external data acquisition modules is addition of more types of data communication interface, such as USB, Firewire and Ethernet. Currently, the prevalent interfaces are IEEE-488 and RS232, which have significant speed limitations. The change will be welcomed by test system developers, particularly those who have tried implementing test and measurement over Ethernet. This is nothing at all like plug-and-play, often requiring what seems to be cryptic, arcane command language and hassling with PC interrupts to avoid hardware conflicts.
Under USB and Firewire you can expect things to get a lot easier. True plug-and-play should become a reality for test instruments, much as it has for conventional PC peripherals, such as printers. A Firewire interface on instruments and the PC, slotless or not, will be very attractive because of high bandwidth and the shared communication bus. A draft of the Firewire (IEEE-1394) industrial instrumentation data communication protocol is moving closer to acceptance, spurred on by the 1394 Trade Association Industrial Instrumentation Work Group. This could happen as early as the third quarter of 1999.
Even more dramatic will be the appearance of Web-enabled instruments, probably based on embedded Windows CE platforms. These should start to appear within the next 12 months or so. The driving force behind these designs is the desire of different enterprise departments to exploit production test data to reduce costs and increase productivity. Sharing such data across secure intranet connections will become commonplace as TCP/IP protocols are embedded in test equipment with high bandwidth communication interfaces.
Consider what these instruments and resulting test systems bring to a manufacturer faced with large scale validation testing. For example, new automotive designs must validate a large number of components and assemblies before they are approved for production. To accelerate the process, large batches of sample devices are simultaneously tested in environmental chambers spread across different engineering and development departments. Networked test systems and databases greatly facilitate the selection of compatible components. Authorised personnel can do detailed analysis on data from any chamber at any time, and take action if any device starts to exhibit reliability problems. There is no need to wait for a batch-processed report to be generated. Similarly, test limits can be easily modified via the network, if appropriate. All this helps shorten the product development cycle and time to market.
Spescom MeasureGraph, a subsidiary of Spescom Limited and a leading supplier of test and measurement equipment, is a local distributor of the Keithley range of instruments.
For more information contact Van Zyl Koegelenberg, Spescom MeasureGraph Test Systems, 011 266 1662, [email protected], www.spescom.com
About the author
David Patricy is vice president and general manager of the Instrument Business at Keithley Instruments, in Cleveland, Ohio. Since joining Keithley in 1974, he has had a succession of increased responsibilities, including manufacturing manager, business area manager of Digital Multimeters, and director of manufacturing. He completed his MBA at the Weatherhead School of Management, Case Western Reserve University, and earned his undergraduate degree in Industrial Management at Cleveland State University.
© Technews Publishing (Pty) Ltd | All Rights Reserved