Monday 18 December 2017

Engineers Turn to Automated Test Equipment to Save Time

http://www.readydaq.com/content/blog/engineers-turn-automated-test-equipment-save-time
With engineers rushing tests in order to hit tight product deadlines, the market for test equipment that automatically detects faults in semiconductors and other components is growing.
Setting aside time for testing has been a struggle for electrical engineers. The shrinking size - and increasing complexity - of semiconductor circuits is not making life any easier. Nearly 15% of wireless engineers are outsourcing final testing and more than 45% contract manufacturing – when most semiconductor testing takes place.
Almost 65% of the survey respondents said that testing is still a challenge in terms of time consumption. New chips designed for tiny connected sensors and autonomous cars also require rigorous testing to ensure reliability.
Tight deadlines for delivering new products is forcing engineers toward using automated test equipment, also known as ATE, to quickly identify defects in semiconductors, especially those used in smartphones, communication devices, and consumer electronics.
The global automated test equipment market is estimated to reach $4.36 billion in 2018, up from $3.54 billion in 2011, according to Transparency Market Research, a technology research firm.
Automated test equipment is used extensively in semiconductor manufacturing, where integrated circuits on a silicon chip must be tested before it is prepared for packaging. It cuts down on the time it takes to test more complex chips, which are incorporating higher speeds, performance, and pin counts. Automatic testing also helps to locate flaws in system-on-chips, or SoCs, which often contain analog, mixed-signal, and wireless parts on the same silicon chip.


Saturday 16 December 2017

Semiconductor Testing


http://www.readydaq.com/content/blog/semiconductor-testing

Automated test equipment (ATE) is computer-controlled test and measurement equipment that allows for testing with minimal human interaction. The tested devices are referred to as a device under test (DUT). The advantages of this kind of testing include reducing testing time, repeatability, and cost efficiency in high volume. The chief disadvantages are the upfront costs of programming and setup.
Automated test equipment can test printed circuit boards, interconnections, and verifications. They are commonly used in wireless communication and radar. Simple ATEs include volt-ohm meters that measure resistance and voltages in PCs; complex ATE systems have several mechanisms that automatically run high-level electronic diagnostics.
ATE is used to quickly confirm whether a DUT works and to find defects. When the first out-of-tolerance value is detected, the testing stops and the device fails.

Semiconductor Testing

For ATEs that test semiconductors, the architecture consists of a master controller (a computer) that synchronizes one or more sources and capture instruments, such as an industrial PC or mass interconnect. The DUT is physically connected to the ATE by a machine called a handler, or prober, and through a customized Interface Test Adapter (ITA) that adapts the ATE's resources to the DUT.
When testing packaged parts or directly on the silicon wafer, a handler is used to place the device on a customized interface board and silicon wafers are tested directly with high precision probes.

Test Types

Logic Testing

Logic test systems are designed to test microprocessors, gate arrays, ASICs and other logic devices.
Linear or mixed signal equipment tests components such as analog-to-digital converters (ADCs), digital-to-analog converters (DACs), comparators, track-and-hold amplifiers, and video products. These components incorporate features such as, audio interfaces, signal processing functions, and high-speed transceivers.
Passive component ATEs test passive components including capacitors, resistors, inductors, etc. Typically, testing is done by the application of a test current.
Discrete ATEs test active components including transistors, diodes, MOSFETs, regulators, TRIACS, Zeners, SCRs, and JFETs.

Printed Circuit Board Testing

Printed circuit board testers include manufacturing defect analyzers, in-circuit testers, and functional analyzers.
Automated Test Equipment imageManufacturing defect analyzers (MDAs) detect manufacturing defects, such as shorts and missing components, but can't test digital ICs as they test with the DUT powered down (cold). As a result, they assume the ICs are functional. MDAs are much less expensive than other test options and are also referred to as analog circuit testers.
In-circuit analyzers test components that are part of a board assembly. The components under test are "in a circuit." The DUT is powered up (hot). In-circuit testers are very powerful but are limited due to the high density of tracks and components in most current designs. The pins for contact must be placed very accurately in order to make good contact. They are also referred to as digital circuit testers or ICT.
A functional test simulates an operating environment and tests a board against its functional specification. Functional automatic test equipment (FATE) unpopular due to the equipment not being able to keep up with the increasing speed of boards. This causes a lag between the board under test and the manufacturing process. There are several types of functional test equipment and they may also be referred to as emulators.

Interconnection and Verification Testing

Test types for interconnection and verification include cable and harness testers and bare-board testers.
Cable and harness testers are used to detect opens (missing connections), shorts (open connections) and miswires (wrong pins) on cable harnesses, distribution panels, wiring looms, flexible circuits, and membrane switch panels with commonly-used connector configurations. Other tests performed by automated test equipment include resistance and hipot tests.
Bare board automated test equipment is used to detect the completeness of a PCB circuit before assembly and wave solder.

Wednesday 6 December 2017

Exploiting LabVIEW Libraries


labview expert
Have you ever viewed a LabVIEW VI Hierarchy and become frustrated with not being able to locate a VI you needed to open?
Do you have large applications composed of similar modules but fear to jump, with both feet, into the learning curve of LVOOP?
Did you ever try to duplicate a sub-VI at the start of a new set of functions and find yourself deep in a nest of cross-linked VIs, or save a VI only to realize that the most suitable name has already been used?
Then using LabVIEW Libraries may be useful to you
Libraries are a feature available in the LabVIEW project or they can be created stand-alone*. They have a number of features that allow you to specify shared properties and attributes of related VIs and custom controls.
In short, many of the features of LVOOP are available without the complications required for Dynamic Dispatching. The remainder of this document will serve as a tutorial that demonstrates how to create, define, and clone a library. Additional notes are included to illustrate how these features can be exploited to help you develop more robust applications that are easier to support than applications that do not use libraries.
*Libraries can be created stand-alone from the LabVIEW splash screen using the method:
File >>> New … >>> Other Files >>> Library
You can create a new library from the project by right-clicking the “My Computer” icon and selecting “New >>> Library”. Save it to a unique folder that will contain all of the files associated with the library.
Open the properties screen and then open the icon editor) to compose a Common Icon for the library and its members.
Take a little time to create the icon because it will be shared by all of the members of the library. Do not get carried away and fill-up the entire icon. Leave some white space so that the icons of the component VIs can be customized to illustrate their role in the functionality of the library.
Create virtual folders in the library to help organize the VIs contained in it. I usually use three folders but you can use more or less depending on your needs and preferences. I use one to hold the controls, and another pair for the public and private VIs. I do not use auto-populating folders for a number of reasons.
I can control which VIs are included and which are not. Occasionally temporary VIs are created to do some basic testing and they are never intended to be part of the library. If functionality changes and the temporary VI breaks due to the change, the library may cause a build to fail due to the broken VI.
I can easily move a VI from private to public without having to move the VI on disk and then properly updating source code control to reflect the change.
I can keep the file paths shorter using the virtual folders while maintaining the structure of the project.
Additional virtual folders can be added if you want to further break-down the organization of the VIs in the library. If developing a library that will be used by other developers and or be as a tool for others, you may want to include a folder for the VIs that define the API your library offers. The API can also be divided into additional virtual folders to break-down the interface into functional areas if you wish. Implement the Logical Grouping of sub-VIs as needed for your library.
Set the Access Scope of the private virtual folder to private. While the private folder and the setting of the access scope can be optional, taking advantage of this options will help you and the users of your library identify which VIs are not intended for use outside of the library. Attempting to use a VI with a private scope from outside the library itself will break the calling VI and make it very obvious that the VI is not intended for public use.
Developing applications using libraries differs little from developing without libraries with one exception; there is no additional work to use them. The exception is illustrated in Figure 8 where the name of the VI is highlighted. While the VI named in the project is shown as “Init_AI.vi” the actual name of the VI is “DAQ.lvib:AI.lvlib:Init_AI.vi”. The difference is the result of what is called “Name Mangling”. The actual name of the VI is prefixed by the library names that own the VI. This is a powerful feature that goes a long way toward avoiding cross-linking and will let us easily clone a library to be used as the starting point of a similar library.
The Save as the screen for the library will not only let us define the library name but also where in the project the library will be placed. This is handy for nested libraries but not critical. The libraries can be moved around in the project or between libraries as need using the project window. When a library is cloned using the Save As an option, all of the VIs contained in the original library are duplicated and re-linked to the VIs in the new library. There is NO chance of cross-linking when Cloning a library!
Libraries can help in all phases of an application from initial development to long-term support through to knowledge transfer. Remember, “Libraries” are your friend!

LabVIEW Improvements


labview developers

LabVIEW passed its 30 year anniversary in 2016,  and six months ago, National Instruments, has launched a considerably updated version of LabVIEW - its Next Generation LabVIEW NXG 1.0.
LabVIEW NXG is a totally reworked version of LabVIEW and this enables it to offer a considerably improved level of performance. By adopting an approach where LabVIEW has been started again from the ground up, LabVIEW NXG enables users to see significant improvements in performance as a result of the new code.
LabVIEW NXG offers some significant definitive improvements over the previous implementation of LabVIEW:
  • Plug & Play: a lot of work has gone into enabling LabVIEW NXG to provide easy set-up of hardware connections. It has true plug and play functionality.
  • IDE: The LabVIEW NXG environment has been totally overhauled to take elements of popular commercial software and replicate the attributes of the environment to make it more intuitive.
  • Tutorials: To facilitate the speedy uptake of newcomers to LabVIEW, the new LabVIEW NXG has inbuilt walk-throughs and other integrated learning facilities. This has been shown to greatly speed up the time which it takes for newcomers to be able to proficiently programme in LabVIEW. It is even possible to undertake a number of standard tasks without “hitting the code.”
National Instruments will be running both the traditional LabVIEW, i.e. LabVIEW 2017 which has also been launched alongside the new next-generation LabVIEW NXG, but ultimately when total compatibility has been established the two will converge enabling users to benefit from the new streamlined core.
Users of LabVIEW will be given access to both LabVIEW 2017 and later versions as well as LabVIEW NXG. In this way, they can make the choice of which version suits their application best.
National Instruments spokespeople stressed that the traditional development line of LabVIEW will continue to be maintained so that the large investment in software and applications that users have is not at risk. However, drivers and many other areas are already compatible with both lines.
“Thirty years ago, we released the original version of LabVIEW, designed to help engineers automate their measurement systems without having to learn the esoterica of traditional programming languages. LabVIEW was the ‘nonprogramming’ way to automate a measurement system,” said Jeff Kodosky, NI co-founder and business and technology fellow, known as the ‘Father of LabVIEW.’
“For a long time, we focused on making additional things possible with LabVIEW, rather than furthering the goal of helping engineers automate measurements quickly and easily. Now we are squarely addressing this with the introduction of LabVIEW NXG, which we designed from the ground up to embrace a streamlined workflow. Common applications can use a simple configuration-based approach, while more complex applications can use the full open-ended graphical programming capability of the LabVIEW language, G.”