Camera & Accessories Search

Tuesday, February 3, 2009

Gigabit Ethernet: Coming to a Camera Near You

Gigabit Ethernet:
Coming to a Camera Near You

by George Chamberlain, Pleora Technologies

An AOI application based on GigE connectivity opens up a whole new range of configuration options.

For want of lower cost, more flexible alternatives, the automated optical inspection (AOI) industry has been designing its high-performance applications around short-reach, point-to-point image transfer protocols like Camera Link™ and low-voltage differential signaling (LVDS).


LYNX GigE Camera
Courtesy of Imperx

Perhaps that explains the buzz in the machine vision world over new application configurations that leverage the economy and agility of standard Gigabit Ethernet (GigE). In GigE, the industry finally has found an inexpensive, standard transport platform that goes long distances, supports all sorts of networking and processing options, and perhaps most importantly, delivers the high bandwidth needed to stream imaging data from cameras to host PCs in real time.

One measure of the momentum behind GigE is an initiative by the Automated Imaging Association (AIA) to define GigE Vision. It is an open standard that will enable seamless interoperation over GigE between cameras and applications software from different vendors (see sidebar).

Pleora Technologies provides a range of stand-alone products that allows existing camera link, LVDS, or analog cameras to stream data in real time over GigE. In the past year or so, many camera companies, including DALSA, Imperx, JAI-Pulnix, Tattile, and Toshiba-TELI, have unveiled their first GigE camera offerings.

If you're shopping among these wares, it's worth taking a hard look at what's inside. Not all GigE interfaces are created equal. The techniques used to grab, packetize, queue, and transfer image data over the GigE link have a significant impact on the reliability and quality of the image stream.

Instead of a frame grabber, GigE applications use a standard network interface card/chip (NIC) to route image data to PC user memory. To improve NIC streaming efficiency, many GigE cameras come with a special driver. These drivers operate with varying degrees of effectiveness and merit careful inspection.

Another key consideration: If your application has to synchronize and trigger multiple system elements, ensure that these traditional frame-grabber functions are adequately met by internal camera circuitry.

The GigE Interface
The GigE interface sits behind the camera head inside the enclosure. Its main job is to acquire image data, convert it to IP packets, queue it for transfer, and send it over the GigE link. It also must deliver control signals from the GigE link and other inputs to the camera head and handle network functions such as boot-up and packet resend.

Two classes of interfaces are offered in today's new GigE cameras: those made from purpose-built hardware and those derived from a software application on an embedded processor. Table 1 summarizes the relative benefits of each approach.


Table 1. Performance Trade-Offs Between Different Classes of GigE Camera Interfaces

In general, purpose-built hardware yields higher performance, more reliable GigE interfaces that are capable of processing and transmitting image data at the full 1-Gb/s GigE line rate. They easily meet the real-time throughput requirements of most AOI applications including those involving high-resolution cameras with fast frame rates.

Hardware interfaces operate with deterministic, clock-cycle accuracy suitable for even the most demanding AOI applications. This helps them deliver the low, consistent latency required to achieve superior system-level performance and improve the quality of AOI application results.

Hardware interfaces are compact, which minimizes camera size. And they are power efficient, drawing as little as 2.0 W regardless of how much data they pump out.

GigE interfaces derived from software can achieve deterministic operation at accuracies suitable for many AOI applications as long as they use a real-time operating system (RTOS) and a processor with decent clock speeds. However—and this is a key point—the power consumption of these interfaces is directly related to how much image data they process.

Analyses indicate that, even with a power-efficient processor, a software interface would draw up to 25 W to drive the throughput of a 1,600 × 1,200 sensor operating at 50 f/s. This is just not practical. Cameras for imaging applications operate at well below 10 W.

Furthermore, the heat generated from this much power would degrade sensor efficiency. Power consumption imposes a ceiling on the data throughput a software interface can handle and explains why GigE cameras using this technology are packaged with sensors operating at lower data rates.

Performance at the PC
AOI systems with GigE cameras don't require a frame grabber. Instead, data is brought into the PC using a standard low-cost NIC. However, the drivers supplied by NIC manufacturers use the delay-prone Windows or Linux IP stack to transfer data into memory. They can handle high-bandwidth image data but not in real time.

Some camera companies ignore this issue because their cameras are aimed at applications where real-time operation is not critical. Others include a more efficient NIC driver with their camera.

A few of these drivers are top-notch software that stream image data directly to user memory, in real time, at the kernel level of the system, bypassing the Windows or Linux IP stack process. Since they don't engage the IP stack and use intelligent direct memory access (DMA) transfers, these drivers need almost no CPU capacity from the PC.

This is important for demanding AOI applications that need the CPU to analyze image data as it's streaming into the computer and, depending on the analysis result, trigger a subsequent event in the system. If the CPU is busy managing data transfer, it's not available to the application, making real-time processing impossible.

More typically, drivers included with GigE cameras improve NIC streaming efficiency but still use elements of the IP stack for some processing. These drivers can't guarantee real-time operation and use more CPU time in the PC than some AOI applications can tolerate.

Higher performance drivers often perform another critical function: helping to ensure the recovery of packets dropped by the network. These drivers guarantee data delivery by implementing a sub-millisecond data resend protocol in conjunction with the GigE camera interface. This ensures data is never lost.

The GigE Vision standard will support data resend but will not define the implementation method or maximum response time. Establishing the time required to generate or respond is left to the camera manufacturer. AOI applications that require real-time performance will need to validate that the packet resend capabilities of the chosen camera are sufficient for their application.

Triggering and Synchronization
Many AOI applications need to synchronize, trigger, and coordinate the operation of cameras with encoders, light sensors, motion detectors, conveyor belts, and other system elements. In the past, this job has fallen to frame grabbers.

In systems with GigE cameras where no frame grabber is required, camera companies need to offer these functions inside the camera. Lower-end GigE cameras usually, but not always, include a couple of I/O ports that support basic functions such as external sync.

At the other end of the spectrum are richly featured GigE cameras with I/O capabilities that rival or exceed those of the most sophisticated frame grabbers. These cameras include a flexible signal matrix that allows signals from the I/O, GigE, and camera connectors to be interconnected in hundreds of ways. Any input can be mapped to any output, activities can be chained where outputs are fed back as inputs, and I/O signals can be combined in classic Boolean logic operations.

Using a look-up table, specific relationships can be built between different system elements. Triggers range from a simple external sync to intricate interoperations. Image capture, for example, can be initiated, interrupted, or aborted by triggers from a single encoder or by combinations of triggers from encoders, light sensors, and motion detectors.

These higher-end I/O systems also include control circuits such as delays, rescalers, and counters for fine-tuning system operation in numerous ways. Images can be tagged with timestamps and system actions triggered based on time intervals.

Signals also can be delayed by specified time intervals or have their frequency scaled up or down. Actionable interrupts can be generated from the PC host, and all system elements can be reset simultaneously.

Triggering From an Application
In some vision systems, in-camera I/O activity must be complemented by commands from PC-based applications software. If, after analyzing the image of a product, the software detects a fault, it may have to activate a push arm to remove the product from a conveyor belt.

In the past, frame grabbers handled this trigger. With GigE cameras, such triggers can be problematic because vision application software usually runs on Windows or Linux. Trigger requests can be stalled in OS queues, interrupting real-time signal flow.

Figure 1 shows how GigE cameras with fully featured I/O capabilities can bypass this limitation. Each image is timestamped by the in-camera I/O system, and the timestamp correlates to an encoder count.


Figure 1. Some GigE Cameras Can Support Real-Time Triggering From a PC Application

The encoder count gives the I/O system the precise location of the widget on the conveyor belt. Through system design, the push bar is located far enough away from the camera to accommodate the maximum possible variability from the OS scheduler and any other network equipment.

The PC-based AOI application sends the in-camera I/O system the timestamp of the exact moment the defective widget will pass the push bar, and the push bar is triggered accordingly. This allows applications to precisely time trigger events and synchronize system elements independent of latency variations in the OS and changes in conveyor speed.

Matching Cameras to Applications
Regardless of which camera is used, the mere fact that an AOI application is based on GigE connectivity opens up a whole new range of configuration options. For the first time, applications can leverage the long-distance reach, networking flexibility, scalability, and low-cost equipment that have made Ethernet the reigning global LAN protocol.

If you need real-time end-to-end performance, high-resolution imaging, or sophisticated I/O capabilities, look for a GigE camera with an interface based on purpose-built hardware. If your AOI application is less demanding, then a product with a processor-based interface might suffice. In the end, as always, application needs and business priorities will dictate which GigE camera best fits the bill.

About the Author
George Chamberlain is president of Pleora Technologies and serves on the AIA GigE Vision Committee. He has been involved in the design of communications, networking, and imaging products since the late 1980s. Before co-founding Pleora in 2000, Mr. Chamberlain worked for companies including Semiconductor Insights, Mitel, and Bell-Northern Research. Pleora Technologies, 359 Terry Fox Dr., Suite 230, Kanata, Ontario, Canada K2K 2E7, 613-270-0625, e-mail: info@pleora.com

No comments: