Semiconductor optical inspection using high-speed machine vision

What you will learn:

  • Line-scan cameras and area-matrix cameras.
  • Key Ingredients of High Speed ​​Machine Vision System for Inspection.
  • How to synchronize platelets with the camera.

Semiconductor optical inspection presents complex challenges, including the small size of the target and the proximity of individual dies in wafer space. At the same time, the quality of wafer inspection results is critical due to the risks involved if requirements are not met. For these reasons, vision systems with exceptionally high speed and resolution capabilities can play an important role in semiconductor inspection.

CoaXPress high-speed area scan cameras transmit data directly to a back-end machine for processing. The nature of semiconductor inspection requires robust image processors, such as graphics processing units (GPUs) and field-programmable gate arrays (FPGAs), as well as central processing units (CPUs) high-end and powerful, depending on the configuration and throughput required. .

What makes up the system?

The machine vision system consists of three main components: a high-speed camera, a microscope, and an air-cushioned XY stage to protect the Z-axis movement of the wafer. The camera is built into a Nikon LV-M microscope (Fig.1).

The camera and microscope were mounted on a HybrYX air-cushion stage to isolate any movement in the Z direction. The air-cushion stage helps to avoid any inconsistencies in the depth of field (DOF) of the array which could distort the focal length of the lens.

For this test, the sensor resolution was reduced to 1920 × 1100 pixels with an exposure time of 100 µs and a speed of 2500 fps to meet inspection requirements. Additionally, the X-axis speed of the air-cushion stage was set at 300 mm per second; the camera receives an external trigger signal from the system controller via a general purpose input/output (GPIO) cable.

Camera and wafer synchronization

Synchronizing a single array unit in a two-dimensional moving table with camera exposure time is an essential part of the setup. If done correctly, the result will be consistent image analysis with a defined field of view (FOV).

The high-speed camera allows the use of a GPIO cable to send external signals to the camera for triggering, synchronization, IRIG and other functions. In this configuration, the camera was synchronized with the speed of the table relative to a single matrix unit, so that the trigger signal aligned with the exposure time of the frame. Alignment with the start edge of a single chip within a wafer is thus achieved. This precise synchronization of the trigger signal to the camera avoids a scenario where the camera receives a trigger signal in the middle of a die.

The trigger signal should be sent at the beginning of the falling edge of the image when the sensor begins exposure, with two consecutive falling edges defining a single image. A time between a falling edge and a rising edge is an exposure time. So in the sync process, the falling edge of an exposure time is where a trigger and sync signal is sent to the camera via the GPIO cable.

The camera starts capturing images at the first edge of the array. The exposure stops when the camera sees the last edge of the die, or before the start of the next die. The first edge of the die is synchronized with the falling edge of the frame (the start of the exposure time) and the last edge of the die is synchronized with the rising edge of the frame (the end of the exposure time) (Fig.2). The camera tracks this spot repeatedly, scanning the entire wafer.

As shown on the picture 3the main conclusion of this mechanism is that the entire die of a wafer can be accurately scanned in a very short time by synchronizing the external signal with the camera to capture an image with the intended FOV, perspective and contrast .

A well-designed machine vision system will improve product quality, reduce inspection time, and reduce takt, which is the rate at which the process can be completed. For semiconductor inspection, the faster the cycle time, the faster the inspection can be performed. This ultimately reduces overall processing time and production costs.

In our test system, the implementation of the high-speed camera significantly increased the manufacturing and packaging process by increasing wafer inspection per hour from 1-2 wafers to 10-15. To further analyze the data:

  • v = 300 mm/s
  • d = 1600 pixels × 10 µm = 16 mm
  • t = d/v = 16/300 = 53 ms/dice
  • Die per wafer = 1000
  • Total time per wafer = 1000 × 53 ms/die = 53 sec ~ = 1 min

Assume there is 100% overhead due to software image processing. The cycle time per wafer goes from 1 to 2 minutes. The number of slices processed by the camera and the software per hour is equal to 30 slices. That’s 15 times more than ASML or similar systems can process per hour.

Replacing a line-scan camera with a high-speed area-scan camera reduces takt time and increases throughput by increasing the amount of wafer range inspected at any given time. In the past, line-scanners were known to scan a wider range of lines and post-process stitched images. However, high-speed cameras can scan 1,000 times more area than line-scan cameras, and the image is immediately available for processing without an image stitching process.

Conclusion

Machine vision systems provide superior speed when implemented in semiconductor inspection, increasing product quality and significantly reducing takt and cost. Previously, a linescan camera was preferred for achieving relatively higher speeds by scanning larger areas and stitching lines together to form a single image. High speed inspection of a larger area of ​​a wafer can be achieved with a high resolution area scan camera, which scans much faster than a line scan camera.

Comments are closed.