Jul 21 2010
Semiconductor Research Corporation (SRC), the world's leading university-research consortium for semiconductors and related technologies, the National Science Foundation (NSF) and researchers from the University of Connecticut (UConn) and Duke University have found a new way to significantly improve the screening of small delay defects (SDDs) commonly found in semiconductors.
Using a much reduced pattern count in the chip testing process, this new methodology will efficiently detect SDDs and help improve the quality and reliability of future semiconductors. As semiconductor technologies migrate to smaller and smaller technology nodes, the complexity and density of designs and functionality significantly increases. SDDs – often a result of physical defects as well as on-chip noise from process variations, power supply noise and crosstalk – have become a major concern for high-quality testing.
SDDs are a type of timing defect that are difficult to fully and efficiently target with the current transition-delay fault (TDF) ATPG methods. Current commercially available timing-aware ATPGs for detecting SDDs use a very large pattern set and CPU runtime. They assume that SDDs appear only as physical defects in the circuit. However, the methodology developed by UConn and Duke researchers uses a reduced pattern set and provides higher quality test patterns for screening against SDDs.
“This is a major breakthrough for chip testing,” said Mohammad Tehranipoor, associate professor of Electrical and Computer Engineering, University of Connecticut, and Krishnendu Chakrabarty, professor of Electrical and Computer Engineering, Duke University. “By evaluating each test pattern according to its unique paths before applying the patterns to silicon, it allows the industry to select only high-quality patterns for testing. This will help to dramatically improve the quality of the test process and reduce the delay test costs while testers budgets.”
With the support from SRC and NSF, the UConn and Duke researchers exploited N-detect pattern sets in terms of low CPU runtime and efficient SDD detection and developed a novel metric to evaluate each test pattern in the N-detect pattern set. The new methodology is also capable of taking into account small delays induced by power supply noise and crosstalk, as well as process variations.
The researchers plan to extend their work to different kinds of patterns such as stuck-at and bridging patterns.
“As technology nodes continue to shrink, the metric used for SDD pattern selection could provide the industry with a valuable opportunity to increase test quality while meeting test time and tester memory requirements,” said William Joyner, director of Computer-Aided Design and Test at SRC.
Efforts are currently under way to bring this technology to the commercial industry. The chip testing methodology is currently being evaluated on silicon with researchers at Advanced Micro Devices, Inc. (AMD), under the leadership of Dr. Mahmut Yilmaz, an AMD senior design engineer who studied at Duke under the supervision of Chakrabarty.
Source: http://www.src.org/