The University of Massachusetts Amherst

Rapid Neural Network-based Autofocus Control for High-precision Imaging Systems

 
END-USE APPLICATIONS

•       High-speed microscope image acquisition

•       Real-time in-line inspection of manufactured micron- and nano-scale products

•       Tracking of fast-moving micron- and nano-scale targets

•       Producing high-speed (≥40Hz) high-precision (1 nm) microscopes/cameras without the need for internal closed-loop controllers

 
ADVANTAGES

•       Ultrafast high-precision autofocus

•       Reduced control complexity for high-precision imaging and vision systems

•       Lower hardware costs

•       Removes need for closed-loop controllers

 
TECHNOLOGY DESCRIPTION

As industry 4.0 pushes the limits of micro and nano-scale technologies, semiconductor, GPU, and robotics manufacturers are searching for ways to optimize their production lines while still maintaining the highest level of quality. Visual inspection of these advanced micro and nano-scale technologies requires remarkably high levels of precision and control. The piezoelectric actuators used for metrology are currently burdened by non-linearities that require slow and expensive internal closed-loop controllers to deliver sufficient precision to the imaging system. A UMass Amherst research team has developed a new control method that reduces the cost and complexity of high-precision imaging systems while still delivering rapid acquisition of clear and crisp images. The new method integrates the focus measurement and the troublesome non-linear effects in a single learning-based model. The method involves evaluating the focus from a short sequence of images in a deep learning-based control model to determine the optimal position for the lens. The technology leverages Long Short-Term Memory (LSTM) because of its superior ability to draw inferences from learned time sequence data. This novel method also utilizes an optimized backpropagation algorithm for efficiency, as well as a unique S-curve control input profile to minimize motor and image jerks. This method supports both rapid and stable dynamic lens transitions for a wide variety of imaging applications. Compared with the leading autofocus technologies, this method demonstrates significant advantages regarding autofocus time.  

ABOUT THE LEAD INVENTOR
Dr. Xian Du is an Assistant Professor in the Department of Mechanical and Industrial Engineering and Institute for Applied Life Sciences at the University of MassachusettsAmherst. His current research focuses on the innovation of high-resolution, large-area, and fast-speed machine vision and pattern recognition technologies for manufacturing and medical devices. His research interests include pattern recognition, intelligent imaging and vision, flexible electronics manufacturing, robotics, and medical device realization. Dr. Du was a recipient of the NSF CAREER award in 2020. He is a member of the Optics Society of America (OSA).
AVAILABILITY:
Available for Licensing and/or Sponsored Research
DOCKET:
UMA 22-056
PATENT STATUS:
Patent Pending
Contact:
Ling Shen
Retired formerly Senior Licensing Officer
University of Massachusetts
413-545-5276
lxshen@research.umass.udu
Inventor(s):
Xian Du
Peter DiMeo
Jingyang Yan
Keywords: