OmniBSI TM Technology Backgrounder. Embargoed News: June 22, 2009. OmniVision Technologies, Inc.



Similar documents
Choosing a digital camera for your microscope John C. Russ, Materials Science and Engineering Dept., North Carolina State Univ.

WHITE PAPER. Are More Pixels Better? Resolution Does it Really Matter?

Beyond Built-in: Why a Better Webcam Matters

LIGHT SECTION 6-REFRACTION-BENDING LIGHT From Hands on Science by Linda Poore, 2003.

X-RAY TUBE SELECTION CRITERIA FOR BGA / CSP X-RAY INSPECTION

Comparing Digital and Analogue X-ray Inspection for BGA, Flip Chip and CSP Analysis

Epson 3LCD Technology A Technical Analysis and Comparison against 1-Chip DLP Technology

White paper. CCD and CMOS sensor technology Technical white paper

Scanners and How to Use Them

2) A convex lens is known as a diverging lens and a concave lens is known as a converging lens. Answer: FALSE Diff: 1 Var: 1 Page Ref: Sec.

Lecture 12: Cameras and Geometry. CAP 5415 Fall 2010

PDF Created with deskpdf PDF Writer - Trial ::

TVL - The True Measurement of Video Quality

Understanding astigmatism Spring 2003

MACHINE VISION FOR SMARTPHONES. Essential machine vision camera requirements to fulfill the needs of our society

White paper. In the best of light The challenges of minimum illumination

The Basics of Scanning Electron Microscopy

Endoscope Optics. Chapter Introduction

CCD and CMOS Image Sensor Technologies. Image Sensors

AP Physics B Ch. 23 and Ch. 24 Geometric Optics and Wave Nature of Light

Stereoscopic 3D Digital Theater System. Operator Manual (MI-2100)

Science In Action 8 Unit C - Light and Optical Systems. 1.1 The Challenge of light

Chapter 17: Light and Image Formation

Investigation of Color Aliasing of High Spatial Frequencies and Edges for Bayer-Pattern Sensors and Foveon X3 Direct Image Sensors

Rodenstock Photo Optics

Wafer Level Testing Challenges for Flip Chip and Wafer Level Packages

Nikon f/2.8g VR II versus Nikon f/4g VR

How an electronic shutter works in a CMOS camera. First, let s review how shutters work in film cameras.

How to measure absolute pressure using piezoresistive sensing elements

Alignement of a ring cavity laser

Beginners Guide to Digital Camera Settings

Understanding Exposure for Better Photos Now

A Beginner's Guide to Simple Photography Concepts: ISO, Aperture, Shutter Speed Depth of Field (DOF) and Exposure Compensation

Lecture 16: A Camera s Image Processing Pipeline Part 1. Kayvon Fatahalian CMU : Graphics and Imaging Architectures (Fall 2011)

Thin Lenses Drawing Ray Diagrams

Laboratory #3 Guide: Optical and Electrical Properties of Transparent Conductors -- September 23, 2014

Digital Image Requirements for New Online US Visa Application

Theremino System Theremino Spectrometer Technology

Rodenstock Photo Optics

Review Vocabulary spectrum: a range of values or properties

What are Fibre Optics?

Digital Camera Imaging Evaluation

Physics 441/2: Transmission Electron Microscope

Lesson 26: Reflection & Mirror Diagrams

Chapter 6 Telescopes: Portals of Discovery. How does your eye form an image? Refraction. Example: Refraction at Sunset.

Screen Printing For Crystalline Silicon Solar Cells

Displays. Cathode Ray Tube. Semiconductor Elements. Basic applications. Oscilloscope TV Old monitors. 2009, Associate Professor PhD. T.

Video Camera Installation Guide

Active Alignment for Cameras in Mobile Devices and Automotive Applications

Telescope Types by Keith Beadman

Characterizing Digital Cameras with the Photon Transfer Curve

Mirror, mirror - Teacher Guide

Filters for Digital Photography

Hello and Welcome to this presentation on LED Basics. In this presentation we will look at a few topics in semiconductor lighting such as light

Lapping and Polishing Basics

AF 70~300 mm F/4-5.6 Di LD Macro 1:2 (Model A17)

BASIC EXPOSURE APERTURES, SHUTTER SPEEDS AND PHOTO TERMINOLOGY

Getting to Know LEDs, Applications and Solutions

ACTION AND PEOPLE PHOTOGRAPHY

Color Filters and Light

Aperture, Shutter speed and iso

WHITE PAPER. P-Iris. New iris control improves image quality in megapixel and HDTV network cameras.

5.3 Cell Phone Camera

Graphical displays are generally of two types: vector displays and raster displays. Vector displays

VECTORAL IMAGING THE NEW DIRECTION IN AUTOMATED OPTICAL INSPECTION

The Trade-off between Image Resolution and Field of View: the Influence of Lens Selection

Example Chapter 08-Number 09: This example demonstrates some simple uses of common canned effects found in popular photo editors to stylize photos.

E70 Rear-view Camera (RFK)

Understanding Depth Of Field, Aperture, and Shutter Speed Supplement to Mastering the Nikon D7000

VZ-C3D Visualizer. Bringing reality closer in 3D

Equations, Lenses and Fractions

Geometric Optics Converging Lenses and Mirrors Physics Lab IV

Handheld USB Digital Endoscope/Microscope

How does my eye compare to the telescope?

Understanding Megapixel Camera Technology for Network Video Surveillance Systems. Glenn Adair

Requirement of Photograph for Indian Passport

The 50G Silicon Photonics Link

Motion Activated Camera User Manual

THE IMPOSSIBLE DOSE HOW CAN SOMETHING SIMPLE BE SO COMPLEX? Lars Hode

Imaging Systems Laboratory II. Laboratory 4: Basic Lens Design in OSLO April 2 & 4, 2002

Advancements in High Frequency, High Resolution Acoustic Micro Imaging for Thin Silicon Applications

Unit 7: Normal Curves

TOF FUNDAMENTALS TUTORIAL

White paper. Sharpdome. Sharp images on every level

3D TOPOGRAPHY & IMAGE OVERLAY OF PRINTED CIRCUIT BOARD ASSEMBLY

Application note for Peerless XLS 10" subwoofer drive units

Miniaturizing Flexible Circuits for use in Medical Electronics. Nate Kreutter 3M

SP AF 17~50 mm F/2.8 XR Di-II LD Aspherical [IF] (Model A16)

Physics 10. Lecture 29A. "There are two ways of spreading light: to be the candle or the mirror that reflects it." --Edith Wharton

RAY OPTICS II 7.1 INTRODUCTION

Introduction to reflective aberration corrected holographic diffraction gratings

18-270mm F/ Di II VC PZD for Canon, Nikon (Model B008) mm F/ Di II PZD for Sony (Model B008)

Contents OVERVIEW WORKFLOW PROBLEM. SOLUTION. FEATURE. BENEFIT 4 5 EASY STEPS TO CALIBRATE YOUR LENSES 5

ING LA PALMA TECHNICAL NOTE No Investigation of Low Fringing Detectors on the ISIS Spectrograph.

Scanning Acoustic Microscopy Training

May 2013 Color Sense Trilinear Cameras Bring Speed, Quality

First let us consider microscopes. Human eyes are sensitive to radiation having wavelengths between

Lenses and Telescopes

product overview pco.edge family the most versatile scmos camera portfolio on the market pioneer in scmos image sensor technology

Tube Control Measurement, Sorting Modular System for Glass Tube

Transcription:

OmniBSI TM Technology Backgrounder Embargoed News: June 22, 2009 OmniVision Technologies, Inc. At the heart of any digital camera lies the image sensor. The image sensor is an integrated circuit, like any memory chip or microprocessor, except that it s designed to be very sensitive to visible light. Image sensor technology has undergone rapid technological improvements in recent years. OmniVision s BackSide Illumination (OmniBSI ) pixel technology truly offers a giant leap forward in the world of imaging technology. Image Sensor Trends Understanding the key trends in the image sensor business helps to illustrate the substantial value offered through OmniBSI technology. Specifically, the image sensor market is being driven towards lower cost devices, smaller pixels, higher resolution, thinner camera modules and better image quality. As with all electronics, the primary trend in image sensors is lower cost and higher performance. Lower cost is achieved primarily by shrinking all features on the die. Aside from transistors, whose size reduction is driven mainly by the microprocessor and memory industries, the largest devices on an image sensor are the actual pixels themselves. It is the innovation in the pixel technology that drives higher performance. The second trend in image sensors is better performance. Make them smaller and better has been the industry s mantra for many years, from salespeople to the pixel designers. The first image sensors for mobile cameras used 5.6 µ pixels, shifting to 2.8µ, 2.2µ, 1.75µ and smaller. Current leading edge devices use 1.4µ pixels, with 1.1 µ and 0.9µ pixels in development. For comparison, the wavelength of visible light varies from 0.65 µ (deep red) to 0.40 µ (blue). Evidently, pixels are getting close to some fundamental physical size limits. With the development of smaller pixels, engineers are asked to pack in as many pixels as possible, often sacrificing image quality. The third most important trend in image sensors is increased resolution. While resolution, measured in megapixels, is easily the most defining specification of a digital camera, it doesn t tell the whole story. Consumers demand higher resolution coupled with other performance factors. The fourth trend in sensors is reduction in camera module size. In the majority of consumer electronics, the camera is built as a small, self contained module integrated into the final product. The camera module contains the image sensor, a lens, a tiny circuit board and a plastic housing to hold it all together. Like sensors, camera modules are under

enormous pressure to shrink in size and cost while increasing in resolution. A primary focus in recent years has been to make modules as thin as possible, a trend emphasized by ultra thin mobile phones. All the while, consumers expect mobile cameras to deliver image quality and features comparable to high end digital still cameras. And why shouldn t they, with features like auto focus, the resolution of digital still cameras (DSC) resolutions and image processing features like red eye reduction, image stabilization and HD video recording? Traditional CMOS Image Sensor Design At its core, a solid state image sensor is a two dimensional array of light sensitive pixels. Each pixel contains a single photodiode with supporting electronics. Photodiodes are the actual light sensing structure, and convert visible light photons into electrons. The number of electrons collected by each pixel is measured by the in pixel electronics, and the values are transmitted to the rest of the camera s electronics for further amplifying, image processing, storage and display. In traditional CMOS Image sensors the front of most integrated circuits is the top surface, which contains all photodiodes, transistors, metal wiring layers and other electronics. This front side layer uses only the top 1 percent or so of the silicon wafer. The remaining 99 percent becomes the backside, and serves only as mechanical support and bonding surface, as well as an electrical ground terminal. In image sensors, the photodiode array is fabricated along with the transistors on the front side surface. The lens projects its image onto the front surface, where the light is collected by the photodiodes and digitized into an image. Pixel Fundamentals To understand the advantages of OmniBSI technology, some background information on pixel performance and the limitation of current technology is useful. Electrical circuits can t sense light directly; the light as photons must first be converted to electrons. That s the job of the photodiode which absorbs the light and converts each photon into one electron and stores those electrons as you would store rain drops in a bucket. Like a heavy rain, the brighter the image, the more photons that land on the sensor over a given period, the fuller the photodiode buckets become. The process of converting photons to electrons is perfect; every photon that strikes the photodiode is converted into a electron by the photodiode structure. The problem is that not every photon that strikes also strikes the photosensitive area, the photosensor, because the photodiode is just a component of the sensor and so is smaller than the physical size of each pixel. The ratio of the number of photons hitting the sensor to the number actually converted to detected electrons is known as quantum efficiency (QE). QE is dependent on the wavelength (i.e. the color) of the light. The closer to 100 percent, the stronger the electrical signal, resulting in a more vivid image. QE is an important aspect of sensitivity, which is an important metric of camera performance. In the context of sensors, sensitivity measures size of the output signal for a known number of input photons. Intuitively, more sensitivity is better.

noise floor also has to be considered. The noise floor is the minimum detectable signal of the photodiode. The same way it s hard to see the change in water level caused by a single drop of rain, small changes in signal are hard to accurately measure. Noise is also present in film cameras, and noisy images are usually termed grainy. This noise floor is a critical performance parameter for image sensors, as it determines how small a signal (or how dim an image) can be before it s indistinguishable from random measurement error. Engineers use the ratio of the measured signal to the noise floor (the signal to noise ratio, or SNR) as one of the most critical sensor performance metrics. The SNR level relates well to the overall image quality. An image with a low SNR will look grainy, and an image with a high SNR will look good. This, however, only considers the image sensor pixel array alone. In reality, the pixels are connected to a good deal of supporting circuitry by layers of metal wiring on top of the silicon surface. Recall that light is shining downwards onto the silicon surface, and must pass though these metal routing layers first. Since metal is both opaque and reflective, it creates a huge problem in pixel design. The wires have to be kept out of the light path, or they will cast shadows or reflect the light away from the photodiode, negatively impacting image quality; i.e. lower the sensors QE and sensitivity as explained above. Any light striking the wires is not just directed away from the right photodiode, but often reflected into another photodiode, in a phenomenon called crosstalk (refer to FSI pixel shown in Figure 1). Crosstalk creates blurry images by causing the light from a small object (destined for only one or two pixels) to be spread out into neighboring pixels. Since photons aren t being collected in the right place, the resulting electrical signal at the target pixels is smaller than it could be, therefore reducing the SNR of the image. This is especially important in low light images, where there aren t many photons to begin with. Crosstalk also reduces color quality, as the photons must pass through color filters on their way to the photodiode. Adjacent pixels possess different colored filters, and crosstalk distorts the sensor s color perception. Clearly sharp, low noise images and good color are the desirable features in photographs, thus eliminating crosstalk is an important element.

Figure 1: FSI and BSI crosstalk comparison As Figure 1 illustrates, getting light to the pixel is quite difficult, and the trend toward shrinking pixels isn t helping. Although pixels are getting smaller, the metal layer stack up isn t thinning as rapidly. The result is that pixels now resemble a tall glass, rather than a bucket, commonly referred to as the pixel straw. Since light cannot pass through the metal walls, it s very hard to get the light where you want it unless it s perfectly aimed. Pixel designers go to great lengths to achieve this, including adding tiny lenses (micro lenses) to the top of each pixel to direct light towards the photodiode and away from the metal layers. More exotic techniques include reducing the number of metal layers in the pixel array, and innovative circuit designs to minimize the number of metal wires required. But the problem remains; how do you get light past all of the metal layers while continuing to shrink pixel size? Do tall pixels and crosstalk affect the camera design? Yes, and profoundly so, with the greatest impact being on the lens design. The best way to get light through the pixel straw is to send it straight through, which means the lens has to aim all rays of light towards each pixel at perfect right angles to the sensor surface. Doing so requires very tall lenses, which simply isn t possible in the small space allowed for mobile cameras. Practical mobile phone lenses produce images where the rays of light form relatively flat cones. The industry term chief ray angle (CRA) defines the cone angles that the light passing through the center of the lens will follow. Mobile phone lenses typically have CRAs ranging from 20 to 30 degrees. Making the lens thinner (shorter) requires squishing the cone, which further increases the CRA. Twenty or thirty degrees might not seem so large, but compared to most DSC lenses where the cone angles are 5 degrees or less, it is significant. Sensor designers can compensate to some extent by adjusting the positions of the micro lenses to better align with the actual paths the light follows to the photodiode. This matching isn t perfect, since light rays destined for a particular pixel can pass though different parts of the lens and at different angles. The sensor also ends up tuned to a very specific lens design, limiting the selection of lenses a manufacturer can use, or causing problems if the lens properties change, as in the case of autofocus or zoom lenses. Mismatches between the sensor and lens can cause color shifts across the image, darkened corners and other image artifacts. The ideal sensor would not be so selective, accepting light from a range of angles and lens designs without suffering crosstalk or reductions in sensitivity. Backside Illumination (OmniBSI) Technology As previously described, FSI pixels have been shown to suffer many disadvantages low QE, low sensitivity, high crosstalk and high sensitivity to CRA mismatch. So what if there was a way to get the light to the photodiode through the backside of the silicon? By avoiding the metal layers, one could avoid all the drawbacks associated with FSI pixels.

This is the essence of OmniBSI technology. As discussed, an image sensor s backside is the bottom surface of previously unused silicon. OmniVision has developed a technology for reducing the thickness of the silicon wafer without compromising the image sensor s strength or flatness. Manufacturing CMOS Image Sensors Image sensors are made in the same silicon fabrication process as other CMOS semiconductors as reflected in Figure 2. Devices such as transistors and photodiodes are defined and created in a pure silicon crystal wafer. Three to six layers of metal wires (aluminum or copper) are deposited on top of the silicon to connect the silicon devices together. The image sensor process involves a few more steps to create the color filters and micro lenses that enable the sensor to see color and to improve image quality. Figure 2: Manufacturing FSI pixels Although the same initial process steps (fabricating transistors and depositing metal layers) are used for both FSI and BSI technologies, during manufacturing of OmniBSI sensors, the process takes a radical turn literally. Once the metal layers are deposited, the silicon wafer is flipped over and ground thinner than a strand of human hair. This is a crucial step and is key to the performance of OmniBSI technology. The process of grinding a 200 mm to 300 mm diameter silicon wafer without causing it to shatter is extremely challenging. OmniVision and its manufacturing partner, TSMC, worked closely together to develop the necessary technology that allows grinding the wafer to the levels required for high volume production.

Figure 2: Manufacturing BSI pixels, showing the flip process step Once the wafer is thinned, the color filters and micro lenses are deposited on the backside. The wafer is then tested, diced up, packaged, and is now ready for integration into camera modules. Conceptually, the process is very simple; however the practical limitations have prevented previous BSI technologies from being adopted in all but the most costly and high performance applications, such as astronomy and scientific cameras. By making the silicon backside many times thinner than a strand of human hair, light can now pass easily though to the photodiodes. So easily in fact, that the quantum efficiency is significantly higher than any FSI design could ever be (and remember higher QE is a good thing). With no metal layers between the lens and photodiode, many more photons are collected. Between the increased efficiency and lack of obstructing metal layers, OmniBSI technology provides a wide range of performance advantages. Performance Advantages of OmniBSI Sensors OmniBSI technology offers a multitude of advantages over conventional FSI designs, including higher sensitivity, smaller pixel size, reduced crosstalk and greater CRA tolerance. OmniBSI technology offers almost double the sensitivity over a similarly sized FSI pixel. Since there is less material through which photons must pass, fewer are absorbed or reflected. The lack of obstructing metals means more photons reach their target, or photodiodes. This combines to produce greater sensitivity over FSI pixels designs. Consumers enjoy brighter, less noisy images, clearer photographs. OmniBSI pixels can also be made smaller than FSI pixels. In FSI pixels, the photodiode and metal layers compete for the same space. With OmniBSI technology, there is no competition: metal wiring can be placed right on top of the photodiode. This allows the pixel components to be packed together more tightly, making for smaller and lower cost image sensors. Crosstalk is also greatly reduced in OmniBSI sensors. Again, without the shiny metal layers redirecting photons, more pixels end up where they re targeted. Images become sharper and the color reproduction is truer. With truer color comes lower noise, since the process of correcting color introduces noise into the images.

Finally, without metal layers forming a pixel straw, OmniBSI pixels impose fewer restrictions on lens CRA matching. The following section describes in more detail how wider CRA flexibility provides lens and module designers with greater freedom and flexibility. Advantages of OmniBSI Sensors for Lens and Module Design OmniBSI technology provides camera module and lens designers three new options over FSI pixel designs: 1. Better support for larger CRAs, enabling thinner modules 2. Better support for larger CRAs, enabling lenses with larger apertures (faster lenses) 3. Support for variations in CRAs, enabling high quality zoom lens designs These advantages are illustrated in, and discussed below. Figure 3: The wider optical path of OmniBSI enables new camera designs OmniBSI Sensors for Thinner Modules Without metal layers shadowing the photodiode, OmniBSI pixels can accept light at a wider range of angles. The most immediate advantage in camera module design is the allowance of the lens to be placed closer to the image sensor, therefore providing a thinner camera module. By allowing a wider range of CRAs, OmniBSI technology allows lens designers to make their designs more extreme, enabling thinner lens and modules, without sacrificing image quality.

OmniBSI Sensors for Large Aperture Lenses Good low light sensitivity is a crucial requirement for mobile cameras, since they rarely have a powerful flash and are used in a variety of challenging lighting environments. One promising solution is to use lenses with larger apertures (also referred to as faster lenses or smaller f numbers), such as f/2.0 instead of the conventional f/2.8. The problem is that large aperture lenses are also larger in diameter, and force the camera module to be taller to maintain the same CRA profile. This directly contradicts the trend for smaller camera modules. By allowing lenses with larger CRAs, OmniBSI technology enables lens designers to place large aperture lenses closer to the sensor, making the module thinner than FSI pixels allow. Besides its inherently larger QE and sensitivity, OmniBSI technology can further improve low light performance by enabling the effective use of larger aperture lenses. OmniBSI Sensors for Zoom Lenses Zoom lenses are extremely popular in nearly all consumer products, with the exception of mobile devices. One significant optical design problem is that the lens CRA changes with the zoom position. As previously discussed, FSI sensors have to be carefully matched to the lens CRA curve. A close match is impossible if the curve changes. With OmniBSI technology, lens CRAs don t have to be as closely matched, giving lens designers the optical design freedom they need to make high quality zoom lenses. OmniBSI Sensor Compatibility with Module Design and Manufacturing With its radical new manufacturing process, one might expect OmniBSI technology to create challenges in module design or manufacturing. In fact, OmniBSI sensors are completely compatible with all existing module design rules, handling procedures, and manufacturing processes. OmniBSI sensors can be made as thick or thin as any FSI device, and can be supplied for either chip on board (COB) wire bonding or in a chip scale package (CSP). OmniBSI technology provides module designers with many new degrees of freedom, without incurring any module design or manufacturing disadvantages. Conclusion With its revolutionary OmniBSI technology, OmniVision delivers some of the highest quality image sensors in the world. OmniBSI technology enables higher sensitivity, smaller pixels, better image quality, thinner camera modules, faster lenses and greater camera module design freedom than ever before, making it truly a giant step forward for consumer imaging. Consumers will appreciate higher resolution images that are brighter, more colorful and less noisy than FSI pixels could ever produce. Author: Michael Okincha, Senior Staff Technical Product Manager