AN580 INFRARED GESTURE SENSING. 1. Introduction. 2. Hardware Considerations



Similar documents
AN962: Implementing Master-Slave Timing Redundancy in Wireless and Packet- Based Network Applications

Figure 1. 8-Bit USB Debug Adapter

AN803. LOCK AND SETTLING TIME CONSIDERATIONS FOR Si5324/27/ 69/74 ANY-FREQUENCY JITTER ATTENUATING CLOCK ICS. 1. Introduction

Figure 1. Proper Method of Holding the ToolStick. Figure 2. Improper Method of Holding the ToolStick

AN111: Using 8-Bit MCUs in 5 Volt Systems

UG103.8: Application Development Fundamentals: Tools

AN486: High-Side Bootstrap Design Using ISODrivers in Power Delivery Systems

AN862. OPTIMIZING Si534X JITTER PERFORMANCE IN NEXT GENERATION INTERNET INFRASTRUCTURE SYSTEMS. 1. Introduction

AN952: PCIe Jitter Estimation Using an Oscilloscope

TS1005 Demo Board COMPONENT LIST. Ordering Information. SC70 Packaging Demo Board SOT23 Packaging Demo Board TS1005DB TS1005DB-SOT

UG103-13: Application Development Fundamentals: RAIL

UG129: ZigBee USB Virtual Gateway Reference Design (RD ) User's Guide

AN922: Using the Command Line Interface (CLI) for Frequency On-the-Fly with the Si5346/47

Making Prototyping Boards for the EFM32 kits

Current Digital to Analog Converter

ETRX3USB ETRX3USB-LRS ETRX3USB+8M ETRX3USB-LRS+8M PRODUCT MANUAL

RoHs compliant, Pb-free Industrial temperature range: 40 to +85 C Footprint-compatible with ICS , 2.5, or 3.3 V operation 16-TSSOP

AN583: Safety Considerations and Layout Recommendations for Digital Isolators

AN614 A SIMPLE ALTERNATIVE TO ANALOG ISOLATION AMPLIFIERS. 1. Introduction. Input. Output. Input. Output Amp. Amp. Modulator or Driver

Bootloader with AES Encryption

Telegesis is a trademark of Silicon Laboratories Inc. Telegesis ZigBee Communications Gateway. Product Manual

Selecting the Right MCU Can Squeeze Nanoamps out of Your Next Internet of Things Application

Touchless switch Object detection Handsets Intrusion/tamper detection. Reflectance-Based Proximity Detection PRX. Signal processing SREN

CPU. PCIe. Link. PCIe. Refclk. PCIe Refclk. PCIe. PCIe Endpoint. PCIe. Refclk. Figure 1. PCIe Architecture Components

Design Challenges for Adding Relative Humidity Sensors

Si52144 PCI-EXPRESS GEN 1, GEN 2, & GEN 3 QUAD OUTPUT CLOCK GENERATOR. Features. Applications. Description. Functional Block Diagram

Backup Power Domain. AN Application Note. Introduction

Table 1. RF Pico Boards of the EZRadioPRO Development Kits. Qty Description Part Number

AT11805: Capacitive Touch Long Slider Design with PTC. Introduction. Features. Touch Solutions APPLICATION NOTE

UG103.14: Application Development Fundamentals: Bluetooth Smart Technology

AN0822: Simplicity Studio User's Guide

Analog to Digital Converter

Figure 1. Classes of Jitter

Accuracy maintained over the entire operating temperature and voltage range Low Power Consumption

USB Audio Simplified

AN588 ENERGY HARVESTING REFERENCE DESIGN USER S GUIDE. 1. Kit Contents. 2. Introduction. Figure 1. Energy Harvesting Sensor Node

Si52142 PCI-EXPRESS GEN 1, GEN 2, & GEN 3 TWO OUTPUT CLOCK GENERATOR WITH 25 MHZ REFERENCE CLOCK. Features. Applications.

AN75. Si322X DUAL PROSLIC DEMO PBX AND GR 909 LOOP TESTING SOFTWARE GUIDE. 1. Introduction

UG103.8 APPLICATION DEVELOPMENT FUNDAMENTALS: TOOLS

How To Develop A Toolstick

AN335 USB DRIVER INSTALLATION UTILITY. 1. Description. 2. Installation Install Package

Connect the EFM32 with a Smart Phone through the Audio Jack

QSG108: Blue Gecko Bluetooth Smart Software Quick-Start Guide

Digital Isolator Evolution Drives Optocoupler Replacement

AN LPC1700 timer triggered memory to GPIO data transfer. Document information. LPC1700, GPIO, DMA, Timer0, Sleep Mode

AN104 I NTEGRATING KEIL 8051 TOOLS INTO THE SILICON LABS IDE. 1. Introduction. 2. Key Points. 3. Create a Project in the Silicon Labs IDE

AN655 R ANGE TEST APPLICATION FOR EZRADIO AND EZRADIOPRO. 1. Introduction. 2. Supported Radio Types

Genesi Pegasos II Setup

USB FM Radio-RD USB FM RADIO USER S GUIDE. 1. USB FM Radio Setup. One-time set-up enables configuration of presets and region specific FM band

AVR151: Setup and Use of the SPI. Introduction. Features. Atmel AVR 8-bit Microcontroller APPLICATION NOTE

AN220 USB DRIVER CUSTOMIZATION

Capacitive Touch Technology Opens the Door to a New Generation of Automotive User Interfaces

FREE FALL. Introduction. Reference Young and Freedman, University Physics, 12 th Edition: Chapter 2, section 2.5

AT88CK490 Evaluation Kit

PAC52XX Clock Control Firmware Design

AN220 USB DRIVER CUSTOMIZATION

Technical Article. Gesture Sensors Revolutionize User Interface Control. By Dan Jacobs

AN335 USB DRIVER INSTALLATION METHODS. 1. Introduction. 2. Relevant Documentation. 3. DPInst Installation and Customization

QSG105 GETTING STARTED WITH SILICON LABS WIRELESS NETWORKING SOFTWARE

APPLICATION NOTE. AT07175: SAM-BA Bootloader for SAM D21. Atmel SAM D21. Introduction. Features

Module 1, Lesson 3 Temperature vs. resistance characteristics of a thermistor. Teacher. 45 minutes

ZigBee-2.4-DK 2.4 GHZ ZIGBEE DEVELOPMENT KIT USER S GUIDE. 1. Kit Contents. Figure GHz ZigBee Development Kit

CP2110-EK CP2110 EVALUATION KIT USER S GUIDE. 1. Kit Contents. 2. Relevant Documentation. 3. Software Setup

LCD MONITOR TOUCH PANEL DRIVER 2 OPERATION MANUAL. for Mac. Version 1.0 PN-L703A/PN-70TA3/PN-L703B/PN-70TB3/PN-L603A/PN-60TA3/PN-L603B/PN-60TB3

TWR-KV31F120M Sample Code Guide for IAR Board configuration, software, and development tools Rev.0

1 Software Overview ncp-uart ash-v3-test-app ash-v3-test-app Command Line Options Testing... 2

BURNY 8 TAKE CONTROL

AN LPC1700 RTC hardware auto calibration. Document information. RTC, Hardware Auto Calibration, LPC1700, Graphic LCD

Geometric Optics Converging Lenses and Mirrors Physics Lab IV

Designing Gain and Offset in Thirty Seconds

PC Base Adapter Daughter Card UART GPIO. Figure 1. ToolStick Development Platform Block Diagram

AN437. Si4432 RF PERFORMANCE AND FCC COMPLIANCE TEST RESULTS. 1. Introduction. 2. Relevant Measurements to comply with FCC

In the idle state, the TIP lead will be high impedance to ground and the RING lead will have the battery voltage. See Figure 1.

Designing the NEWCARD Connector Interface to Extend PCI Express Serial Architecture to the PC Card Modular Form Factor

RPLIDAR. Low Cost 360 degree 2D Laser Scanner (LIDAR) System Development Kit User Manual Rev.1

APPLICATION. si32library. Callback CMSIS HARDWARE. Figure 1. Firmware Layer Block Diagram

AVR127: Understanding ADC Parameters. Introduction. Features. Atmel 8-bit and 32-bit Microcontrollers APPLICATION NOTE

How to Convert 3-Axis Directions and Swap X-Y Axis of Accelerometer Data within Android Driver by: Gang Chen Field Applications Engineer

USER GUIDE EDBG. Description

CS4525 Power Calculator

Android Driver s App Update Version 1.89 Samsung Galaxy Tab 4

Bluetooth + USB 16 Servo Controller [RKI-1005 & RKI-1205]

Windows Embedded Security and Surveillance Solutions

AES Cipher Modes with EFM32

Software Real Time Clock Implementation on MC9S08LG32

AVR353: Voltage Reference Calibration and Voltage ADC Usage. 8-bit Microcontrollers. Application Note. Features. 1 Introduction

Using touch gestures with your SMART Board interactive display frame or SMART Board 6052i interactive display

Synthetic Sensing: Proximity / Distance Sensors

APPLICATION NOTE. AT16268: JD Smart Cloud Based Smart Plug Getting. Started Guide ATSAMW25. Introduction. Features

MPC8245/MPC8241 Memory Clock Design Guidelines: Part 1

CP2102/ V. 48 MHz Oscillator. USB Function Controller. 640B TX Buffer. 576B RX Buffer 1024B PROM. Figure 1. Example System Diagram

Reflection and Refraction

Hi! Let s get started.

Hello! Let s get started.

How To Use A High Definition Oscilloscope

SMARTCARD XPRO. Preface. SMART ARM-based Microcontrollers USER GUIDE

Hi! Let s get started.

1 One Dimensional Horizontal Motion Position vs. time Velocity vs. time

Overcoming Challenges of Connecting Intelligent Nodes to the Internet of Things

Mobile Banking and Mobile Deposit Terms & Conditions

Transcription:

INFRARED GESTURE SENSING 1. Introduction Touchless user interfaces are an emerging trend in embedded electronics as product designers seek out innovative control methods and more intuitive ways for users to interact with electronics. Active infrared proximity motion sensing can solve this challenge. Silicon Labs Si114x proximity and ambient light sensor products are ideally suited to touchless gesturing applications such as page turning on an e-reader, scrolling on a tablet PC, or GUI navigation. The Si114x features up to three LED drivers and has the ability to sense gestures within a 7 to 15 cm product interaction region, assuming a hand as the detectable object. This document will discuss in detail how Silicon Labs implements motion sensing using infrared technology. There are two primary methods used for gesture sensing position-based and phase-based. Position-based gesture sensing involves finding gestures based on the calculated location of an object while phase-based gesture sensing is based on the timing of the changes in signal to determine the direction of an object s motion. 2. Hardware Considerations This application note focuses on detecting gestures made by a user s hand. It is important to recognize that the concepts introduced in this application note can be applied to targets other than the hand, as long as the hardware is designed appropriately. The end application and individual system constraints will each dictate the range requirements for IR gesture sensing. Since object reflectance is the main measurable component for touchless gesturing, a hand is presumed to be the detectable object for the examples in this document. Whereas a hand can achieve gesture sensing up to 15 cm away from the Si114x sensor, a finger, with dramatically lower reflectance, can achieve gesture sensing at a range of < 1 cm for thumb-scroll type applications. The general guideline for designing a gesture sensing system with multiple LEDs is to make sure that there is no dead spot in the middle of the detectable area. When a target is placed above the system and is not detected, the target is in a reflectivity dead spot. To avoid dead spots in the system, the LEDs must be placed such that the emitted infrared light can reflect off the target and onto the sensor from the desired detection range. Figure 1 shows systems designed to detect a hand or a finger. The most susceptible area for a dead spot is directly above the sensor in between the two LEDs. The two LEDs are placed as close to the edge of the target as possible to provide feedback in the middle while also keeping enough distance between the LEDs so that it can be detected when the finger or hand moves to the left or right. Rev. 0.1 1/15 Copyright 2015 by Silicon Laboratories AN580

Figure 1. Ideal Hardware Formations Hand versus Finger Detection Other than the placement of the LEDs in relation to the size of the hand, the location and reflectance of the target in relation to the system is also very important. Notice that the device is placed under the palm of the hand and middle of the finger. For the hand detecting system, the fingers are not a good focal point because light can slip between the fingers, and the shape/curve of the outline of the fingers also makes for unpredictable measurements. For the finger detecting system, the tip of the finger is curved and reflects less light than the middle of the finger. For more information about other hardware considerations for infrared proximity sensing in general, refer to AN498: Si114x Designer s Guide. For more information about LEDs, refer to AN521: irled Selection Guide for Si114x Proximity Applications. 2.1. Method 1 Position-based Gesture Sensing The position-based motion sensing algorithm involves three primary steps. The first step is the conversion of the raw data inputs to usable distance data. The second step uses the distance data to estimate the position of the target object, and the third step checks the timing of the movement of the position data to see if any gestures occurred. Step 1 Converting Counts to Distance: The infrared (IR) proximity sensor outputs a value for the amount of infrared light fed back to the device from the IR LEDs. These values increase as more light is reflected by an object or hand moving closer to the system. This example presumes that a hand is the defined target for detection. The system can estimate how far one s hand is based on characterization of the proximity sensing (PS) feedback for a hand at certain distances. For example, if a hand is placed in front of the system about 10 cm away and the PS measurement yielded 5000 counts, then subsequent PS measurements around 5000 counts also mean that there is a similarly reflective hand approximately 10 cm away from the system. Taking multiple data points at varying distances from the system helps to interpolate between these points and create a piece-wise equation for the conversion from raw PS counts to distance estimation. Note that for systems with multiple LEDs, each LED will have a different PS feedback for each hand distance so each LED will need its own counts to distance equation. 2 Rev. 0.1

In Figure 2, the characterization for a two-led system is displayed. Each LED must be characterized with a target suspended over the midpoint between the LED and the sensing device. When Target 1 is suspended over the sensing device and IR LED 1, the measured feedback will correlate to a distance above the system. The same is true for Target 2, IR LED 2, and. Target 1 Target 2 IR LED 1 IR LED 2 P1 P2 Figure 2. Distance Approximation Rev. 0.1 3

Step 2 Estimating Position: The next step in the process of detecting gestures is to take the distance data from Step 1 and estimate the target s position. The position estimate comes from the formula for the intersection of two circles. An LED s light output is conical in nature, but for this approximation, the LED s output is considered to be spherical. With the LED s on the same axis, the intersection of these two spheres can be considered using equations for the intersection of two circles. Consider the characterization performed in Figure 2 in the previous section. Figure 3 shows what the estimations and mean when a single target is suspended over the middle area of the system. and are estimations of the distance from the points P1 and P2 to the target above the system. If and are thought of as radii of two circles, the intersection of these two circles is the point where the target is located. Target IR LED 1 IR LED 2 P1 P2 Figure 3. Single Target Distance Approximation Figure 4 is an expanded version of Figure 3. The measurements a and b have been added to label the location of the target along the axis between the two points P1 and P2. Also note that the distance measurements and have been renamed to R1 and R2 to represent that they are being considered as radii now. 4 Rev. 0.1

Target R1 R2 P1 a b P2 Figure 4. Intersection of Two Circles Implementation Using d = a + b a = R1 2 R2 2 + d 2 2 d The value of a will be the location of the object along the axis between P1 and P2. A negative value is possible and will mean that the target is on the left side of P1. Step 3 Timing Gestures: With a positioning algorithm in place, keeping track of timing allows the system to search for and acknowledge gestures. For hand-swiping gestures, the entry and exit positions are the most important to consider. If the hand enters the right side with a very high a value and exits the left side with a very low a value, then a left swipe occurred, as long as the timing between the entry and exit happened within a chosen time window. If the position stays steadily in the middle area for a certain period of time, then this gesture can be considered a pause gesture. The system must keep track of the timestamps for the entry, exit, and current positions of the target in the detectable area. With timing information and position information, most gestures can be recognized. Timing will need to be custom tuned to each application and each system designer s preference. Silicon Labs Reference Designs use a quarter of a second for the gesture time requirement. Silicon Labs IrSlider2EK is a great example and starting point for comprehending the fundamentals of positionbased gesturing. Rev. 0.1 5

2.2. Method 2 Phase-based Gesture Sensing For the second method of gesture sensing, the location of the target is never calculated. This means that Steps 1 and 2 from Method 1 above are not used in this process. This method involves looking solely at the raw data from the proximity measurements and looking for the timing of the changes in feedback for each LED. The maximum feedback point for each LED will occur when a hand is right above that LED. If a hand is swiped across two LEDs, the direction of the swipe can be determined by looking at which LED s feedback raised first. Figure 5 shows a three-led system with a hand about to swipe over the system. Figure 5. Swiping Left over a 3-LED System When the hand in Figure 5 is swiped left across the three LEDs and proximity sensing device, the hand will first cross over, then, then. The sensing algorithm will recognize the rise in feedback for and record the timestamp for this raise. Then the sensing algorithm will detect the same raises for and with a higher timestamp than the one before it. The sensing algorithm can also detect the return back to no-detect state of each LED s measurement and record a timestamp for this as well. First will return back to normal, then, then. For up and down gestures in the system in Figure 5, and will rise and fall simultaneously with coming either before or after / for the up or down gesture. If all three channels rise and fall at the same time, then a hand approached from directly above (Z-axis) and was retracted to indicate a select gesture. In Figure 6, the four gestures right, left, down, and up are shown on graphs of ADC Counts versus time. The green line represents PS measurements using, the pink line represents PS data from, and the yellow line shows data from. Observe for the Right Swipe that spikes first, then, and finally. This is expected for a swipe from left to right. For the up and down swipes, and spike at the same time since your hand will cross these LEDs at the same time when swiping up or down. 6 Rev. 0.1

Right Swipe Left Swipe Down Swipe Up Swipe Figure 6. Gesture Data Rev. 0.1 7

3. Position-based Method Advantages and Drawbacks The advantage of the position-based method over phase-based is that the position-based method can offer information on the location of the target. This will allow for ratiometric control of systems. For example, to scroll through a book several pages you could suspend your hand over the right side of the detectable area instead of having to make several right swipe gestures. The main drawback of the position-based algorithm is the accuracy of the position calculations. The positioning algorithm assumes a spherical output from the LEDs, but in practice LED output is more conical than spherical. The algorithm also assumes uniform light intensity across the entire output of the LED, but the light intensity decays away from the normal. Another issue is that this algorithm does not account for the shape of the target. A target that is uniquely shaped will cause inconsistencies with the positioning output. For example, the system cannot tell the difference between the hand and the wrist, so any gestures involving movement that puts the wrist in the area of detection will be less accurately located. The result is that the positioning information provided in this algorithm is good enough for low resolution systems that only need a 3x3 grid of detection, but the current positioning algorithm is not well-suited for a pointing application. This algorithm s output is not an ideal touchscreen replacement. 4. Phase-based Method Advantages and Drawbacks For applications not requiring position information, the phase-based method provides a very robust way of detecting gestures. Each gesture can be detected on either the entry or exit from the detectable area, and the entry and exit can be double-checked with each other to provide much higher certainty for each gesture observed. The drawback of this method when compared with the position-based method is that no positioning information is provided. This means that the number of gestures that can be implemented are more limited than the positionbased method. The phase-based method can only tell the direction of entry and exit from the detectable area so any movement in the middle of the detectable area is not detected. 5. Improved Gestures through Combination of the Two Methods These two methods can be implemented alongside one another to help mask the other s deficiencies. The position-based algorithm can provide some positional information for ratiometric control, and the phase-based algorithm can be used for detection of most gestures. These two algorithms working together can provide a strong solution for gesture sensing, but the drawback here is the code space requirements of implementing two separate algorithms as well as the CPU cycles to process both algorithms at all times. 8 Rev. 0.1

Smart. Connected. Energy-Friendly. Products www.silabs.com/products Quality www.silabs.com/quality Support and Community community.silabs.com Disclaimer Silicon Laboratories intends to provide customers with the latest, accurate, and in-depth documentation of all peripherals and modules available for system and software implementers using or intending to use the Silicon Laboratories products. Characterization data, available modules and peripherals, memory sizes and memory addresses refer to each specific device, and "Typical" parameters provided can and do vary in different applications. Application examples described herein are for illustrative purposes only. Silicon Laboratories reserves the right to make changes without further notice and limitation to product information, specifications, and descriptions herein, and does not give warranties as to the accuracy or completeness of the included information. Silicon Laboratories shall have no liability for the consequences of use of the information supplied herein. This document does not imply or express copyright licenses granted hereunder to design or fabricate any integrated circuits. The products are not designed or authorized to be used within any Life Support System without the specific written consent of Silicon Laboratories. A "Life Support System" is any product or system intended to support or sustain life and/or health, which, if it fails, can be reasonably expected to result in significant personal injury or death. Silicon Laboratories products are not designed or authorized for military applications. Silicon Laboratories products shall under no circumstances be used in weapons of mass destruction including (but not limited to) nuclear, biological or chemical weapons, or missiles capable of delivering such weapons. Trademark Information Silicon Laboratories Inc., Silicon Laboratories, Silicon Labs, SiLabs and the Silicon Labs logo, Bluegiga, Bluegiga Logo, Clockbuilder, CMEMS, DSPLL, EFM, EFM32, EFR, Ember, Energy Micro, Energy Micro logo and combinations thereof, "the world s most energy friendly microcontrollers", Ember, EZLink, EZRadio, EZRadioPRO, Gecko, ISOmodem, Precision32, ProSLIC, Simplicity Studio, SiPHY, Telegesis, the Telegesis Logo, USBXpress and others are trademarks or registered trademarks of Silicon Laboratories Inc. ARM, CORTEX, Cortex-M3 and THUMB are trademarks or registered trademarks of ARM Holdings. Keil is a registered trademark of ARM Limited. All other products or brand names mentioned herein are trademarks of their respective holders. Silicon Laboratories Inc. 400 West Cesar Chavez Austin, TX 78701 USA http://www.silabs.com