HANDWRITING ON A TOUCHSCREEN, WITH ANALYSIS

Similar documents
Test Automation Product Portfolio

What s going on with iphone touch performance?

Multi-Touch Control Wheel Software Development Kit User s Guide

Chapter 5 Understanding Input. Discovering Computers Your Interactive Guide to the Digital World

Multi-Touch Ring Encoder Software Development Kit User s Guide

Chapter 5 Objectives. Chapter 5 Input

A Novel Multitouch Interface for 3D Object Manipulation

Smart Board Basics. December, Rebecca Clemente Department of Education

Elements of robot assisted test systems

Mobile Operating Systems Lesson 05 Windows CE Part 1

Chapter 3 Input Devices

SMART Board User Guide for Mac

Capacitive Touch Technology Opens the Door to a New Generation of Automotive User Interfaces

Windows Embedded Security and Surveillance Solutions

Chapter 5 Input. Chapter 5 Objectives. What Is Input? What Is Input? The Keyboard. The Keyboard

Multi-Touch Control Wheel Software Development Kit User s Guide

Enabling a Universal Stylus

Office 365. Editing Documents Across Devices. Introduction. Devices and Features. White Paper. Susan Horiuchi and Annette Sanders. PayneGroup, Inc.

SMART Board Interactive Whiteboard Setup with USB Cable

The Internet of Things: Opportunities & Challenges

SMART Board Menu. Full Reference Guide

SMART Board User Guide for PC

Using SMART Boards. Display Power (Projector) & PC Buttons on the Console

Bright House Networks Home Security and Automation Mobile Application. Quick Start Guide

Setting up Wireless ipad Control of your DEWETRON Computer-based System

SoMA. Automated testing system of camera algorithms. Sofica Ltd

CIM Computer Integrated Manufacturing

ALIBI Witness and ALIBI Witness HD Apps for Android - Quick Start Guide

Using touch gestures with your SMART Board interactive display frame or SMART Board 6052i interactive display

Unit 1: INTRODUCTION TO ADVANCED ROBOTIC DESIGN & ENGINEERING

Designing Feature-Rich User Interfaces for Home and Industrial Controllers

SMART Boards. If the board is connected to a different computer - Orientation is needed whenever you connect it to a new or different computer.

Roles of Smart TV in Internet of Things

Basics of Computer 1.1 INTRODUCTION 1.2 OBJECTIVES

User Manual. pdoc Pro Client for Windows. Copyright Topaz Systems Inc. All rights reserved.

LEGO NXT-based Robotic Arm

SMART Ink 1.5. Windows operating systems. Scan the following QR code to view the SMART Ink Help on your smart phone or other mobile device.

Introduction to the Perceptual Computing

Tutorial 1. Introduction to robot

Bright House Networks Home Security and Control Mobile Application

Sensors and Cellphones

Web Load Stress Testing

Digital Systems Based on Principles and Applications of Electrical Engineering/Rizzoni (McGraw Hill

SMART BOARD USER GUIDE FOR PC TABLE OF CONTENTS I. BEFORE YOU USE THE SMART BOARD. What is it?

Hands-On Practice. Basic Functionality

2/26/2008. Sensors For Robotics. What is sensing? Why do robots need sensors? What is the angle of my arm? internal information

your home in your hand

CARINAnav SMART LASER CONTROL

Automotive Applications of 3D Laser Scanning Introduction

Milestone Solution Partner IT Infrastructure Components Certification Summary

To Install EdiView IP camera utility on iphone, follow the following instructions:

Help. Contents Back >>

Using Your Polyvision Digital Whiteboard and Walk-and-Talk

Technical Article. Gesture Sensors Revolutionize User Interface Control. By Dan Jacobs

SMART Board Training Packet. Notebook Software 10.0

How to Get High Precision on a Tablet with a Drawing-Like App. Craig A. Will. May, 2014

OPERATION MANUAL. IWB Setup Software/EyeRIS NEC edition

The Swivl Solution. Uses. Swivl User Guide. Swivl is more than just video capture, it is a 3-part solution.

Implementation of Knock Based Security System

S4 USER GUIDE. Read Me to Get the Most Out of Your Device...

Interactive Projector Screen with Hand Detection Using LED Lights

GestPoint Maestro3D. A White Paper from GestureTek The Inventor of 3D Video Gesture Control

DWH-1B. with a security system that keeps you in touch with what matters most

Reducing Configuration Complexity with Next Gen IoT Networks

Using OfficeMate/ExamWRITER on an Apple ipad. October 2013

Digital Solutions. The Future of Alarm Communications

An innovative option for fast ipad and iphone development

print close Building Blocks

Video, film, and animation are all moving images that are recorded onto videotape,

Axxon Monitor. User Manual

Android Phone Controlled Robot Using Bluetooth

City of Corpus Christi. Mobile GIS Viewer. For ipad

SMART Board TM SBID 6052i Interactive Display Troubleshooting Guide

Quick Guide for Polycom VVX 500 Performance Business Media Phone. Document Date: 07/30/13 Document Version: 1.0d

APPLICATION NOTE. AT16268: JD Smart Cloud Based Smart Plug Getting. Started Guide ATSAMW25. Introduction. Features

Is Your Mobile Application Ready For Business?

PRODUCT BROCHURE SPATIAL ANALYZER. Powerful, traceable and easy-to-use metrology and analysis software

Note: This documentation was written using the Samsung Galaxy S5 and Android version 5.0. Configuration may be slightly different.

AN INTUITIVE ACTIVITY MANAGER FOR TODAY S LIFESTYLE

DIGIMobile V2 User Manual

Central Management System (CMS) USER MANUAL

TYPES OF COMPUTERS AND THEIR PARTS MULTIPLE CHOICE QUESTIONS

SMART Collaboration Solutions A better way to work together

Industrial Robotics. Training Objective

AN580 INFRARED GESTURE SENSING. 1. Introduction. 2. Hardware Considerations

Autonomous Advertising Mobile Robot for Exhibitions, Developed at BMF

Connecting Multiple SMART Board Interactive Whiteboards to One Computer

the benefits of SMART visual collaboration solutions THE ECONOMIC BENEFITS

Rogers Smart Home Monitoring QUICK START GUIDE ROG_6093_QST_GDE_EN.indd 1 9/22/12 8:29 AM

AD201, STC106, STC109, STC204, STC302, STC304, STC309, STC311

Blue Conference Room

Lenovo Miix 2 8. User Guide. Read the safety notices and important tips in the included manuals before using your computer.

End User Guide. July 22, 2015

What Smartphones and Tablets are supported by Smart Measure PRO?

RIA : 2013 Market Trends Webinar Series

User Manual. Windows 7 / Vista / XP. iotablet Driver

The ultrasound system that sets new standards

VL Touchscreen Alarm. A Brilliantly Crafted Touchscreen Security System that Revolutionizes Your World

Updating to Test Universe 3.0. What s new?

MACHINE VISION FOR SMARTPHONES. Essential machine vision camera requirements to fulfill the needs of our society

Transcription:

HANDWRITING ON A TOUCHSCREEN, WITH ANALYSIS This WHITE PAPER has been developed in co-operation with Atmel and OptoFidelity. Human handwriting is a great challenge to touchscreens. This White paper introduces the challenges of handwriting with touch screens and Atmel s and OptoFidelity s joint solution to the challenge. CONTENTS Background 2 Stylus and touchpad products 3 The OptoFidelity testing philosophy 4 Handwriting problem statement 5 Atmel & OptoFidelity joint approach 6 New test solution 8 More information 9 1

BACKGROUND Atmel (Nasdaq: ATML) is a worldwide leader in the design and manufacture of microcontrollers, capacitive touch solutions, advanced logic, mixed-signal, nonvolatile memory and radio frequency (RF) components. Leveraging one of the industry s broadest intellectual property (IP) technology portfolios, Atmel provides the electronics industry with complete system solutions. OptoFidelity is the global leader for testing solutions for user interactions in mobile consumer electronics products. OptoFidelity works with most of the top 20 technology and consumer product companies in the industry. OptoFidelity s testing solutions and technology is used also in other industries such as automotive, where touch technology is becoming a standard. OptoFidelity and Atmel have partnered up with the technology development and test case development of touch sensors. This white paper focuses on the challenges of handwriting on touch screens. The white paper has been developed during a joint development project where Atmel was focusing on improving its user experience with touch sensors. During the joint project OptoFidelity developed a novel testing technology for Atmel. Atmel was able to make giant steps forward in the improvement of its touch sensors performance during the project. Atmel was able to improve its touch sensors performance with giant steps during the co-operation project. 2

STYLUS IS BECOMING A STANDARD IN TOUCHPAD PRODUCTS As touchscreen has become the standard user interface, the technologies behind it are becoming more and more sophisticated. In touchscreen products stylus is becoming a dominant feature now when Apple introduced its ipad Pro with Apple Pen. Microsoft, Samsung and several others have had their products with stylus on the market for a while already. But stylus interaction is a challenge as well. By itself, it s creating a new layer of complexity to the product and the technology behind. But then comes the human factor into the play. Human interaction with the stylus has proven to be a major challenge to the technology. And for testing. For consumer, a poor joint performance of stylus and the touch product creates annoyance. And annoyance means bad reputation. The typical challenge from human interaction with stylus comes from the speed, or to be more accurate, the huge variation in the speed of the pen when combined with poor performance of the touch screen. We will describe the problems with more detail later in this document. 3

OPTOFIDELITY TESTING PHILOSOPHY APPROACH When robots are used for testing purposes,, the actual trick is not to make them act delicately but to make them act like humans at the same time when they are, of course, machines: tireless, durable, objective and unable to get bored. In addition to building test robots, the whole testing environment, including test framework, test scripts and test asset adaptation must be created. During the past 10 years OptoFidelity has developed a groundbreaking approach to testing touch enabled devices, Human-like testing. Human-like testing is a testing method where a real test case is taught to a test robot which is operated by OptoFidelity Touch & Test (TnT) Suite software platform. Together the test robotics and TnT are called OptoFidelity Human Simulator. The idea is to simulate human behavior by a robot. The Human Simulator can repeat thousands of test runs with very sophisticated test cases. Sensors, cameras and microphones are the senses, i.e. ears and eyes of the robot, and together with finger actuators or stylus holders they comprise the physical part of the test robot. The brain is the analyzing software with a test suite designed for touch devices. The human simulator brain analyzes automatically the camera or sensor data to produce numerical values like UI latency or UI framerate. The brain enables the automated navigation or state validation based on text or icons on the display. In addition to this OptoFidelity provides the interface for advanced robot controls for testing purposes, test script tools and the base for the adaptation layer. More information about the OptoFidelity testing approach can be found in another white paper Robot Aided Test Automation to improve quality and customer satisfaction, published in 2015. During the past 10 years OptoFidelity has developed a groundbreaking approach to testing touch enabled devices: Human like testing. 4

PROBLEM STATEMENT IN HANDWRITING Recent development in touch technology benefits human-machine interaction greatly as it enables devices to have user interfaces that are both intuitive and easy to learn. In some scenarios, the touch interaction is not enough when a great amount of data is needed to be input, such as writing and drawing. Other input methods are needed to make these tasks easier. It is not difficult to imagine that an active stylus is the product with the most potential among all input methods, in terms of providing convenience for typing and drawing. It can improve the user experience of touch devices by offering more precision as well as being intuitive, portable and having extendable functionality with pressure sensitivity and auxiliary buttons. This introduces multiple challenges: 1. The speed of human handwriting speed is very fast. It means the technology we use has to have a very fast report rate to be able to reproduce the shapes from what we sketch or write. 2. Human handwriting involves tilting and rotating pen to varying angles and controlling the applied force. 3. When using stylus, the touch screen has to be able to reject the palm as it is common that we usually rest our arm on top of the touch screen. The display has to be smart enough to distinguish whether user is using the touch function or the stylus. 4. The first touch latency is important as the real pen leaves no time gap between writing and the inking. According to Miller (1968), 100 milliseconds is the minimum just noticeable difference (JND) for any interaction related to the response time of the system. However, recent research provides the evidence that human can tell the difference even when it is as small as a few milliseconds. Current devices are usually around 60-80 milliseconds. 5

ATMEL & OPTOFIDELITY JOINT APPROACH To be able to provide a good user experience in handwriting Atmel partnered with OptoFidelity to develop a robotic test solution with the following features: 1. A high-level control software platform that allows adding and modifying test sequences easily 2. High precision in position and movement accuracy to reproduce handwriting 3. Capability to input and record every movement from robot execution 4. An analysis tool that is designed to compare the handwriting execution and results from the touch sensor 5. High-level resolution of data collection timing to align data reported by the touch sensor with the robot finger movement. Accurate timestamps assigned to all robot movements and all sensor reports allow the measurement of pen-to-ink latency of the touch screen. 6. Integration to Atmel maxtouch Studio tool to automate sensor fine-tuning 7. Easy switching between touch finger sizes; support for various styluses 8. Easy camera-based positioning of the device under test (DUT) and using the DUT s coordinate system instead of the robot s coordinate system in test sequences 6

OptoFidelity has been providing touch test automation for years and is the pioneer of the test and measurement automation technology. Atmel is committed to touch technology as well as to the active stylus technology with wonderful user experience. We believe through partnership with OptoFidelity, we are able to speed up our R&D development process and to deliver solutions with excellent user experience. An example touch screen latency and accuracy measurement result. The actual line drawn by the robot is presented with a blue line. Coordinates reported by the touch screen are presented as red circles. The red lines pair together the robot coordinates and panel coordinates collected at the same moment and show the system latency. 7

A NEW TEST SOLUTION FOR HANDWRITING To compare what we are actually drawing by robot finger or stylus with what is reported by the touch module, we need to simultaneously collect the robot position data and the touch sensor data. When the robot finger is moved in high speed over complex trajectories, the laws of inertia must be taken into account. In practice this often requires real-time adjustment of speed and position by the robot controller during the actual finger movement. This introduces a small, and hard to predict deviation to the specified motion paths, which must be taken into account when measuring position accuracy of a touch sensor. To tackle this problem, we stop assuming anything about the robot movements, and instead we collect robot position data with the robot s position encoders while the robot is moving. Accurate timestamps are assigned to the position data. Sensor data is collected with the same real-time hardware, and the same clock is used for assigning timestamps for the collected sensor data. OptoFidelity s Touch & Test platform provides an API for writing test cases in the device under test s (DUT) coordinate system. Position of the DUT in the robot s coordinate system is configured using a camera attached to the robot finger. After that we can use DUT s coordinate system to perform gestures such as taps and swipes on the DUT surface, or we can send a list of points through which the robot will operate with a given speed on the DUT surface. The collected robot position data is converted into the DUT s coordinate system and after that it is compared with the data reported by the touch sensor. When comparing points with the same timestamp, touch latency affects calculated results. If we ignore timestamps and instead find the closest robot position for each reported coordinate, we get an absolute accuracy error without the latency component. 8

More information and test reports: A detailed test report is available for download later in march 2016 from OptoFidelity s web site. Please have a look as well to our videos in YouTube, www.youtube.com/optofidelity. For discussions the direct contacts are: OptoFidelity Wen Nivala wen.nivala@optofidelity.com +44 7958 722 226 Natalia Leinonen natalia.leinonen@optofidelity.com +358 40 5313 734 9