MING-CHIH YEH LEE, B.A., M.S. A THESIS COMPUTER SCIENCE

Size: px
Start display at page:

Download "MING-CHIH YEH LEE, B.A., M.S. A THESIS COMPUTER SCIENCE"

Transcription

1 NEURAL NETWORK STRUCTURE MODELING: AN APPLICATION TO FONT RECOGNITION by MING-CHIH YEH LEE, B.A., M.S. A THESIS IN COMPUTER SCIENCE Sbmitted to the Gradate Faclty of Texas Tech University in Partial Flfillment of the Reqirements for the Degree of MASTER OF SCIENCE Approved Accepted December, 988

2 ACKNOWLEDGMENTS To accomplish this task, I am deeply indebted to Dr. Oldham, the chairman of my committee. His expertise, encoragement, gidance, and patience have spported me throghot the process and will never be forgotten. My sincere gratitde is also expressed to Dr. Marcy and Dr. Gstafson for their spport. Special thanks go to Mrs. Weiner for her help in the varios phases of this research as well as in my academic years in the Compter Science Department. Finally, I wold like to thank my parents and my hsband, Yng-Hi, for their spport throghot the stdy.

3 CONTENTS ACKNOWLEDGMENTS ABSTRACT LIST OF TABLES LIST OF FIGURES V vi vii CHAPTER I. INTRODUCTION Scope Problem Definition 3 Specific Tasks 5 II. BACKGROUND INFORMATION AND LITERATURE REVIEW 7 Neral Network Model 7 ANS Description 7 Featres of ANS 8 Mathematical Model of ANS Applications of ANS Varios Neral Network Models 4 Hopfield Models 4 Carpenter and Grossberg Models 7 Single Layer Perceptron 8 Anderson Model 9 Hogg and Hberman Models 2 Smmary 27 III. METHOD AND APPROACH 28 Problem Formlation 28 Software Development 2 9 Encoding Scheme Development 3 Normalizing Fonts and Generating Character Matrices 3 Selecting Character Properties and Generating Property Matrices 3 2 Compting Filter Matrix 3 5 Extracting Properties and Constrct Character Vectors 3 6

4 Application of Models Throgh Simlation 42 Model Comparisons 47 IV. SIMULATION RESULTS 5 Characteristics of Collected Characters 5 Principal Component Analysis 5 Behavior of Model H-Hl 56 Behavior of Model H-H2 7 Model Comparisons 86 Model H-Hl verss Model H-H2 86 Hogg and Hberman verss Cash and Hatamian 92 Hogg and Hberman verss Fjii and Morita 93 V. CONCLUSIONS AND RECOMMENDATIONS FOR FUTURE RESEARCH 95 Conclsions 95 Recommendations for Ftre Research 98 BIBLIOGRAPHY APPENDICES A MODEL H-Hl 2 B MODEL H-H2 3 IV

5 ABSTRACT Two neral network models. Model H-Hl (Hogg and Hberman, 984) and Model H-H2 (Hogg and Hberman, 985) have been sccessflly applied to the font recognition problem and were sed to recognize 26 English capital letters, each with six font representations. Recognition rate, memory space reqirement, learning speed, and recognition speed were sed to measre the models* performances. Model parameters sch as memory array size, Smin_Smax, and Mmin_Mmax were varied to elcidate the models' behavior. As a reslt, both models achieved a % recognition rate when all six fonts were sed as the training as well as the recognition set. When three ot of six fonts were sed for training. Model H-Hl achieved a maximm recognition rate of 87.82% and Model H-H2 achieved a maximm recognition rate of 89.%. This shows that the basins of attractor states existed for the letters in most of the varios font presentations. Model H-H2 significantly otperformed Model H-Hl in terms of recognition rate, se of memory space, and learning speed when all six fonts were sed as the training set. This was spported by the reslts of the Pairwised T Test.

6 LIST OF TABLES. Assignment of Weighting Factor 4 2. Levels of Parameters Principal Component Analysis of the Character Vectors Performance Comparison of x9 and x4 Inpt Code An Example Otpt (Model H-Hl) Model Performance (Model H-Hl; Trained 3 Fonts) 6 7. Analysis of Variance of Performance Data of Model H-Hl Model Performance (Model H-Hl; Trained 6 Fonts) 7 9. An Example Otpt (Model H-H2) 73. Model Performance (Model H-H2; Trained 3 Fonts) 75. Analysis of Variance of Performance Data of Model H-H Model Performance (Model H-H2; Trained 6 Fonts) Performance Differences of Model H-Hl and Model H-H2 (Trained 3 fonts) Pairwised T Test of Performance of Model H-Hl and Model H-H2 (Trained 3 fonts) Performance Differences of Model H-Hl and Model H-H2 (Trained 6 fonts) 9 6- Pairwised T Test of Performance of Model H-Hl and Model H-H2 (Trained 6 fonts) 9 VI

7 LIST OF FIGURES. The Architectre of a Typical ANS 9 2. The Non-linearity Fnctions 2 3. Model Classifications 5 4. The Hogg and Hberman Model Basins of Attractor Change After the Coalescence Process The 56 Target Alphabet 3 7. The Corier 'A' Font and Its Matrix Representation The Selected Character Properties and Their Matrix Representations The Search Matrix and the Windows 38. Location Assignment in the Character Matrix 4. An Example of the Decision Tree Freqency Occrrence of the Extracted Properties Inpt and Otpt Relationships of Model H-Hl Recognition Rate as a Fnction of Memory Matrix Size, Smin_Smax, and MminMmax (Model H-Hl) Memory Space as a Fnction of Memory Matrix Size, SminSmax, and Mmin_Mmax (Model H-Hl) Learning Speed as a Fnction of Memory Matrix Size, Smin_Smax, and Mmin_Mmax (Model H-Hl) Recognition Speed as a Fnction of Memory Matrix Size, Smin_Smax, and Mmin_Mmax (Model H-Hl) Inpt and Otpt Relationships of Model H-H2 72 Vll

8 9. Recognition Rate as a Fnction of Memory Matrix Size, Smin_Smax, and Mmin_Mmax (Model H-H2) Memory Space as a Fnction of Memory Matrix Size, Smin_Smax, and Mmin_Mmax (Model H-H2) 8 2. Learning Speed as a Fnction of Memory Matrix Size, Smin_Smax, and Mmin_Mmax (Model H-H2) Recognition Speed as a Fnction of Memory Matrix Size, Smin_Smax, and Mmin Mmax (Model H-H2) 84 Vlll

9 CHAPTER I INTRODUCTION Scope Artificial Neral Net systems (ANS) is a rapidly growing research area becase of its promise to solve problems that have confonded compter science and artificial intelligence for over 3 years (Hecht-Nielsen, 987). Primarily, ANS technology is being applied to the development of information processing systems which perform tasks similar to those that the hman brain does. For example, ANS technology has enhanced the processing capabilities of high-performance pattern recognition process. In the past, many pattern recognition problems have not been solved sccessflly. Althogh many traditional compter science approaches have been applied, methods for carrying them ot affordably with sfficient speed and with adeqate robstness have not been available (Hecht-Nielsen, 987). Artificial neral net models are an alternative which employs massive parallelism to complete the job in a reasonable time. One of the major reasons that artificial neral nets ot-perform other approaches in solving the recognition problem is that the strctres of the models are developed based on the nderstanding of the biological

10 nervos system (McClelland, Rmelhart, and Hinton, 987). Other ANS properties inclde the following: () The existence of attractor states: An attractor state is the state of a system that is at least qasi-stable (Oldham, 986). It has been proven that attractor states exist in discrete systems sch as compter models (Hogg and Hberman, 985). This is an important featre since the attractor states can be sed to store information. The information is introdced into the attractor by training the system sing examples. Moreover, compting with attractive fixed points can lead to reliable behavior (Hogg and Hberman, 984). This featre makes the neral net more attractive to the pattern recognition application. (2) A learning and recognition capability: Learning and recognition are the basic processes involved in pattern recognition. Some artificial neral networks, sch as the Hogg and Hberman models, can be trained by repeatedly presenting example patterns from an environment. The elements of the memory matrix which represents the model are adjsted accordingly in discrete time steps ntil the system converges according to some criteria. The system is said to be trained or to have converged if the vales of the memory matrix reach stable vales. If the system is properly trained, it will then recognize a pattern with which it was trained.

11 (3) A falt tolerant capability: Becase of the existence of attractor states, the overall performance of the system tends to be insensitive to partial internal failres or inpt data errors. This is becase the learned states are attractor states and the system is capable of evolving to the attractor states based on only partial or approximate information. (4) High performance speed: The massive parallelism (parallel inpt channels and parallel otpt channels) enables the system to process with satisfactory speed. As mentioned previosly, pattern recognition involves the learning and the recognition phases. In the learning phase, the time complexity depends on the nmber of iterations before the system converges; in the recognition phase, the time is independent of the code complexity. (5) VLSI implementation: The strctre of the network makes it easy to implement in VLSI technology. In this research, the modeling of neral networks was stdied and their capabilities in pattern recognition were explored. The stdy has made a sbstantial contribtion to the nderstanding of neral networks throgh the application of the model to the area of pattern recognition. Problem Definition From among the varios neral net models, two models that were developed by Hogg and Hberman in 984 and 98 5

12 (Model H-Hl and Model H-H2) were chosen as the focs of this stdy. They were examined, elcidated, and evalated when applied to font recognition. Basically, Model H-Hl and Model H-H2 are fairly simple and easy to implement. Properties that make these models interesting are: () The models are qite simple in presentation and can be easily implemented in software. (2) Some non-determinism is injected via the memory pdate. (3) All calclations can be done in parallel as a pipeline. (4) Since the inpt and otpt contact the external world at the edges of the system, the models can be easily implemented by VLSI technology (Oldham, 986). (5) Althogh the topologies of these two models are the same, their otpt fnctions and learning rles are different. The behavior of Model H-Hl has been investigated by several stdies. These stdies revealed that this model has a self-repairing capability (Hogg and Hberman, 984) and conditional learning capability (Hogg and Hberman, 985). Model H-H2 can dynamically modify the basins of attractions to inclde or exclde a particlar set of inpts by sing the coalescence process or the dissociation process. The coalescence process is sed to prodce the

13 same otpts when the corresponding inpts are originally associated with different otpts; the dissociation process is sed to differentiate the otpts when the inpt are initially mapping into the same otpts. To be able to maniplate the basins of attractors of the model is a very desirable featre in recognition application. The basins of attractions of Model H-H2 can be maniplated to cople the desired groping of inpts into specific otpts. This featre is particlarly sefl in the area of font recognition (Hogg and Hberman, 985). In smmary, the stdy was condcted to better nderstand, elcidate, and evalate the models developed by Hogg and Hberman. The goals were accomplished by applying the models to font recognition. Specific Tasks The stdy was accomplished by the following steps: () Problem formlation: Each English letter has many font representations. Althogh each font differs from the others, their overall images shold be treated the same. In the stdy, the models were sed to recognize the English characters independent of their font representations. (2) Software Development: To simlate the behavior of these models, programs were developed in PL/I and were exected on a VAX 865.

14 (3) Encoding scheme development: To obtain reliable recognition, an encoding system was developed and the attribtes of the English characters were derived. Six fonts for each of the 26 English capital letters (56 characters in total) were encoded. (4) Application of models throgh simlation: Half of the codes (the training set) were sbmitted to each model dring the training phase. After each model was trained, the testing set was presented to it and the behavior of the model was recorded. The testing set consisted of all 56 characters. To explore the behavior of the models, their parameters were varied and different simlation data were generated. (5) Model comparisons: The performance of these models were determined by comparing their recognition rates, memory space reqirements, learning speeds, and recognition speeds. Also, the performances of the models were compared with the reslts of Cash and Hatamian's stdy (987) and the reslt of Fjii and Merita's stdy(97).

15 CHAPTER II BACKGROUND INFORMATION AND LITERATURE REVIEW In this Literatre Review, the first section provides a review of the basic strctres and fnctions of an Artificial Neral System (ANS). The second section introdces varios neral net models. The Hogg and Hberman models which are the focs of this stdy are illstrated in the third section and was followed by a smmary. Neral Network Model ANS Description An artificial neral network is a network that consists of a set of nodes, the interconnections among nodes, learning rles, and inpt/otpt data (Oldham, 986). The nodes, also called nerons or processing elements (PEs), represent particlar conceptal objects and perform relatively simple comptational processes. A node's job is to receive inpts from its neighbors, to compte the otpt according to the comptation rles, and to send the otpt vale to its neighbors. Ths, each node has little information stored internally and works as a short term working memory (Fahlman and Hinton, 987).

16 Each node is connected to one or more other node(s) via an 8 interconnection weight. In general, the connection weight can be positive or negative. Positive weights are excitory; negative weights are inhibitory. The weights are modified as a fnction of experience by the adaptive or learning rles and the information is stored in the connectivity pattern. In other words, the connection weights determine the long term storage of information (Fahlman and Hinton, 987). Figre illstrates the architectre of a typical ANS (Oldham, 986). As previosly mentioned, changing the knowledge stored in the network involves modifying the pattern of connectivity and the weights. Many, bt not all, learning rles can be considered as the variation of the Hebbian learning rle developed in 949 (McClelland, Rmelhart, and Hinton, 986). The basic principal for the Hebbian rle is that, if a pair of neighboring nodes are both highly active, the weight between them shold be strengthened. Overall inpts and otpts take place at the edges of the network. The inpts for a node are the otpts of its neighboring nodes. Similarly, the otpt from a node is inpt for other nodes. Featres of ANS Artificial neral net models are parallel, distribted process models. They are inherently parallel since a large

17 Figre : The Architectre of a Typical ANS

18 nmber of the process elements perform their comptations simltaneosly and independently. They are inherently distribted becase each process element is assigned a very tiny sb-task and stores little information. The distribted featre enables the artificial neral net to be falt tolerant in regards to both internal failres and inpt data errors. This featre is drawn from the cognitive research idea that a given neron is involved in many processes and many nerons participate in many decisions or processes (Oldham, 986). As a reslt, a given neron may play an insignificant role in the entire storage process. Conseqently, a degree of falt tolerance is achieved by the process. The parallel featre provides the models with comptational efficiency becase that all nerons perform simple operations, and a large nmber of operations occr concrrently. Mathematical Model of ANS Symbolically, a neral net can be viewed as G (I,, n, f(n), C). Where I is inpt, O is otpt, f(n) is fnction operating at the nth node, and C is a connectivity matrix. The fnction f can be either linear or non-linear. Linear models have limitations becase they can not mix or combine basic states to generate new states that correspond to learning or creativity (Oldham, 986). They can only

19 accomplish sperficial mixing of learning. Therefore, nonlinear models were stdied in this research. Non-linear models are difficlt to analyze. Even the simplest non-linear mechanism with very few nodes, is extremely complicated and intractable. Crrently, the only method of analysis is accomplished by compter simlation (Oldham, 986). The most freqently sed non-linearity fnctions are describe mathematically below: () hard limiter: f(x)= -, if x< f(x)=, if x> (2) threshold logic element: f(x)=, if x< f(x)= f(x)= X, if <x<c a, if x>c where c is the threshold and a is a constant (3) sigmoidal non-linearity: f(x)=l+tanh(x) Figre 2 presents these non-linearity fnctions graphically (Lippmann, 987). Applications of ANS Neral net models have been applied to the following problem areas:

20 2 Kia) t, or) f, icn a- cr- NAMO UMfTIfl THRISNOID loolc lomdo Figre 2: The Non-linearity Fnctions

21 () Classifier: Determining which one of M classes is the most representative of an nknown inpt pattern. 3 (Lippmann, 987). Neral net models can identify which class best represent a inpt pattern. This is a classical decision theory problem. (2) Pattern Associator: The goal of pattern association is to bild p an association between patterns defined in one set and patterns defined in another set. After the connections between sets are established, the associated pattern will appear on the second set if a particlar pattern reappears on the first set (Lippmann, 987). An ato«^associator is one sch that the otpt pattern is associated with the same as the inpt. Ths, whenever a portion of the inpt pattern is presented, the remainder of the pattern is to be filled in or completed. The content-addressable memory (also called associative memory) that can recall total storage information based on partial or noisy inpt is an example of ato-associator. A hetero-associator is one in which a pattern in the first set is associated with a different pattern in the second set. (3) Reglarity Discovery (featre detector): The models learn to respond to "interesting" pattern in their inpts (McClelland, Rmelhart and Hinton, 987) and will divide the N inpt patterns into M classes.

22 4 Varios Neral Network Models An ANS is specified by the net topology, node characteristics, and training or learning rles (Lippmann, 987). As mentioned previosly, the pattern recognition process involves a training phase and a recognition phase. In the training phase, some models are trained with spervision while others are not. When a model is trained with spervision, it is provided the side information or label that specifies the correct class for the new inpt patterns. Whereas, nets trained withot spervision, no information concerning the correct class is provided. Among those models that are trained with spervision, Hopfield models were sed as ato-associators while the Perceptron and Anderson models were sed as hetero-associators. Among those models that are trained withot spervision, the Carpenter and Grossberg model was sed as an ato-associator and the Hogg and Hberman models were sed as heteroassociators. Figre 3 presents the model classifications. Hopfield Models The varios Hopfield models, which are highly connected with strong feedback, can be sed as an associative memory or can be sed to solve optimization problems. In the models that can be sed as associative memories, the inpts and otpts are binary vales which can take on + or -.

23 5 Spervised Non-spervised Ato- Hopfield Models Carpenter/Grossberg Models associator Hetero- Perceptron Hogg/Hberman Models associator Anderson Model Figre 3: Model Classifications

24 6 The otpt of each node is fed back to all other nodes via weights. Since the model is trained with spervision, examplars are provided. Following the initialization, the net iterates in discrete time steps. The pattern specified by the node otpts, after convergence, is the net otpt. It has been proven that this net converges to stable final states when the weights are symmetric (Hopfield, 982). This model has two major disadvantages when se as a content addressable memory. First, there is a limitation in the nmber of patterns that can be stored and accrately recalled. Second, the examplar pattern is nstable if it shares many bits in common with another examplar pattern. Also, Hopfield models can solve optimization problems sch as the classical Traveling-Salesman problem, analog to digital conversion problems, and linear programming problems. The stable states of the network containing N nerons are the local minima of the energy fnction E and the circit operates over the interior of a hypercbe defined by the inpt patterns. The minima only occr at the corners of the hypercbe; stable states correspond to the 2 corners of the hypercbe that minimizes E. These optimization problems can be solved by the following steps: () Choosing the connectivities; (2) Choosing the inpt bias circits, which appropriately represent the fnction to be minimized; (3) Providing an initial set of inpts that case the system to converge to a stable state which represents

25 7 the minimm of the fnction; (4) Interpreting the soltion from the stable final state (Hopfield and Tank, 985). Carpenter and Grossberg Models Carpenter and Grossberg introdced the adaptive resonance theory (ART) which forms clsters and is trained withot spervision (Lippmann, 987). ART embedded a competitive learning model into a system that can solve the stability-plasticity dilemma (Carpenter and Grossberg, 988). Stability is essential for the system to remain nchanged in response to irrelevant events. Whereas plasticity is essential for the system to learn in response to significant new events. The stability-plasticity dilemma says that the system mst employ some mechanism to distingish between signal inpts and noisy inpts so that it knows how to switch between its stable and its plastic modes. The ART I deals with binary inpts, while ART II deals with analog inpts. These two models are rather complicated and involve many parameters and learning rles. Among these parameters, the vigilance parameter, which ranges between. and., determines how close a new inpt pattern mst be to a stored examplar to be considered similar. High vigilance forces the system to search for new categories in response to small differences between inpt and expectation.

26 8 Ths the system classifies inpt patterns into a large nmber of categories. On the other hand, low vigilance enables the system to tolerate large mismatches and ths grops inpt patterns into a small nmber of categories. It has been mathematically proven that an ART I architectre is capable of stably learning a recognition code in response to an arbitrary seqence of binary inpt patterns, ntil it tilizes its fll memory capacity (Carpenter and Grossberg, 987). Single Layer Perceptron The original perceptron was developed by Rosenblatt (Lippmann, 987). perceptrons are able to decide whether an inpt belongs to one of two classes (Class A or Class B) that are separated by a decision region created in the mltidimensional space spanned by the inpt variables. The basic idea is to compte a weighted sm of inpt elements, sbtract a threshold, and pass the reslt throgh a hard limiting non-linearity fnction sch that the otpt is either + or -. In the training phase, both inpts and desired otpts are provided. The connection weights are adapted only when the actal otpt differs from the desired otpt. The decision rle is to respond Class A if the otpt is + and Class B if the otpt is -. The Perceptron forms two decision regions separated by a hyperplane.

27 Single layer perceptrons can be expanded to mlti-layer 9 perceptron by introdcing hidden layers. A three-layer perceptron can form arbitrarily complex decision regions. Althogh it can not be proven that this model will converge, it has been shown to be sccessfl for many problems of interest. Anderson Model The Brain In a Box (Anderson model) was developed by Anderson in 977. This model performs as a simple linear associator that is trained with spervision. In the model,any nit can be connected to any other nit. Assme that there are two grops of N nerons, X and Y and every neron in X projects to every neron in Y. A synaptic strength, aj^j, connects the jth neron in X with the ith neron in Y. Let f^j^ be a vector that represents X and g-i be a vector that represents Y. An association can be made between f^^ and g^ so that the presentation of fl alone will give rise to g^^. This association is developed by ^l*^^9l where A^^ is the connectivity matrix. A-^ can be compted by Ai=gi*fi'^. When there are k sets of nerons, namely iti,gi), (f2/g2)' '(^k'^k)' there exists a single connectivity

28 2 matrix to associate the set of activity patterns. of nerons can generate a connectivity matrix, A^. Each pair Then the overall connectivity matrix A can be compted by A=EAi. It has been mathematically proven that, if the inpt vectors are mtally orthogonal, the system can perform the association perfectly (Anderson, Silverstein, Ritz, and Jones, 977). Hogg and Hberman Models Hogg and Hberman models, which are flow forward and synchronos networks, were developed in 984 and 985. They will be referred to as Model H-Hl and Model H-H2, respectively. The models are able to map many inpts to a few final states. The final states are called fixed point attractors. The set of inpts that map into a given otpt defines the basin of the attractor for that otpt. The architectres of Model H-Hl and Model H-H2 are exactly the same and each can be represented by rectanglar matrices (memory matrices) that consist of M rows and N colmns of identical processors, each of which is locally connected to its neighbors. Each element has a vale stored in it. Additionally, each processor has an adjstable internal state, or memory, which allows it to adapt to its

29 2 local environment. The overall inpt and otpt take place at the edges of the matrix, with the pper edge of the matrix (the first row) for inpt and the lower edge (the last row) for otpt. For each element, it receives two integer inpts from its neighbors along its diagonals in the preceding row and prodces an integer otpt which in trn becomes an inpt to the nodes along the diagonals in the following row (see Figre 4). The hard limiter non-linearity fnction is sed so that the otpt vales are constrained to lie within a range, namely [Smin, Smax]. within [Mmin, Mmax] range. The memory vales are limited Those otpt vales that are eqal to the extremes of these ranges are said to be satrated. Let ILj^-: (k) and IR^-; (k) be the inpts to the element in the ith row and the jth colmn after the kth time step and let Oj^^ (k) be the otpt vale for the ith row, jth colmn element. The connections between elements are defined by these relations. ILij(k) = Oi_i^j_i(k), IRij(k) = Oi_i^j+i(k). where < i M and i j N The bondaries of the matrix, for the top, bottom, and sides edges are specified respectively as

30 22 ^ (A) <! > = (B) X \ \ O (C) O () Figre 4: The Hogg and Hberman Model

31 23 Ooj(k) = Sj(k), Omj (^) = Rj (k), Oio(k) = Oij,(k), Oi,n+l(k) = Oii(k). where S(k) is the external inpt signal to the matrix at step k, and R(k) is the reslting otpt vector. The two models employ different otpt fnctions and learning rles which are described next. () Model H-Hl: (A) The otpt from each element for the k+i step is Oj^j (k+) =max{smin,min(smax,mj^j (k) [IL^^j (k) -IR^j (k) ] }. This rle enhances the differences of the inpts by mltiplying them by the memory vale. The satration process keeps the vales within the specified interval. (B) The memory vales are pdated by the following rle: if Oij(k) > Oi^j_;,^(k) and O^j (k) > ^^^^^(k) then else Mj^j (k) = max{mmin,min(mmax,mj^j (k-l)+l) } if Oij(k) < Oij_3^(k) and O^j (k) < ^^^^-^W then else Mj^-j (k) = max(mmin,min(mmax,mj^j (k-)-) } Mij(k) = Mij(k-l).

32 24 (2) Model H-H2: (A) The otpt fnction for the second model is Oj^j (k+l)=max(smin,min(smax,s(ili^ (^) ' ^^ii (^)) * ( I IRi j (k) I + I ILi j (kn ) +Mi j (Ry ) ) where for even rows, if ILj^j (k) is zero, then S(ILj^j (k),irj^j (k) ) is the sign of IRj^j (k), otherwise the sign of ILj^j (k) ; and for odd rows the roles of ILj^j (k) and IR^j (k) are reversed. (B) Learning rles for the second model are: (a) The coalescence Rle (contracting rle) is capable of dynamically associating the basins of two or more attractors to prodce the same attractor. Figre 5 pictorically demonstrates the basins of two attractors before and after the process according to the rle listed below. if at least one of Oj^j (k-) and O^^j (k) is not satrated AND Oj^j (k) *Oij (k-) < then change KJ^J. by, with the sign of the change given by the sign of the otpt with largest magnitde, else Mj^^ is nchanged. (b) The dissociation rle (expanding rle) is sed to separate the inpts which initially map into

33 25 Figre 5: Basins of Attractor Change After the Coalescence Process

34 26 the same otpt. The expanding rle operates opposite to the contracting rle. if at least one of O^^j (k-) and O^^j (k) is not satrated AND Oj^j (k) *Oj^j (k-) > then change M^^j by, with the sign of the change opposite of that of either otpt else Mj^j is nchanged. In the training phase, a set of training patterns are sbmitted to the models, periodically, as a pipeline. Then the otpts and the memory vales are pdated according to the varios rles. The model is said to be stabilized if the otpt vales for the inpt patterns do not change. Once trained, the memory states will be fixed. The official otpts can be obtained by rnning the inpts throgh the model one more time. This has the effect of fixing the otpt patterns after the memory is locked and bars against matrix changes to the otpt patterns de to one or more memory element changing in the later training phase. In the recognition phase, the recognition set is sent to the model. By comparing the otpts with the official otpts, the model is capable of determining whether the inpt is one of the trained patterns. Self-organizing, self-repairing and conditional learning exist in the Hogg and Hberman models. s».- Self-

35 organizing means that the model is capable of converging to 27 fixed point attractors. Self-repairing means that small flctations in either data or memory vales won't case the model to relax to another attractor. Conditional learning means that a set of inpt patterns can be learned faster if they are close to previosly learned ones. Smmarv The capability of dynamically changing the basins of attractors opens a new research area. As Hogg and Hberman (985) sggested, the coalescence and dissociation processes provide a flexible way to transform desired gropings of inpts into specific otpts. These processes are particlar sefl in font recognition since the sets of inpts generated by an encoding scheme for the same letters in different fonts can be identified as an eqivalent class. Little is known abot the dynamics of attractor states and their behavior nder general circmstance. In fact, a whole new area of research has been initiated in this area called "chaos theory" in the last years (Schster, 988). Isses sch as the performance stability of the models when they are sbjected to parameter changes, can not be answered with ease, even in the simplest nontrivial cases. Ths more research work is needed to gain insight and an nderstanding of the behavior of these models.

36 CHAPTER III METHOD AND APPROACH This stdy was condcted to nderstand, elcidate, and evalate the two models developed by Hogg and Hberman (H-Hl and H-H2). These goals were accomplished by applying the two models to font recognition problem. Font recognition was achieved by following these steps: (l) Problem Formlation, (2) Software Development, (3) Encoding Scheme Development, (4) Application of Models Throgh Simlation, and (5) Model Comparisons. Problem Formlation Dynamic modification of the basins of attractors is very sefl in recognition problems that reqire ptting sets of inpts into the same eqivalent classes. Ths font recognition was chosen to explore the models' recognition capabilities. Each English letter has many font representations. Althogh fonts for a particlar letter differ from one another, the overall images are recognized by hmans as the same letter. In other words, letters in varios fonts are treated as being in the eqivalence class. When the models are applied to font recognition, it is desired that they 28

37 29 recognize letters independent of their font representations. For this stdy, six fonts were selected for the 2 6 capital letters of the English alphabet. The fonts, which inclded () Corier,(2) New York, (3) Chicago, (4) Geneva, (5) Times, and (6) Venice, were generated for each letter by a Macintosh compter. Figre 6 lists the 56 target characters. The goal of this stdy was to discover how well the models can learn and recognize the letters in varios fonts. The learning phase was accomplished by sbmitting the training set to train the model. The training set, in this case, are the vector representations of 78 characters (3 fonts for 26 capital letters). The recognition phase was accomplished by sbmitting the vector representations of 56 characters (6 fonts for 26 capital letters) to determine whether the models can recognize the letters correctly. Software Development In order to explore the behavior of the models, programs were developed in PL/I and were exected on a VAX 865. The programs were implemented sing top-down design and step-wise refinement schemes. The program to simlate Model H-Hl contains 685 statements and is provided in Appendix A; the program to simlate Model H-H2 contains 97 statements and is provided in Appendix B. The inpts to the models, one-dimensional character vectors that represent the

38 3 Corier: ABCDEFGHIJKLMNOPQRSTUVWXYZ New York: ABCDEFGHIJKLMNOPQRSTUVWXYZ Chicago: RBCDEFGHIJKLMNOPQRSTUUUiHVZ Geneva: ABCDEFGHIJKLMNOPQRSTUVWXYZ Times: ABCDEFGHIJKLMNOPQRSTUVWXYZ Venice #^DraH3)CLQWE5lTy UlOP ZXCPBNn Figre 6: The 56 Target Characters

39 -W? 3 target characters, will be discssed in detail in following sections. Encoding Scheme Development The principal difficlty encontered in the pattern recognition problem is finding a way to present the patterns in a formalized manner. In many sccessfl character recognition systems, a character is first normalized (e.g., aligned in position), then preprocessed (e.g., by featre extraction), and then classified. In order to prodce satisfactory reslts, a good preprocessing algorithm (encoding scheme) is needed. A good encoding scheme mst be able to express the objects nder consideration in a compact way, withot losing any information. The steps to encode the inpt characters are () Normalizing fonts and Generating character matrices; (2) Selecting character properties and Generating property matrices; (3) Compting the Filter Matrix; and (4) Extracting Properties and Constrcting Character Vectors. These steps are discssed in the following paragraphs. Normalizing Fonts and Generating Character Matrices A total of 56 capitalized English characters were translated into 8x8 character matrices. In the character matrices, Os represent the backgrond and Is represent the

40 32 Character image. The matrices are the inpts to the next process. An Example of the character 'A' in Corier font and its matrix representation are shown in Figre 7. Selecting Character Properties and Generating Property Matrices The inpts to models H-Hl and H-H2 are one-dimensional vector. Therefore, the character matrices (8x8) have to be transferred to vector representations. To accomolish this, 4 selected properties (Fjii and Morita, 97) that constitte the basic bilding blocks of a character were extracted from each character matrix. Each property was represented by a 3x3 matrix. The selected character properties and their corresponding property matrix are shown in Figre 8. The 4 properties are extracted from each character to bild a 4-tple (the character vector C) in which c^ is associated with the ith property. A combined property matrix (X) is a 4x9 matrix which represents the 4 selected character properties. Each row of X is a x9 vector (x^) that represents a character property (see Figre 8). The nine elements in Xj^ are obtained by chaining the three rows of the 3x3, ith property matrix into a x9 vector. For example: => Xj^= []

41 33 n ( '!i a a OA r ) DA fo II ( (U r n iii ri {)J f n M lli r ro II <l II Q U II i) U ) CI ^ ) II n II ) II n Ul ) ( n I) ) I) n n kii ^ k(l fn coo } fl k i in ( II II rn IJ II II II W' in IJ II II II kit II II ) ( n n Q n a I) ( ( ( ( r a ) '' o n n Figre 7: The Corier 'A' Font and Its Matrix Representation

42 34 nmsizihba Figre 8: The Selected Character Properties and Their Matrix Representations

43 35 To be specific, X is constrcted by ptting the vector corresponding to the first property (x-^) in the first row of X, the vector corresponding to the second property (X2) in the second row of X, and so on. That is. X = 3 '4 Compting Filter Matrix In this section, the concepts of the property recognition matrix (Y) and the filter matrix (W) will be introdced and their fnctions will be explained. The property recognition matrix (Y) is a 4x9 matrix which represents a simplification of the property matrix. Matrix Y is an arbitrarily chosen matrix that is constrcted as simple as possible. For example: Y = yi ^2 ^3 yi3 yi4 OOOOllOOOJ The filter matrix (W) is a 9x9 matrix which maps X (the combined property matrix) to Y. Their relationship is

44 36 Y = XW. Given X and Y, W is eqal to X"^Y, if X and Y are sqare and non-singlar matrices. In this case, X is a 4x9 rectanglar matrix (there are more eqations than nknown). Therefore, W is over-determined and no niqe soltion exists. In this project, the minimm sqared error (MSE) techniqe was adopted (Dda and Hart, 973) to approximate W. The MSE procedre minimizes the sqared error between Y and XW. Using this procedre, the psedoinverse of X (X"^) is compted by X"*" = (X^X)""^X^. where X^ is the transpose of X and (X^X)"^ is the inverse of X^X, ths W = X"*"Y. After W is compted by minimizing (Y-XW)^, the exact Y is recompted as the prodct of X and W. Extracting Properties and Constrct Character Vectors Recall that each character is represented by a 8x8 character matrix. The prpose of this step is to extract the selected properties from the character matrix and to constrct the character vector C. To extract character properties, let A be the search area (character matrix) and

45 37 let w be a 3x3 window matrix as shown in Figre 9. The window starts moving from the left pper corner of the search area all the way across the first three rows (rows, 2, and 3). It then moves down one row (rows 2, 3, and 4) and scans all the way across, and so on. Basically, the movement is made from left to right and from top to bottom. The search is stopped when the lower right corner of the window coincides with the lower right corner of the character matrix. Let z be a x9 vector whose elements are obtained by chaining the 3-row elements in w. For example: w = => z'= [] I Then z is recognized as one of the selected properties if it maps to y (y = z W) which is one of the rows in Y. If z IS recognized, say y =Y-^, then the kth element (cj^) in the character vector is incremented by a weighting factor. The weighting factor is a fnction of property as well as the property location. For those properties that occr rarely, sch as property cross, their location in the character matrices were identified by assigning different weights. The vales of the weights are arbitrarily assigned and have no meaning other than for distingishing prposes. To determine the weighting factor, the character matrix-.vas divided into nine 6x6 portions with the left pper porrior. assigned score, 2 to its right and so forth (see Figre

46 38»! NDOM lo [o. o' n II n ( II n n ( rn~t)ritj o n o'l -IT' II ( ( n n ( r, EAPCII MATPI X ) coo G G coo G G ju k a - -> V 2. G G ( Figre 9: The Search Matrix and the Windows

47 39 ). Table lists the weighting factors. On the other hand, for those character properties that occr freqently, their location were ignored. One was assigned to every occrrence of the property no matter where it is. The character vector C is a x4 vector (C=[C2, C2/.../ c±^]) and each element in C is an accmlated score for its corresponding property. The character vector for each character matrix is extracted according to the following algorithm: do i=l to 6 by do j=l to 6 by z'=[ai^j Ai^j+i Ai^j+2 ^i+l,j ^i+l,j+l ^i+l,j+2 Ai+2,j ^i+2,j+l Ai+2,j+2l y'=z'w if y' exactly matches one of the rows in Y, say Y^^Yy^, then Cj^ is incremented by the weighting factor of the kth property end j end i. The encoding scheme is affected by the relative position of the properties. Also, the thickness of the line of the character is restricted to be one nit. Only the sign distribtion of the inpts affects the otpt of Model H-H2. To increase the variabilities of Model H-H2 otpt, the character vector (all of its elements

48 4 location location 2 location 3 location 4 location 5 location 6 location 7 location 8 location 9 Figre : Location Assignment in the Character Matrix

49 4 location property property property property property property property property property property property property property property Assignment Table of Weighting Factor

50 42 are positive before the process) is modified according to the following rles: do i=l to 4 by if threshold[i]= then if C[i]= then C[i]=-4 else C[i]=C[i] else end i. C[i]=C[i]-threshold[i] where threshold = [ ]. The vales of the threshold vector were chosen according to the freqency occrrence of the selected character properties for the 56 characters. The codes that go throgh the above processes are the inpts to the models. Application of Models Throgh Simlation Dring these simlations, the training set was composed of the vector representations of 78 characters: three ot of the six fonts for each letter of the alphabet were randomly selected. All of the 56 character were the candidates for the recognition set. For Model H-Hl, the training set was fed to the model, repeatedly, dring the training phase. The model compted the otpts and adjsted the memory matrix vales, according

51 43 to its otpt fnction and the learning rle ntil there was convergence. After the model stabilized, the memory matrix vales were fixed and the official otpts were generated by rnning the model one more time, sing the training set. The model recorded the otpts as well as their associated letters. Ths the inpts were divided into several categories. In general, several letters fell into the same categories. This happened when more than two letters created the same otpt. Figre shows an example of the decision tree. In this example, character E and character F prodced the same otpt. The learning process was started again to create a child model sing the vector representations of E's and F's as inpts. Another memory matrix was created to distingish E's from F's. This process contined recrsively, to bild the decision tree ntil either each otpt was associated with only one letter or the depth of the tree reached 7. The depth of the decision tree was limited to 6 to avoid having the learning process rn endlessly. For Model H-H2, three characters (three fonts for the same letter) of the training set were sbmitted to the model at one time. The model compted the otpts and adjsted the memory matrix vales based on the otpt fnction and coalescence rle described in Chapter II. After Model H-H2 converged, three vectors that represented another letter were sent to the model and the previos process was started

52 44 Memory Matrix V B f"c G [Q TEIH 'pi L Memory Matrix E F Figre : An Example of the Decision Tree

53 45 over again. After all of the 78 characters were learned, the vales of the memory matrix were fixed. The training set was sbmitted to the model to obtain the official otpts. The model recorded the otpts as well as their associated letters. Normally, the model divided the inpts into several categories with those that prodced the same otpt in one category. If two or more letters fell into the same category, the vector representations for those letters were again sbmitted to the model. The child model was created to discriminate the different letters that fell into one category by adopting the dissociation rle. These processes were rn recrsively to bild the decision tree ntil either each otpt was associated with only one letter or the depth of the tree reached 7. When bilding the decision tree, the coalescence rle and the dissociation rle were sed alternatively, with the odd depth for the coalescence rle and the even depth for the dissociation rle. Again, the depth of the decision tree was limited to 6 to prevent the process from rnning forever. In the recognition process, all 56 codes were inpt to the models. The recognition set was sbmitted to the memory matrix in the root of the decision tree. If the otpt matched one of the otpts previosly recorded and the otpt was only associated with one letter, then the model indicated that the inpt was the recorded character. On the other hand, if the otpt matched one of the otpts bt

54 46 more than two letters were associated with it, the inpt code was sent to its child memory matrix for frther process. The searching started from the root of the tree, and it stopped either when the memory matrix was a leaf node or the otpt did not match any of the otpts. As a reslt, the simlation programs generated the following information:. overall recognition rate, 2. nmber of rejected characters, 3. nmber of correctly recognized characters, 4. recognition rate of the trained characters, 5. rejection rate of the trained characters, 6. recognition rate of the ntrained characters, 7. rejection rate of the ntrained characters, 8. depth of the decision tree, 9. nmber of the memory matrix sed in the decision tree,. learning speed (in terms of nmber of iterations),. recognition speed (in terms of nmber of times), 2. nmber of correctly recognized characters for each alphabet, 3. nmber of rejected characters for each alphabet, 4. nmber of correctly recognized characters for each font, and 5. nmber of rejected characters for each font.

55 47 Model Comparisons The parameters, sch as memory matrix size, Smin_Smax, Mmin_Mmax, were maniplated to determine the models' performance. The memory matrix sizes were set to be 4x4, 6x4, 8x4, and x4. Smin_Smax were assigned to be ±2, ±5, and ±8. For H-Hl, Mmax was set to be 4, 6, and 8 with Mmin fixed to ; for Model H-H2, Mmin_Mmax was set to be ±8, ±, and ±2. Ths 3 6 observations were obtained for each model. These levels were determined by expanding the levels sed by Hogg and Hberman (985). Table 2 presents the levels of parameters. The performances of these models were determined sing the following criteria: accracy, reqired memory space, learning speed, and recognition speed. These for criteria are defined below () The accracy is scored as the fraction of correctly recognized characters with respect to the 56 characters. (2) The reqired memory space is defined by the nmber of memory matrices sed to bild the decision tree. The less discriminative the model is, the more memory matrices are needed to bild the tree. Generally speaking, the nmber of memory matrices reqired depends on the shape of the tree. (3) The learning speed is determined by the nmber of iterations reqired to bild the decision tree. The fewer

56 48 Table 2 Levels of Parameters Model Parameter Levels H-Hl Memory Matrix Size Smin Smax Mmin_Mmax 4x4 + 2 _4 6x4 + 5 _6 8x4 + 8 _8 X4 H-H2 Memory Matrix Size Smin Smax Mmin_Mmax 4x4 + 2 ±8 6x X4 ±8 ±2 x4

57 49 iterations the model needs in the learning phase, the faster the learning speed is. (4) The recognition speed is determined by the nmber of times the inpt data have to be sbmitted to the memory matrices in the decision tree in order to recognize 56 characters. The fewer the nmber of times the inpt data need to be sbmitted, the faster the recognition speed is. The recognition speed depends on the shape of the decision tree. Normally, the shallower the tree is, the faster the recognition speed is.

58 CHAPTER IV SIMULATION RESULTS To analyze the performances of the models. Statistics Analysis System, a compter system of software prodcts for data analysis, was sed. Basically, descriptive statistics (mean and standard deviation: STD) were obtained for performance data for each model. The effects of model parameters on model performance were determined statistically sing an ANalysis Of VAriance (ANOVA) procedre. The performance comparison was condcted by Pairwised T Test. In the first section, the extracted properties of the inpt characters are smmarized. The principal component analysis of the character vectors is presented in the second section. The behavior of models as fnctions of the size of memory array, Smin_Smax, and Mmin_Mmax are presented in the third and forth sections, respectively. Finally, model comparisons are discssed in the last section. Characteristics of Collected Characters Recall that 56 machine-printed characters were created for determinating the recognition performances of the models and forteen selected properties were extracted from each 5

59 5 character. Figre 2 presents the freqency occrrence of the properties of the 56 letters. The property occrrence ranges from 7 for "cross," which is.6% of the total nmber of occrrence, to 754 for "vertical line", which is over 4.44% of the total nmber of occrrence. Principal Component Analvsis The dimensionality of the data can be redced by removing or combining highly correlated data (Dda and Hart, 973). Principal component analysis was sed to redce the dimensionality by forming linear combinations of character vectors featres. SAS was sed for this prpose. Table 3 illstrates the statistical reslt. The eigenvales indicate that nine components, which accont for 92.55% of the standardized variance, provide a good smmary of the data. The eigenvectors provide the principals as the linear combinations of the forteen elements in the character vectors. The reslts of the compter simlations show that the recognition rate is lower when sing the nine principals as the inpts than when sing the x4 vector as the inpt code. The reslts also show that it reqires more memory space and that the learning and recognition speeds are mch slower. As a reslt, only x4 vectors were sed as the inpt data to the models in this stdy. The performance data are presented in Table 4.

ASAND: Asynchronous Slot Assignment and Neighbor Discovery Protocol for Wireless Networks

ASAND: Asynchronous Slot Assignment and Neighbor Discovery Protocol for Wireless Networks ASAND: Asynchronos Slot Assignment and Neighbor Discovery Protocol for Wireless Networks Fikret Sivrikaya, Costas Bsch, Malik Magdon-Ismail, Bülent Yener Compter Science Department, Rensselaer Polytechnic

More information

WHITE PAPER. Filter Bandwidth Definition of the WaveShaper S-series Programmable Optical Processor

WHITE PAPER. Filter Bandwidth Definition of the WaveShaper S-series Programmable Optical Processor WHITE PAPER Filter andwidth Definition of the WaveShaper S-series 1 Introdction The WaveShaper family of s allow creation of ser-cstomized filter profiles over the C- or L- band, providing a flexible tool

More information

10 Evaluating the Help Desk

10 Evaluating the Help Desk 10 Evalating the Help Desk The tre measre of any society is not what it knows bt what it does with what it knows. Warren Bennis Key Findings Help desk metrics having to do with demand and with problem

More information

GUIDELINE. Guideline for the Selection of Engineering Services

GUIDELINE. Guideline for the Selection of Engineering Services GUIDELINE Gideline for the Selection of Engineering Services 1998 Mission Statement: To govern the engineering profession while enhancing engineering practice and enhancing engineering cltre Pblished by

More information

Curriculum development

Curriculum development DES MOINES AREA COMMUNITY COLLEGE Crriclm development Competency-Based Edcation www.dmacc.ed Why does DMACC se competency-based edcation? DMACC tilizes competency-based edcation for a nmber of reasons.

More information

Every manufacturer is confronted with the problem

Every manufacturer is confronted with the problem HOW MANY PARTS TO MAKE AT ONCE FORD W. HARRIS Prodction Engineer Reprinted from Factory, The Magazine of Management, Volme 10, Nmber 2, Febrary 1913, pp. 135-136, 152 Interest on capital tied p in wages,

More information

TrustSVD: Collaborative Filtering with Both the Explicit and Implicit Influence of User Trust and of Item Ratings

TrustSVD: Collaborative Filtering with Both the Explicit and Implicit Influence of User Trust and of Item Ratings TrstSVD: Collaborative Filtering with Both the Explicit and Implicit Inflence of User Trst and of Item Ratings Gibing Go Jie Zhang Neil Yorke-Smith School of Compter Engineering Nanyang Technological University

More information

Candidate: Shawn Mullane. Date: 04/02/2012

Candidate: Shawn Mullane. Date: 04/02/2012 Shipping and Receiving Specialist / Inventory Control Assessment Report Shawn Mllane 04/02/2012 www.resorceassociates.com To Improve Prodctivity Throgh People. Shawn Mllane 04/02/2012 Prepared For: NAME

More information

Deploying Network Load Balancing

Deploying Network Load Balancing C H A P T E R 9 Deploying Network Load Balancing After completing the design for the applications and services in yor Network Load Balancing clster, yo are ready to deploy the clster rnning the Microsoft

More information

A Spare Part Inventory Management Model for Better Maintenance of Intelligent Transportation Systems

A Spare Part Inventory Management Model for Better Maintenance of Intelligent Transportation Systems 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 A Spare Part Inventory Management Model for Better Maintenance of Intelligent

More information

Introduction to HBase Schema Design

Introduction to HBase Schema Design Introdction to HBase Schema Design Amandeep Khrana Amandeep Khrana is a Soltions Architect at Clodera and works on bilding soltions sing the Hadoop stack. He is also a co-athor of HBase in Action. Prior

More information

Designing and Deploying File Servers

Designing and Deploying File Servers C H A P T E R 2 Designing and Deploying File Servers File servers rnning the Microsoft Windows Server 2003 operating system are ideal for providing access to files for sers in medim and large organizations.

More information

Corporate performance: What do investors want to know? Innovate your way to clearer financial reporting

Corporate performance: What do investors want to know? Innovate your way to clearer financial reporting www.pwc.com Corporate performance: What do investors want to know? Innovate yor way to clearer financial reporting October 2014 PwC I Innovate yor way to clearer financial reporting t 1 Contents Introdction

More information

Regular Specifications of Resource Requirements for Embedded Control Software

Regular Specifications of Resource Requirements for Embedded Control Software Reglar Specifications of Resorce Reqirements for Embedded Control Software Rajeev Alr and Gera Weiss University of Pennsylvania Abstract For embedded control systems a schedle for the allocation of resorces

More information

Spectrum Balancing for DSL with Restrictions on Maximum Transmit PSD

Spectrum Balancing for DSL with Restrictions on Maximum Transmit PSD Spectrm Balancing for DSL with Restrictions on Maximm Transmit PSD Driton Statovci, Tomas Nordström, and Rickard Nilsson Telecommnications Research Center Vienna (ftw.), Dona-City-Straße 1, A-1220 Vienna,

More information

A Contemporary Approach

A Contemporary Approach BORICP01.doc - 1 Second Edition Edcational Psychology A Contemporary Approach Gary D. Borich The University of Texas at Astin Martin L. Tombari University of Denver (This pblication may be reprodced for

More information

Candidate: Charles Parker. Date: 01/29/2015

Candidate: Charles Parker. Date: 01/29/2015 Software Developer / Programmer Assessment Report 01/29/2015 www.resorceassociates.com To Improve Prodctivity Throgh People. Janary 29, 2015 01/29/2015 The following pages represent a report based on the

More information

9 Setting a Course: Goals for the Help Desk

9 Setting a Course: Goals for the Help Desk IT Help Desk in Higher Edcation ECAR Research Stdy 8, 2007 9 Setting a Corse: Goals for the Help Desk First say to yorself what yo wold be; and then do what yo have to do. Epictets Key Findings Majorities

More information

Modeling Roughness Effects in Open Channel Flows D.T. Souders and C.W. Hirt Flow Science, Inc.

Modeling Roughness Effects in Open Channel Flows D.T. Souders and C.W. Hirt Flow Science, Inc. FSI-2-TN6 Modeling Roghness Effects in Open Channel Flows D.T. Soders and C.W. Hirt Flow Science, Inc. Overview Flows along rivers, throgh pipes and irrigation channels enconter resistance that is proportional

More information

Candidate: Suzanne Maxwell. Date: 09/19/2012

Candidate: Suzanne Maxwell. Date: 09/19/2012 Medical Coder / Billing Clerk Assessment Report Szanne Maxwell 09/19/2012 www.resorceassociates.com Szanne Maxwell 09/19/2012 Prepared For: NAME Prepared by: John Lonsbry, Ph.D. & Lcy Gibson, Ph.D., Licensed

More information

Compensation Approaches for Far-field Speaker Identification

Compensation Approaches for Far-field Speaker Identification Compensation Approaches for Far-field Speaer Identification Qin Jin, Kshitiz Kmar, Tanja Schltz, and Richard Stern Carnegie Mellon University, USA {qjin,shitiz,tanja,rms}@cs.cm.ed Abstract While speaer

More information

Using GPU to Compute Options and Derivatives

Using GPU to Compute Options and Derivatives Introdction Algorithmic Trading has created an increasing demand for high performance compting soltions within financial organizations. The actors of portfolio management and ris assessment have the obligation

More information

Planning a Managed Environment

Planning a Managed Environment C H A P T E R 1 Planning a Managed Environment Many organizations are moving towards a highly managed compting environment based on a configration management infrastrctre that is designed to redce the

More information

Periodized Training for the Strength/Power Athlete

Periodized Training for the Strength/Power Athlete Periodized Training for the /Power Athlete Jay R. Hoffman, PhD, FACSM, CSCS *D The se of periodized training has been reported to go back as far as the ancient Olympic games. Its basic premise is that

More information

Optimal Trust Network Analysis with Subjective Logic

Optimal Trust Network Analysis with Subjective Logic The Second International Conference on Emerging Secrity Information, Systems and Technologies Optimal Trst Network Analysis with Sbjective Logic Adn Jøsang UNIK Gradate Center, University of Oslo Norway

More information

The Intelligent Choice for Disability Income Protection

The Intelligent Choice for Disability Income Protection The Intelligent Choice for Disability Income Protection provider Pls Keeping Income strong We prposeflly engineer or disability income prodct with featres that deliver benefits sooner and contine paying

More information

The Intelligent Choice for Basic Disability Income Protection

The Intelligent Choice for Basic Disability Income Protection The Intelligent Choice for Basic Disability Income Protection provider Pls Limited Keeping Income strong We prposeflly engineer or basic disability income prodct to provide benefit-rich featres delivering

More information

Closer Look at ACOs. Making the Most of Accountable Care Organizations (ACOs): What Advocates Need to Know

Closer Look at ACOs. Making the Most of Accountable Care Organizations (ACOs): What Advocates Need to Know Closer Look at ACOs A series of briefs designed to help advocates nderstand the basics of Accontable Care Organizations (ACOs) and their potential for improving patient care. From Families USA Updated

More information

CHAPTER ONE VECTOR GEOMETRY

CHAPTER ONE VECTOR GEOMETRY CHAPTER ONE VECTOR GEOMETRY. INTRODUCTION In this chapter ectors are first introdced as geometric objects, namely as directed line segments, or arrows. The operations of addition, sbtraction, and mltiplication

More information

Planning an Active Directory Deployment Project

Planning an Active Directory Deployment Project C H A P T E R 1 Planning an Active Directory Deployment Project When yo deploy the Microsoft Windows Server 2003 Active Directory directory service in yor environment, yo can take advantage of the centralized,

More information

Purposefully Engineered High-Performing Income Protection

Purposefully Engineered High-Performing Income Protection The Intelligent Choice for Disability Income Insrance Prposeflly Engineered High-Performing Income Protection Keeping Income strong We engineer or disability income prodcts with featres that deliver benefits

More information

Enabling Advanced Windows Server 2003 Active Directory Features

Enabling Advanced Windows Server 2003 Active Directory Features C H A P T E R 5 Enabling Advanced Windows Server 2003 Active Directory Featres The Microsoft Windows Server 2003 Active Directory directory service enables yo to introdce advanced featres into yor environment

More information

The Good Governance Standard for Public Services

The Good Governance Standard for Public Services The Good Governance Standard for Pblic Services The Independent Commission for Good Governance in Pblic Services The Independent Commission for Good Governance in Pblic Services, chaired by Sir Alan Langlands,

More information

Equilibrium of Forces Acting at a Point

Equilibrium of Forces Acting at a Point Eqilibrim of orces Acting at a Point Eqilibrim of orces Acting at a Point Pre-lab Qestions 1. What is the definition of eqilibrim? Can an object be moving and still be in eqilibrim? Explain.. or this lab,

More information

The Good Governance Standard for Public Services

The Good Governance Standard for Public Services The Good Governance Standard for Pblic Services The Independent Commission on Good Governance in Pblic Services Good Governance Standard for Pblic Services OPM and CIPFA, 2004 OPM (Office for Pblic Management

More information

Chapter 3. 2. Consider an economy described by the following equations: Y = 5,000 G = 1,000

Chapter 3. 2. Consider an economy described by the following equations: Y = 5,000 G = 1,000 Chapter C evel Qestions. Imagine that the prodction of fishing lres is governed by the prodction fnction: y.7 where y represents the nmber of lres created per hor and represents the nmber of workers employed

More information

Data De-duplication from the data sets using Similarity functions

Data De-duplication from the data sets using Similarity functions Data De-dplication from the data sets sing Similarity fnctions M.Chitrarpa 1, V.Mniraj Naid 2 1 M.Tech Stdent Department of CSE Adisankara College of Engineering & Technology Gdr, Nellore district, India

More information

8 Service Level Agreements

8 Service Level Agreements 8 Service Level Agreements Every organization of men, be it social or political, ltimately relies on man s capacity for making promises and keeping them. Hannah Arendt Key Findings Only abot 20 percent

More information

On the urbanization of poverty

On the urbanization of poverty On the rbanization of poverty Martin Ravallion 1 Development Research Grop, World Bank 1818 H Street NW, Washington DC, USA Febrary 001; revised Jly 001 Abstract: Conditions are identified nder which the

More information

NAZIA KANWAL VECTOR TRACKING LOOP DESIGN FOR DEGRADED SIGNAL ENVIRONMENT. Master of Science Thesis

NAZIA KANWAL VECTOR TRACKING LOOP DESIGN FOR DEGRADED SIGNAL ENVIRONMENT. Master of Science Thesis NAZIA KANWAL VECTOR TRACKING LOOP DESIGN FOR DEGRADED SIGNAL ENVIRONMENT Master of Science Thesis Examiners: Professor Jari Nrmi, Adjnct Professor Simona Lohan and Dr. Heikki Hrskainen Examiner and topic

More information

Sample Pages. Edgar Dietrich, Alfred Schulze. Measurement Process Qualification

Sample Pages. Edgar Dietrich, Alfred Schulze. Measurement Process Qualification Sample Pages Edgar Dietrich, Alfred Schlze Measrement Process Qalification Gage Acceptance and Measrement Uncertainty According to Crrent Standards ISBN: 978-3-446-4407-4 For frther information and order

More information

Designing a TCP/IP Network

Designing a TCP/IP Network C H A P T E R 1 Designing a TCP/IP Network The TCP/IP protocol site defines indstry standard networking protocols for data networks, inclding the Internet. Determining the best design and implementation

More information

EMC ViPR. Concepts Guide. Version 1.1.0 302-000-482 02

EMC ViPR. Concepts Guide. Version 1.1.0 302-000-482 02 EMC ViPR Version 1.1.0 Concepts Gide 302-000-482 02 Copyright 2013-2014 EMC Corporation. All rights reserved. Pblished in USA. Pblished Febrary, 2014 EMC believes the information in this pblication is

More information

Closer Look at ACOs. Designing Consumer-Friendly Beneficiary Assignment and Notification Processes for Accountable Care Organizations

Closer Look at ACOs. Designing Consumer-Friendly Beneficiary Assignment and Notification Processes for Accountable Care Organizations Closer Look at ACOs A series of briefs designed to help advocates nderstand the basics of Accontable Care Organizations (ACOs) and their potential for improving patient care. From Families USA Janary 2012

More information

Effective governance to support medical revalidation

Effective governance to support medical revalidation Effective governance to spport medical revalidation A handbook for boards and governing bodies This docment sets ot a view of the core elements of effective local governance of the systems that spport

More information

Optimal Personalized Filtering Against Spear-Phishing Attacks

Optimal Personalized Filtering Against Spear-Phishing Attacks Optimal Personalized Filtering Against Spear-Phishing Attacks Aron Laszka and Yevgeniy Vorobeychik and Xenofon Kotsokos Institte for Software Integrated Systems Department of Electrical Engineering and

More information

Document management and records (based in part upon materials by Frank Upward and Robert Hartland)

Document management and records (based in part upon materials by Frank Upward and Robert Hartland) Today s lectre IMS1603 Lectre 21 What does docment management entail? Docment management and records (based in part pon materials by Frank Upward and Robert Hartland) www.monash.ed. a Thinking more abot

More information

Planning a Smart Card Deployment

Planning a Smart Card Deployment C H A P T E R 1 7 Planning a Smart Card Deployment Smart card spport in Microsoft Windows Server 2003 enables yo to enhance the secrity of many critical fnctions, inclding client athentication, interactive

More information

Research on Pricing Policy of E-business Supply Chain Based on Bertrand and Stackelberg Game

Research on Pricing Policy of E-business Supply Chain Based on Bertrand and Stackelberg Game International Jornal of Grid and Distribted Compting Vol. 9, No. 5 (06), pp.-0 http://dx.doi.org/0.457/ijgdc.06.9.5.8 Research on Pricing Policy of E-bsiness Spply Chain Based on Bertrand and Stackelberg

More information

Motorola Reinvents its Supplier Negotiation Process Using Emptoris and Saves $600 Million. An Emptoris Case Study. Emptoris, Inc. www.emptoris.

Motorola Reinvents its Supplier Negotiation Process Using Emptoris and Saves $600 Million. An Emptoris Case Study. Emptoris, Inc. www.emptoris. Motorola Reinvents its Spplier Negotiation Process Using Emptoris and Saves $600 Million An Emptoris Case Stdy Emptoris, Inc. www.emptoris.com VIII-03/3/05 Exective Smmary With the disastros telecommnication

More information

11 Success of the Help Desk: Assessing Outcomes

11 Success of the Help Desk: Assessing Outcomes 11 Sccess of the Help Desk: Assessing Otcomes I dread sccess... I like a state of continal becoming, with a goal in front and not behind. George Bernard Shaw Key Findings Respondents help desks tend to

More information

Executive Coaching to Activate the Renegade Leader Within. Renegades Do What Others Won t To Get the Results that Others Don t

Executive Coaching to Activate the Renegade Leader Within. Renegades Do What Others Won t To Get the Results that Others Don t Exective Coaching to Activate the Renegade Leader Within Renegades Do What Others Won t To Get the Reslts that Others Don t Introdction Renegade Leaders are a niqe breed of leaders. The Renegade Leader

More information

MUNICIPAL CREDITWORTHINESS MODELLING BY NEURAL NETWORKS

MUNICIPAL CREDITWORTHINESS MODELLING BY NEURAL NETWORKS 0 Acta Electrotechnica et Informatica Vol. 8, No. 4, 008, 0 5 MUNICIPAL CREDITWORTHINESS MODELLING BY NEURAL NETWORKS Petr HÁJEK, Vladimír OLEJ Institte of System Engineering and Informatics, Faclty of

More information

Facilities. Car Parking and Permit Allocation Policy

Facilities. Car Parking and Permit Allocation Policy Facilities Car Parking and Permit Allocation Policy Facilities Car Parking and Permit Allocation Policy Contents Page 1 Introdction....................................................2 2.0 Application

More information

Designing an Authentication Strategy

Designing an Authentication Strategy C H A P T E R 1 4 Designing an Athentication Strategy Most organizations need to spport seamless access to the network for mltiple types of sers, sch as workers in offices, employees who are traveling,

More information

I Symbolization J,1 II e L~ "-"-:"u"'dll... Table I: The kinds of CGs and their classification, (where, t - a local neighbourhood topology)

I Symbolization J,1 II e L~ --:u'dll... Table I: The kinds of CGs and their classification, (where, t - a local neighbourhood topology) POSTER SESSIONS 484 REPRESENTATION OF THE GENERALIZED DATA STRUCTURES FOR MULTI-SCALE GIS M.O.Govorov Dept. of Cartography,' Siberian State Academy of Geodesy Plahotnogo 10, Novosibirsk, 630108, Rssia

More information

aééäçóáåö=táåççïë= péêîéê=ommp=oéöáçå~ä= açã~áåë

aééäçóáåö=táåççïë= péêîéê=ommp=oéöáçå~ä= açã~áåë C H A P T E R 7 aééäçóáåö=táåççïë= péêîéê=ommp=oéöáçå~ä= açã~áåë Deploying Microsoft Windows Server 2003 s involves creating new geographically based child domains nder the forest root domain. Deploying

More information

Candidate: Kyle Jarnigan. Date: 04/02/2012

Candidate: Kyle Jarnigan. Date: 04/02/2012 Cstomer Service Manager Assessment Report 04/02/2012 www.resorceassociates.com To Improve Prodctivity Throgh People. Cstomer Service Manager Assessment Report 04/02/2012 Prepared For: NAME Prepared by:

More information

Faster Inversion and Other Black Box Matrix Computations Using Efficient Block Projections

Faster Inversion and Other Black Box Matrix Computations Using Efficient Block Projections Faster Inversion and Other Black Box Matrix Comptations Using Efficient Block Projections Wayne Eberly 1, Mark Giesbrecht, Pascal Giorgi,, Arne Storjohann, Gilles Villard (1) Department of Compter Science,

More information

EMC VNX Series. EMC Secure Remote Support for VNX. Version VNX1, VNX2 300-014-340 REV 03

EMC VNX Series. EMC Secure Remote Support for VNX. Version VNX1, VNX2 300-014-340 REV 03 EMC VNX Series Version VNX1, VNX2 EMC Secre Remote Spport for VNX 300-014-340 REV 03 Copyright 2012-2014 EMC Corporation. All rights reserved. Pblished in USA. Pblished Jly, 2014 EMC believes the information

More information

STI Has All The Pieces Hardware Software Support

STI Has All The Pieces Hardware Software Support STI Has All The Pieces Hardware Software Spport STI has everything yo need for sccessfl practice management, now and in the ftre. The ChartMaker Medical Site Incldes: Practice Management/Electronic Billing,

More information

Isilon OneFS. Version 7.1. Backup and recovery guide

Isilon OneFS. Version 7.1. Backup and recovery guide Isilon OneFS Version 7.1 Backp and recovery gide Copyright 2013-2014 EMC Corporation. All rights reserved. Pblished in USA. Pblished March, 2014 EMC believes the information in this pblication is accrate

More information

Planning and Implementing An Optimized Private Cloud

Planning and Implementing An Optimized Private Cloud W H I T E PA P E R Intelligent HPC Management Planning and Implementing An Optimized Private Clod Creating a Clod Environment That Maximizes Yor ROI Planning and Implementing An Optimized Private Clod

More information

Borrowing for College. Table of contents. A guide to federal loans for higher education

Borrowing for College. Table of contents. A guide to federal loans for higher education Borrowing for College A gide to federal loans for higher edcation Table of contents Edcation loan basics 2 Applying for edcation loans 3 Repaying edcation loans 3 Controlling edcation loan debt 5 Glossary

More information

HSBC Internet Banking. Combined Product Disclosure Statement and Supplementary Product Disclosure Statement

HSBC Internet Banking. Combined Product Disclosure Statement and Supplementary Product Disclosure Statement HSBC Internet Banking Combined Prodct Disclosre Statement and Spplementary Prodct Disclosre Statement AN IMPORTANT MESSAGE FOR HSBC CUSTOMERS NOTICE OF CHANGE For HSBC Internet Banking Combined Prodct

More information

The Boutique Premium. Do Boutique Investment Managers Create Value? AMG White Paper June 2015 1

The Boutique Premium. Do Boutique Investment Managers Create Value? AMG White Paper June 2015 1 The Botiqe Premim Do Botiqe Investment Managers Create Vale? AMG White Paper Jne 2015 1 Exective Smmary Botiqe active investment managers have otperformed both non-botiqe peers and indices over the last

More information

Roth 401(k) and Roth 403(b) Accounts: Pay Me Now or Pay Me Later Why a Roth Election Should Be Part of Your Plan Now

Roth 401(k) and Roth 403(b) Accounts: Pay Me Now or Pay Me Later Why a Roth Election Should Be Part of Your Plan Now Reprinted with permission from the Society of FSP. Reprodction prohibited withot pblisher's written permission. Roth 401(k) and Roth 403(b) Acconts: Why a Roth Election Shold Be Part of Yor Plan Now by

More information

An unbiased crawling strategy for directed social networks

An unbiased crawling strategy for directed social networks Abstract An nbiased crawling strategy for directed social networks Xeha Yang 1,2, HongbinLi 2* 1 School of Software, Shenyang Normal University, Shenyang 110034, Liaoning, China 2 Shenyang Institte of

More information

Herzfeld s Outlook: Seasonal Factors Provide Opportunities in Closed-End Funds

Herzfeld s Outlook: Seasonal Factors Provide Opportunities in Closed-End Funds VIRTUS HERZFELD FUND Herzfeld s Otlook: Seasonal Factors Provide Opportnities in Closed-End Fnds When it comes to investing in closed-end fnds, a comprehensive nderstanding of the inefficiencies of the

More information

Apache Hadoop. The Scalability Update. Source of Innovation

Apache Hadoop. The Scalability Update. Source of Innovation FILE SYSTEMS Apache Hadoop The Scalability Update KONSTANTIN V. SHVACHKO Konstantin V. Shvachko is a veteran Hadoop developer. He is a principal Hadoop architect at ebay. Konstantin specializes in efficient

More information

EMC VNX Series Setting Up a Unisphere Management Station

EMC VNX Series Setting Up a Unisphere Management Station EMC VNX Series Setting Up a Unisphere Management Station P/N 300-015-123 REV. 02 April, 2014 This docment describes the different types of Unisphere management stations and tells how to install and configre

More information

3. DATES COVERED (From- To) Technical 4. TITLE AND SUBTITLE

3. DATES COVERED (From- To) Technical 4. TITLE AND SUBTITLE REPORT DOCUMENTATION PAGE Form Approved OMB No. 0704-01-0188 l ne pblic reporting brden tor this collection of information is estimated to average 1 hor per response, inclding the time tor reviewing instrctions,

More information

2.1 Unconstrained Graph Partitioning. 1.2 Contributions. 1.3 Related Work. 1.4 Paper Organization 2. GRAPH-THEORETIC APPROACH

2.1 Unconstrained Graph Partitioning. 1.2 Contributions. 1.3 Related Work. 1.4 Paper Organization 2. GRAPH-THEORETIC APPROACH Mining Newsgrops Using Networks Arising From Social Behavior Rakesh Agrawal Sridhar Rajagopalan Ramakrishnan Srikant Yirong X IBM Almaden Research Center 6 Harry Road, San Jose, CA 95120 ABSTRACT Recent

More information

Evolutionary Path Planning for Robot Assisted Part Handling in Sheet Metal Bending

Evolutionary Path Planning for Robot Assisted Part Handling in Sheet Metal Bending Evoltionary Path Planning for Robot Assisted Part Handling in Sheet Metal Bending Abstract Xiaoyn Liao G. Gary Wang * Dept. of Mechanical & Indstrial Engineering, The University of Manitoba Winnipeg, MB,

More information

FaceTrust: Assessing the Credibility of Online Personas via Social Networks

FaceTrust: Assessing the Credibility of Online Personas via Social Networks FaceTrst: Assessing the Credibility of Online Personas via Social Networks Michael Sirivianos Kyngbaek Kim Xiaowei Yang Dke University University of California, Irvine Dke University msirivia@cs.dke.ed

More information

6 Funding and Staffing the Central IT Help Desk

6 Funding and Staffing the Central IT Help Desk 6 Fnding and Staffing the Central IT Help Desk Money may kindle, bt it cannot itself, or for very long, brn. Igor Stravinsky Key Findings At most instittions the central IT bdget is a major sorce of help

More information

KEYS TO BEING AN EFFECTIVE WORKPLACE PERSONAL ASSISTANT

KEYS TO BEING AN EFFECTIVE WORKPLACE PERSONAL ASSISTANT 5 KEYS TO BEING AN EFFECTIVE WORKPLACE PERSONAL ASSISTANT by: John Barrett Personal assistants (PAs) and their ability to effectively provide essential spports at the workplace are extremely important

More information

Research on Staff Explicitation in Organizational Knowledge Management Based on Fuzzy Set Similarity to Ideal Solution

Research on Staff Explicitation in Organizational Knowledge Management Based on Fuzzy Set Similarity to Ideal Solution Send Orders for Reprints to reprints@benthamscience.ae The Open Cybernetics & Systemics Jornal, 015, 9, 139-144 139 Open Access Research on Staff Explicitation in Organizational Knowledge Management Based

More information

Phone Banking Terms Corporate Accounts

Phone Banking Terms Corporate Accounts Phone Banking Terms Corporate Acconts If there is any inconsistency between the terms and conditions applying to an Accont and these Phone Banking Terms, these Phone Banking Terms prevail in respect of

More information

On a Generalized Graph Coloring/Batch Scheduling Problem

On a Generalized Graph Coloring/Batch Scheduling Problem Reglar Papers On a Generalized Graph Coloring/Batch Schedling Problem Giorgio Lcarelli 1, Ioannis Milis Dept. of Informatics, Athens University of Economics and Bsiness, 104 34, Athens, Greece, {glc, milis}@aeb.gr

More information

Kentucky Deferred Compensation (KDC) Program Summary

Kentucky Deferred Compensation (KDC) Program Summary Kentcky Deferred Compensation (KDC) Program Smmary Smmary and Highlights of the Kentcky Deferred Compensation (KDC) Program Simple. Smart. For yo. For life. 457 Plan 401(k) Plan Roth 401(k) Deemed Roth

More information

Stability of Linear Control System

Stability of Linear Control System Stabilit of Linear Control Sstem Concept of Stabilit Closed-loop feedback sstem is either stable or nstable. This tpe of characterization is referred to as absolte stabilit. Given that the sstem is stable,

More information

Comparative Studies of Load Balancing With Control and Optimization Techniques

Comparative Studies of Load Balancing With Control and Optimization Techniques 5 American Control Conference ne 8-, 5 Portland, OR, USA WeC5 Comparative Stdies of Load Balancing With Control and Optimization Techniqes Yixin Diao, Chai Wah W, oseph L Hellerstein, Adam Storm, Maheswaran

More information

Practical Tips for Teaching Large Classes

Practical Tips for Teaching Large Classes Embracing Diversity: Toolkit for Creating Inclsive, Learning-Friendly Environments Specialized Booklet 2 Practical Tips for Teaching Large Classes A Teacher s Gide Practical Tips for Teaching Large Classes:

More information

Pgrading To Windows XP 4.0 Domain Controllers and Services

Pgrading To Windows XP 4.0 Domain Controllers and Services C H A P T E R 8 Upgrading Windows NT 4.0 Domains to Windows Server 2003 Active Directory Upgrading yor domains from Microsoft Windows NT 4.0 to Windows Server 2003 Active Directory directory service enables

More information

Make the College Connection

Make the College Connection Make the College Connection A college planning gide for stdents and their parents Table of contents The compelling case for college 2 Selecting a college 3 Paying for college 5 Tips for meeting college

More information

Configuration Management for Software Product Lines

Configuration Management for Software Product Lines onfigration Management for Software Prodct Lines Roland Laqa and Peter Knaber Franhofer Institte for Experimental Software Engineering (IESE) Saerwiesen 6 D-67661 Kaiserslatern, Germany +49 6301 707 161

More information

Dimension Debasing towards Minimal Search Space Utilization for Mining Patterns in Big Data

Dimension Debasing towards Minimal Search Space Utilization for Mining Patterns in Big Data Volme: 3 Isse: 8 59-594 Dimension Debasing towards Minimal Search Space Utilization for Mining Patterns in Big Data Dr. M. Naga Ratna Dept. of Compter Science JNTUH College of Engineering Email: mratnajnt@jnth.ac.in

More information

Mining Social Media with Social Theories: A Survey

Mining Social Media with Social Theories: A Survey Mining Media with Theories: A Srvey Jiliang Tang Compter Science & Eng Arizona State University Tempe, AZ, USA Jiliang.Tang@as.ed Yi Chang Yahoo!Labs Yahoo!Inc Snnyvale,CA, USA yichang@yahooinc.com Han

More information

Towers Watson Manager Research

Towers Watson Manager Research Towers Watson Manager Research How we se fnd performance data Harald Eggerstedt 13. März 212 212 Towers Watson. All rights reserved. Manager selection at Towers Watson The goal is to find managers that

More information

Health Benefits Coverage Under Federal Law...

Health Benefits Coverage Under Federal Law... covers Labor Compliance 2014mx.pdf 1 11/19/2014 2:05:01 PM Compliance Assistance Gide Health Benefits Coverage Under Federal Law... The Affordable Care Act Health Insrance Portability and Accontability

More information

Inferring Continuous Dynamic Social Influence and Personal Preference for Temporal Behavior Prediction

Inferring Continuous Dynamic Social Influence and Personal Preference for Temporal Behavior Prediction Inferring Continos Dynamic Social Inflence and Personal Preference for Temporal Behavior Prediction Jn Zhang 1,2,3,4 Chaokn Wang 2,3,4 Jianmin Wang 2,3,4 Jeffrey X Y 5 1 Department of Compter Science and

More information

7 Help Desk Tools. Key Findings. The Automated Help Desk

7 Help Desk Tools. Key Findings. The Automated Help Desk 7 Help Desk Tools Or Age of Anxiety is, in great part, the reslt of trying to do today s jobs with yesterday s tools. Marshall McLhan Key Findings Help desk atomation featres are common and are sally part

More information

Chapter 2. ( Vasiliy Koval/Fotolia)

Chapter 2. ( Vasiliy Koval/Fotolia) hapter ( Vasili Koval/otolia) This electric transmission tower is stabilied b cables that eert forces on the tower at their points of connection. In this chapter we will show how to epress these forces

More information

Resource Pricing and Provisioning Strategies in Cloud Systems: A Stackelberg Game Approach

Resource Pricing and Provisioning Strategies in Cloud Systems: A Stackelberg Game Approach Resorce Pricing and Provisioning Strategies in Clod Systems: A Stackelberg Game Approach Valeria Cardellini, Valerio di Valerio and Francesco Lo Presti Talk Otline Backgrond and Motivation Provisioning

More information

Primary Analysis of Effective Permeability of the Flame in Burning Natural Gas

Primary Analysis of Effective Permeability of the Flame in Burning Natural Gas Jornal of etals, aterials and inerals. Vol.7 No. pp.63-66. rimary Analysis of Effective ermeability of the Flame in Brning Natral Gas Rakoš JAROSAV * and Repasova AGDAENA * Department of Thermal Technology,

More information

Candidate: Kevin Taylor. Date: 04/02/2012

Candidate: Kevin Taylor. Date: 04/02/2012 Systems Analyst / Network Administrator Assessment Report 04/02/2012 www.resorceassociates.com To Improve Prodctivity Throgh People. 04/02/2012 Prepared For: Resorce Associates Prepared by: John Lonsbry,

More information

Manipulating Deformable Linear Objects: Characteristic Features for Vision-Based Detection of Contact State Transitions

Manipulating Deformable Linear Objects: Characteristic Features for Vision-Based Detection of Contact State Transitions Maniplating Deformable Linear Objects: Characteristic Featres for Vision-Based Detection of Contact State Transitions Jürgen Acker Dominik Henrich Embedded Systems and Robotics Lab. (RESY) Faclty of Informatics,

More information

Galvin s All Things Enterprise

Galvin s All Things Enterprise Galvin s All Things Enterprise The State of the Clod, Part 2 PETER BAER GALVIN Peter Baer Galvin is the CTO for Corporate Technologies, a premier systems integrator and VAR (www.cptech. com). Before that,

More information