Introduction to Fuzzy Logic Control

Size: px
Start display at page:

Download "Introduction to Fuzzy Logic Control"

Transcription

1 Introduction to Fuzzy Logic Control 1

2 Outline General Definition Applications Operations Rules Fuzzy Logic Toolbox FIS Editor Tipping Problem: Fuzzy Approach Defining Inputs & Outputs Defining MFs Defining Fuzzy Rules 2

3 General Definition Fuzzy Logic Lotfi Zadeh, Berkely superset of conventional (Boolean) logic that has been extended to handle the concept of partial truth central notion of fuzzy systems is that truth values (in fuzzy logic) or membership values (in fuzzy sets) are indicated by a value on the range [0.0, 1.0], with 0.0 representing absolute Falseness and 1.0 representing absolute Truth. deals with real world vagueness

4 Applications Expert Systems Control Units Bullet train between Tokyo and Osaka Video Cameras Automatic Transmissions

5 Operations A B A B A B A

6 Controller Structure Fuzzification Scales and maps input variables to fuzzy sets Inference Mechanism Approximate reasoning Deduces the control action Defuzzification Convert fuzzy output values to control signals

7 MATLAB fuzzy logic toolbox MATLAB fuzzy logic toolbox facilitates the development of fuzzy-logic systems using: graphical user interface (GUI) tools command line functionality The tool can be used for building Fuzzy Expert Systems Adaptive Neuro-Fuzzy Inference Systems (ANFIS) 7

8 Graphical User Interface (GUI) Tools There are five primary GUI tools for building, editing, and observing fuzzy inference systems in the Fuzzy Logic Toolbox: Fuzzy Inference System (FIS) Editor Membership Function Editor Rule Editor Rule Viewer Surface Viewer 8

9 MATLAB: Fuzzy Logic Toolbox 9

10 MATLAB: Fuzzy Logic Toolbox 10

11 Fuzzy Inference system Two type of inference system Mamdni inference method Sugeno inference method *Mamdani's fuzzy inference method, the most common methodology 11

12 FIS Editor: Mamdani s inference system 12

13 Fuzzy Logic Examples using Matlab To control the speed of a motor by changing the input voltage When a set point is defined, if for some reason, the motor runs faster, we need to slow it down by reducing the input voltage. If the motor slows below the set point, the input voltage must be increased so that the motor speed reaches the set point. 13

14 Input/Output Input status words be: Too slow Just right Too fast output action words be: Less voltage (Slow down) No change More voltage (Speed up) 14

15 FIS Editor: Adding Input / Output 15

16 FIS Editor: Adding Input / Output 16

17 Membership Function Editor 17

18 Input Membership Function 18

19 Output Membership Function 19

20 Membership Functions 20

21 Rules Define the rule-base: 1) If the motor is running too slow, then more voltage. 2) If motor speed is about right, then no change. 3) If motor speed is to fast, then less voltage. 21

22 Member function Editor: Adding Rules 22

23 Rule Base 23

24 Rule Viewer 24

25 Surface Viewer 25

26 Save the file as one.fis. Now type in the commend window to get the result: >>fis = readfis('one'); out=evalfis(2437.4,fis) >>out =

27 Sugeno-Type Fuzzy Inference Takagi-Sugeno-Kang, method of fuzzy inference similar to the Mamdani method in many respects Fuzzifying the inputs and applying the fuzzy operator, are exactly the same. The main difference between Mamdani and Sugeno is that the Sugeno output membership functions are either linear or constant. 27

28 FIS Editor: Sugeno inference system 28

29 Add Input/output variables 29

30 Define Input/output variables 30

31 Add Input MF 31

32 Define Input MF 32

33 Add output MF 33

34 Define output MF 34

35 Add rules 35

36 Define Rule Base 36

37 View rules 37

38 Rules viewer 38

39 Surface viewer 39

40 Advantages of the Sugeno Method Sugeno is a more compact and computationally efficient representation than a Mamdani system. It is computationally efficient. It works well with linear techniques (e.g., PID control). It works well with optimization and adaptive techniques. It has guaranteed continuity of the output surface. It is well suited to mathematical analysis. 40

41 Advantages of the Mamdani Method It is intuitive. It has widespread acceptance. It is well suited to human input. 41

42 Support Vector Machine & Its Applications

43 Overview Introduction to Support Vector Machines (SVM) Properties of SVM Applications Gene Expression Data Classification Text Categorization if time permits Discussion

44 Support Vector Machine(SVM) The fundamental principle of classification using the SVM is to separate the two categories of patterns Map data x into a higher dimensional feature space via a nonlinear mapping. The linear classification (regression) in the high dimensional space is equivalent to the nonlinear classification (regression) in the low dimensional space

45 Linear Classifiers w x+ b>0 denotes +1 denotes -1 x α f y est f(x,w,b) = sign(w x + b) w x+ b<0 How would you classify this data?

46 denotes +1 denotes -1 Linear Classifiers x α f f(x,w,b) = sign(w x +b) y est How would you classify this data?

47 Linear Classifiers denotes +1 denotes -1 x α f f(x,w,b) = sign(w x + b) y est How would you classify this data?

48 Linear Classifiers denotes +1 denotes -1 x α f y est f(x,w,b) = sign(w x + b) Any of these would be fine....but which is best?

49 Linear Classifiers α denotes +1 denotes -1 x f f(x,w,b) = sign(w x +b) y est How would you classify this data? Misclassified to +1 class

50 Classifier Margin α denotes +1 denotes -1 x f f(x,w,b) = sign(w x +b) y est Define the margin of a linear classifier as the width that the boundary could be increased by before hitting a datapoint.

51 Maximum Margin denotes +1 denotes -1 Support Vectors are those datapoints that the margin pushes up against 1. Maximizing the margin is good according f(x,w,b) to intuition = sign(w and PAC x theory + b) 2. Implies that only support vectors are important; other The training maximum examples are ignorable. Linear SVM margin linear classifier is the linear classifier with the, um, maximum margin. 3. Empirically it works very very well. This is the simplest kind of SVM (Called an LSVM)

52 Linear SVM Mathematically x + M=Margin Width X - What we know: w. x + + b = +1 w. x - + b = -1 M = ( x + x w ) w = 2 w w. (x + -x -) = 2

53 Linear SVM Mathematically Goal: 1) Correctly classify all training data 2) Maximize the Margin same as minimize if y i = +1 if y i = -1 We can formulate a Quadratic Optimization Problem and solve for w and b Minimize subject to y wx i + b 1 wx i + b 1 i Φ( w) = y ( wx + b) 1 i i 1 2 t w w ( wx + b) 1 i for all i M = 2 w w t w 1 2 i

54 Solving the Optimization Problem Find w and b such that Φ(w) =½ w T w is minimized; and for all {(x i,y i )}: y i (w T x i + b) 1 Need to optimize a quadratic function subject to linear constraints. Quadratic optimization problems are a well known class of mathematical programming problems, and many (rather intricate) algorithms exist for solving them. The solution involves constructing a dual problem where a Lagrange multiplier α i is associated with every constraint in the primary problem: Find α 1 α N such that Q(α) =Σα i - ½ΣΣα i α j y i y j x it x j is maximized and (1) Σα i y i = 0 (2) α i 0 for all α i

55 The Optimization Problem Solution The solution has the form: w =Σα i y i x i b= y k - w T x k for any x k such that α k 0 Each non-zero α i indicates that corresponding x i is a support vector. Then the classifying function will have the form: f(x) = Σα i y i x it x + b Notice that it relies on an inner product between the test point x and the support vectors x i we will return to this later. Also keep in mind that solving the optimization problem involved computing the inner products x it x j between all pairs of training points.

56 Dataset with noise denotes +1 denotes -1 Hard Margin: So far we require all data points be classified correctly - No training error What if the training set is noisy? - Solution 1: use very powerful kernels OVERFITTING!

57 Soft Margin Classification Slack variables ξi can be added to allow misclassification of difficult or noisy examples. ε 2 ε 11 ε 7 What should our quadratic optimization criterion be? Minimize 1 w. w + C 2 R ε k k= 1

58 Hard Margin v.s. Soft Margin The old formulation: Find w and b such that Φ(w) =½ w T w is minimized and for all {(x i,y i )} y i (w T x i + b) 1 The new formulation incorporating slack variables: Find w and b such that Φ(w) =½ w T w + CΣξ i is minimized and for all {(x i,y i )} y i (w T x i + b) 1- ξ i and ξ i 0 for all i Parameter C can be viewed as a way to control overfitting.

59 Linear SVMs:Overview The classifier is a separating hyperplane. Most important training points are support vectors; they define the hyperplane. Quadratic optimization algorithms can identify which training points x i are support vectors with non-zero Lagrangian multipliers α i. Both in the dual formulation of the problem and in the solution training points appear only inside dot products: Find α 1 α N such that Q(α) =Σα i -½ΣΣα i α j y i y j x it x j is maximized and (1) Σα i y i = 0 (2) 0 α i C for all α i f(x) = Σα i y i x it x + b

60 Non-linear SVMs Datasets that are linearly separable with some noise work out great: But what are we going to do if the dataset is just too hard? 0 x How about mapping data to a higher-dimensional space: x 2 0 x 0 x

61 Non-linear SVMs: Feature spaces General idea: the original input space can always be mapped to some higher-dimensional feature space where the training set is separable: Φ: x φ(x)

62 The Kernel Trick The linear classifier relies on dot product between vectors K(x i,x j )=x it x j If every data point is mapped into high-dimensional space via some transformation Φ: x φ(x), the dot product becomes: K(x i,x j )= φ(x i ) T φ(x j ) A kernel function is some function that corresponds to an inner product in some expanded feature space. Example: 2-dimensional vectors x=[x 1 x 2 ]; let K(x i,x j )=(1 + x it x j ) 2, Need to show that K(x i,x j )= φ(x i ) T φ(x j ): K(x i,x j )=(1 + x it x j ) 2, = 1+ x i12 x 2 j1 + 2 x i1 x j1 x i2 x j2 + x i22 x 2 j2 + 2x i1 x j1 + 2x i2 x j2 = [1 x 2 i1 2 x i1 x i2 x 2 i2 2x i1 2x i2 ] T [1 x 2 j1 2 x j1 x j2 x 2 j2 2x j1 2x j2 ] = φ(x i ) T φ(x j ), where φ(x) = [1 x x 1 x 2 x 2 2 2x 1 2x 2 ]

63 What Functions are Kernels? For some functions K(x i,x j ) checking that K(x i,x j )= φ(x i ) T φ(x j ) can be cumbersome. Mercer s theorem: Every semi-positive definite symmetric function is a kernel Semi-positive definite symmetric functions correspond to a semi-positive definite symmetric Gram matrix: K= K(x 1,x 1 ) K(x 1,x 2 ) K(x 1,x 3 ) K(x 1,x N ) K(x 2,x 1 ) K(x 2,x 2 ) K(x 2,x 3 ) K(x 2,x N ) K(x N,x 1 ) K(x N,x 2 ) K(x N,x 3 ) K(x N,x N )

64 Examples of Kernel Functions Linear: K(x i,x j )= x i T x j Polynomial of power p: K(x i,x j )= (1+ x i T x j ) p Gaussian (radial-basis function network): xi x K( xi, x j) = exp( 2 2σ j 2 ) Sigmoid: K(x i,x j )= tanh(β 0 x i T x j + β 1 )

65 Non-linear SVMs Mathematically Dual problem formulation: Find α 1 α N such that Q(α) =Σα i -½ΣΣα i α j y i y j K(x i,x j ) is maximized and (1) Σα i y i = 0 (2) α i 0 for all α i The solution is: f(x) = Σα i y i K(x i,x j )+ b Optimization techniques for finding α i s remain the same!

66 Nonlinear SVM - Overview SVM locates a separating hyper plane in the feature space and classify points in that space It does not need to represent the space explicitly, simply by defining a kernel function The kernel function plays the role of the dot product in the feature space.

67 Properties of SVM Flexibility in choosing a similarity function Sparseness of solution when dealing with large data sets - only support vectors are used to specify the separating hyper plane Ability to handle large feature spaces - complexity does not depend on the dimensionality of the feature space Over fitting can be controlled by soft margin approach Nice math property: a simple convex optimization problem which is guaranteed to converge to a single global solution Feature Selection

68 SVM Applications SVM has been used successfully in many real-world problems - text (and hypertext) categorization - image classification - bioinformatics (Protein classification, Cancer classification) - hand-written character recognition

69 Weakness of SVM It is sensitive to noise - A relatively small number of mislabeled examples can dramatically decrease the performance It only considers two classes - how to do multi-class classification with SVM? -Answer: 1) with output arity m, learn m SVM s SVM 1 learns Output==1 vs Output!= 1 SVM 2 learns Output==2 vs Output!= 2 : SVM m learns Output==m vs Output!= m 2)To predict the output for a new input, just predict with each SVM and find out which one puts the prediction the furthest into the positive region.

70 Some Issues Choice of kernel Gaussian or polynomial kernel is default if ineffective, more elaborate kernels are needed domain experts can give assistance in formulating appropriate similarity measures Choice of kernel parameters σ in Gaussian kernel σ is the distance between closest points with different classifications Optimization criterion Hard margin vs. Soft margin A lengthy series of experiments in which various parameters are tested

71 Wind Power Forecasting(WPF) WPF is a technique which provides the information of how much wind power can be expected at a given point of time. Due to the increasing penetration of wind power into the electric power grid. A good short term forecasting will ensure grid stability and a favorable trading performance on the electricity markets.

72 Ɛ-SVM The objective function of the ε SVM is based on a ε insensitive loss function. The formula for the ε SVM is given as follows:

73 Structure of SVM

74 Data Resolution The resolution of the dataset is 10 minutes. Each data represents the average wind speed and power within one hour. The data values between xjtwo adjacent samples are linearly changed, that is: ) x j ( t) = x i + x i+ 1 dt + x i i. t 0 t dt Where is the time interval between x i and x i+ 1. dt i i

75 Data Value The average value of the data within T s can be calculated as ) x j ( t) t + T i s 1 ) = x j ( t) dt T s t i where T s = 60 minutes is used in the very short term forecasting (less than 6 hours) and T s = 2 hours is used for short term forecasting.

76 Fixed Step Prediction Scheme Prediction horizon of h steps fixed step forecasting means only the value of the next h sample is predicted by using the historical th data. ŷ(t + h) = f (yt, yt-1,,yt-d) Where f is a nonlinear function generated by SVM y + t h is predicted with the data before (the red blocks), is y t+h 1 predicted with the data before (the green blocks) y t y t 1

77 Wind speed normalization

78 Autocorrelations of the wind speed samples

79 SVM model and the RBF model

80 1h-ahead wind power prediction using the SVM model.

81 CONCLUSIONS The SVM has been successfully applied to the problems of pattern classification, particularly the classification of two different categories of patterns. SVM model is more suitable for very short term and short term WPF Provides a powerful tool for enhancing the WPF accuracy.

Support Vector Machine (SVM)

Support Vector Machine (SVM) Support Vector Machine (SVM) CE-725: Statistical Pattern Recognition Sharif University of Technology Spring 2013 Soleymani Outline Margin concept Hard-Margin SVM Soft-Margin SVM Dual Problems of Hard-Margin

More information

Support Vector Machines Explained

Support Vector Machines Explained March 1, 2009 Support Vector Machines Explained Tristan Fletcher www.cs.ucl.ac.uk/staff/t.fletcher/ Introduction This document has been written in an attempt to make the Support Vector Machines (SVM),

More information

A Simple Introduction to Support Vector Machines

A Simple Introduction to Support Vector Machines A Simple Introduction to Support Vector Machines Martin Law Lecture for CSE 802 Department of Computer Science and Engineering Michigan State University Outline A brief history of SVM Large-margin linear

More information

Introduction to Support Vector Machines. Colin Campbell, Bristol University

Introduction to Support Vector Machines. Colin Campbell, Bristol University Introduction to Support Vector Machines Colin Campbell, Bristol University 1 Outline of talk. Part 1. An Introduction to SVMs 1.1. SVMs for binary classification. 1.2. Soft margins and multi-class classification.

More information

Search Taxonomy. Web Search. Search Engine Optimization. Information Retrieval

Search Taxonomy. Web Search. Search Engine Optimization. Information Retrieval Information Retrieval INFO 4300 / CS 4300! Retrieval models Older models» Boolean retrieval» Vector Space model Probabilistic Models» BM25» Language models Web search» Learning to Rank Search Taxonomy!

More information

Artificial Neural Networks and Support Vector Machines. CS 486/686: Introduction to Artificial Intelligence

Artificial Neural Networks and Support Vector Machines. CS 486/686: Introduction to Artificial Intelligence Artificial Neural Networks and Support Vector Machines CS 486/686: Introduction to Artificial Intelligence 1 Outline What is a Neural Network? - Perceptron learners - Multi-layer networks What is a Support

More information

Class #6: Non-linear classification. ML4Bio 2012 February 17 th, 2012 Quaid Morris

Class #6: Non-linear classification. ML4Bio 2012 February 17 th, 2012 Quaid Morris Class #6: Non-linear classification ML4Bio 2012 February 17 th, 2012 Quaid Morris 1 Module #: Title of Module 2 Review Overview Linear separability Non-linear classification Linear Support Vector Machines

More information

Support Vector Machines

Support Vector Machines Support Vector Machines Charlie Frogner 1 MIT 2011 1 Slides mostly stolen from Ryan Rifkin (Google). Plan Regularization derivation of SVMs. Analyzing the SVM problem: optimization, duality. Geometric

More information

Big Data Analytics CSCI 4030

Big Data Analytics CSCI 4030 High dim. data Graph data Infinite data Machine learning Apps Locality sensitive hashing PageRank, SimRank Filtering data streams SVM Recommen der systems Clustering Community Detection Web advertising

More information

PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 4: LINEAR MODELS FOR CLASSIFICATION

PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 4: LINEAR MODELS FOR CLASSIFICATION PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 4: LINEAR MODELS FOR CLASSIFICATION Introduction In the previous chapter, we explored a class of regression models having particularly simple analytical

More information

Linear Threshold Units

Linear Threshold Units Linear Threshold Units w x hx (... w n x n w We assume that each feature x j and each weight w j is a real number (we will relax this later) We will study three different algorithms for learning linear

More information

Lecture 2: The SVM classifier

Lecture 2: The SVM classifier Lecture 2: The SVM classifier C19 Machine Learning Hilary 2015 A. Zisserman Review of linear classifiers Linear separability Perceptron Support Vector Machine (SVM) classifier Wide margin Cost function

More information

Several Views of Support Vector Machines

Several Views of Support Vector Machines Several Views of Support Vector Machines Ryan M. Rifkin Honda Research Institute USA, Inc. Human Intention Understanding Group 2007 Tikhonov Regularization We are considering algorithms of the form min

More information

Acknowledgments. Data Mining with Regression. Data Mining Context. Overview. Colleagues

Acknowledgments. Data Mining with Regression. Data Mining Context. Overview. Colleagues Data Mining with Regression Teaching an old dog some new tricks Acknowledgments Colleagues Dean Foster in Statistics Lyle Ungar in Computer Science Bob Stine Department of Statistics The School of the

More information

Linear smoother. ŷ = S y. where s ij = s ij (x) e.g. s ij = diag(l i (x)) To go the other way, you need to diagonalize S

Linear smoother. ŷ = S y. where s ij = s ij (x) e.g. s ij = diag(l i (x)) To go the other way, you need to diagonalize S Linear smoother ŷ = S y where s ij = s ij (x) e.g. s ij = diag(l i (x)) To go the other way, you need to diagonalize S 2 Online Learning: LMS and Perceptrons Partially adapted from slides by Ryan Gabbard

More information

Support Vector Machine. Tutorial. (and Statistical Learning Theory)

Support Vector Machine. Tutorial. (and Statistical Learning Theory) Support Vector Machine (and Statistical Learning Theory) Tutorial Jason Weston NEC Labs America 4 Independence Way, Princeton, USA. jasonw@nec-labs.com 1 Support Vector Machines: history SVMs introduced

More information

Machine Learning in Spam Filtering

Machine Learning in Spam Filtering Machine Learning in Spam Filtering A Crash Course in ML Konstantin Tretyakov kt@ut.ee Institute of Computer Science, University of Tartu Overview Spam is Evil ML for Spam Filtering: General Idea, Problems.

More information

Analysis of kiva.com Microlending Service! Hoda Eydgahi Julia Ma Andy Bardagjy December 9, 2010 MAS.622j

Analysis of kiva.com Microlending Service! Hoda Eydgahi Julia Ma Andy Bardagjy December 9, 2010 MAS.622j Analysis of kiva.com Microlending Service! Hoda Eydgahi Julia Ma Andy Bardagjy December 9, 2010 MAS.622j What is Kiva? An organization that allows people to lend small amounts of money via the Internet

More information

A fast multi-class SVM learning method for huge databases

A fast multi-class SVM learning method for huge databases www.ijcsi.org 544 A fast multi-class SVM learning method for huge databases Djeffal Abdelhamid 1, Babahenini Mohamed Chaouki 2 and Taleb-Ahmed Abdelmalik 3 1,2 Computer science department, LESIA Laboratory,

More information

Predict Influencers in the Social Network

Predict Influencers in the Social Network Predict Influencers in the Social Network Ruishan Liu, Yang Zhao and Liuyu Zhou Email: rliu2, yzhao2, lyzhou@stanford.edu Department of Electrical Engineering, Stanford University Abstract Given two persons

More information

Statistical Machine Learning

Statistical Machine Learning Statistical Machine Learning UoC Stats 37700, Winter quarter Lecture 4: classical linear and quadratic discriminants. 1 / 25 Linear separation For two classes in R d : simple idea: separate the classes

More information

Support Vector Machines for Classification and Regression

Support Vector Machines for Classification and Regression UNIVERSITY OF SOUTHAMPTON Support Vector Machines for Classification and Regression by Steve R. Gunn Technical Report Faculty of Engineering, Science and Mathematics School of Electronics and Computer

More information

CS 2750 Machine Learning. Lecture 1. Machine Learning. http://www.cs.pitt.edu/~milos/courses/cs2750/ CS 2750 Machine Learning.

CS 2750 Machine Learning. Lecture 1. Machine Learning. http://www.cs.pitt.edu/~milos/courses/cs2750/ CS 2750 Machine Learning. Lecture Machine Learning Milos Hauskrecht milos@cs.pitt.edu 539 Sennott Square, x5 http://www.cs.pitt.edu/~milos/courses/cs75/ Administration Instructor: Milos Hauskrecht milos@cs.pitt.edu 539 Sennott

More information

Machine Learning Final Project Spam Email Filtering

Machine Learning Final Project Spam Email Filtering Machine Learning Final Project Spam Email Filtering March 2013 Shahar Yifrah Guy Lev Table of Content 1. OVERVIEW... 3 2. DATASET... 3 2.1 SOURCE... 3 2.2 CREATION OF TRAINING AND TEST SETS... 4 2.3 FEATURE

More information

Support Vector Machines with Clustering for Training with Very Large Datasets

Support Vector Machines with Clustering for Training with Very Large Datasets Support Vector Machines with Clustering for Training with Very Large Datasets Theodoros Evgeniou Technology Management INSEAD Bd de Constance, Fontainebleau 77300, France theodoros.evgeniou@insead.fr Massimiliano

More information

An Introduction to Machine Learning

An Introduction to Machine Learning An Introduction to Machine Learning L5: Novelty Detection and Regression Alexander J. Smola Statistical Machine Learning Program Canberra, ACT 0200 Australia Alex.Smola@nicta.com.au Tata Institute, Pune,

More information

Comparison of Non-linear Dimensionality Reduction Techniques for Classification with Gene Expression Microarray Data

Comparison of Non-linear Dimensionality Reduction Techniques for Classification with Gene Expression Microarray Data CMPE 59H Comparison of Non-linear Dimensionality Reduction Techniques for Classification with Gene Expression Microarray Data Term Project Report Fatma Güney, Kübra Kalkan 1/15/2013 Keywords: Non-linear

More information

Predict the Popularity of YouTube Videos Using Early View Data

Predict the Popularity of YouTube Videos Using Early View Data 000 001 002 003 004 005 006 007 008 009 010 011 012 013 014 015 016 017 018 019 020 021 022 023 024 025 026 027 028 029 030 031 032 033 034 035 036 037 038 039 040 041 042 043 044 045 046 047 048 049 050

More information

Machine Learning and Pattern Recognition Logistic Regression

Machine Learning and Pattern Recognition Logistic Regression Machine Learning and Pattern Recognition Logistic Regression Course Lecturer:Amos J Storkey Institute for Adaptive and Neural Computation School of Informatics University of Edinburgh Crichton Street,

More information

Big Data - Lecture 1 Optimization reminders

Big Data - Lecture 1 Optimization reminders Big Data - Lecture 1 Optimization reminders S. Gadat Toulouse, Octobre 2014 Big Data - Lecture 1 Optimization reminders S. Gadat Toulouse, Octobre 2014 Schedule Introduction Major issues Examples Mathematics

More information

A Health Degree Evaluation Algorithm for Equipment Based on Fuzzy Sets and the Improved SVM

A Health Degree Evaluation Algorithm for Equipment Based on Fuzzy Sets and the Improved SVM Journal of Computational Information Systems 10: 17 (2014) 7629 7635 Available at http://www.jofcis.com A Health Degree Evaluation Algorithm for Equipment Based on Fuzzy Sets and the Improved SVM Tian

More information

Classifying Large Data Sets Using SVMs with Hierarchical Clusters. Presented by :Limou Wang

Classifying Large Data Sets Using SVMs with Hierarchical Clusters. Presented by :Limou Wang Classifying Large Data Sets Using SVMs with Hierarchical Clusters Presented by :Limou Wang Overview SVM Overview Motivation Hierarchical micro-clustering algorithm Clustering-Based SVM (CB-SVM) Experimental

More information

Knowledge Discovery from patents using KMX Text Analytics

Knowledge Discovery from patents using KMX Text Analytics Knowledge Discovery from patents using KMX Text Analytics Dr. Anton Heijs anton.heijs@treparel.com Treparel Abstract In this white paper we discuss how the KMX technology of Treparel can help searchers

More information

A Study on the Comparison of Electricity Forecasting Models: Korea and China

A Study on the Comparison of Electricity Forecasting Models: Korea and China Communications for Statistical Applications and Methods 2015, Vol. 22, No. 6, 675 683 DOI: http://dx.doi.org/10.5351/csam.2015.22.6.675 Print ISSN 2287-7843 / Online ISSN 2383-4757 A Study on the Comparison

More information

A FUZZY LOGIC APPROACH FOR SALES FORECASTING

A FUZZY LOGIC APPROACH FOR SALES FORECASTING A FUZZY LOGIC APPROACH FOR SALES FORECASTING ABSTRACT Sales forecasting proved to be very important in marketing where managers need to learn from historical data. Many methods have become available for

More information

Data clustering optimization with visualization

Data clustering optimization with visualization Page 1 Data clustering optimization with visualization Fabien Guillaume MASTER THESIS IN SOFTWARE ENGINEERING DEPARTMENT OF INFORMATICS UNIVERSITY OF BERGEN NORWAY DEPARTMENT OF COMPUTER ENGINEERING BERGEN

More information

Applications to Data Smoothing and Image Processing I

Applications to Data Smoothing and Image Processing I Applications to Data Smoothing and Image Processing I MA 348 Kurt Bryan Signals and Images Let t denote time and consider a signal a(t) on some time interval, say t. We ll assume that the signal a(t) is

More information

Support Vector Machines

Support Vector Machines CS229 Lecture notes Andrew Ng Part V Support Vector Machines This set of notes presents the Support Vector Machine (SVM) learning algorithm. SVMs are among the best (and many believe are indeed the best)

More information

Machine Learning in FX Carry Basket Prediction

Machine Learning in FX Carry Basket Prediction Machine Learning in FX Carry Basket Prediction Tristan Fletcher, Fabian Redpath and Joe D Alessandro Abstract Artificial Neural Networks ANN), Support Vector Machines SVM) and Relevance Vector Machines

More information

Lecture 6: Logistic Regression

Lecture 6: Logistic Regression Lecture 6: CS 194-10, Fall 2011 Laurent El Ghaoui EECS Department UC Berkeley September 13, 2011 Outline Outline Classification task Data : X = [x 1,..., x m]: a n m matrix of data points in R n. y { 1,

More information

Machine Learning and Data Mining. Regression Problem. (adapted from) Prof. Alexander Ihler

Machine Learning and Data Mining. Regression Problem. (adapted from) Prof. Alexander Ihler Machine Learning and Data Mining Regression Problem (adapted from) Prof. Alexander Ihler Overview Regression Problem Definition and define parameters ϴ. Prediction using ϴ as parameters Measure the error

More information

A User s Guide to Support Vector Machines

A User s Guide to Support Vector Machines A User s Guide to Support Vector Machines Asa Ben-Hur Department of Computer Science Colorado State University Jason Weston NEC Labs America Princeton, NJ 08540 USA Abstract The Support Vector Machine

More information

AUTOMATION OF ENERGY DEMAND FORECASTING. Sanzad Siddique, B.S.

AUTOMATION OF ENERGY DEMAND FORECASTING. Sanzad Siddique, B.S. AUTOMATION OF ENERGY DEMAND FORECASTING by Sanzad Siddique, B.S. A Thesis submitted to the Faculty of the Graduate School, Marquette University, in Partial Fulfillment of the Requirements for the Degree

More information

BEHAVIOR BASED CREDIT CARD FRAUD DETECTION USING SUPPORT VECTOR MACHINES

BEHAVIOR BASED CREDIT CARD FRAUD DETECTION USING SUPPORT VECTOR MACHINES BEHAVIOR BASED CREDIT CARD FRAUD DETECTION USING SUPPORT VECTOR MACHINES 123 CHAPTER 7 BEHAVIOR BASED CREDIT CARD FRAUD DETECTION USING SUPPORT VECTOR MACHINES 7.1 Introduction Even though using SVM presents

More information

THE SVM APPROACH FOR BOX JENKINS MODELS

THE SVM APPROACH FOR BOX JENKINS MODELS REVSTAT Statistical Journal Volume 7, Number 1, April 2009, 23 36 THE SVM APPROACH FOR BOX JENKINS MODELS Authors: Saeid Amiri Dep. of Energy and Technology, Swedish Univ. of Agriculture Sciences, P.O.Box

More information

Network Intrusion Detection using Semi Supervised Support Vector Machine

Network Intrusion Detection using Semi Supervised Support Vector Machine Network Intrusion Detection using Semi Supervised Support Vector Machine Jyoti Haweliya Department of Computer Engineering Institute of Engineering & Technology, Devi Ahilya University Indore, India ABSTRACT

More information

Correlation and Convolution Class Notes for CMSC 426, Fall 2005 David Jacobs

Correlation and Convolution Class Notes for CMSC 426, Fall 2005 David Jacobs Correlation and Convolution Class otes for CMSC 46, Fall 5 David Jacobs Introduction Correlation and Convolution are basic operations that we will perform to extract information from images. They are in

More information

Lecture 3: Linear methods for classification

Lecture 3: Linear methods for classification Lecture 3: Linear methods for classification Rafael A. Irizarry and Hector Corrada Bravo February, 2010 Today we describe four specific algorithms useful for classification problems: linear regression,

More information

Predictive Dynamix Inc

Predictive Dynamix Inc Predictive Modeling Technology Predictive modeling is concerned with analyzing patterns and trends in historical and operational data in order to transform data into actionable decisions. This is accomplished

More information

The Effects of Start Prices on the Performance of the Certainty Equivalent Pricing Policy

The Effects of Start Prices on the Performance of the Certainty Equivalent Pricing Policy BMI Paper The Effects of Start Prices on the Performance of the Certainty Equivalent Pricing Policy Faculty of Sciences VU University Amsterdam De Boelelaan 1081 1081 HV Amsterdam Netherlands Author: R.D.R.

More information

Tree based ensemble models regularization by convex optimization

Tree based ensemble models regularization by convex optimization Tree based ensemble models regularization by convex optimization Bertrand Cornélusse, Pierre Geurts and Louis Wehenkel Department of Electrical Engineering and Computer Science University of Liège B-4000

More information

Early defect identification of semiconductor processes using machine learning

Early defect identification of semiconductor processes using machine learning STANFORD UNIVERISTY MACHINE LEARNING CS229 Early defect identification of semiconductor processes using machine learning Friday, December 16, 2011 Authors: Saul ROSA Anton VLADIMIROV Professor: Dr. Andrew

More information

BIOINF 585 Fall 2015 Machine Learning for Systems Biology & Clinical Informatics http://www.ccmb.med.umich.edu/node/1376

BIOINF 585 Fall 2015 Machine Learning for Systems Biology & Clinical Informatics http://www.ccmb.med.umich.edu/node/1376 Course Director: Dr. Kayvan Najarian (DCM&B, kayvan@umich.edu) Lectures: Labs: Mondays and Wednesdays 9:00 AM -10:30 AM Rm. 2065 Palmer Commons Bldg. Wednesdays 10:30 AM 11:30 AM (alternate weeks) Rm.

More information

STA 4273H: Statistical Machine Learning

STA 4273H: Statistical Machine Learning STA 4273H: Statistical Machine Learning Russ Salakhutdinov Department of Statistics! rsalakhu@utstat.toronto.edu! http://www.cs.toronto.edu/~rsalakhu/ Lecture 6 Three Approaches to Classification Construct

More information

Linear Classification. Volker Tresp Summer 2015

Linear Classification. Volker Tresp Summer 2015 Linear Classification Volker Tresp Summer 2015 1 Classification Classification is the central task of pattern recognition Sensors supply information about an object: to which class do the object belong

More information

JAVA FUZZY LOGIC TOOLBOX FOR INDUSTRIAL PROCESS CONTROL

JAVA FUZZY LOGIC TOOLBOX FOR INDUSTRIAL PROCESS CONTROL JAVA FUZZY LOGIC TOOLBOX FOR INDUSTRIAL PROCESS CONTROL Bruno Sielly J. Costa, Clauber G. Bezerra, Luiz Affonso H. G. de Oliveira Instituto Federal de Educação Ciência e Tecnologia do Rio Grande do Norte

More information

Example: Credit card default, we may be more interested in predicting the probabilty of a default than classifying individuals as default or not.

Example: Credit card default, we may be more interested in predicting the probabilty of a default than classifying individuals as default or not. Statistical Learning: Chapter 4 Classification 4.1 Introduction Supervised learning with a categorical (Qualitative) response Notation: - Feature vector X, - qualitative response Y, taking values in C

More information

Computational Intelligence Introduction

Computational Intelligence Introduction Computational Intelligence Introduction Farzaneh Abdollahi Department of Electrical Engineering Amirkabir University of Technology Fall 2011 Farzaneh Abdollahi Neural Networks 1/21 Fuzzy Systems What are

More information

Duality in General Programs. Ryan Tibshirani Convex Optimization 10-725/36-725

Duality in General Programs. Ryan Tibshirani Convex Optimization 10-725/36-725 Duality in General Programs Ryan Tibshirani Convex Optimization 10-725/36-725 1 Last time: duality in linear programs Given c R n, A R m n, b R m, G R r n, h R r : min x R n c T x max u R m, v R r b T

More information

Comparing the Results of Support Vector Machines with Traditional Data Mining Algorithms

Comparing the Results of Support Vector Machines with Traditional Data Mining Algorithms Comparing the Results of Support Vector Machines with Traditional Data Mining Algorithms Scott Pion and Lutz Hamel Abstract This paper presents the results of a series of analyses performed on direct mail

More information

Online (and Offline) on an Even Tighter Budget

Online (and Offline) on an Even Tighter Budget Online (and Offline) on an Even Tighter Budget Jason Weston NEC Laboratories America, Princeton, NJ, USA jasonw@nec-labs.com Antoine Bordes NEC Laboratories America, Princeton, NJ, USA antoine@nec-labs.com

More information

INTRUSION DETECTION USING THE SUPPORT VECTOR MACHINE ENHANCED WITH A FEATURE-WEIGHT KERNEL

INTRUSION DETECTION USING THE SUPPORT VECTOR MACHINE ENHANCED WITH A FEATURE-WEIGHT KERNEL INTRUSION DETECTION USING THE SUPPORT VECTOR MACHINE ENHANCED WITH A FEATURE-WEIGHT KERNEL A Thesis Submitted to the Faculty of Graduate Studies and Research In Partial Fulfillment of the Requirements

More information

Adaptive and Online One-Class Support Vector Machine-based Outlier Detection Techniques for Wireless Sensor Networks

Adaptive and Online One-Class Support Vector Machine-based Outlier Detection Techniques for Wireless Sensor Networks 2009 International Conference on Advanced Information Networking and Applications Workshops Adaptive and Online One-Class Support Vector Machine-based Outlier Detection Techniques for Wireless Sensor Networks

More information

Case Study Report: Building and analyzing SVM ensembles with Bagging and AdaBoost on big data sets

Case Study Report: Building and analyzing SVM ensembles with Bagging and AdaBoost on big data sets Case Study Report: Building and analyzing SVM ensembles with Bagging and AdaBoost on big data sets Ricardo Ramos Guerra Jörg Stork Master in Automation and IT Faculty of Computer Science and Engineering

More information

AdaBoost. Jiri Matas and Jan Šochman. Centre for Machine Perception Czech Technical University, Prague http://cmp.felk.cvut.cz

AdaBoost. Jiri Matas and Jan Šochman. Centre for Machine Perception Czech Technical University, Prague http://cmp.felk.cvut.cz AdaBoost Jiri Matas and Jan Šochman Centre for Machine Perception Czech Technical University, Prague http://cmp.felk.cvut.cz Presentation Outline: AdaBoost algorithm Why is of interest? How it works? Why

More information

Probability and Random Variables. Generation of random variables (r.v.)

Probability and Random Variables. Generation of random variables (r.v.) Probability and Random Variables Method for generating random variables with a specified probability distribution function. Gaussian And Markov Processes Characterization of Stationary Random Process Linearly

More information

E-commerce Transaction Anomaly Classification

E-commerce Transaction Anomaly Classification E-commerce Transaction Anomaly Classification Minyong Lee minyong@stanford.edu Seunghee Ham sham12@stanford.edu Qiyi Jiang qjiang@stanford.edu I. INTRODUCTION Due to the increasing popularity of e-commerce

More information

Machine Learning Logistic Regression

Machine Learning Logistic Regression Machine Learning Logistic Regression Jeff Howbert Introduction to Machine Learning Winter 2012 1 Logistic regression Name is somewhat misleading. Really a technique for classification, not regression.

More information

ABSTRACT. Keyword double rotary inverted pendulum, fuzzy logic controller, nonlinear system, LQR, MATLAB software 1 PREFACE

ABSTRACT. Keyword double rotary inverted pendulum, fuzzy logic controller, nonlinear system, LQR, MATLAB software 1 PREFACE DESIGN OF FUZZY LOGIC CONTROLLER FOR DOUBLE ROTARY INVERTED PENDULUM Dyah Arini, Dr.-Ing. Ir. Yul Y. Nazaruddin, M.Sc.DIC, Dr. Ir. M. Rohmanuddin, MT. Physics Engineering Department Institut Teknologi

More information

Moving Least Squares Approximation

Moving Least Squares Approximation Chapter 7 Moving Least Squares Approimation An alternative to radial basis function interpolation and approimation is the so-called moving least squares method. As we will see below, in this method the

More information

Elasticity Theory Basics

Elasticity Theory Basics G22.3033-002: Topics in Computer Graphics: Lecture #7 Geometric Modeling New York University Elasticity Theory Basics Lecture #7: 20 October 2003 Lecturer: Denis Zorin Scribe: Adrian Secord, Yotam Gingold

More information

The Steepest Descent Algorithm for Unconstrained Optimization and a Bisection Line-search Method

The Steepest Descent Algorithm for Unconstrained Optimization and a Bisection Line-search Method The Steepest Descent Algorithm for Unconstrained Optimization and a Bisection Line-search Method Robert M. Freund February, 004 004 Massachusetts Institute of Technology. 1 1 The Algorithm The problem

More information

Social Media Mining. Data Mining Essentials

Social Media Mining. Data Mining Essentials Introduction Data production rate has been increased dramatically (Big Data) and we are able store much more data than before E.g., purchase data, social media data, mobile phone data Businesses and customers

More information

This unit will lay the groundwork for later units where the students will extend this knowledge to quadratic and exponential functions.

This unit will lay the groundwork for later units where the students will extend this knowledge to quadratic and exponential functions. Algebra I Overview View unit yearlong overview here Many of the concepts presented in Algebra I are progressions of concepts that were introduced in grades 6 through 8. The content presented in this course

More information

These slides follow closely the (English) course textbook Pattern Recognition and Machine Learning by Christopher Bishop

These slides follow closely the (English) course textbook Pattern Recognition and Machine Learning by Christopher Bishop Music and Machine Learning (IFT6080 Winter 08) Prof. Douglas Eck, Université de Montréal These slides follow closely the (English) course textbook Pattern Recognition and Machine Learning by Christopher

More information

Beating the NCAA Football Point Spread

Beating the NCAA Football Point Spread Beating the NCAA Football Point Spread Brian Liu Mathematical & Computational Sciences Stanford University Patrick Lai Computer Science Department Stanford University December 10, 2010 1 Introduction Over

More information

Lecture 8 February 4

Lecture 8 February 4 ICS273A: Machine Learning Winter 2008 Lecture 8 February 4 Scribe: Carlos Agell (Student) Lecturer: Deva Ramanan 8.1 Neural Nets 8.1.1 Logistic Regression Recall the logistic function: g(x) = 1 1 + e θt

More information

Using artificial intelligence for data reduction in mechanical engineering

Using artificial intelligence for data reduction in mechanical engineering Using artificial intelligence for data reduction in mechanical engineering L. Mdlazi 1, C.J. Stander 1, P.S. Heyns 1, T. Marwala 2 1 Dynamic Systems Group Department of Mechanical and Aeronautical Engineering,

More information

Gaussian Processes in Machine Learning

Gaussian Processes in Machine Learning Gaussian Processes in Machine Learning Carl Edward Rasmussen Max Planck Institute for Biological Cybernetics, 72076 Tübingen, Germany carl@tuebingen.mpg.de WWW home page: http://www.tuebingen.mpg.de/ carl

More information

Lecture 2: August 29. Linear Programming (part I)

Lecture 2: August 29. Linear Programming (part I) 10-725: Convex Optimization Fall 2013 Lecture 2: August 29 Lecturer: Barnabás Póczos Scribes: Samrachana Adhikari, Mattia Ciollaro, Fabrizio Lecci Note: LaTeX template courtesy of UC Berkeley EECS dept.

More information

Logistic Regression. Vibhav Gogate The University of Texas at Dallas. Some Slides from Carlos Guestrin, Luke Zettlemoyer and Dan Weld.

Logistic Regression. Vibhav Gogate The University of Texas at Dallas. Some Slides from Carlos Guestrin, Luke Zettlemoyer and Dan Weld. Logistic Regression Vibhav Gogate The University of Texas at Dallas Some Slides from Carlos Guestrin, Luke Zettlemoyer and Dan Weld. Generative vs. Discriminative Classifiers Want to Learn: h:x Y X features

More information

CHAPTER 6 IMPLEMENTATION OF CONVENTIONAL AND INTELLIGENT CLASSIFIER FOR FLAME MONITORING

CHAPTER 6 IMPLEMENTATION OF CONVENTIONAL AND INTELLIGENT CLASSIFIER FOR FLAME MONITORING 135 CHAPTER 6 IMPLEMENTATION OF CONVENTIONAL AND INTELLIGENT CLASSIFIER FOR FLAME MONITORING 6.1 PROPOSED SETUP FOR FLAME MONITORING IN BOILERS The existing flame monitoring system includes the flame images

More information

Lecture 7: Finding Lyapunov Functions 1

Lecture 7: Finding Lyapunov Functions 1 Massachusetts Institute of Technology Department of Electrical Engineering and Computer Science 6.243j (Fall 2003): DYNAMICS OF NONLINEAR SYSTEMS by A. Megretski Lecture 7: Finding Lyapunov Functions 1

More information

SURVEY OF TEXT CLASSIFICATION ALGORITHMS FOR SPAM FILTERING

SURVEY OF TEXT CLASSIFICATION ALGORITHMS FOR SPAM FILTERING I J I T E ISSN: 2229-7367 3(1-2), 2012, pp. 233-237 SURVEY OF TEXT CLASSIFICATION ALGORITHMS FOR SPAM FILTERING K. SARULADHA 1 AND L. SASIREKA 2 1 Assistant Professor, Department of Computer Science and

More information

Neural Networks and Support Vector Machines

Neural Networks and Support Vector Machines INF5390 - Kunstig intelligens Neural Networks and Support Vector Machines Roar Fjellheim INF5390-13 Neural Networks and SVM 1 Outline Neural networks Perceptrons Neural networks Support vector machines

More information

Machine learning algorithms for predicting roadside fine particulate matter concentration level in Hong Kong Central

Machine learning algorithms for predicting roadside fine particulate matter concentration level in Hong Kong Central Article Machine learning algorithms for predicting roadside fine particulate matter concentration level in Hong Kong Central Yin Zhao, Yahya Abu Hasan School of Mathematical Sciences, Universiti Sains

More information

Models of Cortical Maps II

Models of Cortical Maps II CN510: Principles and Methods of Cognitive and Neural Modeling Models of Cortical Maps II Lecture 19 Instructor: Anatoli Gorchetchnikov dy dt The Network of Grossberg (1976) Ay B y f (

More information

Linear Models for Classification

Linear Models for Classification Linear Models for Classification Sumeet Agarwal, EEL709 (Most figures from Bishop, PRML) Approaches to classification Discriminant function: Directly assigns each data point x to a particular class Ci

More information

Online Support Vector Regression

Online Support Vector Regression Online Support Vector Regression Francesco Parrella A Thesis presented for the degree of Information Science Department of Information Science University of Genoa Italy June 2007 Online Support Vector

More information

Making Sense of the Mayhem: Machine Learning and March Madness

Making Sense of the Mayhem: Machine Learning and March Madness Making Sense of the Mayhem: Machine Learning and March Madness Alex Tran and Adam Ginzberg Stanford University atran3@stanford.edu ginzberg@stanford.edu I. Introduction III. Model The goal of our research

More information

Monotonicity Hints. Abstract

Monotonicity Hints. Abstract Monotonicity Hints Joseph Sill Computation and Neural Systems program California Institute of Technology email: joe@cs.caltech.edu Yaser S. Abu-Mostafa EE and CS Deptartments California Institute of Technology

More information

Machine learning for algo trading

Machine learning for algo trading Machine learning for algo trading An introduction for nonmathematicians Dr. Aly Kassam Overview High level introduction to machine learning A machine learning bestiary What has all this got to do with

More information

Classification algorithm in Data mining: An Overview

Classification algorithm in Data mining: An Overview Classification algorithm in Data mining: An Overview S.Neelamegam #1, Dr.E.Ramaraj *2 #1 M.phil Scholar, Department of Computer Science and Engineering, Alagappa University, Karaikudi. *2 Professor, Department

More information

Data visualization and dimensionality reduction using kernel maps with a reference point

Data visualization and dimensionality reduction using kernel maps with a reference point Data visualization and dimensionality reduction using kernel maps with a reference point Johan Suykens K.U. Leuven, ESAT-SCD/SISTA Kasteelpark Arenberg 1 B-31 Leuven (Heverlee), Belgium Tel: 32/16/32 18

More information

Mathematical Models of Supervised Learning and their Application to Medical Diagnosis

Mathematical Models of Supervised Learning and their Application to Medical Diagnosis Genomic, Proteomic and Transcriptomic Lab High Performance Computing and Networking Institute National Research Council, Italy Mathematical Models of Supervised Learning and their Application to Medical

More information

Multi-class Classification: A Coding Based Space Partitioning

Multi-class Classification: A Coding Based Space Partitioning Multi-class Classification: A Coding Based Space Partitioning Sohrab Ferdowsi, Svyatoslav Voloshynovskiy, Marcin Gabryel, and Marcin Korytkowski University of Geneva, Centre Universitaire d Informatique,

More information

BEHAVIOR BASED CREDIT CARD FRAUD DETECTION USING SUPPORT VECTOR MACHINES

BEHAVIOR BASED CREDIT CARD FRAUD DETECTION USING SUPPORT VECTOR MACHINES ISSN: 2229-6956(ONLINE) ICTACT JOURNAL ON SOFT COMPUTING, JULY 2012, VOLUME: 02, ISSUE: 04 BEHAVIOR BASED CREDIT CARD FRAUD DETECTION USING SUPPORT VECTOR MACHINES V. Dheepa 1 and R. Dhanapal 2 1 Research

More information

Introduction to Fuzzy Control

Introduction to Fuzzy Control Introduction to Fuzzy Control Marcelo Godoy Simoes Colorado School of Mines Engineering Division 1610 Illinois Street Golden, Colorado 80401-1887 USA Abstract In the last few years the applications of

More information

Robert Collins CSE598G. More on Mean-shift. R.Collins, CSE, PSU CSE598G Spring 2006

Robert Collins CSE598G. More on Mean-shift. R.Collins, CSE, PSU CSE598G Spring 2006 More on Mean-shift R.Collins, CSE, PSU Spring 2006 Recall: Kernel Density Estimation Given a set of data samples x i ; i=1...n Convolve with a kernel function H to generate a smooth function f(x) Equivalent

More information

A Neural Support Vector Network Architecture with Adaptive Kernels. 1 Introduction. 2 Support Vector Machines and Motivations

A Neural Support Vector Network Architecture with Adaptive Kernels. 1 Introduction. 2 Support Vector Machines and Motivations A Neural Support Vector Network Architecture with Adaptive Kernels Pascal Vincent & Yoshua Bengio Département d informatique et recherche opérationnelle Université de Montréal C.P. 6128 Succ. Centre-Ville,

More information