# Nonlinear Regression in Parametric Activation Studies

Save this PDF as:

Size: px
Start display at page:

## Transcription

2 NONLINEAR REGRESSION IN ACTIVATION STUDIES 61 tion of a number of basis functions. Well-known examples are the Fourier expansion and the polynomial expansion. We adapt this technique to characterize rcbf responses in terms of a set of basis functions of a given parameter (e.g., study parameter or response) and show how rcbf responses can be approximated by a small number of these basis functions. In general the goodness of fit of the regression depends on the type and number of the basis functions chosen and on the number of data points to be approximated. We expect rcbf responses in cognitive activation studies to be smooth and monotonic. In the context of this paper we restrict ourselves to a PET activation study with 12 runs per subject, resulting in 12 data points for regression. Given the smoothness and small number of data points, a small set of basis functions is appropriate to characterize the relationship. The first two basis functions of the discrete cosine transformation and a second-order polynomial expansion were studied. Comparison of the goodness of fit for both approaches showed no marked difference. Cosine basis functions revealed slightly better results in modeling ceiling or floor effects (see Figs. 1 and 2). However, the second-order polynomial expansions were chosen because the coefficients are easier to interpret. Concerning the number of basis functions, additional tests showed that increasing the order of the polynomials beyond 2 did not increase the goodness of fit dramatically, but lowered the degrees of freedom for available statistical inference. In general the issue of an optimal number and type of basis functions is the domain of model selection and this will be the topic of a subsequent paper. SPMs (Friston et al., 1995) can be considered spatially extended statistical processes. In general SPM uses the general linear model to build t statistics. In the special case of parametric studies it is used to make statistical inferences about the correlation of rcbf and a study parameter. The basic equation of the general linear model is x ij 5 g i1 b 1j 1 g ik b kj 1 e ij. (1) Here b kj are k unknown parameters for each voxel j. The coefficients g are explanatory or modeled variables under which the observation (i.e., scan) was made. Comparing a standard polynomial expansion, p(x) 5 p l x n 1 p 2 x n21 1 p n x1p n11, (2) to (1), it is evident that this is another representation of the general linear model. Equation (1) can be written in matrix format: X 5 Gb 1e. (3) G is the design matrix, which has one row for every scan and one column for every modeled (i.e., designed ) effect, b is the parameter matrix which has one row for each column of G and one column for every voxel in the volume, and e is the matrix of error terms. For different explanatory variables the matrix G can be partitioned into several matrices or column vectors of interest G 1 G 2 and noninterest HG5[G 1 G 2 H] each with a unique parameter in b (b 5[b 1 b 2 g]). In our case the polynomial basis functions are the explanatory variables used to model rcbf responses at different voxels. As we restricted our analysis to a second-order polynomial expansion, the column vector G 1 contains the explanatory variable itself and G 2 contains the squared values of this variable. The leastsquares solutions b 1 and b 2 of b 1 and b 2 (b5[b 1 b 2 g]) are the coefficients p 5 [ p 1 p 2 ] of a second-order polynomial [see Eq. (2)] that best models the relation between rcbf and the variable in question. As mentioned above, the design matrix also contains a partition of confounding effects H where effects of no interest, such as global blood flow and block effects, are modeled. To test the overall significance of the effects of interest we test the null hypothesis that their introduction does not significantly reduce the error variance, which is equivalent to testing the hypothesis that both b 1 and b 2 are 0. Thus the error sum of squares, after discounting the effects of interest (G) are given by R(V 0 ) 5 (X 2 Hg) T (X2Hg), (4) where g is the parameter estimate of g. The alternative hypothesis includes the effects of interest. The error sum of squares and products under the alternative hypothesis is therefore given by R(V) 5 (X 2 Gb) T (X2Gb), (5) and the F statistic for voxel i is given by F i 5 r r 0 2 r? R i(v 0 ) 2 R i (V), (6) R i (V) where F has the F distribution with r 0 2 r and r degrees of freedom. In general r 0 5 N 2 rank(h) and r 5 N 2 rank(g) where N is the total number of scans. In our special case the F statistic reflects the goodness of the regression. An image of the F statistic at every voxel constitutes a SPM5F6. The SPM5F6 can be interpreted as an image of the significance of the variance explained by the parameters of interest (i.e., the polynomials in G) relative to error. Due to the extremely large number of nonindependent univariate analyses, the probability that a region

3 62 BÜCHEL ET AL. reaches an uncorrected threshold by chance is very high. On the basis of the theory of Gaussian fields we characterize a local excursion of the SPM by its maximal F value. This simple characterization has a certain probability of chance occurrence. The probability of getting one or more voxel with a certain F value in a given SPM5F6 is the same as the probability that the largest F value of the SPM5F6 is greater than this F value. At high thresholds this probability equals the expected number of maxima. Therefore the problem of calculating a corrected P value can be reduced to finding the expected number of maxima at or above this threshold. In a theoretical paper Worsley (1994) derived an equation for the probability that the largest F value of the SPM5F6 is greater than a threshold f on the basis of the smoothness of the underlying Gaussian component FIG. 2. SPM5F6 for nonlinear regression using polynomials. Voxels over a threshold of F 5 15 are shown. The regression for a voxel in a left frontal region (x 5234, y 5 40, and z 5 24 mm) that was not apparent in the linear regression is shown. F 5 20; df 2, 8; P, (uncorrected). processes. The result in three dimensions is FIG. 1. SPM5F6 for linear regression in the patient who had recovered from an ischemic infarction in the territory of the left middle cerebral artery. Voxels over a threshold of F 5 15 are shown. The plot shows adjusted activity in relation to word rate. The voxel investigated has the coordinates x 5 54, y 5218, and z 5 4mmin respect to the standard space defined by Talairach and Tournoux (1988). F 5 75; df 1, 9; P, (uncorrected). P(F max \$ f ) < l(c) det (L) 1 2 G (m 1 n 2 3) 2 (2p) G 1 m 2 2 G 1 n 22 nf? 1 m 2 1/2(n23) nf m 2 21/2(m1n22) 3 5 (m 2 1)(m 2 2)? 1 nf m (2mn 2 m 2 n 21) nf m 1 (n 2 1)(n 2 2) 6, where f is the threshold, n and m are the degrees of freedom of the F statistic, l(c) is the Lebesgue measure of region C (here the number of voxels in the volume), and L is the variance covariance matrix of the first derivate of the underlying Gaussian component pro- (7)

4 NONLINEAR REGRESSION IN ACTIVATION STUDIES 63 cesses. P(F max \$ f ) therefore corresponds to a corrected P value for a voxel with f 5 F. This general approach to parametric studies can also be extended to compare the nonlinear responses of different groups. This can be subdivided into statistical inferences on similarities and differences between groups. To test for differences, the polynomials appear twice in the design matrix. In the first partition the functions are replicated in both groups, whereas in the second partition the polynomials are inverted for the second group. This mirror-like appearance of the polynomial basis functions models differential responses in that case. The second, mirror part of the design matrix is the one of interest, whereas the first, symmetrical is a confounding covariate. Therefore the SPM5F6 is an image of the significance of the differences of rcbf responses in both groups. Defining the similarities as a confounding covariate in this case is necessary to make FIG. 4. SPM5F6 for nonlinear regression using polynomials in the control subject. Note the cluster of voxels in the left temporal region, absent in the patient. The plot for the same voxel as in Figs. 1 and 3 in the right superior temporal region is shown. F 5 37; df 2, 8; P, (uncorrected). specific inferences about different rcbf responses in the two subjects. To test for similarities, the symmetric polynomials (replicated for both groups) are covariates of interest and the SPM5F6 highlights where rcbf responses are similar in both groups. AN EXAMPLE FIG. 3. As for Fig. 1 but using cosine basis functions instead of the linear regression. The regression for the same voxel as in Fig. 1 is plotted. F 5 63; df 2, 8; P, (uncorrected). Note the ability of the cosine basis functions to better model floor and ceiling effects. The first two columns of the design matrix show the cosine basis functions. The single case and comparison are illustrated with a 12-run H 2 15 O PET activation study using an auditory paradigm of increasing word presentation rates from 0 to 90 words per minute. The data were obtained with a CTI PET camera (Model 935B; CTI, Knoxville, TN). A patient, who had recovered from severe aphasia after an infarction largely confined to the left temporal lobe and involving the whole of the superior temporal gyrus, was studied. To illustrate the comparison of regression between different subjects, a normal volunteer was

5 64 BÜCHEL ET AL. FIG. 5. SPM5F6, regression plot, and design matrix for the comparison of patient and control subject. Voxels over a threshold of F 5 15 are shown. The regression for similar rcbf responses at a voxel x:62, y:222, and z:12 mm are shown. F 5 49; df 2, 19; P, (uncorrected). investigated using the same paradigm. This paper emphasizes the implementation of this technique. Full results and neurobiological implications will be presented separately. Figure 1 illustrates the standard linear approach and shows the SPM5F6 in a maximum intensity projection, the corresponding design matrix in image format, and the regression plot for the patient. The SPM5F6 shows voxels over a threshold of F Significant regions include the right perisylvian area and parts of the anterior temporal lobe. The linear regression of a voxel (x 5 54, y 5218, and z 5 4 mm; F 5 75; df 1, 9; P, [uncorrected]) in the periauditory cortex is shown. Comparing the nonlinear approach shown in Fig. 2 to this SPM5F6, an additional area in the left frontal region reaches significance at the same threshold. As expected, the regression of this voxel (x:234, y:40, z:24 mm) shows a highly nonlinear ( U-shaped ) rcbf response in relation to word-rate (Fig. 2, bottom). As described above, the F values represent the significance of the variance introduced by the parameters of interest. In our case the F values directly reflect the goodness of fit. The solid line is the graphical representation of the regression and demonstrates the ability of the technique to approximate nonlinear rcbf responses. Figure 3 shows the SPM5F6 and regression plot of the same voxel as in Fig. 1 using cosine basis functions. The design matrix contains the two cosine basis functions and the values of global blood flow, defined as a covariate of no interest. Cosine basis functions are able to model floor and ceiling effects slightly better than the polynomials: F 5 63 for the cosine and F 5 43 for the polynomials (regression not shown), both df 2, 8. The higher F value (F 5 75) for the linear regression at this voxel compared to the nonlinear techniques (polynomials F 5 43, cosine basis functions F 5 63) is related to fewer degrees of freedom by introducing a second covariate of interest in the nonlinear case. When the regressions of Figs. 1 and 3 are compared visually, the better fit through the cosine basis functions is evident. Figure 4 shows a corresponding plot at the same voxel as in Figs. 1 and 3 for the normal subject. Note the

6 NONLINEAR REGRESSION IN ACTIVATION STUDIES 65 Figure 6 demonstrates the statistical inference about differences of rcbf responses to parameter for different groups. The first two columns of the design matrix are the covariates of interest (i.e., differences). Confounding covariates are the commonalities (columns 3 and 4), global blood flow (column 7), and block effects (column 5 and 6). There are two maxima apparent in the maximum intensity projection of the SPM5F6. To demonstrate the nonlinear regression we have chosen a voxel in the left hippocampus (x:218, y:226, z:212 mm). Note the decrease of rcbf in relation to increasing word rates in the control subject, whereas the patient shows an increase in rcbf. DISCUSSION FIG. 6. SPM5F6, regression plot, and design matrix for the differences of rcbf responses in the patient and the control subject. Voxels over a threshold of F 5 10 are shown. Regression for differential rcbf responses at a voxel x:218, y:226, and z:212 mm are shown. F 5 17; df 2, 17; P, (uncorrected). similar rcbf response in the right temporal region. The control subject also shows a significant area in the left temporal region (an area that was involved by infarction in the patient). SPM5F6 for similarities and plots of word rate against rcbf at a voxel in the right temporal region [x:62, y:222, z:12 mm with respect to the standard space defined by Talairach and Tournoux (1988)] for both subjects and the design matrix in image format are shown in Fig. 5. The analysis shown reveals areas where rcbf responses to the study parameter are similar in the patient and in the control subject. The first two columns of the design matrix are the covariates of interest (i.e., commonalities). Confounding covariates are global blood flow (column 5) and block effects (columns 3 and 4). Since the patient did not show any significant regression in the left temporal region (see Fig. 1), the significant similarities are restricted to the right superior temporal region as shown in Fig. 5. The application of a general nonlinear fitting technique allows detection of rcbf responses in brain regions which might not have been so evident using simple (i.e., linear) correlation coefficients. The general approach using polynomial expansions avoids predefined fit functions and is able to model a variety of nonlinear rcbf responses without specifying the form of the expected regression. Using this technique different brain areas can show differential responses to a study parameter, which can then be used to characterize each area involved in this task. On the other hand this technique may also improve the discrimination of different, but spatially nearby, areas, which are active in the same task but with different rcbf responses. A clinical application in this respect could be the investigation of cortical reorganization after cerebral injury (e.g., trauma or stroke) as well as physiologic correlates of learning. A possible study design could look for different responses in anatomically defined language areas (e.g., Broca s and Wernicke s area) in normal controls and compare those findings with data from patients suffering from different types of aphasia after a stroke, as shown in our example. Although we have restricted our model to a secondorder polynomial regression, other basis functions could be used. The use of cosine basis functions has some advantages in modeling ceiling or floor effects; however, the interpretation of polynomial coefficients is more intuitive than interpreting coefficients of cosine basis functions (i.e., a decomposition into linear and nonlinear effects). This may be important in experimental analysis where the introduction of nonlinearity (secondorder polynomial) considerably improves the fit. In general the question of number and type of basis functions is an issue of model selection. Statistical inferences on basis of the SPM5F6 reported here are uncorrected for multiple comparison. In those circumstances where a precise a priori anatomical

7 66 BÜCHEL ET AL. hypothesis exists, the correction of P values for multiple comparisons is not necessary. However, in cases where it is impossible to predict the spatial localization of the activation or in the case of activation outside the hypothesized area, reporting uncorrected P values is not statistically appropriate. Based on the equation given by Worsley (1994), it should be possible to estimate corrected P values on the basis of the theory of Gaussian fields. This depends on estimating the smoothness of the components of the F field and will be the subject of a subsequent article. We have demonstrated nonlinear regression in the context of a PET activation study. Another major application is functional magnetic resonance imaging or magnetencephalography. Those imaging techniques allow repeated measurements on the same subject over a short period of time, which make them especially eligible for correlational or parametric designs. In general the technique presented here can be applied to those large data sets but the number and type of basic functions might have to be adjusted according to the nature of the rcbf response. In conclusion, we hope that this novel approach will provide a richer characterization of nonlinear brain responses to stimulus or task parameters. ACKNOWLEDGMENTS We thank Keith Worsley for his comments on this manuscript. R.J.S.W., K.J.F., and C.B. were funded by the Wellcome Trust. C.J.M. was funded by the Medical Research Council and J.B.P. was funded by the European Community. REFERENCES Friston, K. J., Holmes, A. P., Worsley, K. J., Poline, J. B., Frith, C. D., and Frackowiak, R. S. J Statistical parametric maps in functional imaging: A general linear approach. Hum. Brain Map. 2: Grafton, S., Mazziota, J., Presty, S., Friston, K. J., Frackowiak, R. S. J., and Phelps, M Functional anatomy of human procedural learning determined with regional cerebral blood flow and PET. J. Neurosci. 12: Price, C., Wise, R. J. S., Ramsay, S., Friston, K. J., Howard, D., Patterson, K., and Frackowiak, R. S. J Regional response differences within the human auditory cortex when listening to words. Neurosci. Lett. 146: Talairach, P., and Tournoux, P Co-planar Stereotactic Atlas of the Human Brain: 3-Dimensional Proportional System: An Approach to Cerebral Imaging. Thieme Verlag, Stuttgart/New York. Worsley, K. J Local maxima and the expected euler characteristic of excursion sets of x 2, F and t fields. Adv. Appl. Probl. 26:13 42.

### The Wondrous World of fmri statistics

Outline The Wondrous World of fmri statistics FMRI data and Statistics course, Leiden, 11-3-2008 The General Linear Model Overview of fmri data analysis steps fmri timeseries Modeling effects of interest

### 2 Neurons. 4 The Brain: Cortex

1 Neuroscience 2 Neurons output integration axon cell body, membrane potential Frontal planning control auditory episodes soma motor Temporal Parietal action language objects space vision Occipital inputs

### POLYNOMIAL AND MULTIPLE REGRESSION. Polynomial regression used to fit nonlinear (e.g. curvilinear) data into a least squares linear regression model.

Polynomial Regression POLYNOMIAL AND MULTIPLE REGRESSION Polynomial regression used to fit nonlinear (e.g. curvilinear) data into a least squares linear regression model. It is a form of linear regression

### Nonlinear Event-Related Responses in fmri

Nonlinear Event-Related Responses in fmri Karl J. Friston, Oliver Josephs, Geraint Rees, Robert Turner This paper presents an approach to characterizing evoked hemodynamic responses in fmrl based on nonlinear

### Characterizing the Response of PET and fmri Data using Multivariate Linear Models (MLM)

Characterizing the Response of PET and fmri Data using Multivariate Linear Models (MLM) K.J. Worsley 1, J-B. Poline 2, K.J. Friston 2 and A.C. Evans 3 1 Department of Mathematics and Statistics, McGill

Lecture 16: Generalized Additive Models Regression III: Advanced Methods Bill Jacoby Michigan State University http://polisci.msu.edu/jacoby/icpsr/regress3 Goals of the Lecture Introduce Additive Models

### Cognitive Neuroscience

Cognitive Neuroscience Exploring Brain/Behavior relations Neuroscience Psychology Cognitive Neuroscience Computational Sciences / Artificial intelligence Franz Joseph Gall & J. C. Spurzheim localization

### Applications of random field theory to electrophysiology

Neuroscience Letters 374 (2005) 174 178 Applications of random field theory to electrophysiology James M. Kilner, Stefan J. Kiebel, Karl J. Friston The Wellcome Department of Imaging Neuroscience, Institute

### Validating cluster size inference: random field and permutation methods

NeuroImage 20 (2003) 2343 2356 www.elsevier.com/locate/ynimg Validating cluster size inference: random field and permutation methods Satoru Hayasaka and Thomas E. Nichols* Department of Biostatistics,

### Introduction. Karl J Friston

Introduction Experimental design and Statistical Parametric Mapping Karl J Friston The Wellcome Dept. of Cognitive Neurology, University College London Queen Square, London, UK WCN 3BG Tel 44 00 7833 7456

### COMMENTS AND CONTROVERSIES Why Voxel-Based Morphometry Should Be Used

NeuroImage 14, 1238 1243 (2001) doi:10.1006/nimg.2001.0961, available online at http://www.idealibrary.com on COMMENTS AND CONTROVERSIES Why Voxel-Based Morphometry Should Be Used John Ashburner 1 and

### 1 Determinants and the Solvability of Linear Systems

1 Determinants and the Solvability of Linear Systems In the last section we learned how to use Gaussian elimination to solve linear systems of n equations in n unknowns The section completely side-stepped

### 3.6: General Hypothesis Tests

3.6: General Hypothesis Tests The χ 2 goodness of fit tests which we introduced in the previous section were an example of a hypothesis test. In this section we now consider hypothesis tests more generally.

### Nonparametric Permutation Tests for Functional Neuroimaging

Nonparametric Permutation Tests for Functional Neuroimaging T.E. Nichols A.P. Holmes March 4, 2003 Department of Biostatistics, University of Michigan, Ann Arbor, Michigan, 48109. USA. Andrew Holmes, AstraZeneca

### Nonlinear Blind Source Separation and Independent Component Analysis

Nonlinear Blind Source Separation and Independent Component Analysis Prof. Juha Karhunen Helsinki University of Technology Neural Networks Research Centre Espoo, Finland Helsinki University of Technology,

### The ERP Boot Camp! ERP Localization!

The! ERP Localization! All slides S. J. Luck, except as indicated in the notes sections of individual slides! Slides may be used for nonprofit educational purposes if this copyright notice is included,

### Cognitive Neuroscience. Questions. Multiple Methods. Electrophysiology. Multiple Methods. Approaches to Thinking about the Mind

Cognitive Neuroscience Approaches to Thinking about the Mind Cognitive Neuroscience Evolutionary Approach Sept 20-22, 2004 Interdisciplinary approach Rapidly changing How does the brain enable cognition?

### Neuroimaging module I: Modern neuroimaging methods of investigation of the human brain in health and disease

1 Neuroimaging module I: Modern neuroimaging methods of investigation of the human brain in health and disease The following contains a summary of the content of the neuroimaging module I on the postgraduate

### Least-Squares Intersection of Lines

Least-Squares Intersection of Lines Johannes Traa - UIUC 2013 This write-up derives the least-squares solution for the intersection of lines. In the general case, a set of lines will not intersect at a

### Principal Component Analysis Application to images

Principal Component Analysis Application to images Václav Hlaváč Czech Technical University in Prague Faculty of Electrical Engineering, Department of Cybernetics Center for Machine Perception http://cmp.felk.cvut.cz/

### Linear Threshold Units

Linear Threshold Units w x hx (... w n x n w We assume that each feature x j and each weight w j is a real number (we will relax this later) We will study three different algorithms for learning linear

### Bayesian probability theory

Bayesian probability theory Bruno A. Olshausen arch 1, 2004 Abstract Bayesian probability theory provides a mathematical framework for peforming inference, or reasoning, using probability. The foundations

### Why do we have so many brain coordinate systems? Lilla ZölleiZ WhyNHow seminar 12/04/08

Why do we have so many brain coordinate systems? Lilla ZölleiZ WhyNHow seminar 12/04/08 About brain atlases What are they? What do we use them for? Who creates them? Which one shall I use? Brain atlas

### Example: Credit card default, we may be more interested in predicting the probabilty of a default than classifying individuals as default or not.

Statistical Learning: Chapter 4 Classification 4.1 Introduction Supervised learning with a categorical (Qualitative) response Notation: - Feature vector X, - qualitative response Y, taking values in C

### II. DISTRIBUTIONS distribution normal distribution. standard scores

Appendix D Basic Measurement And Statistics The following information was developed by Steven Rothke, PhD, Department of Psychology, Rehabilitation Institute of Chicago (RIC) and expanded by Mary F. Schmidt,

### Elementary Statistics Sample Exam #3

Elementary Statistics Sample Exam #3 Instructions. No books or telephones. Only the supplied calculators are allowed. The exam is worth 100 points. 1. A chi square goodness of fit test is considered to

### Univariate Regression

Univariate Regression Correlation and Regression The regression line summarizes the linear relationship between 2 variables Correlation coefficient, r, measures strength of relationship: the closer r is

### Stefanos D. Georgiadis Perttu O. Ranta-aho Mika P. Tarvainen Pasi A. Karjalainen. University of Kuopio Department of Applied Physics Kuopio, FINLAND

5 Finnish Signal Processing Symposium (Finsig 5) Kuopio, Finland Stefanos D. Georgiadis Perttu O. Ranta-aho Mika P. Tarvainen Pasi A. Karjalainen University of Kuopio Department of Applied Physics Kuopio,

### Technology Step-by-Step Using StatCrunch

Technology Step-by-Step Using StatCrunch Section 1.3 Simple Random Sampling 1. Select Data, highlight Simulate Data, then highlight Discrete Uniform. 2. Fill in the following window with the appropriate

### How are Parts of the Brain Related to Brain Function?

How are Parts of the Brain Related to Brain Function? Scientists have found That the basic anatomical components of brain function are related to brain size and shape. The brain is composed of two hemispheres.

### Notes for STA 437/1005 Methods for Multivariate Data

Notes for STA 437/1005 Methods for Multivariate Data Radford M. Neal, 26 November 2010 Random Vectors Notation: Let X be a random vector with p elements, so that X = [X 1,..., X p ], where denotes transpose.

### 15.062 Data Mining: Algorithms and Applications Matrix Math Review

.6 Data Mining: Algorithms and Applications Matrix Math Review The purpose of this document is to give a brief review of selected linear algebra concepts that will be useful for the course and to develop

### Chi-Square Test. Contingency Tables. Contingency Tables. Chi-Square Test for Independence. Chi-Square Tests for Goodnessof-Fit

Chi-Square Tests 15 Chapter Chi-Square Test for Independence Chi-Square Tests for Goodness Uniform Goodness- Poisson Goodness- Goodness Test ECDF Tests (Optional) McGraw-Hill/Irwin Copyright 2009 by The

### MATHEMATICAL METHODS OF STATISTICS

MATHEMATICAL METHODS OF STATISTICS By HARALD CRAMER TROFESSOK IN THE UNIVERSITY OF STOCKHOLM Princeton PRINCETON UNIVERSITY PRESS 1946 TABLE OF CONTENTS. First Part. MATHEMATICAL INTRODUCTION. CHAPTERS

### Ged Ridgway Wellcome Trust Centre for Neuroimaging University College London. [slides from the FIL Methods group] SPM Course Vancouver, August 2010

Ged Ridgway Wellcome Trust Centre for Neuroimaging University College London [slides from the FIL Methods group] SPM Course Vancouver, August 2010 β β y X X e one sample t-test two sample t-test paired

### Fairfield Public Schools

Mathematics Fairfield Public Schools AP Statistics AP Statistics BOE Approved 04/08/2014 1 AP STATISTICS Critical Areas of Focus AP Statistics is a rigorous course that offers advanced students an opportunity

### L10: Probability, statistics, and estimation theory

L10: Probability, statistics, and estimation theory Review of probability theory Bayes theorem Statistics and the Normal distribution Least Squares Error estimation Maximum Likelihood estimation Bayesian

### Quantitative Analysis of Medical Imaging Data

Quantitative Analysis of Medical Imaging Data Brandon Whitcher PhD CStat Mango Solutions London, United Kingdom www.mango-solutions.com user!2011 17 August 2011 The Menu Today Motivation Early-Phase Drug

### Part 2: Analysis of Relationship Between Two Variables

Part 2: Analysis of Relationship Between Two Variables Linear Regression Linear correlation Significance Tests Multiple regression Linear Regression Y = a X + b Dependent Variable Independent Variable

### Simple Linear Regression Inference

Simple Linear Regression Inference 1 Inference requirements The Normality assumption of the stochastic term e is needed for inference even if it is not a OLS requirement. Therefore we have: Interpretation

### Percent Signal Change for fmri calculations

Percent Signal Change for fmri calculations Paul Mazaika, Feb. 23, 2009 Quantitative scaling into percent signal change is helpful to detect and eliminate bad results with abnormal extreme values. While

### Functional localization and double dissociations: The relationship between internal structure and behavior

Brain and Cognition 57 (2005) 146 150 www.elsevier.com/locate/b&c Functional localization and double dissociations: The relationship between internal structure and behavior David A. Medler a, *, Michael

### Inferential Statistics

Inferential Statistics Sampling and the normal distribution Z-scores Confidence levels and intervals Hypothesis testing Commonly used statistical methods Inferential Statistics Descriptive statistics are

Lecture 5: Linear least-squares Regression III: Advanced Methods William G. Jacoby Department of Political Science Michigan State University http://polisci.msu.edu/jacoby/icpsr/regress3 Simple Linear Regression

### Mining Information from Brain Images

Mining Information from Brain Images Brian D. Ripley Professor of Applied Statistics University of Oxford ripley@stats.ox.ac.uk http://www.stats.ox.ac.uk/ ripley/talks.html Outline Part 1: Characterizing

### An fmri study on reading Hangul and Chinese Characters by Korean Native Speakers

언 어 치 료 연 구, 제14 권 제4호 Journal of Speech & Hearing Disorders 2005, Vol.14, No.4, 29 ~ 36 An fmri study on reading Hangul and Chinese Characters by Korean Native Speakers Hyo-Woon Yoon(Brain Science Research

### Curve Fitting. Next: Numerical Differentiation and Integration Up: Numerical Analysis for Chemical Previous: Optimization.

Next: Numerical Differentiation and Integration Up: Numerical Analysis for Chemical Previous: Optimization Subsections Least-Squares Regression Linear Regression General Linear Least-Squares Nonlinear

Statistics Graduate Courses STAT 7002--Topics in Statistics-Biological/Physical/Mathematics (cr.arr.).organized study of selected topics. Subjects and earnable credit may vary from semester to semester.

### Functional neuroimaging. Imaging brain function in real time (not just the structure of the brain).

Functional neuroimaging Imaging brain function in real time (not just the structure of the brain). The brain is bloody & electric Blood increase in neuronal activity increase in metabolic demand for glucose

### Fourier Analysis. A cosine wave is just a sine wave shifted in phase by 90 o (φ=90 o ). degrees

Fourier Analysis Fourier analysis follows from Fourier s theorem, which states that every function can be completely expressed as a sum of sines and cosines of various amplitudes and frequencies. This

### Computational Foundations of Cognitive Science

Computational Foundations of Cognitive Science Lecture 15: Convolutions and Kernels Frank Keller School of Informatics University of Edinburgh keller@inf.ed.ac.uk February 23, 2010 Frank Keller Computational

### Regression, least squares

Regression, least squares Joe Felsenstein Department of Genome Sciences and Department of Biology Regression, least squares p.1/24 Fitting a straight line X Two distinct cases: The X values are chosen

### Association Between Variables

Contents 11 Association Between Variables 767 11.1 Introduction............................ 767 11.1.1 Measure of Association................. 768 11.1.2 Chapter Summary.................... 769 11.2 Chi

### Central Itching Modulation : A Human PET Study

CYRIC Annual Report 2003 VIII. 1. Central Itching Modulation : A Human PET Study Mochizuki. H, Tashiro M., Kano M., Sakurada Y., Itoh M. *, and Yanai K. Department of Pharmacology, Tohoku University School

### 1 Introduction to Matrices

1 Introduction to Matrices In this section, important definitions and results from matrix algebra that are useful in regression analysis are introduced. While all statements below regarding the columns

### MATH 304 Linear Algebra Lecture 4: Matrix multiplication. Diagonal matrices. Inverse matrix.

MATH 304 Linear Algebra Lecture 4: Matrix multiplication. Diagonal matrices. Inverse matrix. Matrices Definition. An m-by-n matrix is a rectangular array of numbers that has m rows and n columns: a 11

Supplementary Methods Subjects: Fourteen Princeton undergraduate and graduate students were recruited to participate in the study, including 9 females and 5 males. The mean age was 21.4 years, with standard

### Chapter 14: Analyzing Relationships Between Variables

Chapter Outlines for: Frey, L., Botan, C., & Kreps, G. (1999). Investigating communication: An introduction to research methods. (2nd ed.) Boston: Allyn & Bacon. Chapter 14: Analyzing Relationships Between

### Using the Singular Value Decomposition

Using the Singular Value Decomposition Emmett J. Ientilucci Chester F. Carlson Center for Imaging Science Rochester Institute of Technology emmett@cis.rit.edu May 9, 003 Abstract This report introduces

Lecture 11: Outliers and Influential data Regression III: Advanced Methods William G. Jacoby Michigan State University http://polisci.msu.edu/jacoby/icpsr/regress3 Outlying Observations: Why pay attention?

### 1.The Brainvisa Hierarchy for fmri databases

fmri Toolbox of Brainvisa.The Brainvisa Hierarchy for fmri databases The Brainvisa software defines a directory structure in order to help the selection of various files for the available processing steps.

### Supplemental Information: Structure of Orbitofrontal Cortex Predicts Social Influence

1 Supplemental Information: Structure of Orbitofrontal Cortex Predicts Social Influence Daniel K Campbell Meiklejohn, Ryota Kanai, Bahador Bahrami, Dominik R Bach, Raymond J Dolan, Andreas Roepstorff &

### CHAPTER 11 CHI-SQUARE: NON-PARAMETRIC COMPARISONS OF FREQUENCY

CHAPTER 11 CHI-SQUARE: NON-PARAMETRIC COMPARISONS OF FREQUENCY The hypothesis testing statistics detailed thus far in this text have all been designed to allow comparison of the means of two or more samples

Lecture 4: Transformations Regression III: Advanced Methods William G. Jacoby Michigan State University Goals of the lecture The Ladder of Roots and Powers Changing the shape of distributions Transforming

### Module 5: Multiple Regression Analysis

Using Statistical Data Using to Make Statistical Decisions: Data Multiple to Make Regression Decisions Analysis Page 1 Module 5: Multiple Regression Analysis Tom Ilvento, University of Delaware, College

### Solving Systems of Linear Equations Using Matrices

Solving Systems of Linear Equations Using Matrices What is a Matrix? A matrix is a compact grid or array of numbers. It can be created from a system of equations and used to solve the system of equations.

### Factor Analysis. Chapter 420. Introduction

Chapter 420 Introduction (FA) is an exploratory technique applied to a set of observed variables that seeks to find underlying factors (subsets of variables) from which the observed variables were generated.

### X X X a) perfect linear correlation b) no correlation c) positive correlation (r = 1) (r = 0) (0 < r < 1)

CORRELATION AND REGRESSION / 47 CHAPTER EIGHT CORRELATION AND REGRESSION Correlation and regression are statistical methods that are commonly used in the medical literature to compare two or more variables.

### 1. What is the critical value for this 95% confidence interval? CV = z.025 = invnorm(0.025) = 1.96

1 Final Review 2 Review 2.1 CI 1-propZint Scenario 1 A TV manufacturer claims in its warranty brochure that in the past not more than 10 percent of its TV sets needed any repair during the first two years

### Partial Least Squares (PLS) Regression.

Partial Least Squares (PLS) Regression. Hervé Abdi 1 The University of Texas at Dallas Introduction Pls regression is a recent technique that generalizes and combines features from principal component

### Self Organizing Maps: Fundamentals

Self Organizing Maps: Fundamentals Introduction to Neural Networks : Lecture 16 John A. Bullinaria, 2004 1. What is a Self Organizing Map? 2. Topographic Maps 3. Setting up a Self Organizing Map 4. Kohonen

### Variance of OLS Estimators and Hypothesis Testing. Randomness in the model. GM assumptions. Notes. Notes. Notes. Charlie Gibbons ARE 212.

Variance of OLS Estimators and Hypothesis Testing Charlie Gibbons ARE 212 Spring 2011 Randomness in the model Considering the model what is random? Y = X β + ɛ, β is a parameter and not random, X may be

### SAS Software to Fit the Generalized Linear Model

SAS Software to Fit the Generalized Linear Model Gordon Johnston, SAS Institute Inc., Cary, NC Abstract In recent years, the class of generalized linear models has gained popularity as a statistical modeling

### SPECIAL PERTURBATIONS UNCORRELATED TRACK PROCESSING

AAS 07-228 SPECIAL PERTURBATIONS UNCORRELATED TRACK PROCESSING INTRODUCTION James G. Miller * Two historical uncorrelated track (UCT) processing approaches have been employed using general perturbations

### Integration and Visualization of Multimodality Brain Data for Language Mapping

Integration and Visualization of Multimodality Brain Data for Language Mapping Andrew V. Poliakov, PhD, Kevin P. Hinshaw, MS, Cornelius Rosse, MD, DSc and James F. Brinkley, MD, PhD Structural Informatics

### Multivariate Analysis of Variance. The general purpose of multivariate analysis of variance (MANOVA) is to determine

2 - Manova 4.3.05 25 Multivariate Analysis of Variance What Multivariate Analysis of Variance is The general purpose of multivariate analysis of variance (MANOVA) is to determine whether multiple levels

### Thresholding of Statistical Maps in Functional Neuroimaging Using the False Discovery Rate 1

NeuroImage 15, 870 878 (2002) doi:10.1006/nimg.2001.1037, available online at http://www.idealibrary.com on Thresholding of Statistical Maps in Functional Neuroimaging Using the False Discovery Rate 1

### Least Squares Estimation

Least Squares Estimation SARA A VAN DE GEER Volume 2, pp 1041 1045 in Encyclopedia of Statistics in Behavioral Science ISBN-13: 978-0-470-86080-9 ISBN-10: 0-470-86080-4 Editors Brian S Everitt & David

### Metric Multidimensional Scaling (MDS): Analyzing Distance Matrices

Metric Multidimensional Scaling (MDS): Analyzing Distance Matrices Hervé Abdi 1 1 Overview Metric multidimensional scaling (MDS) transforms a distance matrix into a set of coordinates such that the (Euclidean)

### Dimensionality Reduction: Principal Components Analysis

Dimensionality Reduction: Principal Components Analysis In data mining one often encounters situations where there are a large number of variables in the database. In such situations it is very likely

### Appendix 4 Simulation software for neuronal network models

Appendix 4 Simulation software for neuronal network models D.1 Introduction This Appendix describes the Matlab software that has been made available with Cerebral Cortex: Principles of Operation (Rolls

### Unit 31 A Hypothesis Test about Correlation and Slope in a Simple Linear Regression

Unit 31 A Hypothesis Test about Correlation and Slope in a Simple Linear Regression Objectives: To perform a hypothesis test concerning the slope of a least squares line To recognize that testing for a

### 496 STATISTICAL ANALYSIS OF CAUSE AND EFFECT

496 STATISTICAL ANALYSIS OF CAUSE AND EFFECT * Use a non-parametric technique. There are statistical methods, called non-parametric methods, that don t make any assumptions about the underlying distribution

### Simple Regression Theory II 2010 Samuel L. Baker

SIMPLE REGRESSION THEORY II 1 Simple Regression Theory II 2010 Samuel L. Baker Assessing how good the regression equation is likely to be Assignment 1A gets into drawing inferences about how close the

### Pearson s Correlation

Pearson s Correlation Correlation the degree to which two variables are associated (co-vary). Covariance may be either positive or negative. Its magnitude depends on the units of measurement. Assumes the

### December 4, 2013 MATH 171 BASIC LINEAR ALGEBRA B. KITCHENS

December 4, 2013 MATH 171 BASIC LINEAR ALGEBRA B KITCHENS The equation 1 Lines in two-dimensional space (1) 2x y = 3 describes a line in two-dimensional space The coefficients of x and y in the equation

### Multivariate Logistic Regression

1 Multivariate Logistic Regression As in univariate logistic regression, let π(x) represent the probability of an event that depends on p covariates or independent variables. Then, using an inv.logit formulation

### EEG COHERENCE AND PHASE DELAYS: COMPARISONS BETWEEN SINGLE REFERENCE, AVERAGE REFERENCE AND CURRENT SOURCE DENSITY

Version 1, June 13, 2004 Rough Draft form We apologize while we prepare the manuscript for publication but the data are valid and the conclusions are fundamental EEG COHERENCE AND PHASE DELAYS: COMPARISONS

### Effective Connectivity Modeling with the eusem and GIMME

Effective Connectivity Modeling with the eusem and GIMME Benjamin Zinszer PhD candidate, Department of Psychology MAS candidate, Department of Statistics Center for Language Science Pennsylvania State

### Outline of Topics. Statistical Methods I. Types of Data. Descriptive Statistics

Statistical Methods I Tamekia L. Jones, Ph.D. (tjones@cog.ufl.edu) Research Assistant Professor Children s Oncology Group Statistics & Data Center Department of Biostatistics Colleges of Medicine and Public

### Lecture 7: Factor Analysis. Laura McAvinue School of Psychology Trinity College Dublin

Lecture 7: Factor Analysis Laura McAvinue School of Psychology Trinity College Dublin The Relationship between Variables Previous lectures Correlation Measure of strength of association between two variables

### This chapter will demonstrate how to perform multiple linear regression with IBM SPSS

CHAPTER 7B Multiple Regression: Statistical Methods Using IBM SPSS This chapter will demonstrate how to perform multiple linear regression with IBM SPSS first using the standard method and then using the

### Auditory memory and cerebral reorganization in post-linguistically deaf adults

Auditory memory and cerebral reorganization in post-linguistically deaf adults Implications for cochlear implantation outcome D Lazard, HJ Lee, E Truy, AL Giraud Ecole Normale Supérieure, Inserm U960,

### UNDERSTANDING THE TWO-WAY ANOVA

UNDERSTANDING THE e have seen how the one-way ANOVA can be used to compare two or more sample means in studies involving a single independent variable. This can be extended to two independent variables

### False discovery rate and permutation test: An evaluation in ERP data analysis

Research Article Received 7 August 2008, Accepted 8 October 2009 Published online 25 November 2009 in Wiley Interscience (www.interscience.wiley.com) DOI: 10.1002/sim.3784 False discovery rate and permutation

### MATHEMATICAL METHODS FOURIER SERIES

MATHEMATICAL METHODS FOURIER SERIES I YEAR B.Tech By Mr. Y. Prabhaker Reddy Asst. Professor of Mathematics Guru Nanak Engineering College Ibrahimpatnam, Hyderabad. SYLLABUS OF MATHEMATICAL METHODS (as

### Geostatistics Exploratory Analysis

Instituto Superior de Estatística e Gestão de Informação Universidade Nova de Lisboa Master of Science in Geospatial Technologies Geostatistics Exploratory Analysis Carlos Alberto Felgueiras cfelgueiras@isegi.unl.pt