1 NEUROIMAGE 4, (1996) ARTICLE NO Nonlinear Regression in Parametric Activation Studies C. BÜCHEL, R. J. S. WISE, C. J. MUMMERY, J.-B. POLINE, AND K. J. FRISTON The Wellcome Department of Cognitive Neurology, Institute of Neurology, Queen Square, London WC1N 3BG, United Kingdom Received March 13, 1996 Parametric study designs can reveal information about the relationship between a study parameter (e.g., word presentation rate) and regional cerebral blood flow (rcbf) in functional imaging. The brain s responses in relation to study parameters might be nonlinear, therefore the (linear) correlation coefficient as often used in the analysis of parametric studies might not be a proper characterization. We present a noninteractive method, which fits nonlinear functions of stimulus or task parameters to rcbf responses, using second-order polynomial expansions. This technique is implemented in the context of the general linear model and statistical parametric mapping. We also consider the usefulness of statistical inferences, based on F fields, about similarities and differences of these nonlinear responses in different groups. This approach is illustrated with a 12-run H 2 15 O PET activation study using an auditory paradigm of increasing word presentation rates. A patient who had recovered from severe aphasia and a normal control were studied. We demonstrate the ability of this new technique to identify brain regions where rcbf is closely related to increasing word presentation rate in both subjects without constraining the nature of this relationship and where these nonlinear responses differ. r 1996 Academic Press, Inc. INTRODUCTION Based on the premise that regional cerebral blood flow (rcbf) varies with the amount of cortical processing engaged by an experimental task, parametric or correlational designs can reveal information about the relationship between a study parameter (e.g., word presentation rate), a behavioral response (e.g., reaction time), and rcbf. The variable being correlated can either be continuous (e.g., reaction time) or discrete (e.g., word presentation rate). Examples are studies by Grafton et al. (1992) who demonstrated correlations between rcbf and the performance of a motor tracking task in the supplementary motor area and thalamus. Price et al. (1992) investigated the correlation between rcbf and frequency of aural word presentation in normal subjects. They found a linear relationship between rcbf and word rate in the periauditory regions. rcbf in Wernicke s area, however, was not associated with increased word rate but with the presence or absence of semantic content. This example illustrates the ability of a parametrical approach to characterize and differentiate brain regions by their rcbf response slope in relation to the task parameters. As the type of the relationship between parameters and rcbf varies in different brain regions and is unknown in advance, the a priori definition of a fit function (e.g., a linear function used with the correlation coefficient) might lead to an insufficient or partial result. We present an extension of the current analysis of parametric studies to overcome this restriction. A noninteractive method, which regresses nonlinear functions of stimulus or task parameters on rcbf responses (using a multilinear regression with secondorder polynomial expansions) is introduced. The proposed regression model is nonlinear in the explanatory variables but not in the unknown parameters. This is different from nonlinear regression in the statistics literature where unknown parameters are nonlinear. In the first part of this paper we will show how this approach is implemented in the context of the general linear model and statistical parametric mapping (SPM). The second aspect of this work is to introduce the SPM5F6 as a useful tool in this context. We assess the goodness of fit using spatially extended F statistics [i.e., SPM5F6 instead of SPM5Z6 (Friston et al., 1995)]. As parametric studies might reveal differences and similarities of rcbf response patterns in different groups, we also consider statistical inferences about similarities and differences of those nonlinear rcbf response patterns in group studies. This new approach is demonstrated and compared to a linear correlation with a parametric H 2 15 O positron emission tomography (PET) activation study. METHODS Digital signal processing utilizes different techniques to characterize discrete signals by a combina /96 $18.00 Copyright r 1996 by Academic Press, Inc. All rights of reproduction in any form reserved. 60
2 NONLINEAR REGRESSION IN ACTIVATION STUDIES 61 tion of a number of basis functions. Well-known examples are the Fourier expansion and the polynomial expansion. We adapt this technique to characterize rcbf responses in terms of a set of basis functions of a given parameter (e.g., study parameter or response) and show how rcbf responses can be approximated by a small number of these basis functions. In general the goodness of fit of the regression depends on the type and number of the basis functions chosen and on the number of data points to be approximated. We expect rcbf responses in cognitive activation studies to be smooth and monotonic. In the context of this paper we restrict ourselves to a PET activation study with 12 runs per subject, resulting in 12 data points for regression. Given the smoothness and small number of data points, a small set of basis functions is appropriate to characterize the relationship. The first two basis functions of the discrete cosine transformation and a second-order polynomial expansion were studied. Comparison of the goodness of fit for both approaches showed no marked difference. Cosine basis functions revealed slightly better results in modeling ceiling or floor effects (see Figs. 1 and 2). However, the second-order polynomial expansions were chosen because the coefficients are easier to interpret. Concerning the number of basis functions, additional tests showed that increasing the order of the polynomials beyond 2 did not increase the goodness of fit dramatically, but lowered the degrees of freedom for available statistical inference. In general the issue of an optimal number and type of basis functions is the domain of model selection and this will be the topic of a subsequent paper. SPMs (Friston et al., 1995) can be considered spatially extended statistical processes. In general SPM uses the general linear model to build t statistics. In the special case of parametric studies it is used to make statistical inferences about the correlation of rcbf and a study parameter. The basic equation of the general linear model is x ij 5 g i1 b 1j 1 g ik b kj 1 e ij. (1) Here b kj are k unknown parameters for each voxel j. The coefficients g are explanatory or modeled variables under which the observation (i.e., scan) was made. Comparing a standard polynomial expansion, p(x) 5 p l x n 1 p 2 x n21 1 p n x1p n11, (2) to (1), it is evident that this is another representation of the general linear model. Equation (1) can be written in matrix format: X 5 Gb 1e. (3) G is the design matrix, which has one row for every scan and one column for every modeled (i.e., designed ) effect, b is the parameter matrix which has one row for each column of G and one column for every voxel in the volume, and e is the matrix of error terms. For different explanatory variables the matrix G can be partitioned into several matrices or column vectors of interest G 1 G 2 and noninterest HG5[G 1 G 2 H] each with a unique parameter in b (b 5[b 1 b 2 g]). In our case the polynomial basis functions are the explanatory variables used to model rcbf responses at different voxels. As we restricted our analysis to a second-order polynomial expansion, the column vector G 1 contains the explanatory variable itself and G 2 contains the squared values of this variable. The leastsquares solutions b 1 and b 2 of b 1 and b 2 (b5[b 1 b 2 g]) are the coefficients p 5 [ p 1 p 2 ] of a second-order polynomial [see Eq. (2)] that best models the relation between rcbf and the variable in question. As mentioned above, the design matrix also contains a partition of confounding effects H where effects of no interest, such as global blood flow and block effects, are modeled. To test the overall significance of the effects of interest we test the null hypothesis that their introduction does not significantly reduce the error variance, which is equivalent to testing the hypothesis that both b 1 and b 2 are 0. Thus the error sum of squares, after discounting the effects of interest (G) are given by R(V 0 ) 5 (X 2 Hg) T (X2Hg), (4) where g is the parameter estimate of g. The alternative hypothesis includes the effects of interest. The error sum of squares and products under the alternative hypothesis is therefore given by R(V) 5 (X 2 Gb) T (X2Gb), (5) and the F statistic for voxel i is given by F i 5 r r 0 2 r? R i(v 0 ) 2 R i (V), (6) R i (V) where F has the F distribution with r 0 2 r and r degrees of freedom. In general r 0 5 N 2 rank(h) and r 5 N 2 rank(g) where N is the total number of scans. In our special case the F statistic reflects the goodness of the regression. An image of the F statistic at every voxel constitutes a SPM5F6. The SPM5F6 can be interpreted as an image of the significance of the variance explained by the parameters of interest (i.e., the polynomials in G) relative to error. Due to the extremely large number of nonindependent univariate analyses, the probability that a region
3 62 BÜCHEL ET AL. reaches an uncorrected threshold by chance is very high. On the basis of the theory of Gaussian fields we characterize a local excursion of the SPM by its maximal F value. This simple characterization has a certain probability of chance occurrence. The probability of getting one or more voxel with a certain F value in a given SPM5F6 is the same as the probability that the largest F value of the SPM5F6 is greater than this F value. At high thresholds this probability equals the expected number of maxima. Therefore the problem of calculating a corrected P value can be reduced to finding the expected number of maxima at or above this threshold. In a theoretical paper Worsley (1994) derived an equation for the probability that the largest F value of the SPM5F6 is greater than a threshold f on the basis of the smoothness of the underlying Gaussian component FIG. 2. SPM5F6 for nonlinear regression using polynomials. Voxels over a threshold of F 5 15 are shown. The regression for a voxel in a left frontal region (x 5234, y 5 40, and z 5 24 mm) that was not apparent in the linear regression is shown. F 5 20; df 2, 8; P, (uncorrected). processes. The result in three dimensions is FIG. 1. SPM5F6 for linear regression in the patient who had recovered from an ischemic infarction in the territory of the left middle cerebral artery. Voxels over a threshold of F 5 15 are shown. The plot shows adjusted activity in relation to word rate. The voxel investigated has the coordinates x 5 54, y 5218, and z 5 4mmin respect to the standard space defined by Talairach and Tournoux (1988). F 5 75; df 1, 9; P, (uncorrected). P(F max $ f ) < l(c) det (L) 1 2 G (m 1 n 2 3) 2 (2p) G 1 m 2 2 G 1 n 22 nf? 1 m 2 1/2(n23) nf m 2 21/2(m1n22) 3 5 (m 2 1)(m 2 2)? 1 nf m (2mn 2 m 2 n 21) nf m 1 (n 2 1)(n 2 2) 6, where f is the threshold, n and m are the degrees of freedom of the F statistic, l(c) is the Lebesgue measure of region C (here the number of voxels in the volume), and L is the variance covariance matrix of the first derivate of the underlying Gaussian component pro- (7)
4 NONLINEAR REGRESSION IN ACTIVATION STUDIES 63 cesses. P(F max $ f ) therefore corresponds to a corrected P value for a voxel with f 5 F. This general approach to parametric studies can also be extended to compare the nonlinear responses of different groups. This can be subdivided into statistical inferences on similarities and differences between groups. To test for differences, the polynomials appear twice in the design matrix. In the first partition the functions are replicated in both groups, whereas in the second partition the polynomials are inverted for the second group. This mirror-like appearance of the polynomial basis functions models differential responses in that case. The second, mirror part of the design matrix is the one of interest, whereas the first, symmetrical is a confounding covariate. Therefore the SPM5F6 is an image of the significance of the differences of rcbf responses in both groups. Defining the similarities as a confounding covariate in this case is necessary to make FIG. 4. SPM5F6 for nonlinear regression using polynomials in the control subject. Note the cluster of voxels in the left temporal region, absent in the patient. The plot for the same voxel as in Figs. 1 and 3 in the right superior temporal region is shown. F 5 37; df 2, 8; P, (uncorrected). specific inferences about different rcbf responses in the two subjects. To test for similarities, the symmetric polynomials (replicated for both groups) are covariates of interest and the SPM5F6 highlights where rcbf responses are similar in both groups. AN EXAMPLE FIG. 3. As for Fig. 1 but using cosine basis functions instead of the linear regression. The regression for the same voxel as in Fig. 1 is plotted. F 5 63; df 2, 8; P, (uncorrected). Note the ability of the cosine basis functions to better model floor and ceiling effects. The first two columns of the design matrix show the cosine basis functions. The single case and comparison are illustrated with a 12-run H 2 15 O PET activation study using an auditory paradigm of increasing word presentation rates from 0 to 90 words per minute. The data were obtained with a CTI PET camera (Model 935B; CTI, Knoxville, TN). A patient, who had recovered from severe aphasia after an infarction largely confined to the left temporal lobe and involving the whole of the superior temporal gyrus, was studied. To illustrate the comparison of regression between different subjects, a normal volunteer was
5 64 BÜCHEL ET AL. FIG. 5. SPM5F6, regression plot, and design matrix for the comparison of patient and control subject. Voxels over a threshold of F 5 15 are shown. The regression for similar rcbf responses at a voxel x:62, y:222, and z:12 mm are shown. F 5 49; df 2, 19; P, (uncorrected). investigated using the same paradigm. This paper emphasizes the implementation of this technique. Full results and neurobiological implications will be presented separately. Figure 1 illustrates the standard linear approach and shows the SPM5F6 in a maximum intensity projection, the corresponding design matrix in image format, and the regression plot for the patient. The SPM5F6 shows voxels over a threshold of F Significant regions include the right perisylvian area and parts of the anterior temporal lobe. The linear regression of a voxel (x 5 54, y 5218, and z 5 4 mm; F 5 75; df 1, 9; P, [uncorrected]) in the periauditory cortex is shown. Comparing the nonlinear approach shown in Fig. 2 to this SPM5F6, an additional area in the left frontal region reaches significance at the same threshold. As expected, the regression of this voxel (x:234, y:40, z:24 mm) shows a highly nonlinear ( U-shaped ) rcbf response in relation to word-rate (Fig. 2, bottom). As described above, the F values represent the significance of the variance introduced by the parameters of interest. In our case the F values directly reflect the goodness of fit. The solid line is the graphical representation of the regression and demonstrates the ability of the technique to approximate nonlinear rcbf responses. Figure 3 shows the SPM5F6 and regression plot of the same voxel as in Fig. 1 using cosine basis functions. The design matrix contains the two cosine basis functions and the values of global blood flow, defined as a covariate of no interest. Cosine basis functions are able to model floor and ceiling effects slightly better than the polynomials: F 5 63 for the cosine and F 5 43 for the polynomials (regression not shown), both df 2, 8. The higher F value (F 5 75) for the linear regression at this voxel compared to the nonlinear techniques (polynomials F 5 43, cosine basis functions F 5 63) is related to fewer degrees of freedom by introducing a second covariate of interest in the nonlinear case. When the regressions of Figs. 1 and 3 are compared visually, the better fit through the cosine basis functions is evident. Figure 4 shows a corresponding plot at the same voxel as in Figs. 1 and 3 for the normal subject. Note the
6 NONLINEAR REGRESSION IN ACTIVATION STUDIES 65 Figure 6 demonstrates the statistical inference about differences of rcbf responses to parameter for different groups. The first two columns of the design matrix are the covariates of interest (i.e., differences). Confounding covariates are the commonalities (columns 3 and 4), global blood flow (column 7), and block effects (column 5 and 6). There are two maxima apparent in the maximum intensity projection of the SPM5F6. To demonstrate the nonlinear regression we have chosen a voxel in the left hippocampus (x:218, y:226, z:212 mm). Note the decrease of rcbf in relation to increasing word rates in the control subject, whereas the patient shows an increase in rcbf. DISCUSSION FIG. 6. SPM5F6, regression plot, and design matrix for the differences of rcbf responses in the patient and the control subject. Voxels over a threshold of F 5 10 are shown. Regression for differential rcbf responses at a voxel x:218, y:226, and z:212 mm are shown. F 5 17; df 2, 17; P, (uncorrected). similar rcbf response in the right temporal region. The control subject also shows a significant area in the left temporal region (an area that was involved by infarction in the patient). SPM5F6 for similarities and plots of word rate against rcbf at a voxel in the right temporal region [x:62, y:222, z:12 mm with respect to the standard space defined by Talairach and Tournoux (1988)] for both subjects and the design matrix in image format are shown in Fig. 5. The analysis shown reveals areas where rcbf responses to the study parameter are similar in the patient and in the control subject. The first two columns of the design matrix are the covariates of interest (i.e., commonalities). Confounding covariates are global blood flow (column 5) and block effects (columns 3 and 4). Since the patient did not show any significant regression in the left temporal region (see Fig. 1), the significant similarities are restricted to the right superior temporal region as shown in Fig. 5. The application of a general nonlinear fitting technique allows detection of rcbf responses in brain regions which might not have been so evident using simple (i.e., linear) correlation coefficients. The general approach using polynomial expansions avoids predefined fit functions and is able to model a variety of nonlinear rcbf responses without specifying the form of the expected regression. Using this technique different brain areas can show differential responses to a study parameter, which can then be used to characterize each area involved in this task. On the other hand this technique may also improve the discrimination of different, but spatially nearby, areas, which are active in the same task but with different rcbf responses. A clinical application in this respect could be the investigation of cortical reorganization after cerebral injury (e.g., trauma or stroke) as well as physiologic correlates of learning. A possible study design could look for different responses in anatomically defined language areas (e.g., Broca s and Wernicke s area) in normal controls and compare those findings with data from patients suffering from different types of aphasia after a stroke, as shown in our example. Although we have restricted our model to a secondorder polynomial regression, other basis functions could be used. The use of cosine basis functions has some advantages in modeling ceiling or floor effects; however, the interpretation of polynomial coefficients is more intuitive than interpreting coefficients of cosine basis functions (i.e., a decomposition into linear and nonlinear effects). This may be important in experimental analysis where the introduction of nonlinearity (secondorder polynomial) considerably improves the fit. In general the question of number and type of basis functions is an issue of model selection. Statistical inferences on basis of the SPM5F6 reported here are uncorrected for multiple comparison. In those circumstances where a precise a priori anatomical
7 66 BÜCHEL ET AL. hypothesis exists, the correction of P values for multiple comparisons is not necessary. However, in cases where it is impossible to predict the spatial localization of the activation or in the case of activation outside the hypothesized area, reporting uncorrected P values is not statistically appropriate. Based on the equation given by Worsley (1994), it should be possible to estimate corrected P values on the basis of the theory of Gaussian fields. This depends on estimating the smoothness of the components of the F field and will be the subject of a subsequent article. We have demonstrated nonlinear regression in the context of a PET activation study. Another major application is functional magnetic resonance imaging or magnetencephalography. Those imaging techniques allow repeated measurements on the same subject over a short period of time, which make them especially eligible for correlational or parametric designs. In general the technique presented here can be applied to those large data sets but the number and type of basic functions might have to be adjusted according to the nature of the rcbf response. In conclusion, we hope that this novel approach will provide a richer characterization of nonlinear brain responses to stimulus or task parameters. ACKNOWLEDGMENTS We thank Keith Worsley for his comments on this manuscript. R.J.S.W., K.J.F., and C.B. were funded by the Wellcome Trust. C.J.M. was funded by the Medical Research Council and J.B.P. was funded by the European Community. REFERENCES Friston, K. J., Holmes, A. P., Worsley, K. J., Poline, J. B., Frith, C. D., and Frackowiak, R. S. J Statistical parametric maps in functional imaging: A general linear approach. Hum. Brain Map. 2: Grafton, S., Mazziota, J., Presty, S., Friston, K. J., Frackowiak, R. S. J., and Phelps, M Functional anatomy of human procedural learning determined with regional cerebral blood flow and PET. J. Neurosci. 12: Price, C., Wise, R. J. S., Ramsay, S., Friston, K. J., Howard, D., Patterson, K., and Frackowiak, R. S. J Regional response differences within the human auditory cortex when listening to words. Neurosci. Lett. 146: Talairach, P., and Tournoux, P Co-planar Stereotactic Atlas of the Human Brain: 3-Dimensional Proportional System: An Approach to Cerebral Imaging. Thieme Verlag, Stuttgart/New York. Worsley, K. J Local maxima and the expected euler characteristic of excursion sets of x 2, F and t fields. Adv. Appl. Probl. 26:13 42.
Outline The Wondrous World of fmri statistics FMRI data and Statistics course, Leiden, 11-3-2008 The General Linear Model Overview of fmri data analysis steps fmri timeseries Modeling effects of interest
Polynomial Regression POLYNOMIAL AND MULTIPLE REGRESSION Polynomial regression used to fit nonlinear (e.g. curvilinear) data into a least squares linear regression model. It is a form of linear regression
Nonlinear Event-Related Responses in fmri Karl J. Friston, Oliver Josephs, Geraint Rees, Robert Turner This paper presents an approach to characterizing evoked hemodynamic responses in fmrl based on nonlinear
Characterizing the Response of PET and fmri Data using Multivariate Linear Models (MLM) K.J. Worsley 1, J-B. Poline 2, K.J. Friston 2 and A.C. Evans 3 1 Department of Mathematics and Statistics, McGill
Lecture 16: Generalized Additive Models Regression III: Advanced Methods Bill Jacoby Michigan State University http://polisci.msu.edu/jacoby/icpsr/regress3 Goals of the Lecture Introduce Additive Models
Cognitive Neuroscience Exploring Brain/Behavior relations Neuroscience Psychology Cognitive Neuroscience Computational Sciences / Artificial intelligence Franz Joseph Gall & J. C. Spurzheim localization
Neuroscience Letters 374 (2005) 174 178 Applications of random field theory to electrophysiology James M. Kilner, Stefan J. Kiebel, Karl J. Friston The Wellcome Department of Imaging Neuroscience, Institute
Introduction Experimental design and Statistical Parametric Mapping Karl J Friston The Wellcome Dept. of Cognitive Neurology, University College London Queen Square, London, UK WCN 3BG Tel 44 00 7833 7456
STUDY GUIDE LINEAR ALGEBRA AND ITS APPLICATIONS THIRD EDITION UPDATE David C. Lay University of Maryland College Park Copyright 2006 Pearson Addison-Wesley. All rights reserved. Reproduced by Pearson Addison-Wesley
1 Determinants and the Solvability of Linear Systems In the last section we learned how to use Gaussian elimination to solve linear systems of n equations in n unknowns The section completely side-stepped
3.6: General Hypothesis Tests The χ 2 goodness of fit tests which we introduced in the previous section were an example of a hypothesis test. In this section we now consider hypothesis tests more generally.
Nonparametric Permutation Tests for Functional Neuroimaging T.E. Nichols A.P. Holmes March 4, 2003 Department of Biostatistics, University of Michigan, Ann Arbor, Michigan, 48109. USA. Andrew Holmes, AstraZeneca
Nonlinear Blind Source Separation and Independent Component Analysis Prof. Juha Karhunen Helsinki University of Technology Neural Networks Research Centre Espoo, Finland Helsinki University of Technology,
The! ERP Localization! All slides S. J. Luck, except as indicated in the notes sections of individual slides! Slides may be used for nonprofit educational purposes if this copyright notice is included,
Cognitive Neuroscience Approaches to Thinking about the Mind Cognitive Neuroscience Evolutionary Approach Sept 20-22, 2004 Interdisciplinary approach Rapidly changing How does the brain enable cognition?
1 Neuroimaging module I: Modern neuroimaging methods of investigation of the human brain in health and disease The following contains a summary of the content of the neuroimaging module I on the postgraduate
Least-Squares Intersection of Lines Johannes Traa - UIUC 2013 This write-up derives the least-squares solution for the intersection of lines. In the general case, a set of lines will not intersect at a
Principal Component Analysis Application to images Václav Hlaváč Czech Technical University in Prague Faculty of Electrical Engineering, Department of Cybernetics Center for Machine Perception http://cmp.felk.cvut.cz/
Linear Threshold Units w x hx (... w n x n w We assume that each feature x j and each weight w j is a real number (we will relax this later) We will study three different algorithms for learning linear
Bayesian probability theory Bruno A. Olshausen arch 1, 2004 Abstract Bayesian probability theory provides a mathematical framework for peforming inference, or reasoning, using probability. The foundations
Appendix D Basic Measurement And Statistics The following information was developed by Steven Rothke, PhD, Department of Psychology, Rehabilitation Institute of Chicago (RIC) and expanded by Mary F. Schmidt,
Elementary Statistics Sample Exam #3 Instructions. No books or telephones. Only the supplied calculators are allowed. The exam is worth 100 points. 1. A chi square goodness of fit test is considered to
Univariate Regression Correlation and Regression The regression line summarizes the linear relationship between 2 variables Correlation coefficient, r, measures strength of relationship: the closer r is
5 Finnish Signal Processing Symposium (Finsig 5) Kuopio, Finland Stefanos D. Georgiadis Perttu O. Ranta-aho Mika P. Tarvainen Pasi A. Karjalainen University of Kuopio Department of Applied Physics Kuopio,
Technology Step-by-Step Using StatCrunch Section 1.3 Simple Random Sampling 1. Select Data, highlight Simulate Data, then highlight Discrete Uniform. 2. Fill in the following window with the appropriate
How are Parts of the Brain Related to Brain Function? Scientists have found That the basic anatomical components of brain function are related to brain size and shape. The brain is composed of two hemispheres.
Notes for STA 437/1005 Methods for Multivariate Data Radford M. Neal, 26 November 2010 Random Vectors Notation: Let X be a random vector with p elements, so that X = [X 1,..., X p ], where denotes transpose.
.6 Data Mining: Algorithms and Applications Matrix Math Review The purpose of this document is to give a brief review of selected linear algebra concepts that will be useful for the course and to develop
Chi-Square Tests 15 Chapter Chi-Square Test for Independence Chi-Square Tests for Goodness Uniform Goodness- Poisson Goodness- Goodness Test ECDF Tests (Optional) McGraw-Hill/Irwin Copyright 2009 by The
MATHEMATICAL METHODS OF STATISTICS By HARALD CRAMER TROFESSOK IN THE UNIVERSITY OF STOCKHOLM Princeton PRINCETON UNIVERSITY PRESS 1946 TABLE OF CONTENTS. First Part. MATHEMATICAL INTRODUCTION. CHAPTERS
Ged Ridgway Wellcome Trust Centre for Neuroimaging University College London [slides from the FIL Methods group] SPM Course Vancouver, August 2010 β β y X X e one sample t-test two sample t-test paired
Mathematics Fairfield Public Schools AP Statistics AP Statistics BOE Approved 04/08/2014 1 AP STATISTICS Critical Areas of Focus AP Statistics is a rigorous course that offers advanced students an opportunity
L10: Probability, statistics, and estimation theory Review of probability theory Bayes theorem Statistics and the Normal distribution Least Squares Error estimation Maximum Likelihood estimation Bayesian
Quantitative Analysis of Medical Imaging Data Brandon Whitcher PhD CStat Mango Solutions London, United Kingdom www.mango-solutions.com user!2011 17 August 2011 The Menu Today Motivation Early-Phase Drug
Simple Linear Regression Inference 1 Inference requirements The Normality assumption of the stochastic term e is needed for inference even if it is not a OLS requirement. Therefore we have: Interpretation
Percent Signal Change for fmri calculations Paul Mazaika, Feb. 23, 2009 Quantitative scaling into percent signal change is helpful to detect and eliminate bad results with abnormal extreme values. While
Brain and Cognition 57 (2005) 146 150 www.elsevier.com/locate/b&c Functional localization and double dissociations: The relationship between internal structure and behavior David A. Medler a, *, Michael
Inferential Statistics Sampling and the normal distribution Z-scores Confidence levels and intervals Hypothesis testing Commonly used statistical methods Inferential Statistics Descriptive statistics are
Lecture 5: Linear least-squares Regression III: Advanced Methods William G. Jacoby Department of Political Science Michigan State University http://polisci.msu.edu/jacoby/icpsr/regress3 Simple Linear Regression
Mining Information from Brain Images Brian D. Ripley Professor of Applied Statistics University of Oxford firstname.lastname@example.org http://www.stats.ox.ac.uk/ ripley/talks.html Outline Part 1: Characterizing
언 어 치 료 연 구, 제14 권 제4호 Journal of Speech & Hearing Disorders 2005, Vol.14, No.4, 29 ~ 36 An fmri study on reading Hangul and Chinese Characters by Korean Native Speakers Hyo-Woon Yoon(Brain Science Research
Statistics Graduate Courses STAT 7002--Topics in Statistics-Biological/Physical/Mathematics (cr.arr.).organized study of selected topics. Subjects and earnable credit may vary from semester to semester.
Functional neuroimaging Imaging brain function in real time (not just the structure of the brain). The brain is bloody & electric Blood increase in neuronal activity increase in metabolic demand for glucose
Computational Foundations of Cognitive Science Lecture 15: Convolutions and Kernels Frank Keller School of Informatics University of Edinburgh email@example.com February 23, 2010 Frank Keller Computational
Contents 11 Association Between Variables 767 11.1 Introduction............................ 767 11.1.1 Measure of Association................. 768 11.1.2 Chapter Summary.................... 769 11.2 Chi
CYRIC Annual Report 2003 VIII. 1. Central Itching Modulation : A Human PET Study Mochizuki. H, Tashiro M., Kano M., Sakurada Y., Itoh M. *, and Yanai K. Department of Pharmacology, Tohoku University School
1 Introduction to Matrices In this section, important definitions and results from matrix algebra that are useful in regression analysis are introduced. While all statements below regarding the columns
Supplementary Methods Subjects: Fourteen Princeton undergraduate and graduate students were recruited to participate in the study, including 9 females and 5 males. The mean age was 21.4 years, with standard
Using the Singular Value Decomposition Emmett J. Ientilucci Chester F. Carlson Center for Imaging Science Rochester Institute of Technology firstname.lastname@example.org May 9, 003 Abstract This report introduces
Lecture 11: Outliers and Influential data Regression III: Advanced Methods William G. Jacoby Michigan State University http://polisci.msu.edu/jacoby/icpsr/regress3 Outlying Observations: Why pay attention?
fmri Toolbox of Brainvisa.The Brainvisa Hierarchy for fmri databases The Brainvisa software defines a directory structure in order to help the selection of various files for the available processing steps.
1 Supplemental Information: Structure of Orbitofrontal Cortex Predicts Social Influence Daniel K Campbell Meiklejohn, Ryota Kanai, Bahador Bahrami, Dominik R Bach, Raymond J Dolan, Andreas Roepstorff &
CHAPTER 11 CHI-SQUARE: NON-PARAMETRIC COMPARISONS OF FREQUENCY The hypothesis testing statistics detailed thus far in this text have all been designed to allow comparison of the means of two or more samples
Lecture 4: Transformations Regression III: Advanced Methods William G. Jacoby Michigan State University Goals of the lecture The Ladder of Roots and Powers Changing the shape of distributions Transforming
Using Statistical Data Using to Make Statistical Decisions: Data Multiple to Make Regression Decisions Analysis Page 1 Module 5: Multiple Regression Analysis Tom Ilvento, University of Delaware, College
Solving Systems of Linear Equations Using Matrices What is a Matrix? A matrix is a compact grid or array of numbers. It can be created from a system of equations and used to solve the system of equations.
Chapter 420 Introduction (FA) is an exploratory technique applied to a set of observed variables that seeks to find underlying factors (subsets of variables) from which the observed variables were generated.
CORRELATION AND REGRESSION / 47 CHAPTER EIGHT CORRELATION AND REGRESSION Correlation and regression are statistical methods that are commonly used in the medical literature to compare two or more variables.
1 Final Review 2 Review 2.1 CI 1-propZint Scenario 1 A TV manufacturer claims in its warranty brochure that in the past not more than 10 percent of its TV sets needed any repair during the first two years
Self Organizing Maps: Fundamentals Introduction to Neural Networks : Lecture 16 John A. Bullinaria, 2004 1. What is a Self Organizing Map? 2. Topographic Maps 3. Setting up a Self Organizing Map 4. Kohonen
Variance of OLS Estimators and Hypothesis Testing Charlie Gibbons ARE 212 Spring 2011 Randomness in the model Considering the model what is random? Y = X β + ɛ, β is a parameter and not random, X may be
SAS Software to Fit the Generalized Linear Model Gordon Johnston, SAS Institute Inc., Cary, NC Abstract In recent years, the class of generalized linear models has gained popularity as a statistical modeling
AAS 07-228 SPECIAL PERTURBATIONS UNCORRELATED TRACK PROCESSING INTRODUCTION James G. Miller * Two historical uncorrelated track (UCT) processing approaches have been employed using general perturbations
Integration and Visualization of Multimodality Brain Data for Language Mapping Andrew V. Poliakov, PhD, Kevin P. Hinshaw, MS, Cornelius Rosse, MD, DSc and James F. Brinkley, MD, PhD Structural Informatics
2 - Manova 4.3.05 25 Multivariate Analysis of Variance What Multivariate Analysis of Variance is The general purpose of multivariate analysis of variance (MANOVA) is to determine whether multiple levels
Metric Multidimensional Scaling (MDS): Analyzing Distance Matrices Hervé Abdi 1 1 Overview Metric multidimensional scaling (MDS) transforms a distance matrix into a set of coordinates such that the (Euclidean)
Appendix 4 Simulation software for neuronal network models D.1 Introduction This Appendix describes the Matlab software that has been made available with Cerebral Cortex: Principles of Operation (Rolls
Unit 31 A Hypothesis Test about Correlation and Slope in a Simple Linear Regression Objectives: To perform a hypothesis test concerning the slope of a least squares line To recognize that testing for a
496 STATISTICAL ANALYSIS OF CAUSE AND EFFECT * Use a non-parametric technique. There are statistical methods, called non-parametric methods, that don t make any assumptions about the underlying distribution
Pearson s Correlation Correlation the degree to which two variables are associated (co-vary). Covariance may be either positive or negative. Its magnitude depends on the units of measurement. Assumes the
December 4, 2013 MATH 171 BASIC LINEAR ALGEBRA B KITCHENS The equation 1 Lines in two-dimensional space (1) 2x y = 3 describes a line in two-dimensional space The coefficients of x and y in the equation
1 Multivariate Logistic Regression As in univariate logistic regression, let π(x) represent the probability of an event that depends on p covariates or independent variables. Then, using an inv.logit formulation
Version 1, June 13, 2004 Rough Draft form We apologize while we prepare the manuscript for publication but the data are valid and the conclusions are fundamental EEG COHERENCE AND PHASE DELAYS: COMPARISONS
Statistical Methods I Tamekia L. Jones, Ph.D. (email@example.com) Research Assistant Professor Children s Oncology Group Statistics & Data Center Department of Biostatistics Colleges of Medicine and Public
Lecture 7: Factor Analysis Laura McAvinue School of Psychology Trinity College Dublin The Relationship between Variables Previous lectures Correlation Measure of strength of association between two variables
CHAPTER 7B Multiple Regression: Statistical Methods Using IBM SPSS This chapter will demonstrate how to perform multiple linear regression with IBM SPSS first using the standard method and then using the
UNDERSTANDING THE e have seen how the one-way ANOVA can be used to compare two or more sample means in studies involving a single independent variable. This can be extended to two independent variables
Research Article Received 7 August 2008, Accepted 8 October 2009 Published online 25 November 2009 in Wiley Interscience (www.interscience.wiley.com) DOI: 10.1002/sim.3784 False discovery rate and permutation
Instituto Superior de Estatística e Gestão de Informação Universidade Nova de Lisboa Master of Science in Geospatial Technologies Geostatistics Exploratory Analysis Carlos Alberto Felgueiras firstname.lastname@example.org
Your consent to our cookies if you continue to use this website.