Journal of Modern Accounting and Auditing, ISSN 1548-6583 September 2013, Vol. 9, No. 9, 1275-1279 D DAVID PUBLISHING Efficiency Rankings of MBA Programs in Indian Top Public Colleges Patrick Jaska University of Mary Hardin-Baylor, Texas, USA Vedamuthu Kulandai Swamy St. Joseph s PG College, Hyderabad, India This study examines the relative efficiency of the top 20 Indian public colleges that offer MBAs. These colleges were chosen from a list provided by Careers 360, a magazine in India known for its university rankings. The purpose of this study was to evaluate the colleges on an efficiency basis rather than on a total score ranking scale as is the common practice of most publications that rank universities or programs. The ranking method used in this study is based on data envelopment analysis (DEA), a nonparametric procedure for evaluating entities based upon examining inputs in relation to outputs achieved. The rankings using DEA were somewhat different than those given by Careers 360. The results of the DEA analysis of this study rank the universities that are the most efficient at getting students the best salaries and return on investment (ROI) based on the inputs of diversity, work experience, and residency. The authors conclude, as previous studies have shown, that DEA analysis is a useful and non-biased method of comparing university programs. Keywords: university academic rankings, relative efficiency, data envelopment analysis (DEA) Introduction Much emphasis worldwide has been directed at ranking universities based on criteria important to the university reputation and students expectations. Recent studies have looked at rankings of universities in different regions of the world. In this study, the authors will look at the rankings of MBA programs at 20 top private universities in India. Data for this study were obtained from Careers 360, a popular magazine known for its ranking of universities. One methodology used for ranking is data envelopment analysis (DEA). It is a non-parametric method based on inputs and outputs. The strength of this method is that multiple inputs and outputs can be used to rank entities. These entities are called as decision-making units (DMUs). The DEA procedure has been used to rank DMUs based on technical efficiency. The organization of this paper is as follows. A brief literature review is given in the next section. The third section discusses the data and the DEA. The fourth section includes an explanation of the results. The fifth section is the conclusions, and the final section gives suggestions for further research. Patrick Jaska, chair, Business Computer Information Systems, College of Business, University of Mary Hardin-Baylor. Email: pjaska@umhb.edu. Vedamuthu Kulandai Swamy, principal, Department of Management, St. Joseph s PG College.
1276 EFFICIENCY RANKINGS OF MBA PROGRAMS Literature Review A sampling of the studies using DEA to rank universities includes Abbott and Doucouliagos (2003) for Australian universities, Izadi, Johnes, Oskrochi, and Crouchley (2002) for Great Britain universities, Johnes and Yu (2008) for Chinese universities, Fandel (2007) for German universities, and Jaska and Hogan (1995) for United States of America (USA) universities. This is just a small sample of the many studies using DEA to rank universities. In order to illustrate the wide variety of inputs and outputs used in ranking universities, the inputs and outputs for the studies mentioned above will be illustrated. The study of Abbott and Doucouliagos (2003) compares universities using four inputs: total number of academic staff (full-time equivalent (FTE)), number of non-academic staff (fulltime equivalent), expenditure on all other inputs other than labor inputs, and value of non-current assets. Four outputs were used which include number of equivalent full-time students (EFTS), number of postgraduate and undergraduate degrees enrolled, number of postgraduate degrees conferred, and number of undergraduate degrees conferred. The study of Izadi et al. (2002) compares Great Britain universities using input total expenditure per year and four outputs including undergraduate student load in arts subjects, undergraduate student load in science subjects, postgraduate student load, and value of research grants and contracts received. The Johnes and Yu s (2008) study on Chinese universities uses five inputs in the DEA analysis: full-time staff to student ratio, percentage of the faculty with associate professor position or higher, proportion of all students who are postgraduates, research expenditure, and capital inputs (an index of library books and an index of the area of the buildings). Outputs include an index of research output per person, an index of volume of research output, and an index of the prestige of research activity. The Fandel s (2007) study on German universities uses three inputs including: number of students, number of personnel, and outside funding. Outputs include: number of graduates and number of doctorates. The Jaska and Hogan s (1995) study on USA universities uses inputs of current fund expenditures per fulltime student enrollment, educational and general expense per fulltime student enrollment, and average salary of instructors. For outputs, they used FTE student enrollments and number of degrees granted. Data and DEA The choice of inputs and outputs depends upon the nature and focus of the study. Some studies are trying to determine monetary efficiency, teaching efficiency, and/or research efficiency. In the present study, the focus is on determining which universities are the most efficient at helping students get the best salaries and return on investment (ROI) from their education. The inputs include: diversity of students, work experience of students, and percentage of students in residence. The outputs include: average salary of graduates and a measure of ROI. These inputs and outputs were chosen based on the available data given in Careers 360 (2011) for December 2011. This study includes the top 21 universities listed in the December 2011 issue of Careers 360 magazine, known for its annual ratings of universities and degree programs. The public institutions used in this study are: India Institute of Management in Bengaluru (IIMB), Indian Institute of Management in Lucknow (IIML), Indian Institute of Technology (IIT) (Department of Management Studies (DMS)), Indian Institute of
EFFICIENCY RANKINGS OF MBA PROGRAMS 1277 Management in Calcutta (IIMC), Indian Institute of Technology (IIT) (Department of Management Studies (DoMS)), National Institute of Industrial Engineering (NITIE), Indian Institute of Foreign Trade (IIFT), Indian Institute of Technology (IIT) (Vinod Gupta School of Management (VGSOM)), Indian Institute of Technology (IIT) (Department of Management Studies (DOMS)), Indian Institute of Management in Kozhikode (IIMK), Indian Institute of Technology (IIT) (Bombay) (Shailesh J. Mehta School of Management (SJSoM)), Indian Institute of Management in Indore (IIMI), Indian Institute of Technology (IIT) (Department of Industrial & Management Engineering (DIME)), Faculty of Management Studies (FMS), National Institute of Agricultural Extension Management (MANAGE), Jamnalal Bajaj Institute of Management Studies (JBIMS), Motilal Nehru National Institute of Technology (MNNIT), Institute of Public Enterprise (IPE), National Institute of Technology (NIT) Trichy, Sydenham Institute of Management Studies, Research & Entrepreneurship Education (SIMSREE), and Assam University (Department of Business Administration). First, the data obtained from the December 2011 issue of Careers 360 (2011) are given in Table 1 along with the university ranking given by Careers 360. Table 1 Rankings of MBA Programs at Indian Public Universities Based on Sum of Scores Rank Name of institute City Student diversity Students 2yrs work experience Residential Avg. domestic salary ROI (flagship prog.) 1 IIML Lucknow 20.87 26.59 70 76.36 8.47 202.29 2 IIMB Bengaluru 21.8 9.44 67 91.7 6.56 196.5 3 JBIMS Mumbai 18 17.33 17.99 94.18 41.77 189.27 4 IIMC Kolkata 19.8 0 70 92.85 6.1 188.75 5 IIT (DMS) New Delhi 19.67 21.33 70 38.13 34.47 183.6 6 IIT (DOMS) Roorkee 16.06 9.72 70 45.45 39.53 180.76 7 IIT (DoMS) Chennai 15.66 23.19 70 48.48 17.34 174.67 8 NITIE Mumbai 14.9 5.69 46 56.36 50 172.95 9 IIT (DIME) Kanpur 16.72 14.29 70 47.27 23.3 171.58 10 IIMK Kozhikode 20.68 8.05 55 68.79 6.1 158.62 11 IIT (VGSOM) Kharagpur 14.83 12.06 70 48.48 8.27 153.64 12 FMS Delhi 20.5 0 16 66.67 49.28 152.45 13 IIMI Indore 19.5 0 61 66.67 4.93 152.1 14 IIFT Delhi 20.9 13.02 42 60.613 7.68 144.213 15 IIT (Bombay) Mumbai 14.02 22.73 0 84.06 8.47 129.28 16 MNNIT Allahabad 15.76 2.8 59.5 27.88 12.37 118.31 17 SIMSREE Mumbai 16.79 15.21 2.08 51.09 32.84 118.01 18 MANAGE Hyderabad 23.37 1.25 31.97 41.76 5.13 103.48 19 Assam University Silchar 21.41 4.12 35.41 10.36 12.42 83.72 20 IPE Hyderabad 15.26 20 10 21.82 3.23 70.31 21 NIT Trichy Trichy 6.67 17.5 0 30.3 14.06 68.53 Results of the DEA from this study are given in Table 2 with rankings compiled by using DEA. The DEA rankings are somewhat different than the Careers 360 rankings, due to the criteria of each method of ranking. The method in Table 1 uses the sum of the scores, the DEA method in Table 2 uses technical efficiency analysis comparing each university to the other 20 universities in this group based on inputs and outputs. Score
1278 EFFICIENCY RANKINGS OF MBA PROGRAMS Table 2 DEA Rankings of MBA Programs at Indian Public Universities Students Avg. DEA DMU (Name DEA score Student 2yrs work Residential domestic ROI ranking of institution) (%) diversity experience salary Benchmarks 1 FMS 3,534.43 6 2 NIT Trichy 348.92 1 3 IIT (Bombay) 213.59 8 4 SIMSREE 176.31 0 5 NITIE 137.52 5 6 IIMC 137.16 10 7 JBIMS 116.73 8 8 IIMB 83.49 0 0 8.78 0 0.49 4 (0.67) 11 (0.35) 9 IIMI 76.37 0 0 0 0 4.27 4 (0.64) 14 (0.11) 10 IIT (DOMS) 74.00 0 2.46 15.82 0 0 6 (0.77) 16 (0.02) 11 IIMK 66.68 0 0 0 0 0 4 (0.52) 11 (0.22) 14 (0.00) 16 (0.02) 12 IIML 64.14 0 0 35.06 0 0 4 (0.13) 11 (0.72) 16 (0.04) 13 IIT (VGSOM) 61.82 0 0 28.09 0 0 4 (0.19) 11 (0.23) 16 (0.12) 14 IIT (DoSM) 57.28 0 3.31 33.16 0 0 11 (0.14) 16 (0.39) 15 IIFT 56.82 0 0 0 0 0 4 (0.32) 11 (0.29) 14 (0.03) 16 (0.04) 16 IIT (DIME) 56.66 0 0 26.64 0 0 4 (0.01) 6 (0.10) 16 (0.43) 17 IIT (DMS) 52.22 0 7.22 4.84 0.72 0 6 (0.69) 18 MANAGE 48.79 0 0 0 0 16.28 4 (0.13) 11 (0.03) 14 (0.41) 19 MNNIT 42.33 0 0 4.12 0 0 4 (0.16) 6 (0.21) 14 (0.02) 20 IPE 25.05 0 0 0.19 0 0 4 (0.02) 11 (0.19) 16 (0.03) 21 Assam University 20.98 0 0 5.27 0 6 (0.12) 14 (0.13) 19 (0.01) Explanation of Results The use of DEA to compare universities has been used in many studies and is a viable method for comparing DMUs, such as universities. The uniqueness of DEA is that the basis for the analysis is on technical efficiency which measures outputs relative to inputs comparing all universities to one another. DEA also gives other important information including efficiency ratings and criteria for improving efficiency. The first seven schools in Table 2 are efficient: FMS, NIT Trichy, IIT (Bombay), SIMSREE, NITIE, IIMC, and JBIMS. Each has a DEA score of over 100%. The other universities are inefficient at delivering average salary and ROI to students. The DEA data in Table 2 also give the universities that the inefficient universities should emulate in order to become efficient. For example, looking at column nine (benchmarks), IIMB should emulate 4 (SIMSREE) 67% and 11 (IIMK) 35% in order for IIBM to be efficient. Another aspect of the DEA is the last five columns in Table 2 which give the amount of change in different inputs and outputs to be efficient. For example, IIMB needs to decrease residential students by 8.78 and increase ROI by 0.49 to be efficient. Conclusions Using DEA as a method for comparing universities is a viable method. It has been used to compare various DMUs from bank branches (Haag & Jaska, 1995) to agricultural production units (Haag, Jaska, &
EFFICIENCY RANKINGS OF MBA PROGRAMS 1279 Semple, 1992) to universities (Jaska & Hogan, 1995). In this study, DEA was used as a non-biased method of comparing MBA programs in India. The results show not only how the MBA programs compare, but also how those that are not efficient can be improved compared with efficient programs. The results were different when compared with Careers 360 results. The results of the Careers 360 analysis were based on the sum of the scores on each category. The DEA analysis compares the programs based on efficiency of resources used and produced. The results of the DEA analysis of this study rank the universities that are the most efficient at getting students the best salaries and ROI based on the inputs of diversity, work experience, and residency. The DEA analysis is a proven method for evaluating entities that have similar inputs, outputs, and objectives. This study shows that there is a discrepancy between an arbitrary method of summing ratings as opposed to an efficiency analysis of ratings. The authors conclude, as previous studies have shown, that DEA analysis is a useful and non-biased method of comparing university programs. Suggestions for Further Research The DEA methodology has been used extensively for comparing universities around the world. A comparison of universities from one country to the other could be performed. Also, another study on public universities in India could be done. Private universities could be compared with public universities as an extension of this study. References Abbott, M., & Doucouliagos, C. (2003). The efficiency of Australian universities: A data envelopment analysis. Economics of Education Review, 22(1), 89-97. Careers 360. (2011, December). Best MBA universities in India. Careers 360, 58-67. Fandel, G. (2007). On the performance of universities in North Rhine-Westphalia, Germany: Government s redistribution of funds judged using DEA efficiency measures. European Journal of Operational Research, 176(1), 521-533. Haag, S., & Jaska, P. (1995). Interpreting inefficiency ratings: An application of bank branch operating efficiencies. Managerial and Decision Economics, 16(1), 7-14. Haag, S., Jaska, P., & Semple, J. (1992). Assessing the relative efficiency of agricultural production units. Applied Economics, 24(5), 559-565. Izadi, H., Johnes, G., Oskrochi, R., & Crouchley, R. (2002). Stochastic frontier estimation of a CES cost function: The case of higher education in Britain. Economics of Education Review, 21(1), 63-71. Jaska, P., & Hogan, P. (1995). Benchmarking in public institutions of higher education. Proceedings from Annual Meeting of the Decision Sciences Institute, Boston. Johnes, J., & Yu, L. (2008). Measuring the research performance of Chinese higher education institutions using data envelopment analysis. China Economic Review, 19(4), 679-696.