2012 Proeedings of the Informati Systems Eduators Cferene ISSN: 2167-1435 A Database Management Assessment Instrument Jeffrey P. Landry jlandry@usouthal.edu J. Harold Pardue hpardue@southalabama.edu Roy Daigle RDaigle@southalabama.edu Herbert E. Lgeneker, Jr. lgenekerb@gmail.om Shool of Computing University of South Alabama Mobile, AL 36688, USA Abstrat This paper desribes an instrument designed for assessing learning outomes in data management. In additi to assessment of student learning and ABET outomes, we have also found the instrument to be effetive for determining database plaement of inoming informati systems (IS) graduate students. Eah of these three uses is disussed in this paper. We desribe the use of a pre/post test, item validati, and orrelati tehniques for the purpose of validati and assessment. Although the instrument was developed for loal assessment, its design is based internatial informati systems urriulum guidelines rendering it suitable for use in any program whih inorporates database management in its urriulum. Keywords: assessment, database, data management, exams, outomes 1. INTRODUCTION Universities are inreasingly being required to demstrate that student learning is ourring at their institutis in measurable, doumented ways, and that these measurable results are being used to improve their eduatial programs. Assessment of learning has beome a requirement of institutial and program areditati. Many methods of assessment are possible, inluding internally/externally developed, diret/indiret measures of performane, and formative/summative indiators. Often these assessment approahes are developed for loal use, i.e. they are not designed to be generalized for use by similar programs at peer institutis. This paper desribes the development, validati, use, and results interpretati of a database exam an internally-developed, diret assessment, formative indiator of student learning in a fouryear informati systems (IS) degree program that we believe an be used for assessment in any program requiring a database management ourse. In the setis that follow, we desribe the foundati for the exam, the approah taken 2012 EDSIG (Eduati Speial Interest Group of the AITP) Page 1
2012 Proeedings of the Informati Systems Eduators Cferene ISSN: 2167-1435 for developing and verifying exam items, the approah taken for validating that the exam is a useful instrument for student outomes assessment, and a disussi of the several uses that we have made of the instrument. 2. BACKGROUND The exam was developed in the mid-2000 s as an outgrowth of a natial ertifiati exam projet, and for use at the o-authors university the University of South Alabama (USA), loated in Mobile, Alabama. Available from the Center for Forensis, Informati Tehnology, and Seurity, the USA-CFITS DB Exam sists of 25 multiple hoie items, 16 of whih appear the IS 2002 exit exam, a natial ertifiati exam for informati systems exit skills (Landry, Reynolds, & Lgeneker, & 2003). The original reas for reating the exam was to address a graduate program plaement issue. Students admitted to the informati systems master s program had traditially been plaed into the graduate data management ourse based the prerequisite of having passed an undergraduate database ourse. Despite having transript evidene of an undergraduate database management ourse at other institutis, some students were not prepared to sueed in our graduate database ourse. Sine our undergraduate ourse was designed to satisfy ourse objetives sistent with learning units in IS 2002 and sine graduate students who suessfully ompleted our undergraduate database ourse also suessfully ompleted the graduate database ourse, we luded that a plaement exam was needed to aurately determine when the undergraduate ourse should be a required prerequisite. Subsequently, the database plaement exam was reated to be given to inoming master s students, and used as a plaement mehanism. Students making a passing sore were admitted to the graduate data management ourse, while students making a failing sore were advised to omplete the undergraduate database ourse with a passing grade of C or better. Development and Validati of the Exam The USA-CFITS DB Exam was originally designed to be a measure of data management knowledge and skills, e of the fundamental ore areas of Informati Systems urriula (Landry, Lgeneker, Haigood, & Feinstein, 2000; Haigood 2001; Colvin 2008). The foundatis for the exam are database-related learning units (LU) of IS urriula models, IS 90, IS 97, and IS2002 (Lgeneker & Feinstein, 1991; Lgeneker, Feinstein, Couger, Davis, & Gorge, 1995; Davis, Gorge, Couger, Feinstein, & Lgeneker, 1997; Gorge, Davis, Valaih, Topi, Feinstein, & Lgeneker, 2003). The tinuing relevane of database skills and knowledge in the IS urriula models is further supported by the results of two surveys e targeting faulty and industry partners (Landry et al., 2000) and a sed targeting IS professials two to four years beyd graduati (Colvin, 2008). Speifi knowledge and skill areas used to motivate item writing for the USA-CFITS DB Exam were drawn from prior work refleting an interseti of aademi and professial needs. Henders, Champlin, Coleman, Cupoli, Hoffer, Howarth, Sivier, Smith, & Smith (2004) published a framework for Data Management urriula intended for postsedary eduati and spsored by a professial soiety, the Data Management Assoiati (DAMA). Lgeneker, Henders, Smith, Cupoli, Yarbrough, Smith, Gillens, & Feinstein (2006) studied this framework in detail and found that the skills were ompatible with the IS2002 and IS2010 IS urriulum guidelines. Table 5 in the appendix reflets a synthesis of the DAMA framework, the IS model urriulum guidelines, and a job ad analysis (Landry et al., 2000; Haigood 2001). In developing the USA-CFITS DB Exam to reflet both professial skills and urriulum guidelines, the authors wrote items that assessed the interseti of a data management sub-skill area and an IS 2002 learning unit. The learning objetives for eah of the 25 items the USA-CFITS DB Exam are as follows: 1. Given a piee of data to programmatially manipulate, hoose the appropriate data type 2. Given a real-world appliati, determine appropriate fields to be stored in a file 3. Choose and defend the orret data type for representing a omm data attribute 4. Differentiate between entities and attributes when developing an ERD 5. Reognize the need either for an interseti table in a M:N relatiship or the need to revisit requirements to determine if there is a missing entity 2012 EDSIG (Eduati Speial Interest Group of the AITP) Page 2
2012 Proeedings of the Informati Systems Eduators Cferene ISSN: 2167-1435 6. Given a relatial database desripti, evaluate the arhiteture 7. Given a system need, suh as aess trol to a database, identify the neessary informati 8. Differentiate amg alternatives for enforing data integrity straints 9. Compare and trast the proesses involved in data modeling 10. Reognize the impliati of a asade delete 11. Reognize the notati of standard ER models 12. Reognize and desribe a orret threeentity soluti to a problem expressed as a many-to-many relatiship between two entities 13. Reognize that many-to-many relatiships require a third, linking table in a relatial DB 14. Apply the knowledge of using a stored proedure to enhane the performane in a database envirment 15. Given database design goals, identify orret tehniques for implementati 16. Normalize (redesign) an unnormalized (poorly designed) table 17. Reognize orret syntax and orret use of views 18. Reognize the impliati of using views in a lient appliati 19. Reognize the advantages and disadvantages of implementati with stored proedures 20. Trae and debug SQL syntax 21. Reognize the orret formulati of a query 22. Differentiate normal forms as part of database design 23. Reognize whih tasks are assoiated with disovering and eliiting database design requirements in the initial phase of requirements analysis 24. Reognize relevant fators involved in the purhasing deisi of a major enterprise level DBMS pakage 25. Reognize properties of the Entity- Relatiship Model, partiularly the ept of minimum ardinality Sine the development of the USA-CFITS DB Exam, a revisi of the informati systems urriulum guidelines has been issued. IS 2010, available at http://www.am.org/eduati/urriula, defines ore ourse IS 2010.2 as Data and Informati Management. All 25 USA-CFITS DB Exam items map to a stated ourse objetive of the IS 2010.2 ourse. Of the 25 items, 13 of them map to ourse objetives 6, 8, and 12, dealing with eptual data modeling, designing a high quality database, and various SQL ommands, and 13 of the 21 ourse objetives are overed by at least e exam item. The exam item objetives were also mapped to ABET student outomes riteria (ABET, 2007, p. 14). The outomes riteria, alg with the number of exam items mapped to eah, are shown in Table 1. See Table 5 in the appendix for a grand mapping of the 25 item objetives with IS 2002, IS 2010 and ABET. Table 1 - Coverage of ABET Student Outomes The program has doumented measurable outomes that are based the needs of the program s stituenies Student Outomes that must be enabled (a) An ability to apply knowledge of omputing and mathematis appropriate to the disipline (b) An ability to analyze a problem, and identify and define the omputing requirements appropriate to its soluti () An ability to design, implement and evaluate a omputer-based system, proess, ompent, or program to meet desired needs (i) An ability to use urrent tehniques, skills, and tools neessary for omputing Number of assoiated exam item objetives It is important that an internal exam designed for assessment be mappable into multiple assessment frameworks. Doing so strengthens the validity of the exam s tent as being relevant outside of the loal unit s needs. For more the approah used to map multiple assessment frameworks, write items, and validate exams, see related papers (Landry et al., 2003; Landry, Daigle, Lgeneker, & 2012 EDSIG (Eduati Speial Interest Group of the AITP) Page 3 1 5 12 7
2012 Proeedings of the Informati Systems Eduators Cferene ISSN: 2167-1435 Pardue, 2010; Reynolds, Lgeneker, Landry, Pardue, & Applegate, 2004). Exam Cstruti differentiate between those who know and those who d t, perhaps to a higher degree than instrutors do in general. The multiple mappings established a useful foundati for item writing, whih was arried out using these and other good praties in eduatial assessment (Hogan 2007; Croker & Algina 1986). The writers wrote items and objetives in alignment with mapped frameworks. An item sisted of a stem with four possible answers with e orret answer. Good item writing was diffiult, and multiple reviewers were utilized in the item review proess. The entire item-writing and review proess was supported by a web-based exam delivery system developed by the o-authors and their graduate students at the University of South Alabama. The andidate items were pilot tested, revised, and validated with statistial tehniques, inluding test item statistis. See Seti 3 Validati below for details. A summary of reommended praties inludes the following: Define objetives, and write items that target the objetives Map items into other outomes for assessment value D t write items that are too diffiult Make sure items are based knowledge Get multiple reviewers to rigorously review items, and orret Pilot test the exam Use test item statistis to validate Make exam easy to administer and sore Selet an appropriate passing sore Develop good seurity poliies See Figure 1 for an overview of the item struti proess. A ut sore for passing was set at 44% orret respses. The suess rate of students in our graduate database ourse orrelated with whether the student made at least a 44. A sore of 44 orrelated with a midrange C performane in our undergraduate database ourse. While the sore of 44 would seem low for a student who has taken a database management ourse, an explanati is that sores for this external exam are preditably lower than sores internal assessments that reflet an individual instrutor s preferenes in instrutial approah and topi emphasis. Furthermore, we designed the items the exam to be disriminating, that is, to Write items & objetives Align w/skill, urriulum frameworks Review & revise items Cdut pilot tests Validate w/ statistis Make revisis and publish Figure 1 - Item Cstruti Proess Multiple Uses of the Exam The faulty eventually found multiple uses for the exam in additi to graduate data management ourse plaement. In the undergraduate database ourse, the exam is given as a pre-test at the beginning of the ourse and as a post-test inorporated as part of the final exam. This pratie provides the apability of assessing the degree to whih the undergraduate database ourse is ahieving its intended learning outomes, independent of instrutor assignment (espeially part-time instrutors) and in different delivery formats (traditial, blended, fully line). This results are used as a formative program assessment method for both ABET and regial areditati agenies (e.g. SACS). 3. VALIDATION The results of using the exam over three years are desribed next. The first test desribed is a test using tent experts. This test was 2012 EDSIG (Eduati Speial Interest Group of the AITP) Page 4
2012 Proeedings of the Informati Systems Eduators Cferene ISSN: 2167-1435 intended as a fae validity test, but also demstrated tent validity. The panel of experts, whih sisted of professors from the university using the exam, took the test as a student would, in a protored lab envirment. Overall, observatis made by the experts inluded a perepti that the test items are disriminating, that is, they are effetive at disriminating between whether somee knew the answer or would have to guess. The perepti amg the tent experts is testable. See disussi of item validati and pre/post testing below. Another positive reati from an expert after taking the test was that I knew what the item was about, but d t know if I got it right. This omment was interpreted as meaning the item was about a relevant database ept familiar to the expert, but that the item was also hallenging. Another expert said that it was helpful that the exam had a sistent format of diagrams and tables that aompanied some of the items, as well as reuse of data in tables. Suh sisteny uts down the ognitive overload takers. The eight items (of 25) that use tables or figures depit ER models, queries, or tables/views of data. One expert liked the normalizati item, another liked the item interseti tables (whih table gets the foreign key? ). More ritially, the experts thought that four or five items need revisiting (more review). Some jarg was reognized as being potentially fusing to students, inluding the use of United States zip odes a data types item. The toughest items were believed to be those triggers and straints. The experts were skeptial of items that presumed a speifi order of database life yle ativities. Another item asked about the best way to do something, and it was believed the item to be too normative. The sed set of tests we duted was to run statistial analyses the most reent set of test taker data. We alulated summary and item statistis, and duted pre/post tests, and ran orrelatis of test vs. ourse performane. Summary and Test Item Statistis From January 2008 until May 2010, a total of 246 USA students, a ombinati of graduate and undergraduate students, English speaking and ESL students, took the USA-CFITS DB Exam. Over this period, 53.4 was the mean sore with standard deviati of 14.6. This sore is sistent with natial norms for the informati systems exit exam. The highest sore was a 92, and the lowest sore was a 16. Eight test takers, or a little more than 3 perent of all takers, sored below 25, or worse than guessing. The KR20, whih measures internal item sisteny, was 0.62. The sore is right above a minimally aeptable sore of 0.60, whih is reommended for tests in a subjet domain taken by those trained in that domain. Some test item statistis are provided in Table-2 below. This table indiates the perentage of subjets getting eah item orret, whih varies from 26% to 87%, and the point biserial, whih varies from.12 to.51. The perent orret sores indiate item diffiulty a 100-point sale, with a 100 representing the easiest (least diffiult) item, that is, with 100% of takers answering it orretly. Higher point biserials are indiative of items that orrelate well with the exam as whole, espeially when values are 0.40 and higher. Table 2 - Item Statistis Pt Corret Point Biserial 43 0.45 64 0.36 58 0.24 65 0.46 40 0.40 50 0.51 80 0.30 54 0.26 58 0.25 34 0.20 40 0.12 81 0.41 75 0.43 86 0.19 32 0.34 58 0.14 72 0.26 28 0.21 87 0.29 30 0.51 39 0.36 53 0.34 26 0.30 28 0.30 46 0.44 2012 EDSIG (Eduati Speial Interest Group of the AITP) Page 5
2012 Proeedings of the Informati Systems Eduators Cferene ISSN: 2167-1435 Pre and Post tests The purpose of a pre/post test is to demstrate that learning took plae between the two measurements. In our ase, we gave the USA- CFITS DB Exam to inoming graduate students. Those (25 students) who failed to make a passing sore were required to take an undergraduate database ourse, and three other students who barely passed also deided to take the database ourse. Table 3 - Pre/Post Test Results Pretest sore Posttest sore Taker # 1 24 52 28 2 32 48 16 3 36 56 20 4 28 52 24 5 16 56 40 6 40 56 16 7 28 60 32 8 36 68 32 9 40 76 36 10 48 68 20 11 44 68 24 12 32 44 12 13 24 44 20 14 40 48 8 15 40 48 8 16 20 40 20 17 40 48 8 18 32 32 0 19 64 72 8 20 24 56 32 21 40 68 28 22 36 36 0 23 32 48 16 24 32 44 12 25 40 52 12 26 40 60 20 27 40 56 16 28 36 44 8 # Failed 25 3 # Passed 3 25 Total takers 28 28 Pt takers passed 11% 89% Mean sore (0-100) 35.1 53.6 18.4 Differene b/w pre & post At the end of the database ourse, they again took the plaement exam. These two sets of sores were ompared using a paired t-test, using PASW Statistis. There were 28 students in the sample. The pre/post test sores are in Table 3 as follows. By the end of the ourse the results were reversed. There were now 25 passing sores and three that were still below passing (although e of those improved by 20 points) for a pass rate of 89%. The pre-test mean was 35.1, ompared to a post-test mean of 53.6. The mean differene was 18.4 points, and the result of a paired differenes test was statistially signifiant at a.001 level (p=.000). Suh a result is a strg indiator of learning taking plae in the ourse. It was partiularly remarkable that the inrease in sores ourred despite the fat that many of the students in the sample had prior database experiene and sored lose to passing in the pre-test. If the test maps well to the objetives of the ourse, and the pre-test is given to those with little knowledge of the subjet matter, a pre/post test design ought to detet whether learning is taking plae. In this way, we an use the USA-CFITS DB Exam to verify that the undergraduate ourse is ahieving its planned learning outomes, over time, espeially as the instrutor hanges. One a pre/post relatiship is established, it might be suffiient just to give the post-test, and ompare the post test mean to historial post-test averages. Correlatis of test taker performane vs. database ourse performane Over time (see Table 4), we determined that the sores the exam orrelated as follows: Table 4 - Exam-Course Correlatis Sore USA-CFITS DB Exam (% orret) 60-100 A 50-59 B 40-49 C 30-39 D 0-29 F Assoiated letter grade in the ourse The grading sale an exam like this is not the same as a typial 10-point sale used ommly 2012 EDSIG (Eduati Speial Interest Group of the AITP) Page 6
2012 Proeedings of the Informati Systems Eduators Cferene ISSN: 2167-1435 in universities, with 90-100 A, 80-89 B, et. The items the exam, while representative of a first database ourse, are not partiular to a speifi instituti s database ourse or its instrutor. We believe that instrutors taught the database ourse in an unbiased manner towards the exam. It should be noted that that data inludes sores from students in setis taught by two of the o-authors, e of whom also developed questis for this exam. The oauthor s approah in teahing the ourse was not to teah to the test, nor use exam items elsewhere in the ourse. The other instrutors had no aess to the exam items before, during, and after the pre/post tests. 4. CONCLUSION In summary, the benefits of using the exam are as follows: Maps to ABET outomes Provides instrutor-independent assessment of learning Can use as a plaement exam for grad program or transfer students Useful for outomes assessment for ABET areditati Useful for ourse assessment We believe that the need for more and better assessment helps make efforts like ours worthwhile. To inquire about use of the exam, tat the University of South Alabama Center for Forensis, Informati Tehnology, and Seurity (USA-CFITS, http://www.usafits.org). 5. REFERENCES ABET Computing Areditati Commissi - ABET CAC (2007). Self-Study Questinaire. ABET, In. August 2007. Baltimore, Maryland. Colvin, R. (2008). Informati Systems Skills and Career Suess, Master s Thesis, University of South Alabama, Shool of Computer and Informati Sienes. Croker, L. and Algina J. (1986). Introduti to Classial and Modern Test Theory. Holt, Rinehart and Winst. Orlando, Florida. Davis, G., Gorge J., Couger J., Feinstein D., and Lgeneker H. (1997). IS'97: Model Curriulum and Guidelines for Undergraduate Degree Programs in Informati Systems. ACM SIGMIS Database, 28(1). With the growing demand for more outomesbased assessment in higher eduati, the use of this type of internally-developed exam, while beoming neessary, will offer many benefits. Amg these are instrutor-independent ourse and program outomes assessment that supports multiple frameworks. We have shown that the USA-CFITS DB Exam is aligned with internatial urriulum models, ABET outomes and job-related skills from two surveys (Landry et al., 2000; Colvin, 2008). With the speifi exam being desribed, the USA-CFITS DB Exam, we have provided evidene that suess in a first database ourse is most losely orrelated with mastery of a speifi subset of learning outomes in data management. We desribed how we were able to verge a ut sore that predited whether or not a graduate student needed to take a database prerequisite ourse. We provided evidene that post-test student sores parallel their loal ourse performane, while trending lower than loal sores for preditable reass (i.e. exam is not speifi to an instrutor or the loal ourse). All this made the exam useful for student plaement and ourse assessment. Gorge, J., Davis G., Valaih J., Topi H., Feinstein D., and Lgeneker H. (2003). IS 2002 Model Curriulum and Guidelines for Undergraduate Degree Programs in Informati Systems. Data Base 34(1). Haigood, B. (2001). Classifiati of Performane Level Requirements of Current Jobs Within the Field of Informati Systems, Master s Thesis, University of South Alabama, Shool of Computer and Informati Sienes. Henders, D., Champlin B., Coleman D., Cupoli P., Hoffer J., Howarth L., Sivier K., Smith A., and Smith E. (2004). Model Curriulum Framework for Post Sedary Eduati Programs in Data Resoure Management. The Data Management Assoiati Internatial Foundati, Committee the Advanement of Data Management in Post Sedary Institutis, Sub Committee Curriulum Framework Development. 2012 EDSIG (Eduati Speial Interest Group of the AITP) Page 7
2012 Proeedings of the Informati Systems Eduators Cferene ISSN: 2167-1435 Hogan, T. (2007). Eduatial Assessment: A Pratial Introduti. John Wiley & Ss, In., Danvers, Massahusetts. Landry, J., Daigle R., Lgeneker H., and Pardue H. (2010). IS 2002 and ABET Areditati: Meeting the ABET Program Outome Criteria. Informati Systems Eduati Journal, August 6, 2010, Volume 8, Issue No. 67, URL: http://isedj.org/8/67/. Landry, J., Lgeneker H., Haigood B., and Feinstein D. (2000). Comparing Entry-Level Skill Depths Aross Informati Systems Job Types: Pereptis of IS Faulty. Amerias Cferene Informati Systems (AMCIS 2000), August 2000, Lg Beah, California. Landry, J., Reynolds J., and Lgeneker H. (2003). Assessing Readiness of IS Majors to Enter the Job Market: An IS Competeny Exam Based the Model Curriulum. Amerias Cferene Informati Systems (AMCIS 2003), August 2003, Tampa, FL. Lgeneker, H., and Feinstein D. (Eds.) (1991). IS 90: The DPMA Model Curriulum for Informati Systems for 4 Year Undergraduates. Park Ridge, Illinois: Data Proessing Management Assoiati Lgeneker, H., Feinstein D., Couger J., Davis G., and. Gorge J. (1995). Informati Systems 95: A Summary of the Collaborative IS Curriulum Speifiati of the Joint DPMA, ACM, AIS Task Fore. Journal of Informati Systems Eduati, Volume 6, Number 4, pp. 174-187. Lgeneker, H., Henders D., Smith E., Cupoli P., Yarbrough D., Smith A., Gillens M., and Feinstein D. (2006). A Reommendati for a Professial Fous Area in Data Management for the IS2002 Informati Systems Model Curriulum. In The Proeedings of the Informati Systems Eduati Cferene 2006, v 23 (Dallas): 2115. ISSN: 1542-7382 Reynolds, J. H., Lgeneker H., Landry J., Pardue J., and Applegate B. (2004). Informati Systems Natial Assessment Update: The Results of a Beta Test of a New Informati Systems Exit Exam Based the IS 2002 Model Curriulum. Informati Systems Eduati Journal, May 1, 2004, Volume 2, Issue Number 24, URL: http://isedj.org/2/24/ 2012 EDSIG (Eduati Speial Interest Group of the AITP) Page 8
2012 Proeedings of the Informati Systems Eduators Cferene ISSN: 2167-1435 APPENDIX: Table 5 - Grand Mapping of the USA-CFITS DB Exam Skill 1.1.3 Data Types and File Strutures Skill Words analysis, design, development, debugging, testing, simple data strutures (arrays, reords, strings). # ABE T Out ome LU IS2002 LU-Title IS2002 LU-Goal IS 2010 Outome Item Objetive % Corret PtBi Ser Group Avg% Corre t 1 b Analy ze 58 Problem Solving, with Files and Database to present and ensure problem solving involving files and database representatis 2.113 Implement a relatial database design using an industrial-strength database management system, inluding the priniples of data type seleti and indexing. given a piee of data to programmatially manipulate, hoose the appropriate data type 0.43 0.45 2 b Analy ze 42 Informati Measuremen ts/ Data /Events to present the ept that data is a representati and measurement of real-world events 2.05 Apply informati requirements speifiati proesses in the broader systems analysis & design text. given a real-world appliati, determine appropriate fields to be stored in a file 0.64 0.36 0.55 3 b Analy ze 58 Problem Solving, with Files and Database to present and ensure problem solving involving files and database representatis 2.11 Implement a relatial database design using an industrial-strength database management hoose and defend the orret data type for representing a omm data attribute 0.58 0.24 2012 EDSIG (Eduati Speial Interest Group of the AITP) Page 9
2012 Proeedings of the Informati Systems Eduators Cferene ISSN: 2167-1435 system, inluding the priniples of data type seleti and indexing. 1.3.1 Modeling and design, struti, shema tools, DB systems Data modeling, SQL, struti, tools -top down, eptual, logial and physial designs; sripts; bottom up designs; shema development tools; desk-top/enterprise versis; systems: Aess, SQL Server/Orale/Sybase, data warehousing & mining; sripts, GUI tools; retrieve, manipulate and store data; tables, relatiships and views 13 a Basis 89 ADTs: Database Models and Funtis to develop awareness of the syntatial and theoretial differenes between database models 2.11 Implement a relatial database design using an industrial-strength database management system, inluding the priniples of data type seleti and indexing. reognize that many-tomany relatiships require a third, linking table in a relatial DB 0.75 0.43 23 b Analy ze 111 IS Requirement s and Database to develop requirements and speifiatis for a database requiring multi-user informati system 2.07 Link to eah other the results of data/informati modeling and proess modeling. reognize whih tasks are assoiated with disovering and eliiting database design requirements in the initial phase of requirements analysis 0.26 0.30 0.50 25 b Analy ze 111 IS Requirement s and Database to develop requirements and speifiatis for a database requiring multi-user informati system 2.08 Design high-quality relatial databases. reognize properties of the Entity-Relatiship Model, partiularly the ept of minimum ardinality 0.46 0.44 2012 EDSIG (Eduati Speial Interest Group of the AITP) Page 10
2012 Proeedings of the Informati Systems Eduators Cferene ISSN: 2167-1435 6 Build 81 Appliatis Development to develop appliati skills for implementing databases and appliatis by operating and testing these databases 2.06 Use at least e eptual data modeling tehnique (suh as entity-relatiship modeling) to apture the informati requirements for an enterprise domain given a relatial database desripti, evaluate the arhiteture 0.50 0.51 8 Build 81 Appliatis Development to develop appliati skills for implementing databases and appliatis by operating and testing these databases 2.08 Design high-quality relatial databases. differentiate amg alternatives for enforing data integrity straints 0.54 0.26 10 Build 88 IS Data Modeling to develop skill with data modeling whih desribe databases 2.06 Use at least e eptual data modeling tehnique (suh as entity-relatiship modeling) to apture the informati requirements for an enterprise domain reognize the notati of standard ER models 0.34 0.20 11 Build 88 IS Data Modeling to develop skill with data modeling whih desribe databases 2.06 Use at least e eptual data modeling tehnique (suh as entity-relatiship modeling) to apture the informati requirements for an enterprise domain reognize and desribe a orret three-entity soluti to a problem expressed as a many-to-many relatiship between two entities 0.40 0.12 12 Build 88 IS Data Modeling to develop skill with data modeling whih desribe databases 2.15 Understand the basi mehanisms for aessing relatial databases from various types of appliati development envirments. ompare and trast the proesses involved in data modeling 0.81 0.41 16 Build 90 and IS to develop skill in appliati of database systems development and retrieval failities needed to failitate reati of informati system appliatis 2.10 Design a relatial database so that it is at least in 3NF. normalize (redesign) an unnormalized (poorly designed) table 0.58 0.14 2012 EDSIG (Eduati Speial Interest Group of the AITP) Page 11
2012 Proeedings of the Informati Systems Eduators Cferene ISSN: 2167-1435 21 Build 92 Appliati to develop skill with appliati and physial implementati of database systems, using a programming envirment 2.12 Use the data definiti, data manipulati, and data trol language ompents of SQL in the text of e widely used implementati language. reognize the impliati of using views in a lient appliati 0.30 0.51 4 i Tools 58 Problem Solving, with Files and Database to present and ensure problem solving involving files and database representatis 2.06 Use at least e eptual data modeling tehnique (suh as entity-relatiship modeling) to apture the informati requirements for an enterprise domain differentiate between entities and attributes when developing an ERD 0.65 0.46 21 i Tools 92 Appliati to develop skill with appliati and physial implementati of database systems, using a programming envirment 2.12 Use the data definiti, data manipulati, and data trol language ompents of SQL in the text of e widely used implementati language. reognize orret syntax and orret use of views 0.39 0.36 1.3.2 Triggers, Stored Proedures, Audit Ctrols: Design / Development Triggers, audit trols-stored proedures, trigger epts, design, development, testing; audit trol epts/standards, audit trol ; SQL, epts, proedures, embedded programming (e.g. C#) 5 Build 81 Appliatis Development to develop appliati skills for implementing databases and appliatis by operating and testing these databases 2.06 Use at least e eptual data modeling tehnique (suh as entityrelatiship modeling) to apture the informati requirements for an enterprise domain reognize the need either for an interseti table in a M:N relatiship or the need to revisit requirements to determine if there is a missing entity 0.40 0.40 0.57 2012 EDSIG (Eduati Speial Interest Group of the AITP) Page 12
2012 Proeedings of the Informati Systems Eduators Cferene ISSN: 2167-1435 15 Build 90 and IS to develop skill in appliati of database systems development and retrieval failities needed to failitate reati of informati system appliatis 2.11 Implement a relatial database design using an industrial-strength database management system, inluding the priniples of data type seleti and indexing. given database design goals, identify orret tehniques for implementati 0.86 0.19 15 Build 90 and IS to develop skill in appliati of database systems development and retrieval failities needed to failitate reati of informati system appliatis 2.14 Understand the ept of database transati and apply it appropriately to an appliati text. apply the knowledge of using a stored proedure to enhane the performane in a database envirment 0.32 0.34 17 Build 92 Appliati to develop skill with appliati and physial implementati of database systems, using a programming envirment 2.12 Use the data definiti, data manipulati, and data trol language ompents of SQL in the text of e widely used implementati language. reognize the advantages and disadvantages of implementati with stored proedures 0.72 0.26 18 Build 92 Appliati to develop skill with appliati and physial implementati of database systems, using a programming envirment 2.12 Use the data definiti, data manipulati, and data trol language ompents of SQL in the text of e widely used implementati language. trae and debug SQL syntax 0.28 0.21 19 Build 92 Appliati to develop skill with appliati and physial implementati of database systems, using a programming envirment 2.12 Use the data definiti, data manipulati, and data trol language ompents of SQL in the text of e widely used implementati language. reognize the orret formulati of a query 0.87 0.29 2012 EDSIG (Eduati Speial Interest Group of the AITP) Page 13
2012 Proeedings of the Informati Systems Eduators Cferene ISSN: 2167-1435 22 Build 95 Ceptual/L ogial Models to show how to design a eptual relatial database model and logial data base model, vert the logial database designs to physial designs, develop the physial database, and generate test data 2.09 Understand the purpose and priniples of normalizing a relatial database struture. differentiate normal forms as part of database design 0.53 0.34 1.3.3 Administrati: seurity, safety, bakup, repairs, repliating mitoring, safety -seurity, administrati, repliati, mitoring, repair, upgrades, bakups, mirroring, seurity, privay, legal standards, HIPAA; data administrati, poliies 24 b Anal yze 111 IS Requirement s and Database to develop requirements and speifiatis for a database requiring multi-user informati system 2.01 Understand the role of databases and database management systems in managing organizatial data and informati. reognize relevant fators involved in the purhasing deisi of a major enterprise level DBMS pakage 0.28 0.30 7 Build 81 Appliatis Development to develop appliati skills for implementing databases and appliatis by operating and testing these databases 2.17 Understand the key priniples of data seurity and identify data seurity risk and violatis in data management system design given a system need, suh as aess trol to a database, identify the neessary informati 0.80 0.30 0.54 1.3.6 Data Quality: dimensis, assessment, improvement Data Auray, Believability, Relevany, Resoluti, Completeness, Csisteny, Timeliness; Data definiti quality harateristis, Data model / requirements quality harateristis; Data lean-up of legay data, Mapping, transforming, leansing legay data; Data defet preventi; referential integrity; Data quality employee motivati, Informati quality maturity assessment, gap analysis 2012 EDSIG (Eduati Speial Interest Group of the AITP) Page 14
2012 Proeedings of the Informati Systems Eduators Cferene ISSN: 2167-1435 9 b Anal yze 88 IS Data Modeling to develop skill with data modeling whih desribe databases 2.18 Understand the ore epts of data quality and their appliati in an organizatial text. reognize the impliati of a asade delete 0.58 0.25 0.58 Average % Corret ----> 0.53 Note: The table is organized by sub-skills. Eah row of the table shows the item number, the mapping of the item to the ABET program outomes, IS 2002 Learning Unit (LU) number, LU Title and LU Goal statement followed by and IS 2010 learning outome from IS2010.2 ourse. The item objetive (in bold) was mapped to the IS 2010 learning outome. The last three fields show the perent orret, and the point bi-serial orrelati oeffiient, and the average of perent orret for eah sub-skill. Test items (not shown) were derived by first developing the Item Objetives (while studying the sub-skill and LU data) and then the Test Item was written. 2012 EDSIG (Eduati Speial Interest Group of the AITP) Page 15