Global Security and Intelligence Studies Volume 4, Number 1, Spring/Summer 2019 | Page 27

Global Security and Intelligence Studies The Citadel’s undergraduate and graduate programs rely heavily on the use of portfolios. This mechanism is designed to be a compilation of course assignments that demonstrate a given student’s fulfillments of the SLOs for the program. This could include a variety of assignment types such as in-class assignments, group projects, or papers. This compilation of student work that was completed as a part of their course grades is then repurposed as a means of assessing the programs learning objectives. This approach necessitates that courses in the program utilize a common set of rubrics to assess areas such as critical thinking, communications skills, and ethical reasoning. In addition to this collection of graded coursework that the student selects for the portfolio, they must also pass an oral exam before graduation that demonstrates competency in the field of study (Jensen 2015). While this assessment measure is intended to primarily demonstrate student achievement with regard to the program’s learning objectives, it also contains an indirect measure by asking students to write a reflective essay which summarizes their perception of the knowledge, skills, and attitudes related to intelligence and homeland security that they have developed as a result of their educational experience (Jensen 2015). The programs at the University of Texas at El Paso, the University of Arizona South and Coastal Carolina University identify the use of writing samples as an assessment method. This approach utilizes a sample of student work that is drawn from different courses in the curriculum and then assessed with a common rubric. This allows for an assessment to track the development of writing skills across introductory, intermediate, and advanced level courses. However, as noted earlier, the use of this approach raises concerns over sample bias. The University of Texas at El Paso utilizes an external reviewer in its program assessment. The reviewer is provided with the SLO, course materials, and examples of student work and asked to provide an evaluation of how well the program achieved the SLO. This approach has the benefit of an unbiased evaluator who has substantive expertise in the field of study. One of the programs that was reviewed in this study was considering using personnel from an advisory board to the program. Another intended to seek program certification from the IAFIE as a substitute for an external reviewer (Background, Telephone interview with author, January 24, 2018). The use of an external review in the field of intelligence studies does have one potential drawback—the lack of qualified external reviewers. As noted in other studies, the field of intelligence studies is limited by qualified faculty to staff these programs (Smith 2013, 28). The small number of programs in the field leads to a similarly small number of potential reviewers. Beyond the issue of who is qualified to serve, which does not appear to have a clear standard, the use of external reviewers can be further exacerbated if there are additional limitations placed on the selection by the institution, such as the reviewer representing a peer institu- 16