Global Security and Intelligence Studies Volume 5, Number 1, Spring / Summer 2020 | Page 112
Global Security and Intelligence Studies
the evaluators, did not measure metacognition.
Test C had questions that required
test-takers to consider their biases,
assumptions, and evaluations, but in
multiple-choice test format, which was
the format for all three tests. The drawback
of multiple-choice is that they do
not provide qualitative data. Even so,
it was determined that Test B was the
most effective in measuring critical
thinking and included real-world scenarios
that applied directly to defense
professionals.
Critical Thinking
Test B Findings
Test B was administered to a random
sampling of twenty junior
and senior IAs employed by a
defense contractor and working at numerous
military locations. Interestingly,
in this cohort, individuals who had
training in analytic methodologies had
low scores on the critical thinking test.
In fact, only three IAs out of twenty stated
that they used analytical methodologies
on the job. Out of those three, one
had low scores and two had moderate
scores (Marangione and Long 2019).
Also concerning is the testing results
measuring precise contexts; 53 percent
of the cohort of IAs did not manifest
this skill, as illustrated in Figure 8.
Because of the cost factor, the
test was only administered to a small
sample. It is understandable that conclusions
cannot be drawn from such
small a sample; however, the results appear
to support the conclusions drawn
by the IC. It should be cautioned that
critical thinking skill tests might not
predict job-related performance and
this is an area for further study. Critical
thinking tests are a tool, but only
one tool in the toolbox for measuring
an employee’s critical thinking aptitude
or at least their skill level when hired.
Some researchers have also postulated
that general intelligence ability, as measured
by critical thinking tests, does not
predict an individual’s critical analytic
thinking skills. Instead, it found that
“critical thinking predicts task performance
above and beyond the ability of
general intelligence” (Eslon 2018). Also,
according to Statistics and Research
Methods Professor Hilary Campbell,
assessment tests are inherently and seriously
flawed, and their results cannot
be evaluated in a silo. For example,
she feels that assessment tests may just
measure a person’s ability to take tests
(Campbell 2019). Critical thinking tests
suggest the importance of measuring
and testing critical thinking skills when
making evidence-based decisions while
hiring, but they are not the only means
and certainly should not be used exclusively.
Their results suggest the potential
benefits of measuring critical thinking
skills in the hiring process and testing
before and after analytical training to
gauge the effectiveness of training.
The Way Forward
Significant research shows that
metacognition can be a skill that
is developed over time and must
be fostered by employers by encouraging
and welcoming strategies to employ
critical thinking in the workplace. This
98