The Role of Assessment and Its Potential within a Military Training EstablishmentBy Paul Sylvester May 15, 2019
...the developments of the strategic environment and technology demand a more focussed monitoring of trends in education and training, and an accompanying approach to innovation that allows assessment and implementation of new and relevant approaches to learning.
The Ryan Review, 2016
The term 'assessment' is broad and, because of this, is one that may be commonly misunderstood or inconsistently applied. Army’s application of this term can similarly demonstrate a greater emphasis on the role of assessment as a process rather than a tool to support learning. I
n order to better support the delivery of training and education to today’s modern learners, this brief paper will argue that the role of assessment within a military training environment can, and should, be better understood and applied. As such, it contends that an improved practical application of assessment may be identified through better understanding, multiple definitions of assessment; an outline of current and common practices; limitations with current practices; the modern learner; and how authenticity and stress contribute towards effective assessment.
The common definition of assessment relates to an action of making a judgement about something . This definition is broadened in Army’s two key documents related to the delivery of instruction and assessment (ATI 1-3  and The Instructor’s Handbook ). Within these documents, assessment is further defined as a process of gathering evidence in order to make a considered judgement against a prescribed standard (ATI 1-3 para 14d; The Instructor’s Handbook p. 223). While the nature and importance of debriefing is further discussed, assessment is implied as an event separate from learning.
A broadened definition of assessment identifies that the event may be incorporated into the delivery of training and education so that it more actively contributes towards effective learning. Educational academics, such as Graue (4), Shepard (2000)(5), and numerous others, have demonstrated that assessment, through associated reflective opportunities, is more effective in supporting a learner’s understanding and application of information and skills.
Based upon ATI 1-3, the conduct of training reflects a cyclic process of: delivery, monitoring, prepare for assessment, conduct assessment, and moderate/evaluate employed methodologies prior to recommencement of delivery. Broadened by the Instructor’s Handbook, the conduct of assessment is to be supported with a thorough debrief in order to highlight areas of improvement to the trainee. This debrief is commonly supported by a written (and acknowledged) record of results that is attached to (or is part of) the assessment tool. In order to ensure the integrity of the process, assessment tools (including the trainee’s written responses and/or documented feedback) are often removed from the trainee’s possession. Although aware of the trainee’s learning requirements, the application of assessment is often demonstrated as an administrative process distinctly separate from the delivery of instruction. Further complications may occur where a training establishment employs assessment as a ‘gate’ in order to support an administrative function (e.g. back squad/class or removal from training).
While effective in the support of administration, this application of assessment is less supportive of effective learning. Shepard (1989)(6) outlines that effective learning is not achieved through the delivery and assessment of discrete elements and should be (where appropriate) linked, by the learner, within an active and constructive holistic model (7). Under Army’s common application of assessment, trainees receive limited time with any written feedback and must rely upon their memory to recall areas for improvement. Although this may be supported by detailed verbal feedback, the lack of access to this information in a written (or recorded) form may affect effective reflection for later employment. Functionally, this may cause a similar level of dissociation as described by Shepard. This is especially problematic within a course that is complicated by competing topics and/or is subject to strict time constraints. By viewing assessment as an ends rather than a means, (8) increases anxiety with a learning process that is employed as an administrative tool. As a result, learning may be hindered.
The modern learner
As technology and the learner’s engagement grows within the modern education system, so do the expectations of today’s students. Through the access of information, learners are more and more aware of and connected with their perceived environment. With such a sense of connectivity, learners seek to be better engaged and more responsible for their own learning. Facilitated through effective instruction/education, the modern learner can be focussed onto a task that may be developed outside of face-to-face contact hours while meeting the demands of course constraints (e.g. competing subjects and time restraints).
The value of stress in assessment
Assessment under the above administrative model may promote unnecessary stress within trainees (9). Linked to the fear of administrative repercussions, trainees may develop a level of anxiety that may affect, in itself, performance. However, stress does have a valid role within military-based training. Driskel and Johnston (10) demonstrate that the emotional and psychological stress of selected work environments may and should be simulated in order to assess an individual’s ability to function under such conditions. This type of stress differs significantly from that associated with administrative requirements of a course.
Stress related to course constraints may still provide a level of benefit to trainees by providing self-management skills such as time management or prioritisation of tasks. Providing such conditions can (and does) support the identification of attitudinal application within a course and so may represent some value to graduation standards. These conditions are expected within the broader military context as soldiers and officers are required to meet regular timings as part of operational and procedural taskings. Employing a more effective assessment model may maintain appropriate stress while reducing that associated with an attitude towards adverse administrative action.
How to employ an expanded role
Three key elements (11) are associated with the development of an assessment model that better supports learning, namely: appropriate assessment tools; visibility; and instructor/assessor facilitation. Through the employment of effective assessment tools, trainees are provided a clearly recorded outcome and transparent justification for their result. Although diminishing, many Army training establishments maintain simplified assessment checklists that reflect learning outcomes prescribed within a Learning Management Package. Such checklists are problematic as they provided an abbreviated representation of the learning outcomes contained within the Learning Management Package and assume the assessor is fully conversant with them and the underpinning skills and knowledge associated with the delivered training. Where this is not always possible, assessors may be uncertain what the expected standard should look like. Through simplified assessment tools, excessive administrative support is required (e.g. moderation) and subjective assessment may occur. Additionally, trainees are provided simplified or confusing feedback that does not support understanding. This may adversely affect further training that is reliant upon fundamental principles that were incorrectly understood as part of previous training and assessment.
In order to address this concern, assessment tools may demonstrate greater fidelity with increased detail criteria. Although the tool may appear less user friendly, once familiar with the criteria assessors can conduct assessment more quickly and accurately with less time associated with moderation. Noting ATI 1-3’s requirements for assessment, a rubric (the assessment tool) may be developed that clearly outlines an unacceptable and acceptable standard. Positive deviance from the acceptable standard may be further detailed in order to identify graduations of competence (i.e. satisfactory through to excellent). Such a tool will (12) support less experienced assessors with a clear standard; reduce subjectivity; provide improved transparency of the assessment process; and support detailed feedback to trainees. (13)
Regardless of the nature and contained detail of the assessment tool, visibility of the feedback is important because it supports reflection. In line with the profile of the modern learner, trainees benefit where they may refer to previously provided feedback (14) (ideally detailed). Where feedback (detailed or otherwise) is provided as part of the assessment event (e.g. immediately following) trainees may be unable to properly assimilate all information. Similarly, where the feedback is provided at a later date, the trainee may be unable to concentrate in light of more immediate and competing demands. As such, provided feedback may be of little benefit and not appreciated until a related assessment is the focus of the trainee’s attention.
The application of an increased online presence and participation in training is offered through the Australian Defence Education Learning Environment (ADELE). Tapping into the requirements for the ‘digital natives’, ADELE provides an opportunity for the management of assessments that may be available to trainees, in their own time, in order to better support reflection and application to further aspects of training.
The culture in which assessment is conducted is conducive to a more effective learning environment (Carless, p 40 (15)). Where the training establishment, through its instructors/assessors, focus upon the assessment as ‘The Tool’ to determine suitability (or not), trainees may be more concerned with the administrative repercussions. Where assessment is discussed and understood to be an opportunity for learning primarily and administration secondly, then the focus may be placed upon learning. Although appreciated by training establishments and staff, this concept is less understood or applied in light of the resource constraints of many courses (namely time). Conclusion Assessment as a tool for learning is overshadowed by its administrative importance within Army. Inconsistently understood, the modern learner can better respond to an environment where assessment is a supportive tool of learning provided by effective tools, maintained visibility of feedback and delivery as an opportunity for development. While resource constraints are anecdotally cited as limiting factors towards the development of such a model, the consequences may be detrimental to the development of current and future members of the Army. Failure to address the requirements of the modern learner may cause a sense of alienation and continue an attitude of forcing a ‘square peg into a round hole’. Where the modern learner is more effectively involved in their own learning, however, results (and consequentially administrative requirements) can be improved.
1. Refer Google search “Assessment definition”
2. ATI 1-3/17 The Management of Army Training
3. LWP-G-7.1.2 The Instructor’s Handbook 2017
4. ME Graue (1993) Integrating theory and practice through instructional assessment, Educational Assessment, 1, pp. 293-309
5. L Shepard (2000) The Role of Assessment in a Learning Culture, Educational Researcher, Vol 29, pp. 4-14
6. Shepard L (1989) Why we need better assessments, Educational Leadership, 46, pp. 4-9
7. Marzano R, Pickering D & McTighe J (1993) Assessing Student Outcomes AS&CD
8. Bassett D (26 Sep 2013) The role of assessment in education, The Great Education Debate (website)
10. Driskell J & Johnston J (1998) Stress exposure training. Making decisions under stress: Implications for individual and team training, pp.191-217.
11. Further elements do exist (e.g. assessment scores) but this essay limits its scope to three
12. RMC-D has effectively trialled such tools across the ARA Officer Commissioning Course in 2018
13. Rubrics can support most forms of assessment – RMC-D has developed effective rubrics that support ‘complex’ assessments such as Tactical Exercises Without Troops (TEWTs)
14. Kirchner, M & O’Connor, K (2018) Incorporating Reflection Exercises to Identify Soft Skills in Army Education, Journal of Military Learning, Article 6, Oct 2018
15. Carless, D (2005) Prospects for the implementation of assessment for learning, Assessment in Education, Vol 12, No 1, Mar 2005, pp. 39-54