The ACOE program is an integral piece of the plan and should remain so for the foreseeable future.

ACOE training focuses on the needs of the customer.

As stewards of the ACOE program, we owe it to our Soldiers to strive for continual improvement.

The goal to improve as individual commands, and as a single Army organization, are inseparable.

An environment of free and open exchange of ideas has begun to develop.

Program Overview

In 2009 the Army Reserve Army Communities of Excellence (AR ACOE) program was on life support. Only two of the ten eligible commands had submitted packets the previous year and confidence in the program was minimal. The choices were to renew our commitment to the program, or to kill it and use the annual $934,000 budget elsewhere. It was agreed that one more effort to salvage the AR program would be warranted.

"The ACOE program that was all but dead two years ago is now a tremendous positive influence for the Army Reserve community."

The Identified Requirements

The AR program manager, with assistance from the course instructors, quickly identified two requirements for improving the program:

  1. Find out why our eligible commands (also referred to as our customers) were not participating in a centrally-funded program designed to help them improve their business practices.
  2. Redefine the goals and plan of execution for the program so it could be adapted to the needs of our customers while adhering to the ACOE program guidelines.

The first goal of asking our commands to explain why they weren’t participating is ironic. One of the tenets of our ACOE training is to focus on the needs of the customer. We had to admit that we were not practicing what we were preaching. And once we did the commands responded with enthusiasm. They asked for more emphasis on the strategic – operational link so their employees could appreciate how their day-to-day actions were tied to the Commander’s stated goals. And they asked for feedback reports that provided real direction so they could use the data to improve their internal practices. Their feedback became the focus of our initial efforts.

The second goal was to do a systematic review of our program execution at the headquarters level. Improved practices and the elimination of waste of limited program funding were top priorities. A Lean Six-Sigma (LSS) project was completed in 2009 to help identify changes that would make the program more efficient. They included:

a. Restructuring the training plan.

In the past only a few “organizational self assessment team members” were trained for each command. Two or three classes were scheduled each year at central locations and travel costs were covered for each student. The program costs were prohibitive and the commands were only getting a few people trained. Also, the commands were sending lower level personnel that could be spared for a week. As a result the packet writers had the ACOE training needed to write the packet, but not the requisite knowledge to include important characteristics within the command. They simply didn’t have the experience and organizational situational awareness needed to write a packet which represented the way the organization operated.

A request to include strategic – operational understanding was incorporated, and the commands quickly recognized the benefit and increased the number of students enrolled. As a result mobile training teams could be sent to any command who could guarantee at least 15 students. The other commands with more limited enrollment would be provided seats at the other command locations to fill out the class. Four organizational self assessment classes were offered in 2010, doubling the number for the previous year. Five were conducted in 2011, and in 2012 seven classes, one for each participating command, are scheduled.

b. Reevaluated the cost-benefit of program execution.

Moving the classes to the commands had two benefits. The commands could send more students because participation became more affordable, both from a fiscal and a productivity standpoint. This had a significant effect on our program costs. Instead of paying TDY for an instructor and 20 students, two instructors were sent to conduct training on-site, resulting in an average savings of $40,000 per class. In 2011 five self assessment courses were conducted for a total cost of approximately $120,000. The same courses, before the program review, would have cost as much as $360,000.

c. Increased the pool of instructors.

Because the program was stagnated, only one instructor was needed in 2009. He is an exceptional instructor but, as a result, he was in great demand. The plan of execution for our classes was dictated by his schedule, and was determined at the start of the fiscal year to ensure he would be available. This resulted in a high rate of disenrollment, due to conflicts between command needs and the commitment to attend the training when the class was conducted months later.

In 2010, with the assistance of the existing instructor, five more qualified trainers were added to the rolls. Classes can now be scheduled with a greater amount of flexibility. Quality is assured by providing two instructors for each course so that one can provide input and class oversight while the other instructs.

d. Reduced demand on command personnel.

In the past, five to eight personnel from each command were required to attend the organizational self assessment course. Of those trained, three had to attend an additional week of instruction to be examiners, and then travel for another two weeks to attend the packet review event at some central location. For most commands this represented a 20 week investment of personnel and a significant allocation of resources.

In an effort to reduce the amount of lost time, the two week down select (packet review) event was restructured. The majority of work is now completed remotely over a four to six week period. Commands are requested to provide the examiners with dedicated time during work hours to complete the requirements, but can schedule that time around other command requirements. At the end of the process the team members meet for a three day (as opposed to two weeks) event to finalize the feedback report.

In the past the commands got five trained personnel for a twenty week investment. Now, for the same twenty weeks the result is 18 trained personnel. The direct result has been an immediate improvement in packet quality. In addition, commands now have a significant pool of workforce personnel with Malcolm Baldrige criteria and assessment tool knowledge with which to improve their organization.

e. Restructure the site visits

The final step in the command process is a site visit to the two top scoring commands. In the past, two separate teams performed the site visits, lasting five days each, and involving five to six team members from other commands. This was an additional commitment by the participating commands, and most of the participants had little or no experience in reviewing the practices of another command. The commitment of time and personnel was not justified by the result.

In 2010 the process was changed. Three of our instructors are now assigned to review the packets of the two finalists, then they spend two days (Monday-Tuesday at one location, Thursday-Friday at the other) reviewing the two commands. Because the participants have far more experience the results have markedly improved. The revised process is also more efficient because the commands are not investing an additional week for each participant, and the two locations now have only two days of interruption to their normal operations, instead of a full week. With six instructors now participating in the AR ACOE program, a different team of three can be utilized every other year to ensure a fresh perspective separate from the year prior.

An added benefit is that the same three instructors review both commands in a given year, so we get a direct comparison. This helps to calibrate the scoring and ensures that each command is rated on the same scale.

The Result

The benefits of these changes were immediate and include:

  1. Increased Participation: Six of ten eligible commands are now participating, and one more is considering participation in 2012.
  2. Increased Training: In 2009, 54 Reserve personnel were trained; in 2010 and 2011 the combined total was 283. In 2011 the average class had 37 students, up from 21 in 2009.
  3. Improved Business Practices: Participating commands are now using the Malcolm Baldrige criteria (the basis for our training doctrine) during their strategic conferences and to improve operational practices, such as the design of action plans and a serious review of what they measure and how it relates to their stated goals.
  4. Leveraging Best Practices: An environment of free and open exchange of ideas has begun to develop. Though the prize money was always attractive, the commands have bought into the philosophy of the AR ACOE program manager- “The goal to improve as individual commands, and as a single Army organization, are inseparable. We cannot sustain one without the other.” As a result the winning packet is now posted on the AR ACOE website for all to review, along with many other worthwhile files. In addition, many of the participating commands have paired up and are sharing the feedback reports in an effort to solicit outside counsel. This is unprecedented and was initiated solely between enthusiastic commands. We would have never expected commands to choose to share their failures with one another. But this is a perfect illustration of their level of commitment.
  5. Cost Reduction: The AR ACOE program can now be executed for 50% of the original annual budget. Assuming the funds remain within the program, a plan has been developed to implement an internal prize structure as a replacement to the OSD award that will be discontinued this coming cycle. In addition to a champion and runner-up, a “most improved” winner will be recognized annually. The focus is not just on winning; it should also include a measure toward constant improvement.
  6. Organizational Improvement: Finally, the program is about systematic, rather than localized, improvements. In 2011, among the five eligible competitors, the average score increased by 34% over the previous year. All five competed the year prior and the same team leaders were assigned to lead the packet review process, so the data can be considered comparative. In the previous five years AR ACOE packet scores (which are never published) improved a total of 12%. The 2011 result was nearly triple that of the previous five years. This is a direct reflection on the team effort made to improve the process, and the level of renewed commitment by participating commands.

Conclusion

The ACOE program that was all but dead two years ago is now a tremendous positive influence for the Army Reserve community. Process improvements will remain a focus of the program, but will be instituted cautiously so as not to put at risk the strides achieved the past two years. As stewards of the ACOE program, we owe it to our Soldiers to strive for continual improvement. The ACOE program is an integral piece of the plan and should remain so for the foreseeable future.

follow us