As a previous exec, I ran the numbers on how long it took for one OPR to be run through from start to finish. A conservative estimate assumes it takes at least 80 total man hours. If you take the total number of officers in the USAF, it costs the AF over $80 billion in man time per year. This figure assumes all the work is done by a first year O-1. Therefore, it is very conservative. On the other hand, if you assume it takes 160 man hours (a very reasonable estimate from my experience) and a 6 year O-3 on average is doing all the work (this averages all the salaries involved which is realistic since an OPR goes through a lot of higher ranking individuals), the amount spent in man time is over $308 billion. That means the Air Force spends between $80 to $308 billion to create, process, and file all the OPRs.
@Thomas, compelling numbers. I would also submit that changes in processes can increase the money spent by orders of magnitude. A good example of this would be the switch from EMS to vPC. Just the time along spent trying to understand a new system for some folks can lead to eating up more man hours. Additionally, vPC doesn't show you were an OPR is in the chain, unlike EMS, so you can't go to a specific person to push the process along, again costing more man hours. Even better is when you have a person who is deployed that gets back to homestation who earned an OPR (year long tour), it has to work its way through the system, but the number of folks that are in the chain to look at the OPR may not be conducive to a productive workflow (again more man hours).
In my experience as an exec, 80 hours was a very conservative estimate. Some things to add for consideration: 1- We change jobs frequently due to the current evaluation system. If you close out twice with the same job, it "looks bad" so we must keep track of Change of Rating Officer, CRO reports, adding additional man-hours 2- Again, we change jobs frequently, so we are almost always operating from an area of inexperience, not experience, making us take longer than we would if we were comfortable in our job 3- Often our execs were chosen from who was available due to TDYs, deployments, requests from group and wing for manning, necessary training, and PCS. This means that less experienced personnel are often chosen because they are available, again increasing the time required. Factoring in CROs done to avoid short term reports, CROs done to prevent future short term reports, and CRO reports accomplished, I personally spent approximately 95 hours per OPR, with an average of 1.5 CROs required (1 being the minimum since we don't let people close out with the same job twice). Compare that to the enlisted evaluation system, where changes in rater do not require additional reports, and it would have saved me around 15 hours per evaluation. We already have a functional means to record accomplishment when an OPR is not required in a non-mandatory LOE, which can be used to record accomplishments, does not require significant time investment outside of the rater and ratee, and can be included in the overall OPR.
Great Comment! This is the value of a large number of people reviewing and commenting on each others posts. I am sure it grated at you to see 80 hours when you have experienced upwards of 95. Every hour you needlessly spend on an OPR is time that you could be doing your primary job or time with your family. Thanks for adding value to the discussion. I am looking forward to hearing your solutions to the problem in the next stage.
As a wing exec I work directly for our Senior Rater who is also a Development Team and promotion board member. He has stated that during DT and promotion board record reviews they get so little time to review a record that a large portion of the record gets overlooked. On PRFs for example, the promotion board member stated they only have enough time to adequately read/understand the award line, strats down the side, and the SR’s push line. Tying this to @Thomas Meyer’s comment above, if it takes on average 80 hours to generate an OPR (and one could argue just as long to generate a PRF), yet the board spends less than 1 minute reviewing it, clearly there is unnecessary complexity in the OES. Additionally, since the PRF (roughly 1 bullet from each OPR) is what the promotion board spends most of their time reviewing, it can be argued that for an OPR to be effective, it needs to just have one strong PRF worthy bullet.
I think this is a good point, but it assumes that OPRs are only valuable for promotion purposes. Aside from being a source of material for a PRFs, are there other uses for the OPR that justify the investment of a more thorough record?
Good point @Matt. I think one of the other main uses is to create an accurate record to document performance and potential (for a variety of purposes, job opportunities, and other reviews). In that case, we get into the inflation COG though. I'll try and do some more digging to find facts that support what OPRs are primarily used for.
Replace "teachers" with raters and "students" with ratees. This article provides 7 principles of feedback, backed up by research, that could help simplify the current complex process.
"In higher education, formative assessment and feedback are still largely controlled by and seen as the respon- sibility of teachers; and feedback is still generally conceptualised as a transmission process, even though some influential researchers have recently challenged this view- point (Sadler, 1998; Boud, 2000; Yorke, 2003). Teachers ‘transmit’ feedback messages to students about what is right and wrong in their academic work, about its strengths and weaknesses, and students use this information to make subsequent improvements. There are a number of problems with this transmission view when applied to formative assessment and feedback."
Nicol, David J., and Debra Macfarlane‐Dick. "Formative assessment and self‐regulated learning: A model and seven principles of good feedback practice." Studies in higher education 31.2 (2006): 199-218.
I recognize this is anecdotal but just today I was told by an O5 mentor to make sure that someone who is good at writing OPRs was doing mine because it would affect my career if my rater didn't understand how to use the right kind of language for OPRs.
Formatting and language are indeed complex. The need to squeeze as many supporting numbers and facts into bullet form has made for some interesting abbreviations and rules. I had to use Wing and Numbered Air Force writing guides that didn't always match when reviewing evaluations. This further adds to the complexity and man hours needed to get the evaluation in specific format. I can only imagine that at a promotion board these abbreviations and acronyms cause confusion and need decoding.
From my experience as a squadron exec, it takes 4 to 6 weeks prior to closeout to begin the routing process (depending on how high the evaluation needs to be routed). Sadly, most of this time the evaluation sits and waits to be reviewed. More often than not, no significant changes are made and the OPR won't reflect the last six weeks.
The complexity and time demands for OPRs are especially difficult for reservists. Because of our limited time, completing only 4 evaluations takes approximately 10% of my raters standard duty hours for the entire year.
The system is so complex that your raters are never able to complete your OPR like stated that they should be. I have never met anyone who hasn't written their own OPR, and actually had their rater write it for them. That fact means that multiple hours are taken up from the ratee writing their own report, then their raters having to go in and polish/change their bullets in their OPR, contributing to the conservative estimate of 80+ hours for an OPR as stated above. A less complex system would cut off a lot of that time and your rater would actually be able to write your OPR.
During my time as a squadron ADO, I spent nine months handling the routing/editing/signing of performance reports. Including returns for typos, mislabeled titles/signature blocks, and and incorrect verbiage, each OPR made a minimum of three visits to each major and minor rung in the chain. For example, for OPRs the chain included the rater, who was usually the flight chief (but could've started lower) -> ADO -> DO -> Sq/CC -> Gp/CCE -> Gp/CC -> Wg/CCE -> Wg/CC. This was never implement directly, because at each stage the reports were always returned to some previous level in order to make changes, with some changes at a previous level being reversed or overridden by a higher-up. Even controlling for returns that were attributable to inattention to detail, that still left us with an average of two stops per rung. This number could sometimes be doubled on the signature end if anyone at a higher level changed their mind about a strat or push line.
During my time as a squadron ADO, I spent nine months handling the routing/editing/signing of performance reports. Including returns for typos, mislabeled titles/signature blocks, and and incorrect verbiage, each OPR made a minimum of three visits to each major and minor rung in the chain. For example, for OPRs the chain included the rater, who was usually the flight chief (but could've started lower) -> ADO -> DO -> Sq/CC -> Gp/CCE -> Gp/CC -> Wg/CCE -> Wg/CC. This was never implement directly, because at each stage the reports were always returned to some previous level in order to make changes, with some changes at a previous level being reversed or overridden by a higher-up. Even controlling for returns that were attributable to inattention to detail, that still left us with an average of two stops per rung. This number could sometimes be doubled on the signature end if anyone at a higher level changed their mind about a strat or push line.
Here is information I just received from the AFMC SOF/ISR directorate's exec (led by a One Star Acquisitions Officer). There are 108 CGOs in the directorate, and the Directorate's front office spends on average 15 minutes reviewing each OPR. There is approximately another 10 minutes per OPR for the "mouseclicks" to move the document to the General and for him to sign off. Finally, the admin does the final tweaking at roughly 5 minutes per OPR. In total that is roughly 30 minutes per OPR, so in this small directorate 54 hours are spent by leadership signing these documents. This is only for the CGOs, and does not include the large number of FGOs. This obviously does not take into account all the time spent at the individual level, but does give us insight into the top level.
As a previous exec, I ran the numbers on how long it took for one OPR to be run through from start to finish. A conservative estimate assumes it takes at least 80 total man hours. If you take the total number of officers in the USAF, it costs the AF over $80 billion in man time per year. This figure assumes all the work is done by a first year O-1. Therefore, it is very conservative. On the other hand, if you assume it takes 160 man hours (a very reasonable estimate from my experience) and a 6 year O-3 on average is doing all the work (this averages all the salaries involved which is realistic since an OPR goes through a lot of higher ranking individuals), the amount spent in man time is over $308 billion. That means the Air Force spends between $80 to $308 billion to create, process, and file all the OPRs.
ReplyDelete@Thomas, compelling numbers. I would also submit that changes in processes can increase the money spent by orders of magnitude. A good example of this would be the switch from EMS to vPC. Just the time along spent trying to understand a new system for some folks can lead to eating up more man hours. Additionally, vPC doesn't show you were an OPR is in the chain, unlike EMS, so you can't go to a specific person to push the process along, again costing more man hours. Even better is when you have a person who is deployed that gets back to homestation who earned an OPR (year long tour), it has to work its way through the system, but the number of folks that are in the chain to look at the OPR may not be conducive to a productive workflow (again more man hours).
DeleteGreat Stats. I agree with Steven that these numbers help us frame the problem clearly. Nice work here!
DeleteThis comment has been removed by the author.
ReplyDeleteIn my experience as an exec, 80 hours was a very conservative estimate. Some things to add for consideration:
ReplyDelete1- We change jobs frequently due to the current evaluation system. If you close out twice with the same job, it "looks bad" so we must keep track of Change of Rating Officer, CRO reports, adding additional man-hours
2- Again, we change jobs frequently, so we are almost always operating from an area of inexperience, not experience, making us take longer than we would if we were comfortable in our job
3- Often our execs were chosen from who was available due to TDYs, deployments, requests from group and wing for manning, necessary training, and PCS. This means that less experienced personnel are often chosen because they are available, again increasing the time required.
Factoring in CROs done to avoid short term reports, CROs done to prevent future short term reports, and CRO reports accomplished, I personally spent approximately 95 hours per OPR, with an average of 1.5 CROs required (1 being the minimum since we don't let people close out with the same job twice).
Compare that to the enlisted evaluation system, where changes in rater do not require additional reports, and it would have saved me around 15 hours per evaluation. We already have a functional means to record accomplishment when an OPR is not required in a non-mandatory LOE, which can be used to record accomplishments, does not require significant time investment outside of the rater and ratee, and can be included in the overall OPR.
Correction, that note about saved time should say 1.5 not 15
DeleteThis comment has been removed by the author.
DeleteGreat Comment! This is the value of a large number of people reviewing and commenting on each others posts. I am sure it grated at you to see 80 hours when you have experienced upwards of 95. Every hour you needlessly spend on an OPR is time that you could be doing your primary job or time with your family. Thanks for adding value to the discussion. I am looking forward to hearing your solutions to the problem in the next stage.
DeleteAs a wing exec I work directly for our Senior Rater who is also a Development Team and promotion board member. He has stated that during DT and promotion board record reviews they get so little time to review a record that a large portion of the record gets overlooked. On PRFs for example, the promotion board member stated they only have enough time to adequately read/understand the award line, strats down the side, and the SR’s push line. Tying this to @Thomas Meyer’s comment above, if it takes on average 80 hours to generate an OPR (and one could argue just as long to generate a PRF), yet the board spends less than 1 minute reviewing it, clearly there is unnecessary complexity in the OES.
ReplyDeleteAdditionally, since the PRF (roughly 1 bullet from each OPR) is what the promotion board spends most of their time reviewing, it can be argued that for an OPR to be effective, it needs to just have one strong PRF worthy bullet.
I think this is a good point, but it assumes that OPRs are only valuable for promotion purposes. Aside from being a source of material for a PRFs, are there other uses for the OPR that justify the investment of a more thorough record?
DeleteThis comment has been removed by the author.
DeleteGood point @Matt. I think one of the other main uses is to create an accurate record to document performance and potential (for a variety of purposes, job opportunities, and other reviews). In that case, we get into the inflation COG though. I'll try and do some more digging to find facts that support what OPRs are primarily used for.
DeleteThis comment has been removed by the author.
ReplyDeleteReplace "teachers" with raters and "students" with ratees. This article provides 7 principles of feedback, backed up by research, that could help simplify the current complex process.
ReplyDelete"In higher education, formative assessment and feedback are still largely controlled by and seen as the respon- sibility of teachers; and feedback is still generally conceptualised as a transmission process, even though some influential researchers have recently challenged this view- point (Sadler, 1998; Boud, 2000; Yorke, 2003). Teachers ‘transmit’ feedback messages to students about what is right and wrong in their academic work, about its strengths and weaknesses, and students use this information to make subsequent improvements.
There are a number of problems with this transmission view when applied to formative assessment and feedback."
Nicol, David J., and Debra Macfarlane‐Dick. "Formative assessment and self‐regulated learning: A model and seven principles of good feedback practice." Studies in higher education 31.2 (2006): 199-218.
I recognize this is anecdotal but just today I was told by an O5 mentor to make sure that someone who is good at writing OPRs was doing mine because it would affect my career if my rater didn't understand how to use the right kind of language for OPRs.
ReplyDeleteFormatting and language are indeed complex. The need to squeeze as many supporting numbers and facts into bullet form has made for some interesting abbreviations and rules. I had to use Wing and Numbered Air Force writing guides that didn't always match when reviewing evaluations. This further adds to the complexity and man hours needed to get the evaluation in specific format. I can only imagine that at a promotion board these abbreviations and acronyms cause confusion and need decoding.
DeleteFrom my experience as a squadron exec, it takes 4 to 6 weeks prior to closeout to begin the routing process (depending on how high the evaluation needs to be routed). Sadly, most of this time the evaluation sits and waits to be reviewed. More often than not, no significant changes are made and the OPR won't reflect the last six weeks.
ReplyDeleteThe complexity and time demands for OPRs are especially difficult for reservists. Because of our limited time, completing only 4 evaluations takes approximately 10% of my raters standard duty hours for the entire year.
ReplyDeleteThis doesn't count the time spent by the Active Duty writer who was asked to help get the code words correct.
DeleteThe system is so complex that your raters are never able to complete your OPR like stated that they should be. I have never met anyone who hasn't written their own OPR, and actually had their rater write it for them. That fact means that multiple hours are taken up from the ratee writing their own report, then their raters having to go in and polish/change their bullets in their OPR, contributing to the conservative estimate of 80+ hours for an OPR as stated above. A less complex system would cut off a lot of that time and your rater would actually be able to write your OPR.
ReplyDeleteI myself have spent somewhere in the 2-3 hour time frame writing bullets for each of the 2 OPRs that I have had.
DeleteDuring my time as a squadron ADO, I spent nine months handling the routing/editing/signing of performance reports. Including returns for typos, mislabeled titles/signature blocks, and and incorrect verbiage, each OPR made a minimum of three visits to each major and minor rung in the chain. For example, for OPRs the chain included the rater, who was usually the flight chief (but could've started lower) -> ADO -> DO -> Sq/CC -> Gp/CCE -> Gp/CC -> Wg/CCE -> Wg/CC. This was never implement directly, because at each stage the reports were always returned to some previous level in order to make changes, with some changes at a previous level being reversed or overridden by a higher-up. Even controlling for returns that were attributable to inattention to detail, that still left us with an average of two stops per rung. This number could sometimes be doubled on the signature end if anyone at a higher level changed their mind about a strat or push line.
ReplyDeleteDuring my time as a squadron ADO, I spent nine months handling the routing/editing/signing of performance reports. Including returns for typos, mislabeled titles/signature blocks, and and incorrect verbiage, each OPR made a minimum of three visits to each major and minor rung in the chain. For example, for OPRs the chain included the rater, who was usually the flight chief (but could've started lower) -> ADO -> DO -> Sq/CC -> Gp/CCE -> Gp/CC -> Wg/CCE -> Wg/CC. This was never implement directly, because at each stage the reports were always returned to some previous level in order to make changes, with some changes at a previous level being reversed or overridden by a higher-up. Even controlling for returns that were attributable to inattention to detail, that still left us with an average of two stops per rung. This number could sometimes be doubled on the signature end if anyone at a higher level changed their mind about a strat or push line.
ReplyDeleteHere is information I just received from the AFMC SOF/ISR directorate's exec (led by a One Star Acquisitions Officer). There are 108 CGOs in the directorate, and the Directorate's front office spends on average 15 minutes reviewing each OPR. There is approximately another 10 minutes per OPR for the "mouseclicks" to move the document to the General and for him to sign off. Finally, the admin does the final tweaking at roughly 5 minutes per OPR. In total that is roughly 30 minutes per OPR, so in this small directorate 54 hours are spent by leadership signing these documents. This is only for the CGOs, and does not include the large number of FGOs. This obviously does not take into account all the time spent at the individual level, but does give us insight into the top level.
ReplyDelete