Welcome to the first stage: The problem with the OES is multi-faceted.
In order to attack this problem let's identify the Centers of Gravity (COGs).
In order to attack this problem let's identify the Centers of Gravity (COGs).
Reply to this post with your "Gut Feel" of the problem. Recommend < 150 words in your reply.
- Think Twitter ... summarize your Think Tank Application into a BLUF
- BLUF = TAG LINE + few lines that summarize the problem.
Goal is to create "Tag Words" that can be used to develop COGs.
- Example of "Tag Words" used in previous Think Tanks included:
- Religion, Asymmetric, Women, ISIL, Education, Humanitarian, Social_Media, etc.
*If using phrases words should be connected by underscore i.e. SOCIAL_MEDIA |
- Idea is to use rapid brainstorming to create a Word Cloud that reveals trending words/phrases.
- Tag'd Words that trend will be used to ID COGs and points will be awarded to the team.
Reward conformism and ignores merit
ReplyDeleteThe Air Force is needs outside the box thinkers but if we incentives conformism and ignore "real value" when identifying those who should promote we will get groupthink and officers who have been conditioned to manage not lead.
DeleteFor better or worse the OPR system has little quantifiable data. This comes as a product of the varied AFSCs that compete against each other but comes at the detriment of people who may be overlooked because they 1. Don't have a strat and 2. Don't have a PRF-type bullets.
ReplyDeletefeedback - I agree with the earlier statement, we need better feedback. Who cares who my wingmen is...my supervision sits down with me every six months...grow me as an officer. we need more frequent feedback with developmental milestones. I don't know about the rest of the AFSCs but I have had minimal mentorship in the medical core. I am expected to lead airmen yet I have no guidance on how to be a better leader instead I get questions like who is my wingman and what are my goals??? A good leader would know all of that.
DeleteCompletely agree. I don't think that the subjective parts should be completely removed from the reports, but there should definitely be more quantitative metrics that are applicable across AFSCs.
DeleteIf we make all the effort to have performance reports, why do they have so little weight in promotion O-1 to O-4. Statistically most of those reports wont ever matter or be read because the member will separate before they are used. Perhaps consider BTZ to O-4?
ReplyDeleteJust playing devil's advocate here, but do you think a quicker path to O-4 will keep people in? I don't think getting a BTZ slot to Major would be a good idea, then you will potentially shorten the tactical experience that a Major would have before crossing over into the FGO realm.
DeleteMajor_Revision, Simplify, Less_Time, Easier, Valuable, Delete_myPERS - I believe the current OES needs to be majorly revised. It requires too much time and effort to produce a single OPR, PRF, or feedback statement. The transition to myPERS has made the process even more difficult, since it is not fully functional and has many errors. Create a survey like feedback system that is synonymous with the OPR. Gather quantitative data throughout the officer's career and promote off of simple and easy to understand numbers garnered from the OPR.
ReplyDeleteI concur!
Deleteyou are providing great "tag" solutions which is useful for COAs but I want to know the problem "tags".
DeleteTimely, Costly, Difficult to Understand, Not_Fair - The OPR, PRF, and feedback system is too timely to continue to use. With my previous experience as an exec, it took over 80 hours of man time to process one OPR. This equates to billions of dollars of time lost. It also doesn't fairly capture and display the performance of those it rates.
DeleteNailed it. Nice work.
DeleteI had the same experience with OPRs as an exec, and most of the time was completely wasted. I spent the majority of my time dealing with formatting and finding ways to make bullets fit nicely rather than share valuable information.
DeleteI can understand that bullets are appealing as they can summarize a large amount of work into a single line, but that is also their downfall. Perhaps an officer was ranked #1 of 13 in his previous placement, but out of context and without a narrative relating his leadership impact on others, this could mean he was the best in a field of chaff.
DeleteEducation
ReplyDeleteMore education needs to be presented in the way of OPRs, as well, and possibly more importantly what goes into making a good PRF. First many, especially early on in your career, you do not know how to properly compose a OPR bullet, either for quarterly awards or your OPR. Many of the people who know this is your A1 shop, flight commanders and Execs. If you never have one of these positions, you don’t get much practice. Additionally, more education on what actually goes into your PRF needs to be disseminated as well. It is important to know what kind of bullets and strats you need/look good to the board.
DeleteI agree that there needs to be more education, but I also think that the bullets shouldn't be so difficult to write. Instead of crafting wordy bullets that make everyone sound like superheroes, the language should be simpler and more accurately convey the true job performance and impact. Quantitative metrics would go a long way to creating a common language and baseline from which to start and would simplify the process, reducing the need for education - not getting rid of it (!) just making the system less complicated.
DeleteI definitely agree with Marcus. The process is too difficult for the average supervisor to successfully complete for their Airmen. Did you know the AFI prohibits the ratee from making his/her own OPR for review. Depending on how you interrupt this, this means the supervisor has to completely create all the bullets necessary. In my experience, supervisors cannot do such a thing. They simply don't have all of the data.
Delete@Marcus, hit the nail on the head. One of the quickest potential solutions to the OES system would be to educate everyone on how to write bullets that can be understood across AFSCs (i.e. if a non-mil spouse/friend gets it, should be good). This would have to be accomplished from the top-down as you would encounter middle management that could potentially push back on a "non-superhero" bullet.
DeleteThe superhero bullets have always struck me as particularly odd. I felt a bit awkward selling the idea that I "spearheaded" something when simply fulfilling the duties of my job. In any event, they have become the standard and one obstacle to changing this bullet format is that suddenly OPR bullets will seem much less impressive. A rater will see a normal, non-war-winning bullet and think "hmm, I guess this guy must have been pretty unimpressive. Lets give it to the guy that oversaw the successful exectuion of 13 CASEVACS (as a scheuduler.) There would need to be unified and simultaneous oversight on this transition.
DeleteAmplifying_Data
ReplyDeleteOPR bullets, while some can be good, are composed of mainly abbreviations and acronyms. In general, if you gave a normal person outside the military, and even some in, an OPR and read it, it would not make much sense to them at all. Furthermore, when you really think about it, in general how much does this really show how good of a leader someone is, or how much they should be promoted. Possibly adding a free text area, with a minimum word requirement, to have your primary and secondary raters write amplifying data about you could be one possibly way to add more helpful information to OES process.
DeleteI think you make an important point about the varying language, but I think the fix is to standardize the language at a higher level. If the standards were exactly the same for everyone instead of wing or unit specific standards, then everyone would (theoretically) understand everything that would go on the reports.
DeleteI agree completely that our bullets are difficult to decipher, we fill them with acronyms and strats because that's what is required in order to get promoted. This means that OPRs do not reflect performance. Moreover, bullets are often edited by people without expertise in the field due to their role as executive officers for commanders. These combined mean that the current system does not include the necessary data to achieve the goals of reflecting performance or establishing expectations for future performance and unnecessarily punishes those in diverse career fields.
DeleteRequire_Feedback
ReplyDeleteOne of the best ways you can learn about how you are doing is through constant feedback, not how your OPR looks. Your OPR is made essentially to help you stand out from your peers, and while this is not a bad thing, if you base people’s opinions on you or how you are doing career wise solely from that you are missing a huge piece. Find some way to instill in people the want and need for feedback so they can hold their superiors accountable for it, not because it is required but because they want the feedback to learn and grow from it.
DeleteI definitely agree with you Ty. I think there is a way to incorporate feedback, the OPR, and the PRF all into one similar type document that can be used receptively. This source document gives one feedback just from previous versions because it is easy to understand what progression he/she is making.
DeleteWith regard to how you will be rated for performance, however, the feedback you need would be from your rater. I desire to receive feedback from multiple people in my field and get a larger picture of how I am performing against my timeline in my job and against others in my position. If my rater has an isolated opinion that will still be the one that impacts my chances of advancement.
DeleteRethink_Stratifications
ReplyDeleteIf someone doesn’t get a high strat, or one at all, does that automatically mean they are a bad officer? I would argue no. In addition to the amplified information I suggested, finding a new way to do strats should be looked at. One issue is that for a strat to ‘look good’ you can’t have something like he’s my 8/10 Captains, because at face value it looks bad. However, in certain areas this person could really excel. Additionally, it has come to the point where if a commander really wants to give a #1 strat in some cases a nonstandard category is used just so this person can get a #1 strat because the weight that is put on it. Have only certain acceptable categories for strats, as well as possibly making scaled areas, rated 1 – 10 for example, on leadership, work ethic, etc could also be used.
DeleteTy, these are good thoughts -- keep the ideas coming!. I think you've hit on some key areas that will spur some great discussions into the next phases. Col Lass
DeleteTy, let me add to this...now that I've read more. Be thinking about the repercussions that lie in any kind of quantitative feedback or performance evaluation system. I like your thoughts -- just be thinking about how to fix this while dealing with human nature.
DeleteThe enlisted EPR system has a solution that provides standard scales on a 1-5 system, where not every enlisted person is a 5, but some of them are. My group no longer recognizes "soft strats" meaning any strat below a #1/XX no longer counts.
DeleteMy question would be #8/10 Captains for what end? All around? As you suggest, maybe there is a talent this person has that places him leagues above his peers in a specialized area. Do we then squander his talents by rating him against the wrong scale? It would be a shame to waste talent when it is discovered. A short narrative could also shed some light on that and perhaps relate that this candidate has alot to offer that will not be represented in the overall rank.
Deletesubordinate_peer_input - I believe as officers, we are leaders regardless if it is a flight or just individual we come in contact with. Way too often officers are being promoted to high ranking positions and they are horrible leaders. we need a way to convey that to the boards
ReplyDeleteI agree, often the superiors don't sit down to provide the feedback sessions, or are distance from the process and many of us end up writing out own OPRs with the leadership unsure of what actually transpired.
DeleteRatings
ReplyDeleteIt would be beneficial for the Air Force to incorporate a system similar to the new enlisted performance evaluation process. The only difference between a high performer and someone that should probably be getting out is stratification and some stronger written bullets. How much weight does that strat carry… the strat doesn’t help you until you come up on Major. We need a system that is quantifiable. A lot of top civilian corporations give their people ratings between 1 -5. Currently, no one wants to give someone a bad OPR because they don’t want to “hurt” someone’s career. They are hopeful that the individual turns it around and gets serious about their career. There is too much on the line to sit back & wait on someone to care and start performing. Based on the ratings, a captain can potentially make Major btz on high ratings. There should be limited 4s and 5s that only the best of the best get.
DeleteRichard, as you research through this... You'll want to look into the older systems we used 25 years ago. I think you'll find it interesting as to how we got here, from there. Good luck!. Col Lass
DeleteRichard, in looking at the Navy and Marine Corps Fitness Reports, or FITREPS, they already have in place a way to measure character and give it a numeric value (1-5). They also have a way to capture a raters entire rating profile, so you can be ranked against everyone they have ever rated. Quickly and easily we can turn that data in quantitative info for the boards.
DeleteInflated, underutilized
ReplyDeleteOES is intended to provide meaningful feedback and establish a record of performance to be assessed to help predict future potential. However, nearly every OPR states that officers are "outstanding" or "excellent" which impairs the ability to provide adequate feedback and fails to create an accurate record over time of the officers abilities and future potential. Additionally, OPRs are essentially used to derive 1 PRF bullet. Then, promotion boards often only have time to look at the top line, bottom line, and strats of a PRF. The rest of the bullets on an OPR go relatively unused.
DeleteI completely agree, Brent. Everyone is "outstanding" and a "must for PDE". We need a system that is quantifiable.
DeleteThis comment has been removed by the author.
Delete"OPRs are essentially used to derive 1 PRF bullet. Then, promotion boards often only have time to look at the top line, bottom line, and strats of a PRF. The rest of the bullets on an OPR go relatively unused.". Very astute observations, Brent. Remember that as you look to create new systems, you may accidentally create new cultures that will manifest themselves just as you described our current one. The trick will be to make it as immune to such derivations. Looking forward to hearing everyone's thoughts in this Tank! Col Lass
DeleteThis comment has been removed by the author.
ReplyDeleteConformism, status_quo, inflation, exaggeration, stratification, tedious, time_consuming, missing_the_point
ReplyDeleteAs competition grows fiercer, manning is reduced, and the mission evolves, the Air Force is going to “better leverage the variety of experiences, special skills and exceptional potential of Airmen,” not by continuing to inflate the system, but by revolutionizing the way job performance, stratification, and significant self-improvement are captured in the body of the OPR.
DeleteAs competition grows fiercer, manning is reduced, and the mission evolves, the Air Force is going to “better leverage the variety of experiences, special skills and exceptional potential of Airmen,” not by continuing to inflate the system, but by revolutionizing the way job performance, stratification, and significant self-improvement are captured in the body of the OPR.
DeleteInflation, artificial_stratification, style_over_substance, feedback, standards
ReplyDeleteAs competition for stratification defines promotion recommendations, OPR bullets that do not include strats are increasingly exchanged in favor of ones that do include them. Additionally, the bullet format requirements lead to a focus on using just the right word and fitting a bullet perfectly in the space rather than capturing the performance picture. This means that OPRs focus on what strats they can include in how much space, not delivering feedback or accurately capturing performance or establishing expectations for future performance.
DeleteRalph, these are great points!. Be thinking about second- and third- order effects that occur because of what you've highlighted. PRFs, promotion, key job postings, etc. etc. So, if you solve one problem... Do you solve others or create new ones?. Don't need an answer... Just keep thinking!. Col Lass
DeleteThis comment has been removed by the author.
ReplyDeleteBias, feedback, no_standard, stratification, timing
ReplyDeleteBias - multiple types of cognitive bias exist in the current system, including: availability heuristic/bias, as in a "what have you done for me lately" attitude or "he is about to go up for promotion and needs a good strat"; base-rate heuristic/bias - inflated stratification of pilots by pilots, maintainers by maintainers, etc, rather than in equal measure across AFSCs because of internal assumptions. Inclusion of more quantitative metrics along with the existing qualitative ones could give a more complete picture of where an officer stands across AFSCs, making the OPR more truly representative and making selection for promotion less complex.
DeleteFeedback - rarely occurs as prescribed, and even when it does, many feedbacks are not useful for true measurement of performance. The current open format allows flexibility, but is often used incorrectly, providing low quality feedback without specificity or concrete ways to allow comparison with other officers. More standardization of the feedback process, including some forcing functions to control the quality, would make it easier to make quantitative assessments instead of the current subjective process.
DeleteNo_Standard - Each wing and many squadrons have their own writing guide with approved acronyms, verbiage, etc. The reports are used at all Air Force levels, where the local language may be unfamiliar. If the reports are used at the highest level, thats where the standards for content should be, and lower levels should not be able to modify the guidance. This would clarify language for PRF and promotion board raters of different career fields, reducing uncertainty and the possibility of cognitive bias affecting decisions.
DeleteStratification - Strats are not tracked over a career, can change multiple times per year, are sometimes used to give pushes instead of accurately representing standing among peers, are mostly subjective or based on arbitrary criteria instead of quantitative data, and there are many different categories (#1 CGO, #1 Capt, #1 Flight Commander, #1 Squadron Bartender, etc) instead of one agreed-upon baseline for comparison.
DeleteTiming - close-out dates spread out over different times of year create inconsistencies in how people are assessed and stratified. Instead of having one time per year when everyone is assessed and stratified together against the same standards, different op-tempos, shifting priorities, supervisory changes, chain of command life altering events, can all negatively affect the consistency of assessment quality.
DeleteLack of Standardization, Bias, Inconsistencies, Useless -- very strong comments, Marcus. Will make for great discussions later! Col Lass
Deletestratification, inflation, artificial, no_real_substance
ReplyDeleteThe ideas, goals, and purpose behind the officer evaluation system are solid; it is the execution and the fear behind the numbers which have warped the outcome. The stratification system needs to be changed to show the true stratifications – there can only be one number one. Being creative with numbers to give multiple people the same stratification only does a disservice to all members involved. And we should put less emphasis on being "number one" and being the best officer we can be - it's okay to be number two, three, etc.
DeleteAnd there needs to be a true evaluation of skills through testing or an interview panel, rather than letting inflated bullets create a picture that is not accurate. The inflation of information bullets contain creates an artificial reality lacking true substance. It is all fluff to sound pretty.
You would think that honesty and integrity in a system that defines who we are and what we've accomplished would be second-nature, wouldn't you? Very good points, Alicia. Honesty definitely needs to be a keyword going forward. Col Lass
DeleteInflation is a very real problem in OPRs. Many have experienced reading our OPR after our rater works on it and feeling guilty because of the inflation process, and being told "don't believe your own OPR but we have to do this because it is part of the game."
DeleteIt is hard to take it coming from another service but I found a valuable link to a paper written by a Navy officer on the culture of inflation in the Air Force OES. Here is the link http://www.dtic.mil/dtic/tr/fulltext/u2/a514309.pdf . I highly recommend reading this paper. It has good research on evaluation theory and presents summaries of other systems. It is from 2009 so it is a little out of date especially on the Army side but still extremely helpful.
Matthew, I've looked at the paper you recommended -- I think you've found a good piece of research here. The author hypothesizes, "The organizational structure of the military, officer-specific reward system, processes and tools of evaluation, promotion system, organizational culture, and the interaction between individuals influence personnel to inflate evaluations over
Deletetime. Addressing the root cause of organizational factors and/or implementing controls on known factors in the various tools for evaluation would reduce the inertia towards evaluation inflation. In addition, addressing one element is insufficient to stem inflation; it requires a “whole of system” approach." I think she (then, Major Wolfgeher) understood just how many factors are required JUST to handle the inflation issue alone. Her recommendations on page 79 are especially worth reviewing. I think -- after your brainstorming session -- many of you will find her conclusions helpful. Great find! Col Lass
Matt, great work identifying the facts, be sure to include them in the next section. It doesn't take special analytical tools to see that inflation is a problem that we will have to deal with... thanks for getting ahead of the curve.
DeleteAlicia, you hit my argument right on. I agree that there needs to be a standardized skills test that shows quantifiable data on the Officer, similar to how the WAPS testing evaluates Enlisted members on skills tests (SKT) and professional development guide (PDG) to generate scores for promotion.
DeleteTransparency, Subordinate_feedback, Peer_feedback, Simplification, Mandatory_education
ReplyDeleteMore transparency in the OES would allow a ratee to improve his or her performance...secret ranking charts from my boss to his or her boss don't help me get better.
Incorporating subordinate feedback into an officer's rating would help increase the officer's motivation to take care of his or her people.
Mandatory_education on the system would help an officer better prepare for his or her future.
Completely agree! Ratees who are simply shown the "meets standards" "clearly exceeds" boxes don't really take away a lot because the stratification process compares members to each other, not to the standards. Who's to know which flight member is buddy-buddy with the Commander and thus receives a #1 strat out of the blue. Providing a quantitative review on feedback provides a paper trail of positive or negative ratings that ratees can actually utilize to improve themselves and know what to expect on their upcoming OPR.
DeleteI'm hesitant to use subordinate feedback into a raters rating. If you have a large number of subordinates, you'll have a large data pool, but it will be difficult for each subordinate to truly know their commander (i.e. A Flt/CC with 300 people). I'd be concerned the Flt/CC spends more time smoozing his people to get good feedback from them than doing what he should be. The other side of the coin is equally a slippery slope. If you have a single subordinate who dislikes you becasuse you had to discipline him for reason 'X', then you'll get poor feedback from him/her and now your screwed. This could still be doable, we would just have to be extrememly careful with how we chose to implement and incorporate it.
DeleteShawn, I agree we need to be careful on the subordinates feedback but I think it is critical. Supervisors see what we want them to see or in my case I write my own OPR...if my duty as an officer is to lead and mentor then somehow we need to find a way to capture that. Sounds like we need to figure out the balance but great points!
Delete"Scmoozing" with subordinates would be Individual Consideration to an excess. The proper amount would be an excellent leadership tool. People can identify the excess just as we can identify those kissing up to their bosses. The subordinate feedback is critical as it is our only direct feedback as to how the candidate is perceived as a leader. It is true you cannot make everybody happy, but the risk of unhappy subordinates can be ablated by isolating outliers reviews if substantial positive feedback is received.
Deleteno_bullets, narratives
ReplyDeleteBullet format it the main problem in OPR's - they don't actually really say a lot and its very difficult to condense what could be up to a year's amount of work an effort into a single line.
DeleteA narrative format as used by other services would go a long way in better communicating an officer's collective impact on the mission and the organizations, plus better provide details on individual accomplishments.
The formatting efforts along with the effort condense a sizeable amount of data down to a single line is exceptionally time consuming, further increasing the administrative opportunity costs for units.
This is what I suggested in my paper as well. I definitely agree.
DeleteI agree that narrative has to be part of the solution
DeleteI'm concerned with the idea of using narrative. During selection boards, the board members have to reveiw a lot of records in little time. The current bullet format is designed to assist with that, so if we change how we do bullets to something narrative, this will have an effect on how we do promotion boards down the line. Just something to consider with making this kind of change. I think we can make bullets better without going completely narrative. Perhaps a half way pt of making the bullet less restricitve in length, format etc? I do agree spending so much time formating bullets is a waste and poor design.
DeleteThis is what I suggested in my paper as well. I definitely agree.
DeleteBullets convey checked boxes, narratives paint a picture of the man. Perhaps the two can co-exist to an extant? The board will have more work? Maybe put more people on the board. Maybe have two board groups. I feel that the narrative is essential.
DeleteInflation, Rater, thesaurus, Word-count, TFI
ReplyDeleteWhen we spend more time counting letters and searching the thesaurus for the right words than we do providing verbal feedback to our ratee then something is wrong.
DeleteIt is very unhelpful to just use Yes/No checkboxes in evaluating Job Knowledge/Performance. Just as in our Wing Inspections there should be a grading scale for how well an officer does in these areas. The Wing Inspection gives 4 graded areas. Managing Resources, Improving the Unit, Leading People, and Executing the Mission. It seems that these would be helpful categories to grade.
DeleteThis comment has been removed by the author.
DeleteAs a reservist I am very concerned to make sure that the OES is done in a way that promotes Total Force Integration. With current budget realities and mission needs, the importance of TFI will only increase. In the current system I have heard several times (from O-6) that a Reserve Captain who has been in for more than 4 years could not change to Active Duty because their bullets would not be as good as an Active Duty captain. I questioned this since Reservists so often imbed with Active Duty. I was told that it isn't about job knowledge or execution but about the fact that Reserve raters don't know the kind of language "code" that is needed for good OPRs. If someone is good at their job and has high potential then it shouldn't require a "code" to communicate that.
DeleteWhile officers should be looking for extra opportunities in community involvement and special programs, these should be deemphasized in the OPR process. Officers who are not as effective in their primary job can attempt to make up for that by looking for “good bullet” opportunities. Good officers will do these things regardless of whether or not they get a bullet point.
DeleteIn order to increase the value of the feedback for officers and their supervisors, the stigma of anything negative in the report should be removed. No matter how perfect an OPR looks, everyone knows that mistakes are made. How an officer has responded to mistakes, grown in an area of weakness, or shown teachability are all more important than a superhuman OPR.
DeleteMatthew, you make some great points. Be thinking about things that will make your suggestions difficult to do in today's culture -- even next-to-impossible. I've mentioned human nature before in an earlier post, but getting people to give honest, constructive, and actionable feedback and evaluations shouldn't be that hard...but it is when looked at in the greater aggregate. Inflation occurs for a reason, as does block checking, letter counting, and so on. The question is "why?", and how do you combat it in a culture that has not only grown accustomed to it, but may not even recognize they're doing it.
DeleteMore_data, physical_fitness, scaled_categories
ReplyDeleteAnother problem is the lack of comprehensive data on OPRs - they just don't say a lot about the officer. If PT is as important to the officers as it is to the enlisted force, then rate officers by their physical fitness level. There is a big difference in an officer who scores 80s on the PFT and the ones who scored 100s. None of this is apparent on current OPRs
DeleteStandardization seems to be a huge issue, and the lack of codified, scaled categories to rate our officers makes the murkiness and comparability of one Air Force officer to another extremely difficult. The best officer evaluation tool I have ever seen is the NAVMC 10835 - the USMC fitness report. It is in depth, standardized, respected and very easy to see whether an officer is garbage or the "eminently qualified" officer
Andrew,
DeleteI proposed our question to the group at the USAF Officer Mentorship Forum on facebook. USMC Maj Blake Veath answered with:
Mirror the Marine Corps evaluation system. I'm not joking. Briefly, each rater carries a subordinate rating profile, by rank, through their entire career. For example, I have a profile on how I have rated all of the captains that I have ever rated. They are thusly stratified. On the master brief sheet, which is briefed at the board, they can see if I am consistently in the lower mid or upper third of all my previous raters' profiles. If you have only all stars one rating cycle, you are not trashing one or some of them by having a bottom guy, you can put them at the upper end of your historic profile. If all subordinates need work, you can indicate that with an appropriate stratification on your historical profile.
Yes, your profile is a loving set of numbers and needs to be managed and kept track of EVERY time one rates on a subordinate.
Let me know if you want more on this. It actually works really well for us.
no_AADs, no_volunteering, pseudostrats, unofficial requirments
ReplyDeleteAdvanced academic degrees - do we need them or not? Gen Welsh says that as far as the USAF is concerned, they are great but we really don't need them until maybe O-5/O-6. BUT they are essential to winning awards which are essential to getting good strats which are essential to good OPRs which are essential to good PRFs. I don't get paid extra for having them nor are they visible on my boards - or even a part of the OPR. It is an unofficial job requirement that weasels it's way into being an actual requirement. The USAF needs to make a decision - are they an expectation and a formal requirement for CGOs to pursue? If so, they need to be compensated for them, represented and given credit for on OPRs. If not, then they need to be removed from the whole systems - awards, strats, boards.
DeleteThe same goes with volunteering - its not a requirement for OPRs but it is because volunteerings equals winning awards which equals good strats which equals good OPRs which equals goods PRFs. Again - another huge decision point for the USAF: Is volunteering an important category to rate officers on - if so, then put it on the OPR and make it a formal requirement. If not, then remove the requirement from the whole system - awards, strats, etc.
DeleteI think AADs should be kept as a differentiating factor. I'm indifferent on whether or not they're on the OPR but we shouldn't take it away as a tool to differentiate between two individuals. If they're good for the AF, which I think we can all agree they are, then they should stay a part of the system, though not necessarily a big part of the system.
DeleteVolunteering on the other hand should count for zero. I don't think there should be awards for it, I don't think it should count for anything on one's OPR, PRF, etc. I don't want someone leading the charge who spent more time in a soup kitchen than at work, while all his coworkers are picking up the slack. I'm not saying volunteering is a bad thing, its a great thing, I just don't think it should count towards forwarding an officers career.
I agree that AADs should be kept in your PRF only as a source for deciding school slots. If the AF wants me to obtain an AAD, then they should send me to a school to obtain one. If you want to do it for your personal advancement, then that is your decision, but having it on any kind of board only leads to checking a box by most people which is a core issue with the Officer Evaluation System itself.
DeleteGood conversation, everyone. Be thinking about this: is developing professionally a responsibility of the individual, and should it be important to the AF? What does an AAD really mean, and should what its in and where you get it (and how) make a difference? How have we gone from where we were (where not everyone had a Bachelor's degree) to where we are (or were recently)? Deciding what we hold dear -- aka, what we value -- as an AF is going to be a critical factor in how this think tank goes forward. Your challenge is really to identify some of these areas in ways that NO ONE has really looked at before, or identify methods that we've just plain misread. This is exciting stuff -- keep writing! Col Lass
DeleteI believe developing professionally is the responsibility of the individual. The AF should be concerned with how they are developing in the form of performance during their primary job. In today's "Do more with less" mentality, we demand more out of our people so we need to look at what we really want people to focus on. To me, that should be Air Force specific activities. If they want to use AAD, we need a way that we can take the degrees people earn and use that knowledge. The Air Force seems refuse to tap the knowledge or talent its members have. We are not efficient at using the talent we have access to.
DeleteAndrew, I would expand this to professional certifications. Certain career fields have the opportunity to further their education and receive state or federal certifications (eg., Professional Engineer, MD, etc.), but our OES system does nothing to mechanize a Commander to highlight an individual with X cert as a more capable leader in that career field. We value PME, but do not seem to encourage or value education from non-DoD classrooms.
DeleteThe only time I see an AAD brought into the OPR mix is when the Commander sends his push note to the Group CC. Even then, I can't say how much of a real impact it has on whether or not the Group CC uses that to legitimize a strat. Our Commander has always said just to knock out the degree because who knows if the next Chief of Staff will mandate it for promotion. I do agree that it should count for something rather than just checking a box until it becomes a mandatory requirement. I'd certainly say there should be a line added to the OPR perhaps beneath the remarks box that allows ratees to add their degree that gets kept on all subsequent OPRs. If nothing else it gives credit to the hard work individuals have to do in addition to their daily jobs
DeleteFeedback
ReplyDeleteIn order for the OPR and PRF packages to build upon what both initial and midterm feedback have developed, the three facets must have a seamless transition among each other. Stratifications have become the bread and butter of representing a successful ratee in the OPR/PRF but aren't present at all in feedback for individuals. The current system of "meets standards" "clearly exceeds" needs to be replaced with a stratification system showing ratees where they fall with regards to their peers in the flight. Stop allowing the rater to blindly check a box and force them to legitimately rate individuals to emphasize both honest feedback and growth potential
DeleteGood point about bringing feedback inline with the OPR itself, I hadn't thought of that. I had a similar idea (with intent towards a different goal) about updating feedback to be utilized as a way ahead, with actual goal setting and accountability for missing said goal.
DeleteI do agree feedback is useless. There is no mechanism that encourages a supervisor to give valuable feedback, document it, and praise the individual when they improve post-feedback. If the idea of the final OPR is to review a year's performance, then wouldn't we value the data point of knowing where the were mid-way versus where they are now? I think this was a great addition to expand our aperture.
DeleteCode, outdated, checkbox
ReplyDeleteCheckbox - I do see a problem with people checking the box in many areas to include job titles. Commanders move people into and out of positions to "check boxes" and show "career progression" which overall has a negative impact on the AF. By moving people so often, we keep our people in a state of incompetence. By the time they become efficient at their job, we move them to another position.
DeleteFeedback, Standardization
ReplyDeleteFeedback: Digitize current form, add sections for member and rater to develop way ahead with goals set for days, weeks, months ahead. Export that to outlook calendar, allowing ratee and rater to monitor progress that can then be utilized for midterm feedback and provides data for final OPR
DeleteStandardization: We need more standardized verbiage, duty titles, etc to make it easier to compare individuals across multiple AFSCs for promoion.
Limited, feedback, standardization
ReplyDeleteLimited: There should be more options than "Meets" or "Does Not Meet" at a rating. While we get stratifications from our raters, additional rating options broaden the ability of application of rating across AFSCs with different missions and positions.
DeleteFeedback: Should be included in the file for members and there needs to be a different accountability for raters to provide feedback.
Standardization: Each career field has different "boxes to check" at different ranks, positions, etc., which can be difficult to place standard value on strats across AFSCs. As a result, promotion boards with individuals who don't have the experience of a particular AFSC may not have knowledge of the significance of different strats. Also, the OPR form should break down into different categories, such as leadership separate from job duties, so that those who are standing out as leaders vs just doing their job are able to be properly recognized.
Jillian, I struggle with this area -- you may as well. I talked about it further up in the threads, but Boolean box checking is "easy", where stratifications are technically hard if done honestly. Can you envision a system that allows you to do what you've described and be done fairly, honestly and accurately? If so, I'd be interested to see where that thought process goes. I think there's a lot of value here, but where we'll get frustrated is when we hit cultural roadblocks/barriers to entry because our system isn't as flawed as we are. Food for thought. Col Lass
DeleteThis comment has been removed by the author.
ReplyDeleteThe current OES is stagnant and it severely cuts a number of deserving airmen out of the promotion system. The OES should become a process with "the whole person concept" as the underlying factor. Using regular supervisor feedback, peer feedback/ratings, and creating challenges that promote innovation should all be apart of this process. The OES should be a MAJOR contributing factor to strengthening our Air Force with firm, active, innovative leaders. With the direction the fight is going today we need to we have no choice.
ReplyDeleteaccountability - the military spends millions of dollars training people in their AFSC yet we have individuals who are unable to carry their weight or complete their required tasks yet because they volunteered for a few functions they are outstanding. I feel we need some way to hold people accountable. if they are not able to complete their duties or constantly missing deadline then the OPR should reflect that
ReplyDeleteIneffective_Feedback, Ratings, Group_Performance, Recycled
ReplyDeleteInflated, Differentiate, Mediocre, Accomplishments, True_Measure
ReplyDeleteIt creates an incentive to lie/inflate the most mundane accomplishments and in doing so, makes it impossible to differentiate between true achievers and the mediocre. It's not a true measure of an officers value.
DeleteBrandon, I couldn't agree more. The whole system is inflated. Commanders/bosses have to exaggerate because everyone else is. They have to keep their people in good standing. Its the same issue as the "firewall 5" in the old epr system
DeleteThe metrics seem to be more important than understanding if an individual actually developed. Mid-term feedbacks have become useless footnotes that supervisors must complete. The OES system relies to heavily on metrics that do not always represent the true accomplishment or abilities of the individual.
DeleteTime_Consuming, Time_Wasting, Number_Focused, Stratification
ReplyDeleteNo one cares about the words, only the numbers. How many man hours that go into a product that ultimately only matters if there is a stratification attached to it.
DeleteLanguage, Writing_Guide, Standardize
ReplyDeleteThere are different writing guides for different levels that the evaluation goes to. The difference in the language can create gaps where officers aren't compared equally. On the flip side, if you can understand the short hand why do we need such detailed guides.
DeleteFocus on Primary_Jobs, do away with Progression_must_be_shown, remove Personal_Identifying_info from boards
ReplyDeleteWe need to put the emphasis back in the primary jobs. More often than not, there are comments about needing more “Volunteer” or “self-improvement” bullets when the performance in the Primary AFSC is being over looked. As a result, it is well known that our Secondary, Volunteer, or even Self-Improvement bullets far outweigh the effect of doing your primary job well. For this reason, we have safety trends showing degraded aircrew ability due to the importance placed on secondary jobs for career advancement.
DeleteDo away with the idea that we must show progression with each OPR. From an operations standpoint, progression is shown in our secondary jobs. It doesn’t matter that you’ve made instructor or even evaluator if you haven’t been a shop chief or flight commander as well as several other minor rolls in the squadron. On top of this, we also must show breath of experience. This keeps our people operating at a level of incompetence throughout their entire careers! It “doesn’t look good” to have more than one OPR close out in the same position. Therefore we must to a different shop every year. This negatively impacts the Air Force in that about the time you really learn your job and can perform it effectively, leadership forces you to a new position where you are essentially incompetent because you are having to learn everything all over again. Because of this inefficiency, officers now have to focus more on the new job rather than spending that time bettering their primary job skills, furthering education, or most importantly their family.
How I would fix this is by removing personally identifying information. For example, who cares if the applicant is male, female, black, white, Hispanic, or even what year group they are in. All leadership should be concerned with is “Is this person the best for the position?” If you really want to be impartial, then we need to remove this information from the board. This will put better leaders in place and I believe they will affect the best changes if leadership is blind to any kind of bias information.
Exaggerate, embellish, inflated, biased, partiality
ReplyDeleteRaters complete OPRs using bullet tactics that meet their specific practice. It is also well known that most bullets are over embellished as to the work that was completed. This impacts future decisions for evaluations and promotions. This also creates a problem for people of different AFSCs who positions that take more time to accomplish than other AFSC.
DeleteI agree with you that we over exaggerate and embellish bullets more often than not. One thing I don't see much of however, is writing a OPR that accurately reflects the individuals performance. I can come up with 12 big events that person participated in, but it just describes an overall event not how the member performed in that event. I think we need to come up with a better way of providing that feedback.
DeleteInflexible_Feedback
ReplyDeleteFeedback - The vast majority of the valuable, actionable feedback received in my career has been informal. The feedback received at the "set times" either has been non-existent or fairly vague. Allowing supervisors and commanders the flexibility to fill in the ACA over a period of time, annotating informal feedbacks that have occurred on specific subjects in the appropriate blocks of the ACA as they happen would make things simpler for the rater, and the ratee receives the important, timely feedback that they need. There have been several instances where I have had to sit outside of a rater’s office waiting for them to complete the form for the feedback session; I have also had formal feedbacks that restated information from the informal feedbacks I had previously received, which did not help me much. If the rater were to just annotate on the form that feedback on a specific topic was given at a certain date and time, it could prevent wasted time and duplication of information in the formal feedback that was already provided informally.
fraudulent, inaccurate, misrepresentation, unstandardized, horizontal_raters
ReplyDeleteThis came from flight members.
DeleteThere is a problem with having horizontal raters, with a rater having less experience than the ratee. The way the information is presented within the OPRs leads to inaccuracies due to inflation of information. The information that returns to a member for signature is a skewed misrepresentation of their actual achievements, which leads the member to feel like what they actually did is not good enough. There is not an AF wide standardized adopted way to write OPRs, which leads to differences from base to base. There is more concentration of format, semantics, and symbols, rather than actual content.
Your point about AFSCs is interesting--I hadn't really thought of it that way. My one counter to that is that we are all officers first, and while our specializations do cause some variance in evaluations, performance is performance. I'm in my second career field, and my OPRs may have different terminology on them, but the performance reflected is pretty much the same. I think it would be a challenge to differentiate by AFSC, especially in smaller career fields where you would have less senior individuals to evaluate the records--sometimes it's beneficial to have a different, outside perspective.
ReplyDeletetransparency, inconsistency, career_track, career_path, requirements_for_success
ReplyDeleteThe USAF officer evaluation system by design enables current senior leaders the ability to establish a record of performance and to an extent assist selection boards to identify the best qualified officers for competitive opportunities. As it is currently executed there are inherent inconsistencies within the system which must be overcome in order to make sure the future of the force places the best qualified individuals into future senior leadership positions. Currently shortfalls for the officer evaluation system are two fold, the lack of a rater/senior rater to clearly identify the career path for an officer (outside of a push bullet) and the lack of transparency for the ratee to understand what requirements must be accomplished in order to set themselves up for success in their preferred career track.
Deleteinadequacy in the board process
ReplyDeleteOften when a package is taken to the board it is reviewed by individuals outside of the officer career field who do not have a true understanding of what the Officer's OPRs or appraisals truly mean for leadership or impact. Is it possible to pull a larger portion of the board and evaluation process inside the career field, or have a larger input from like career personnel. A lack of understanding for missions sets can negatively impact the rating of the officers.
DeleteKate, you should look at the Navys system -- see if that meets your expectations. Promotion within career fields can be a double edged sword, especially if there's not enough room as you move up. I don't disagree with the difficulty in understanding all career fields at the board...but look into how this works in other systems. I'll be interested to hear your thoughts. Col Lass
Deletemeaningless, too_poltical
ReplyDeleteBullets are mostly fluff and over exaggerated. Doesn't matter if there is no strat.
DeleteI agree on with your point on the two rater system. There can really be a wide gap between your rater and additional rater, for example, in units that are geographically separate, the additional rater may have never met you. If you already have a weak relationship with a primary rater then how are they truly able to know what kind of officer you are?
ReplyDeleteTeam, I briefed the General earlier on the volume and depth of your brainstorming -- he was truly impressed, as am I. Thank you for your inputs, and please keep them coming. I wish I had this opportunity when I came through here almost 20 years ago, but this is your time...and soon, this will be your Air Force to run. I greatly look forward to hearing more of your thoughts, and your thoughts on others' thoughts. All the best, Col Lass
ReplyDeleteI agree on with your point on the two rater system. There can really be a wide gap between your rater and additional rater, for example, in units that are geographically separate, the additional rater may have never met you. If you already have a weak relationship with a primary rater then how are they truly able to know what kind of officer you are?
ReplyDeleteLanguage, Writing_Guide, Standardize
ReplyDeleteProfessional_Certification, stratification, metrics
ReplyDeleteThe current OES system does not benefit the individual persuing higher learning or professional certifications - I may be a state or nationally licensed X, but the Air Force does not recognize me as more capable as a senior leader in that same field.
DeleteStratifications and metrics have become the highest priority. The factual capabilities of each individual are lost. The OES system leans on inflated statistics, which can be molded to meet any agenda or make a young CGO seem like they carry the responsibilities well beyond actuality.
DeleteMatt, I don't know about your career field, but in 17D (Cyber), certifications definitely have weight and bearing in competitive schools and assignments, which ultimately and drastically shape your career. What career field are you in so I understand your perspective?
DeleteI am in acquisitions, and becoming a program management professional definitely doesn't help me.
DeleteBullets, Checked Boxes, Misrepresentation, Shallow Criteria
ReplyDeleteIn our focus on streamlining and comparing candidates by common benchmarks / achievements (Distinguished Graduates) the actual functional capability of the candidate can lose its representation. The presence or aura of authority that an individual can exude / impress upon their followers will not be relayed.
DeleteIt has been said before so perhaps it will start snowballing; there needs to be a concise narrative detailing what strengths the candidate has that would otherwise not be represented in the traditional OER format. Yes, it will take a bit longer to review, but is necessary. If I have in my squadron and officer who does not stand out in any other way except that he can lead his subordinates to the ends of the Earth, I would like to be able to convey that.
DeleteJustin, great point of the OES being AFSC agnostic. I do feel, and this may be a stereotype, that rated officers get recognized out of boards by default, just because of how much impact their bullets can have. Dropping bombs and flying sorties will always show more impact than closing thousands of IT tickets or managing an acquisition program.
ReplyDeleteUnstandardized, Subjective ratings versus merit, lack of continuous education, no equivalent WAPS standard
ReplyDeleteThe Officer Evaluation System is unstandardized because of the stratifications and push lines that supervisors have to make. These comments are all subjective and based on the rater's opinion and has no parameters of what makes them qualified as being promotable, other than a rater/senior rater's perspective. I think it would be beneficial to hold Officers to the same standards that we hold Enlisted members to by taking skills test and professional development test to show our knowledge, expertise, and dedication to our profession, similar to how Enlisted members have the Weighted Airman Promotion System (WAPS). These tests would be primarily leadership and management based instead of technical based, but would generate a score that reflects an Officer's skillset in his/her AFSC.
DeleteI concur on a standardized examination component to officer evaluation and advancement -- while I don't think it can be the only factor to consider for evaluation and promotion, it should be considered as a component that provides a part of larger quantitative-level of assessment. Other federal agencies (FBI/NSA) use scientifically validated personality exams to screen individuals for certain positions -- surely a scientifically validated examination can be developed that evaluates leadership aptitude -- a precedent already exists for evaluating would-be officers -- the AFOQTS.
DeleteCryptic language; trumped-up bullets; quantitative_evaluation; bell_curve_distribution
ReplyDeleteI propose that an evaluation system that relies primarily on quantitative evaluation -- numeric scales and bell curve distribution, rather than qualitative measures, cryptic narratives or simple binary (meets/does not meet standards) criteria, would provide all parties -- rater, ratee and the Air Force (as an organization) a more accurate picture about where people stand as officers, especially relative to one another. I propose that each OPR would have a total cumulative score that would cover multiple areas of evaluation (including a fitness score), and that total cumulative score should fall within a bell curve distribution for all ratees of the same rank, evaluated by a given senior rater.
360_Feedback
ReplyDeleteThe current feedback and OPR structure does not include input from subordinates. With the current system, poor leaders who are adamantly viewed as horrible leaders by their subordinates, can have very positive OPRs from supervisors. This allows poor leaders to have great paperwork and get promoted. Having a portion of the feedback / OPR dedicated to the thoughts of subordinates can correct this issue.
DeleteAgree. This has been a common theme. I know of several people who are great on paper but inert in the squadron.
DeleteCentralized
ReplyDeleteThe promotion decisions in the current evaluation system is held at too high of a level. You have Medical Officers, Pilots, and Security Forces officers looking at PRFs from other career fields. While we are all officers, our languages and understanding of job performance is all different. BL, De-centralizing promotions boards will allow Pilots to promote Pilots, Med Officers to Promote Med Officers, and Security Forces to promote Security Forces.
DeleteMost people probably agree that the OES system needs to be changed, and I certainly have my own ideas about how to change it. However, if changes are going to be pursued, the first step needs to be figuring out a way to measure the potential value added by such changes. Beyond "feel good" perceptions of a potential new system, if a reformed system doesn't ultimately lead to quantifiable, different results, i.e., selecting different (presumably better) people for promotion/retention than would otherwise have been selected under the current system -- then what's the point of pursuing a new system?
ReplyDeleteLt Gen (Ret) Dan Leaf's response to this question in the USAF Officer Mentorship Forum on facebook was:
ReplyDeleteI currently rate or senior rate under seven systems - all four services, two DoD civilian systems, and Department of State. The Air Force system is BY FAR the worst. Reasons? 1.) Reliance on apples-to-oranges stratification. 2.) Brevity. Too short to tell the story of the airman being evaluated. 3.) Norms...Air Force boards have come to expect "dumbed-down" laundry lists of facts/accomplishments with no means to fact-check or compare/contrast. I spent 33 years in the AF and have watched our system evolve and then devolve. I think the primary causes have been failure to teach raters/senior raters how to write...in plain English; and a misguided attempt to make the job of selection boards easier. The PRF/board process is better, pretty good in fact, but tied to the fatally-flawed EER/OER structure. The system needs a total redo.
Unfair, More_quantifiers,apples_oranges
ReplyDeleteIn the acquisition world we will be rated against 4-10 of our peers, and even if we were the best Captain ever, we will still show 1 out of 10. The Marine Corps utilize a tool called a comparative assessment, which automatically places the Marine into a percentage based on the raters entire profile. Even if you are only 1 out of 10 this year, if you are the greatest Marine ever, you will show up in the 100 percentile on this assessment which means alot more. This is by no means the only fix, but it is one more quantifier that is needed.
This is why quantitative evaluations are so key. As an acquisitions guy I will never have an OPR that looks as good as a fighter pilot. I would love to have a system in place that rates each of us quantitatively in the characteristics we actually need to be a leader, and have that score be placed on the OPR as well. And if the scale is 1-5, the answer is 3 by default unless it can be justified on that document. I don't want another 1-5 system like the old EPR where everything is inflated, but if the Navy can do it now, I don't see why we cant.
ReplyDeleteFeedback - More often, better quality, and hold the rater responsible
ReplyDeleteFeedback is the cornerstone of the entire evaluation system, but it's not happening consistently. Gen Welsh stated in a memo, dated 8 Jun 14, that feedback is the most important part of the evaluation system. If you ask ten different Captains about whether or not their boss provides feedback, you'll receive inconsistent answers. Why? Because Raters are not held accountable for providing feedback; consequently, there is no incentive for Raters to prioritize feedback over other taskers.
DeleteThis comment has been removed by the author.
DeleteFeedback is definitely something that can simply be pencil whipped. It's difficult though for commanders to hold the leadership of their squadrons accountable because they don't have the time to physically sit in on each session to ensure the leader is doing their job. My solution is to have stratifications implemented into the feedback sheet which aligns with how raters plan to strat their folks on the upcoming OPR. This is one means of proving to the commander that feedback was accomplished and ratees have the ability to fix their shortcomings or at least have an idea what to expect when their OPR drops
DeleteAFSC_balance
ReplyDeleteWhen writing the OPR, guidance from Senior Leadership is to utilize as few acronyms as possible because most likely the members of a board reviewing the OPR won't have your same AFSC. The issue then becomes cramming as much information into a bullet without acronyms takes away from the action, impact, and especially the result. Ratees cannot fully capture all that they've accomplished in such a small space without using tools to shorten words. Solution: categorize OPRs by AFSC and then have only specific AFSCs sit on the board for their own people
DeleteSpecifically regarding the feedback portion of your post, feedback should not only be documented but should also be a two-way street that then gets put into your OPR package when it's sent to the Commander for his final review. Only then will the ratee be able to provide insight for the Commander as to whether or not their rater is actually doing their job: i.e. providing timely feedback at the pre-set timeline, providing useful feedback, or even how well they know what you do and how you do it. This would guarantee ratees are receiving feedback but also hold raters accountable because their members are providing direct insight for the commander to see and heed.
ReplyDeleteclassified personnel management system.
ReplyDeleteThe OES does not allow members working in classified programs or organizations the ability to accurately portray their duties on their reports and ultimately to the promotion board. As an officer, we are promoted based on leadership potential at the next grade. I have witnessed multiple situations where an individuals was performing extremely well and leading at a high standard in their duties, but when you read their OPR....it could not reflect because of the classification level it would require to do so. Several times I have seen individuals passed over for promotion because of this issue.
DeleteProposal:
Our sister services already have the systems in place to track, manage and promote its members through a classified system. The Air Force should follow suit and implement a like system to due diligence for it's members.