Peer-Level Calibration of Performance Evaluation Ratings: Are There Winners or Losers?

dc.contributor.author Bol, Jasmijn
dc.contributor.author de Aguiar, Andson Braga
dc.contributor.author Lill, Jeremy
dc.date.accessioned 2019-12-06T18:32:56Z
dc.date.available 2019-12-06T18:32:56Z
dc.date.issued 2019-08-27
dc.description.abstract In this study we examine the common practice of employee performance rating calibration, the process in which calibration committee members discuss, compare, and potentially adjust supervisors’ preliminary subjective employee performance ratings. We highlight the inherent incentive conflict related to calibration between the organization and supervisors, where the organization wants calibration to increase consistency in performance ratings while supervisors are also interested in adjustments that benefit themselves. We show that in peer-level calibration, where supervisors are involved in the calibration of their own employees’ ratings, supervisors strategically use this opportunity to influence the calibration process. Specifically, we show that incentive-driven supervisor rating behavior predicts the winners and losers of the peer-level calibration process. The adjustments (or lack thereof) made during the calibration process are not solely driven by the organizational objective of increased rating consistency, but also by supervisors’ incentives. Our research has important implications for the designers of performance evaluation and compensation plans. It highlights the importance of the structural design and the composition of calibration committees, and cautions against overestimating the accuracy of post-calibration performance ratings when using them for important decisions such as promotions and resource allocation.
dc.identifier.uri http://hdl.handle.net/10125/64844
dc.subject Calibration
dc.subject Subjectivity
dc.subject Performance Evaluation
dc.title Peer-Level Calibration of Performance Evaluation Ratings: Are There Winners or Losers?
Files
Original bundle
Now showing 1 - 1 of 1
No Thumbnail Available
Name:
HARC_2020_paper_119.pdf
Size:
817.19 KB
Format:
Adobe Portable Document Format
Description: