A novel approach to challenging consensus in evaluations: The Agitation Workshop
Keywords:
Keywords: consensus, false consensus, workshop, groupthink, evaluation, hidden, sensemaking, shared understandingAbstract
Abstract: As researchers evaluate organisations, projects, and teams, there is a desire for a consensus from those within the organisations who are participating in the research. A common consensual perspective from a team appears to reflect an optimal state where those being evaluated have a common understanding of the current state of events within the context of their environment. The question arises, though, whether an evaluation finding consensus reflects the reality: there are a variety of reasons why a common understanding may be false consensus. Hidden behind this false consensus may be a variety of unaddressed issues which are actually the core of the problem. This paper proposes an evaluation method incorporating the principles of sensemaking and devils advocate, where a consensus of perspectives is challenged before they are considered valid. This is achieved in a workshop where participants reflect on their own perception of reality and represent this reality in a matrix of influencing and relevant factors. The individual matrices are then combined and used to highlight disparities in the participants perspectives through a single matrix visualisation. Discussion in the workshop then focusses on the areas, highlighted by the matrix, where differences of perspectives are identified. In effect, the consensus presented by those being evaluated will be challenged, and a new common understanding will have to be created. Problems such as groupthink can create a false consensus, and it is proposed herein that the workshop provides a mechanism for challenging this. The objective of the research herein was to determine the feasibility and potential benefits of the proposed workshop. The workshop itself is evaluated in this paper, to determine if it has value. The benefits of such a workshop are described, showing how an organisation went from a false consensus concerning problems within the organisation, to the start of a process to address the real underlying issues.Downloads
Published
Issue
Section
License
Open Access Publishing
The Electronic Journal of Information Systems Evaluation operates an Open Access Policy. This means that users can read, download, copy, distribute, print, search, or link to the full texts of articles, crawl them for indexing, pass them as data to software, or use them for any other lawful purpose, without financial, legal, or technical barriers other than those inseparable from gaining access to the internet itself. The only constraint on reproduction and distribution, and the only role for copyright in this domain, is that authors control the integrity of their work, which should be properly acknowledged and cited.
This Journal is licensed under a Creative Commons Attribution-NoDerivatives 4.0 International License.