what is the best research method to use for comparing interventions
Introduction
The evaluation of organizational interventions targeting employee health and wellbeing has been plant to be a challenging chore (Murta et al., 2007). The use of procedure evaluation, defined as the evaluation of "individual, collective or management perceptions and actions in implementing any intervention and their influence on the overall result of the intervention." Nytrø et al. (2000) has served to increase focus on the evaluation of the specific intervention processes and not only the outcomes. Although several evaluation frameworks (Nielsen and Abildgaard, 2013; Nielsen and Randall, 2013) have been suggested it has proven to be methodologically challenging to evaluate the processes of implementation of organizational interventions (OIs; Nielsen and Randall, 2013). Ii distinct approaches to process evaluation data drove are normally used. 1 is a quantitative approach where either standardized or intervention-specific questionnaire items are included in a follow-up questionnaire, and are later integrated into statistical models of implementation and issue (due east.1000., Nielsen et al., 2007; Nielsen and Randall, 2009, 2012). The other is the drove of qualitative data; oft specifically as a supplement to quantitative data, using semi-structured interviews with employees and managers, (Dahl-Jørgensen and Saksvik, 2005; Nielsen et al., 2006), observations of intervention activities (Brannan and Oultram, 2012), or long-term field observations (Czarniawska-Joerges, 2007). Qualitative process evaluation has been used extensively to sympathise the context of interventions outcomes (e.g., Mikkelsen and Saksvik, 1998; Saksvik et al., 2002; Nielsen et al., 2006; Aust et al., 2010). Each data source has its methodological strengths and weaknesses and the concurrent mixed methods apply of both quantitative and qualitative approaches has been proposed as a potential middle basis (Dahl-Jørgensen and Saksvik, 2005; Nielsen and Randall, 2013). Mixed methods is hither divers "as a method [which] focuses on collecting, analyzing and mixing both quantitative and qualitative data in a single or series of studies. Its fundamental premise information technology that the use of [both] approaches in combination provides a amend understanding of inquiry problems than either arroyo solitary" (Creswell and Plano Clark, 2011, p. 5). Although much is written about evaluation research in general (Lipsey and Cordray, 2000; Rossi et al., 2004; Pawson, 2013), and mixed methods evaluation in full general (Rallis and Rossman, 2003; Nastasi et al., 2007) the particularities and methodological considerations of using qualitative and quantitative data in mixed methods based process evaluation have been sparse (Nastasi et al., 2007), especially concerning the specifics of evaluating OIs (Nielsen and Abildgaard, 2013). Using a instance of an OI in the Danish post where questionnaires and semi-structured interviews were used for process evaluation data collection, we compare the epistemological properties of both methods and assess the benefits of unlike ways to collect process information.
The aim of the present written report is to examine the type of cognition about the intervention procedure that may be produced by quantitative and qualitative data and talk over how these sources best tin be applied in mixed methods designs. It is hence not a study of different forms of mixed methods designs (for such literature run across Nastasi et al., 2007; Teddlie and Tashakkori, 2009; Creswell and Plano Clark, 2011) but instead an assessment of the properties and potential roles of specific data sources in mixed methods OI evaluation. We use a sequential mixed methods assay to identify a set of factors in the quantitative information that function as an analytical framework with which we comparatively analyze the qualitative data. This arroyo will help united states accentuate what knowledge about the intervention each data drove methods may provide, and allows usa to discuss differences and similarities.
Mixed Methods OI Evaluation
Though OI evaluation has historically focused on whether the interventions improve working weather condition on quantitatively measured outcomes (Griffiths, 1999) mixed methods approaches have go a commonly chosen evaluation design. A archetypical design would be the use of surveys to measure effects of the intervention (Bambra et al., 2007; Egan et al., 2007) and a, oftentimes minor (Egan et al., 2009), degree of interviews/ascertainment to assess the process and implementation. Though this arroyo will encompass process and effect evaluation, researchers are advocating using more methodologically rigorous qualitative methods (Griffiths, 1999; Egan et al., 2009; Nielsen and Abildgaard, 2013), equally well as more integrated mixed methods approach (Nielsen et al., 2010) to iteratively collect and analyze information from different methods to improve the assessment of the intervention process (such as Nielsen et al., 2015). Additionally, in recent years scholars have more extensively included quantitative process measures (Havermans et al., 2016) which is a further argument for the necessity increased clarity of which methods are about advisable for different mixed methods evaluation tasks. To complement the focus on stronger mixed methods methodology in OI evaluation the nowadays study serves to shed lite on what type of knowledge of the intervention is gained from qualitative and quantitative procedure evaluation information.
Quantitative Process Evaluation Information Drove
A commonly used manner to quantify perceptions of intervention processes is the development and utilize of procedure evaluation scales (Havermans et al., 2016). Although generic scales to measure, for instance, managerial conduct and leadership (Carless et al., 2000) exist, the quantitative process evaluation approach focuses on developing scales to mensurate managerial attitudes and deportment related direct to the intervention in question. Established intervention measures include the Intervention Process Measure (IPM; Randall et al., 2009) and the Healthy Alter Process Inventory (Tvedt et al., 2009). Other approaches include using items to quantitatively assess certain cardinal aspects of the intervention such as employees' participation in activities (Füllemann et al., 2015), perceived legitimacy of a modify program (Biron et al., 2010), stakeholder back up (Sørensen and Holman, 2014) or caste of implementation (Eklöf and Hagberg, 2006; Hasson et al., 2014). A review of the process variables used in organizational stress management intervention evaluation showed a substantial heterogeneity in the level of measurement and the constructs that are assessed (Havermans et al., 2016).
On ane paw, circumspection is needed when using unvalidated or tailored scales (Cox and Ferguson, 1994), on the other, using context specific measures has been recommended by Randall et al. (2009), and seems especially promising every bit many strongly emphasize the demand to take contextual differences into business relationship (Johns, 2001; Biron et al., 2010; Nielsen and Abildgaard, 2013; Nielsen et al., 2014).
To demonstrate the potential use of quantitative process information, nosotros analyze the questionnaire data for psychometrically valid factors, hence identifying scales. Identifying procedure factors via questionnaires offers opportunities to (1) inquire the entire population nearly the intervention process, (ii) link processes to outcomes and (3) exam whether the process factors are generic, e.g., that line manager support is an important process factor across a range of interventions. This will contribute to our understanding of how process questionnaires are best put to employ in conducting evaluation of circuitous OIs, and we hence pose the following inquiry question:
Research question 1: What information about the intervention procedure is gained from quantitative procedure evaluation?
Qualitative Process Evaluation Information Collection
The other approach, qualitative evaluation, is based on collecting and analyzing data of a very different nature. Interviews, focus groups, logbooks observations, field notes, documents, photographs, video and sound, are all valid sources, though semi-structured interviews seems to exist the conventional method used in numerous studies (Mikkelsen and Saksvik, 1998; Nielsen et al., 2006, 2007; Aust et al., 2010; Biron et al., 2010; Greasley and Edwards, 2015). The semi-structured interview, being based on a prefixed interview guide with the possibility of additional follow-upwardly questions (Kvale, 2007) allows the researcher to cover both contextual factors and intervention implementation. Other methods of selection include logbooks of activities (Gilbert-Ouimet et al., 2011; Hasson et al., 2012), consultants' written reports of activities (Aust et al., 2010), electronic communication (Biron et al., 2010) and workplace observations supplemented with field notes or unstructured interviews (Mikkelsen and Saksvik, 1998).
Qualitative process evaluation has often been used to explain puzzling results from quantitative effect evaluation. For instance, in Aust et al. (2010), the intervention grouping's working conditions deteriorated compared to the control grouping. Interviews indicated this deterioration was likely acquired by disappointment that the OI did not deliver the expected improvements in working weather condition. Nielsen et al. (2006) demonstrated how compensatory rivalry caused 1 command group to improve whereas unpopular concurrent changes caused the intervention to fail in one intervention group. Greasley and Edwards (2015) used extensive qualitative interviews pre- and postal service-intervention to assess managerial commitment and its relation to intervention success. Studies such as these demonstrate the usefulness of qualitative methods to explicate unexpected effects and advance our understanding of intervention mechanisms.
In summary, it is well established that qualitative data can shed light on novel phenomena relevant to interventions, simply the type of knowledge and how it differs from quantitative methods has non yet been addressed in relation to OI projects. To assess the characteristics of the knowledge gained from conducting process evaluation interviews, nosotros aim to clarify the same constructs identified in the quantitative analysis to brand comparison possible and pose the second enquiry question:
Inquiry question 2: What information near the intervention process is gained from qualitative process evaluation?
Past answering these ii research questions we contribute to the growing and various literature on the use of qualitative and quantitative process evaluation information in mixed methods designs by providing conceptual clarity about the epistemological properties of both methods. As we analyze the same concepts using the same intervention with unlike data sources we are able to compare the contributions, strengths and weaknesses of both methods. We subsequently discuss the extent of, and limits to, data drove, and how these methods can be combined in mixed methods designs regarding OI projects specifically.
Materials and Methods
The Organizational Intervention
The OI used a cluster randomized blueprint in iv postal areas divided in two Regions in the Postal service. Post post carriers and their line managers participated in the intervention. The OI was implemented in a participatory mode where activities were adjusted to suit the participating employees and managers. The researchers randomized the two Regions into an initial intervention group (Region 1) and a waitlist control group (Region two) that would implement an adjusted version, based on experiences from the initial OI in Region i. In both regions the OI focused on addressing current piece of work environment challenges also as improving the systems for managing the long term developments of the working weather. The key intervention components comprised an interview and questionnaire based cess of working conditions, a detailed evaluation of health and safety practices, a prioritization workshop, and a daylong action planning workshop. In addition, ongoing steering commission meetings were held to monitor progress of activities and brand decisions regarding the OI. A detailed presentation of the intervention can be found in Nielsen et al. (2013).
Quantitative Evaluation
Process Items
The procedure questionnaire contained 22 items based on the IPM questionnaire but tailored to the specific context as recommended by Randall et al. (2009). Response options were five point Likert-type scales ranging from "strongly disagree (ane)" to "strongly agree (v)." A list of the process items can be institute in Table one.
TABLE 1. Exploratory factor (EFA) factor construction.
Statistical Analyses
The being of district scales within the items was examined using exploratory factor (EFA) with varimax rotation (N = 285 response rate 89%) analysis. Several items displayed a significant (p < 0.05) right skewed trend, these were included based on a visual inspection, but due to the skewness principal component assay was chosen over maximum likelihood interpretation as recommended by Fabrigar et al. (1999). The EFA analyzes followed the procedures from the original IPM evolution.
Qualitative Evaluation
The Interviews
At least two employees from each squad were interviewed, in larger teams one private and one grouping interview (with three employees) was conducted. The interviews were conducted at the end of the implementation phase 3 months prior to the follow-up questionnaire and followed a semi-structured interview guide. For each work squad the research team selected at random a number of informants equivalent to 10% of the work team, in case the informant was not available on the day of the interview, the next person on the personnel listing was selected. In full, 22 employees in Region i (sixteen private and 2 groups) and 28 employees in Region 2 (xix individual and 3 groups) were interviewed. The interviews were tape recorded and lasted between 45 min and 1 h. The interview guide focused on the following 3 major topics; the intervention program and perceptions nigh the OI (sample question "which changes practise you run into the OI has brought about?"), changes in the workplace (sample question "How have your daily work tasks and schedule inverse during the last twelvemonth?") and hindering and facilitating factors in the context (sample question "Which conditions in your workplace have made information technology difficult to achieve positive outcomes from the OI?").
Qualitative Belittling Approach
Thematic analysis (Boyatzis, 1998; Braun and Clarke, 2006) was used to analyze the interview fabric. To assess the deviation in methodological properties between qualitative and quantitative procedure measures we developed a thematic framework based on the factors derived from the exploratory factor assay. The analysis focused on what qualities of the OI the interview information could illuminate. Once the thematic factors were identified in the cistron analysis, all interviews were thoroughly read through and all parts relevant to specific themes were collected, afterward an account illustrating both the breadth and depth of each theme was produced. We aimed to identify aspects relevant for understanding the working mechanisms of the OI, the personal perceptions and narratives of the OI and in that sense produce detailed contextual accounts of the OI.
Results
RQ1: What Information virtually the Intervention Procedure Is Gained from Quantitative Procedure Evaluation?
To place what knowledge most the intervention the process items can provide we initially conducted exploratory gene assay to place constructs and scales for farther analysis. In gild to achieve a good factorial fit, six items were excluded due to high loadings on several factors (loadings > 0.2). The data had acceptable properties for conducting factor analysis (KMO = 0.89; Bartlett's Test of Sphericity p < 0.000).
The factor analysis also revealed a cistron construction consisting of 4 factors explaining 75.4% of the variance in the data. Correlation between factors and statistics from the factor analysis is presented in Table 2.
TABLE two. Descriptive statistics and inter-correlations between the scales.
Line Manager's Actions and Attitudes
This factor consists of consists of six items which are measuring managerial actions and attitudes supporting the intervention. Cronbach's blastoff = 0.94. This cistron explained 46.39% of the variance in the information.
Improved Psychosocial Work Environs
This factor consists of four items and covers the exposure to the intervention equally well as proximal measures for intervention mechanisms (east.g., improved dialog and understanding of psychosocial piece of work environment). Cronbach's alpha = 0.87. This factor explained 13.eight% of the variance in the data.
Information about Changes
This factor includes four items focusing on having received acceptable information nigh changes relevant to the squad. Cronbach's alpha = 0.89. This factor explained 8.23% of the variance in the information.
Demand for OI
The final factor includes two items focusing on having received adequate data well-nigh changes relevant to the team. Inter-detail Correlation = 0.31. This cistron explained six.67% of the variance in the data.
What Cognition Is Gained from the Questionnaires?
To answer RQ1; reducing the quantitative process questionnaire items into four singled-out factors facilitated the development distinct scales, and hence provided a shortlist of the near important aspects of implementation. The quantitative information emphasized that managerial attitudes and behaviors are of particular importance (by explaining well-nigh of the variation, and also having a particularly high internal reliability). The quantitative process questionnaire provided a reduction of the complexity of the intervention into a more manageable number of components representing dissimilar aspects of the plan.
An argument for the validity of the results is the fact that factors to a big extent overlap with the results found in the original IPM validation, specially the "Line director's attitudes and deportment" and "Improved psychosocial work environment" are comprised of a subset of items from the original IPM scales. The concluding factor "Need for OI" is based on ii items about the piece of work environment screening questionnaire and the intervention, and has a low inter-detail correlation compared to items in the others factors likely due to the ii items being targeted at different, but still somewhat related, areas of perceived need (i.eastward., the need for a new questionnaire, and the demand for the OI in general).
It is a result that is supporting quantitative measurement of OI processes that the identified constructs are in correspondence with the full general literature, which has documented the distinct role of line managers, (Nielsen, 2013), the importance of information (Mattila et al., 2006), the necessity of perceived change (Semmer, 2011; Nielsen and Randall, 2012) and needs assessment (Bartholomew et al., 1998). The EFA likewise provided four psychometrically valid factors for utilise in subsequent quantitative assay. Observing the four scales, information about changes was by the respondents rated more positively than the others, which would point that employees were more positive with regards to this intervention surface area compared to the other factors. The fact that data almost changes and improvements in work environment were clearly distinct factors besides suggests that the perceptions of information about changes in general and perceptions near the event of the OI did not stem from the same underlying construct. In summary, the quantitative information identifies constructs, and a quantification of their validity, reliability and interrelatedness, which can be further practical in futurity studies.
Research Question 2: What Data about the Intervention Process Is Gained from Qualitative Process Evaluation?
To assess the type of information about the procedure that interviews may provide we analyzed the 4 constructs identified in the quantitative results and compared the information to that found in the qualitative data on the aforementioned topics. We first analyzed the line director'south actions and attitudes relating to the OI, 2nd we looked closer at the perceptions about improvements in psychosocial piece of work environment, third, we assessed experiences relating to information about changes in the workplace, and finally, we analyzed the experiences related to need for the OI. Quotes illustrating each theme tin exist constitute in the appendix labeled "Data Sheet 1".
Line Managing director's Actions and Attitudes
When the interviewees were asked nigh deportment and attitudes of their line manager in relation to the OI they confirmed the crucial part of line managers. They elaborated on how the actions of line managers both helped and hindered implementing the OI. A majority of employees problematized the scarcity of time and the fact that line managers ofttimes prioritized focusing on other tasks than conducting and following up on OI activities.
Interviewees expanded on this perspective and underlined the central role of line managers in making certain OI progress was taking identify, and that continuous communication virtually the intervention procedure was happening. Some employees expressed positive attitudes toward management'south actions during the OI, but often commented negatively on how the line managers had problems keeping their own promises. The interviews, compared to the quantitative factor, demonstrated how these everyday aspects external to the OI affected the employees' perceptions of how the line managers were capable of supporting the implementing the OI.
Improvements in Psychosocial Work Environment
Many employees experienced positive developments during the implementation of the OI, almost concretely improved social relations and team climate. Others agreed on the development merely were not sure if it was due to the OI. Some employees expressed thwarting with regards to having spent also much fourth dimension and free energy on assessment and likewise little on developing actions. These disappointments were linked to difficulties regarding what activities stemmed from the OI and how they related to changes in working conditions.
Some expressed a hesitance about ascribing too articulate causality between the OI and the improvements that could exist observed, and others commented that the OI did lead to practical improvements though non on a large calibration. Many interviewees too commented on the OI and presented their perceptions of its working mechanisms. This demonstrates how interviews can help researchers explain why and how an OI works. For example, a clear positive cistron in the interviews regarding the outcome of the intervention was, for some, a feeling of beingness involved and participating in the development and follow-upward on activities. The clear departure to the quantitative gene is the substantial incertitude and hesitance expressed by the employees with regards to intervention causality. Similarly opinions and suggestion regarding weighing of the energy spent on different components of the intervention is a parameter more hands assessed past explorative qualitative methods.
Information about Changes in the Workplace
When asked about data about changes in the workplace, respondents talked near several interrelated bug: information about the OI activities, problems of assigning fourth dimension for data distribution and general information almost changes. Regarding the OI, some respondents experienced a lack of data and hence did not know where the process was headed. I interviewee explained that information did not come about by itself, ane needed to actively seek out information and another employee problematized the balancing act of having limited time to seek data.
A consistent theme in the interviews was that changes in the company on a both organizational and team level significantly affected the OI and that information about these changes was insufficient. Non only did the interviewees written report several cases of restructuring of work tasks only too of layoffs. These disturbances were even seen by interviewees equally beingness used by line and area managers as excuses for not focusing sufficiently on the implementation of the OI. A trouble that was raised about concurrent projects, particularly during the layoffs, was that the data and developed practices were fleeting. Several interviewees hence articulated a reluctance to commit themselves to novel projects as many had substantial previous experiences with modify failure. This theme demonstrated that though employees positively rated the information regarding changes in the questionnaire, their daily experiences of lack of data and navigating in a complex organization proved difficult. Likewise the interviews highlighted that the juxtaposition of wanting more than information and the cost of having to spend time on acquiring it.
Need for the OI
Interviewees presented a lot of statements about how they perceived the need for specific aspects of the OI such as the format of beingness involved, developing activity plans and participating. Some experienced that at that place had been a need for a new mode of working with screening and action planning in smaller groups, while others would take preferred that everyone was participating in the activities.
In the interviews talk most the OI was as well often linked to experiences with other similar activities and how they had often been forgotten in the long run. Some excused not having had sufficient fourth dimension and resources for the OI due to concurrent organizational changes such as layoffs, merging teams or irresolute managers.
A general assessment was that the process and issue questionnaire used in the OI was likewise long but some relevant aspects were identified. Some interviewees did non retrieve completing the questionnaire, just they often explicate that they had likely done it and since forgotten about it. A group of interviewees explained that the questionnaire is superseded by concurrent events such as managerial change.
The final theme was very different in the interviews than the two items in the questionnaire. Interviewees in the semi-structured interviews did non restrain themselves to just answering the questions regarding the demand for the OI, simply instead gave accounts of the contextual setting that they had to appraise the demand for an OI in. They expressed alter fatigue and compared the OI to previous failed projects and an annual attitude surveys that suffered from a lack of follow-up. Thus, the interviews provided important data almost what factors employees consider before deciding whether to commit to an OI.
What Cognition Is Gained from the Interviews?
The accounts and narratives identified in the four categories have a quality of existence what Geertz (1973) and others have characterization "thick descriptions," pregnant that it is non merely the direct thoughts and actions that are covered but too a detailed description of how they fit in a social context. The mental models of how employees perceive the intervention to work in their organizational context is similarly of import to uncover in order to establish what mechanisms the participants' perceive that the OI is working through (Pawson, 2013).
In the interviews we are offered explanations of how the OI fared in the practical reality of the daily postal life with hindrances such every bit canceled meetings, forgotten questionnaires, and unsupportive line managers. Such information is paramount in the task of providing a detailed assessment of whether an intervention every bit such has failed (theory failure), or it has not been adopted adequately to have had a chance to be effective (implementation failure; Nielsen et al., 2006). It allowed us to investigate, not only the caste of implementation, but besides which contextual factors have caused the OI to role as information technology did.
A further central quality of the interviews is that they reveal how the intervention became embedded in the larger narrative of the company and became a part of the intervention history of the company. How the intervention is seen past participants compared to previous similar projects is a key result of the interviews.
Discussion
The aim of this paper was to examine what information almost the intervention process is to be gained from quantitative (RQ1) and qualitative (RQ2) process evaluation. The results in this paper have shown that for RQ1 the EFA analysis identified four distinct factors in the data, providing a set up of scales for potential further inquiry and comparison. The qualitative data assessed in RQ2 in dissimilarity demonstrated how the intervention fit the arrangement, and provided colorful context specific details nigh the intervention.
Integrating Qualitative and Quantitative Data
A cardinal question in mixed methods research has been how information are combined and what part different sources play in analyses (Bryman, 2007; Johnson et al., 2007; Nastasi et al., 2007; Teddlie and Tashakkori, 2009; Creswell and Plano Clark, 2011). The relevance of using a thorough qualitative assessment of the context and perceptions likewise as a quantitative cess of implementation and proximal effect of change processes seems to intuitively speak for a methodological approach where both methods are used to approximate the details of the intervention procedure in question (Greene et al., 1989; Rallis and Rossman, 2003; Nastasi et al., 2007). Studies have shown the potential of mixed methods by cartoon on both types of process data in combination with result measures to get a precise estimate of processes and effects (e.g., Mikkelsen et al., 2000; Dahl-Jørgensen and Saksvik, 2005; Nielsen et al., 2006, 2015; Aust et al., 2010; Sørensen and Holman, 2014). These studies tin be seen as using a form of mixed methods, labeled by Bryman (2006) as complimentary mixed methods, which demonstrates how the utilize of ane information type (qualitative in this case) to show depth and detail can complement and dash the results from some other information type showing breadth and representativeness (quantitative in this case). The current study, however, sheds light on specific aspects of the employ of qualitative and quantitative data in mixed methods evaluations of organizational interventions.
The Usefulness of Questionnaire Measurement in Mixed Methods Designs
The fact that the quantitative procedure evaluation results presented a psychometrically valid factor structure with constructs that were mirrored in the qualitative data shows speaks for the validity of this method and the validity of the following characteristics: Beginning of all a cardinal quality of quantitative measurement is that researchers can gain valuable data nearly cardinal problems from a big proportion of the sample using few resource. If intervention outcomes are measured using pre- and post-intervention questionnaires, ane should not overlook the practicality of besides measuring process using questionnaire items. Compared to conducting lengthy interviews or focus groups it is user-friendly for respondents to also answer a number of process questions that measure cardinal constructs known to be relevant for implementation and that tin can exist linked to quantitative outcome evaluation (Murta et al., 2007; Semmer, 2011; Nielsen and Abildgaard, 2013). Quantitative procedure measurement also allows for integration of process and outcome evaluation in longitudinal, mediation/moderation models with tools such as structural equation modeling (Ullman and Bentler, 2003).
Several studies accept shown that interventions do not necessarily affect the entire intervention group, or accept similar effects in all subgroups (Nielsen et al., 2006; Semmer, 2011). The use of quantitative data likewise enables for comparing of items of implementation across different contexts or intervention instances which is a substantial quality of quantitative process evaluation information.
Understanding the Qualities of the OI Process and Context
First and foremost the qualitative interviews provided a more detailed narrative contextual business relationship of the themes identified in the cistron analysis, which gives the reader a richer understanding of the intervention and its context than the quantitative methods. The qualitative information shed low-cal on how organizations and their members exercise not exist in a historical vacuum; the intervention is compared to by activities and concurrent events. The assessment of the organizational narratives that the intervention is seen through is a fundamental quality to provide evaluation researchers and their audiences a more nuanced understanding of the "how" and "why" of intervention processes.
Qualitative data is besides central for conducting a thorough procedure evaluation because aspects not measured in the quantitative questionnaires are probable to be affecting the results. This was seen in quotes where the employees explained nuanced aspects of line managers actions, how line managers were focusing on other aspects, how information was somehow both needed, but not wanted badly enough to telephone call for action. Complex aspects of organizational reality, such as these, need to be uncovered using a qualitative cess, as quantitative methods take difficulties illuminating these aspects. Similarly the interviews reveal a substantial insecurity well-nigh which outcomes are related to which activities, a problem that is not easily assessed with the questionnaires. Identifying such problematic gaps in implementation is a key benefit of explorative qualitative assessment that helps push implementation and evaluation of OIs farther.
Another issue was how employees were focused on the increasing problems of downsizing and organizational change in the postal service. Conducting interviews where questions were posed about the general country of the arrangement made it possible to clarify how the changes were perceived, and hence how the changes might influence the consequence of the OI.
Implications for Mixed Methods Procedure Evaluation
The results from this study first of all confirm the relevance and need for application of mixed methods designs to the process evaluation of organizational interventions, as different methodological tasks are amend handled by applying different methods. Though this study demonstrates that it is possible to combine information sources to a mixed methods assay of specific constructs it besides puts weight backside the argument that each method would be suboptimal on its own (Greene et al., 1989; Rallis and Rossman, 2003; Nielsen et al., 2006): It is complex to accurately charge per unit and compare degrees of implementation and back up among of groups of employees using the qualitative information, and with the quantitative data novel contextual events are difficult to assess (Rallis and Rossman, 2003).
A key aspect of intervention evaluation projects is that they are linked to fourth dimension limited events (i.east., the specific OI implementation), and it hence appears that researchers often behave entirely parallel data collection designs (examples include Saksvik et al., 2002; Nielsen et al., 2006; Aust et al., 2010) possibly due to lack of fourth dimension for crossover of results and adjustment of data collection strategy. In contrast to the parallel design the results from this study suggest that there are potential benefits from sequentially harnessing methods to amend the evaluation, or even using reiterative cycles of mixed methods application (Nastasi et al., 2007). The results from quantitative analyses can exist used to guide, non only qualitative assay (every bit was washed in this study) merely also the qualitative data collection to ensure that specific aspects that accept been found to be puzzling are being qualitatively uncovered (Nastasi et al., 2007; Creswell and Plano Clark, 2011). Besides interviews tin can be used to guide survey evolution to both select items and scales or even develop tailored items based on interview content (c.f. Nielsen et al., 2014).
Knowing how to residue the utilization of an efficient separate qualitative/quantitative information collection and potentially more circuitous and time consuming mixed methods approaches where results from different information sources are used to inform farther information collection, is not an like shooting fish in a barrel job (Bryman, 2007; Mertens, 2011). The question is hence not whether or non mixed methods should be used, but instead which mixed methods design is about appropriate. Hither a starting betoken could be to examine the program theory (Pawson, 2013) underpinning the OI and consider which aspects are most appropriately and comprehensively covered by different methods.
Strengths and Limitations
The present study used data from an OI conducted in two regions in ane visitor. Though this is a clear limitation of the generalizability of the results, the fit with general findings in the literature suggest that the results are still usable for other researchers. As this is a study of evaluation methods, generalizability of the concrete findings is non a cardinal quality of the report and therefore we consider the amount of data adequate.
Some other limitation is that the procedure data collection in the intervention is very thorough in the qualitative office and possibly not as thorough in the quantitative where only xvi items were used to mensurate the process. The quantitative results presented a limited picture show of the intervention, only nosotros might be able to legitimate more than complex analyses if nosotros had included more than items. The survey was conducted later the interviews and hence the adaptation of the IPM would be influenced past crucial elements of the interviews.
Conclusion
We suggest that researchers venturing into mixed methods evaluation designs carefully consider what aspects of the intervention process should be assessed by which data collection method. Qualitative process data has the potential to tie together meaning, context and narratives of the intervention and the arrangement. Quantitative process information in contrast has the potential to represent a larger sample of individuals' opinions in a cost effective way, tie together evaluation across contexts and link process and issue measures. Both are applicable in OI evaluation but researchers must employ them wisely to harness their strengths as they have dissimilar methodological presuppositions and respond different questions.
Writer Contributions
JA and KN conducted the intervention and nerveless the data for the study. JA wrote the draft of the paper and conducted the qualitative and quantitative analyses. PS and KN contributed substantially to its development, refinement of the analyses, presentation and discussion of the results.
Funding
This research was supported past the following grants: Joint Committee for Nordic Research Councils for the Humanities and the Social Sciences (NOS HS) grant number 219610/F10. Danish National Piece of work Environment Research Fund, grant no. fourteen-2009-09.
Supplementary Material
The Supplementary Material for this article tin can be institute online at: https://www.frontiersin.org/article/ten.3389/fpsyg.2016.01380
Conflict of Interest Statement
The authors declare that the inquiry was conducted in the absence of any commercial or financial relationships that could exist construed as a potential conflict of interest.
References
Aust, B., Rugulies, R., Finken, A., and Jensen, C. (2010). When workplace interventions lead to negative effects: learning from failures. Scand. J. Public Health 38, 106–119. doi: 10.1177/1403494809354362
CrossRef Full Text | Google Scholar
Bambra, C., Egan, M., Thomas, Due south., Petticrew, M., and Whitehead, M. (2007). The psychosocial and health furnishings of workplace reorganisation. ii. A systematic review of chore restructuring interventions. J. Epidemiol. Community Health 61, 1028–1037. doi: ten.1136/jech.2006.054999
CrossRef Full Text | Google Scholar
Bartholomew, L. K., Package, G. South., and Kok, K. (1998). Intervention mapping: a process for developing theory- and evidence-based health education programs. Wellness Educ. Behav. 25, 545–563. doi: ten.1177/109019819802500502
CrossRef Full Text | Google Scholar
Biron, C., Gatrell, C., and Cooper, C. L. (2010). Autopsy of a failure: evaluating process and contextual issues in an organizational-level piece of work stress intervention. Int. J. Stress Manag. 17, 135. doi: 10.1037/a0018772
CrossRef Full Text | Google Scholar
Boyatzis, R. E. (1998). Transforming Qualitative Information: Thematic Assay and Lawmaking Development. Thousand Oaks, CA: Sage.
Google Scholar
Brannan, M. J., and Oultram, T. (2012). "Participant observation," in Qualitative Organizational Research: Cadre Methods and Electric current Challenges, eds G. Symon and C. Cassell (London: SAGE Publications, Ltd), 296–313.
Braun, V., and Clarke, V. (2006). Using thematic assay in psychology. Qual. Res. Psychol. 3, 77–101. doi: 10.1191/1478088706qp063oa
CrossRef Full Text | Google Scholar
Bryman, A. (2006). Integrating quantitative and qualitative research: how is it washed? Qual. Res. 6, 97–113. doi: ten.1177/1468794106058877
CrossRef Full Text | Google Scholar
Bryman, A. (2007). Barriers to integrating quantitative and qualitative enquiry. J. Mix. Methods Res. one, 8–22. doi: 10.1177/2345678906290531
CrossRef Full Text | Google Scholar
Carless, S. A., Wearing, A. J., and Mann, 50. (2000). A curt measure of transformational leadership. J. Bus. Psychol. fourteen, 389–405. doi: x.1023/A:1022991115523
CrossRef Full Text | Google Scholar
Cox, T., and Ferguson, E. (1994). Measurement of the subjective work environment. Work Stress 8, 98–109. doi: 10.1080/02678379408259983
CrossRef Full Text | Google Scholar
Creswell, J. Westward., and Plano Clark, V. L. (2011). Designing and Conducting Mixed Methods Research, 2nd Edn. Los Angeles, CA: SAGE Publications.
Google Scholar
Czarniawska-Joerges, B. (2007). Shadowing: and Other Techniques for Doing Fieldwork in Modernistic Societies. Denmark: Copenhagen Business School Press.
Google Scholar
Dahl-Jørgensen, C., and Saksvik, P. Ø (2005). The impact of 2 organizational interventions on the health of service sector workers. Int. J. Health Serv. 35, 529–549. doi: ten.2190/P67F-3U5Y-3DDW-MGT1
CrossRef Total Text | Google Scholar
Egan, 1000., Bambra, C., Petticrew, K., and Whitehead, Thousand. (2009). Reviewing show on complex social interventions: appraising implementation in systematic reviews of the health effects of organisational-level workplace interventions. J. Epidemiol. Community Health 63, 4–eleven. doi: x.1136/jech.2007.071233
CrossRef Full Text | Google Scholar
Egan, Thou., Bambra, C., Thomas, S., Petticrew, Thou., Whitehead, M., and Thomson, H. (2007). The psychosocial and wellness effects of workplace reorganisation. 1. A systematic review of organisational-level interventions that aim to increase employee control. J. Epidemiol. Community Health 61, 945–954. doi: x.1136/jech.2006.054965
CrossRef Full Text | Google Scholar
Eklöf, M., and Hagberg, M. (2006). Are simple feedback interventions involving workplace data associated with better working environs and health? A cluster randomized controlled written report among Swedish VDU workers. Appl. Ergon. 37, 201–210. doi: x.1016/j.apergo.2005.04.003
CrossRef Full Text | Google Scholar
Fabrigar, 50. R., Wegener, D. T., MacCallum, R. C., and Strahan, E. J. (1999). Evaluating the use of exploratory factor analysis in psychological research. Psychol. Methods 4, 272–299. doi: 10.1037/1082-989X.4.3.272
CrossRef Full Text | Google Scholar
Füllemann, D., Jenny, One thousand. J., Brauchli, R., and Bauer, G. F. (2015). The cardinal role of shared participation in changing occupational self-efficacy through stress management courses. J. Occup. Organ. Psychol. 88, 490–510. doi: ten.1111/joop.12124
CrossRef Full Text | Google Scholar
Geertz, C. (1973). "Thick clarification: toward in interpretive theory of civilisation," in The Interpretation of Cultures, ed. C. Geertz (New York, NY: Basic Books), 3–30.
Google Scholar
Gilbert-Ouimet, Yard., Brisson, C., Vézina, One thousand., Trudel, 50., Bourbonnais, R., Masse, B., et al. (2011). Intervention study on psychosocial piece of work factors and mental wellness and musculoskeletal outcomes. Healthc. Pap. 11, 47–66. doi: x.12927/hcpap.2011.22410
CrossRef Full Text | Google Scholar
Greasley, K., and Edwards, P. (2015). When do health and well-beingness interventions work? Managerial commitment and context. Econ. Ind. Democr. 36, 355–377. doi: 10.1177/0143831X13508590
CrossRef Full Text | Google Scholar
Greene, J. C., Caracelli, V. J., and Graham, W. F. (1989). Toward a conceptual framework for mixed-method evaluation designs. Educ. Eval. Policy Anal. xi, 255–274. doi: 10.3102/01623737011003255
CrossRef Full Text | Google Scholar
Griffiths, A. (1999). Organizational interventions: facing the limits of the natural science epitome. Scand. J. Work Environ. Health 25, 589–596. doi: 10.5271/sjweh.485
CrossRef Full Text | Google Scholar
Hasson, H., Brisson, C., Guérin, S., Gilbert-Ouimet, M., Baril-Gingras, G., Vézina, Thousand., et al. (2014). An organizational-level occupational health intervention: employee perceptions of exposure to changes, and psychosocial outcomes. Work Stress 28, 179–197. doi: x.1080/02678373.2014.907370
CrossRef Full Text | Google Scholar
Hasson, H., Gilbert-Ouimet, G., Baril-Gingras, Chiliad., Brisson, C., Vézina, M., Bourbonnais, R., et al. (2012). Implementation of an organizational-level intervention on the psychosocial environs of piece of work: comparison of managers′ and employees′ views. J. Occup. Environ. Med. 54, 85–91. doi: ten.1097/JOM.0b013e31823ccb2f
CrossRef Full Text | Google Scholar
Havermans, B. Thousand., Schlevis, R. K., Kicking, C. R., Brouwers, Eastward. P., Anema, J. R., and van der Beek, A. J. (2016). Procedure variables in organizational stress management intervention evaluation research: a systematic review. Scand. J. Piece of work Environ. Health doi: x.5271/sjweh.3570 [Epub alee of print].
CrossRef Full Text | Google Scholar
Johnson, R. B., Onwuegbuzie, A. J., and Turner, L. A. (2007). Toward a definition of mixed methods research. J. Mix. Methods Res. 1, 112–133. doi: 10.1177/1558689806298224
CrossRef Full Text | Google Scholar
Kvale, S. (2007). Doing Interviews. London: SAGE Publications, Ltd.
Google Scholar
Lipsey, M. W., and Cordray, D. Due south. (2000). Evaluation methods for social intervention. Annu. Rev. Psychol. 51, 345–375. doi: x.1146/annurev.psych.51.i.345
CrossRef Full Text | Google Scholar
Mattila, P., Elo, A.-L., Kuosma, E., and Kylä-Setälä, E. (2006). Issue of a participative work conference on psychosocial work environment and well-beingness. Eur. J. Work Organ. Psychol. 15, 459–476. doi: 10.1080/13594320600901729
CrossRef Full Text | Google Scholar
Mikkelsen, A., and Saksvik, P. Ø (1998). Learning from parallel organizational development efforts in ii public sector settings findings from personnel research in norway. Rev. Public Pers. Adm. 18, 5–22. doi: 10.1177/0734371X9801800202
CrossRef Full Text | Google Scholar
Mikkelsen, A., Saksvik, P. Ø, and Landsbergis, P. (2000). The bear on of a participatory organizational intervention on job stress in customs health care institutions. Piece of work Stress xiv, 156–170. doi: 10.1080/026783700750051667
CrossRef Full Text | Google Scholar
Murta, S. One thousand., Sanderson, K., and Oldenburg, B. (2007). Process evaluation in occupational stress management programs: a systematic review. Am. J. Health Promot. 21, 248–254. doi: 10.4278/0890-1171-21.4.248
CrossRef Full Text | Google Scholar
Nastasi, B. Yard., Hitchcock, J., Sarkar, S., Burkholder, One thousand., Varjas, Grand., and Jayasena, A. (2007). Mixed methods in intervention enquiry: theory to adaptation. J. Mix. Methods Res. one, 164–182. doi: ten.1177/1558689806298181
CrossRef Full Text | Google Scholar
Nielsen, K. (2013). Review article: how can we make organizational interventions work? Employees and line managers equally actively crafting interventions. Hum. Relat. 66, 1029–1050. doi: 10.1177/0018726713477164
CrossRef Full Text | Google Scholar
Nielsen, K., and Abildgaard, J. S. (2013). Organizational interventions: a enquiry-based framework for the evaluation of both process and effects. Piece of work Stress 27, 278–297. doi: 10.1080/02678373.2013.812358
CrossRef Full Text | Google Scholar
Nielsen, K., Abildgaard, J. S., and Daniels, K. (2014). Putting context into organizational intervention design: using tailored questionnaires to measure out initiatives for worker well-being. Hum. Relat. 67, 1537–1560. doi: 10.1177/0018726714525974
CrossRef Full Text | Google Scholar
Nielsen, K., Fredslund, H., Christensen, K. B., and Albertsen, K. (2006). Success or failure? Interpreting and understanding the impact of interventions in four similar worksites. Work Stress twenty, 272–287. doi: x.1080/02678370601022688
CrossRef Total Text | Google Scholar
Nielsen, One thousand., and Randall, R. (2009). Managers' active support when implementing teams: the affect on employee well-being. Appl. Psychol. Wellness Well Being one, 374–390. doi: 10.1111/j.1758-0854.2009.01016.10
CrossRef Full Text | Google Scholar
Nielsen, K., and Randall, R. (2012). The importance of employee participation and perceptions of changes in procedures in a teamworking intervention. Piece of work Stress 26, 91–111. doi: 10.1080/02678373.2012.682721
CrossRef Full Text | Google Scholar
Nielsen, Yard., and Randall, R. (2013). Opening the black box: presenting a model for evaluating organizational-level interventions. Eur. J. Work Organ. Psychol. 22, 601–617. doi: 10.1080/1359432X.2012.690556
CrossRef Full Text | Google Scholar
Nielsen, M., Randall, R., and Albertsen, M. (2007). Participants' appraisals of procedure issues and the effects of stress management interventions. J. Organ. Behav. 28, 793–810. doi: ten.1002/chore.450
CrossRef Total Text | Google Scholar
Nielsen, M., Randall, R., and Christensen, K. B. (2015). Do different grooming conditions facilitate team implementation? A quasi-experimental mixed methods written report. J. Mix. Methods Res one–25. doi: ten.1177/1558689815589050 [Epub ahead of print].
CrossRef Total Text | Google Scholar
Nielsen, K., Stage, M., Abildgaard, J. South., and Brauer, C. V. (2013). "Participatory intervention from an organizational perspective: employees every bit active agents in creating a healthy work environment," in Salutogenic Organizations and Modify – The Concepts Behind Organizational Health Intervention, eds Yard. F. Bauer and G. J. Jenny (Dordrecht: Springer).
Google Scholar
Nielsen, K., Taris, T. W., and Cox, T. (2010). The future of organizational interventions: addressing the challenges of today's organizations. Work Stress 24, 219–233. doi: 10.1080/02678373.2010.519176
CrossRef Full Text | Google Scholar
Nytrø, K., Saksvik, P. Ø, Mikkelsen, A., Bohle, P., and Quinlan, Chiliad. (2000). An appraisal of key factors in the implementation of occupational stress interventions. Work Stress xiv, 213–225. doi: 10.1080/02678370010024749
CrossRef Full Text | Google Scholar
Pawson, R. (2013). The Science of Evaluation: A Realist Manifesto. K Oaks, CA: SAGE.
Google Scholar
Rallis, South. F., and Rossman, Grand. B. (2003). "Mixed methods in evaluation contexts: a pragmatic framework," in Handbook of Mixed Methods in Social and Behavioral Research, eds A. Tashakkori and C. Teddlie (Thousand Oaks, CA: Sage), 491–512.
Google Scholar
Randall, R., Nielsen, M., and Tvedt, S. D. (2009). The development of 5 scales to measure employees' appraisals of organizational-level stress management interventions. Work Stress 23, one–23. doi: x.1080/02678370902815277
CrossRef Full Text | Google Scholar
Rossi, P. H., Lipsey, M. W., and Freeman, H. E. (2004). Evaluation, a Systematic Arroyo, 7th Edn. Thousand Oaks, CA: Sage.
Google Scholar
Saksvik, P. Ø, Nytrø, K., Dahl-Jorgensen, C., and Mikkelsen, A. (2002). A process evaluation of individual and organizational occupational stress and health interventions. Work Stress 16, 37–57. doi: 10.1080/02678370110118744
CrossRef Full Text | Google Scholar
Semmer, N. K. (2011). "Job stress interventions and arrangement of work," in Handbook of Occupational Health Psychology, eds J. C. Quick and L. E. Tetrick (Washington, DC: American Psychological Clan), 299–318.
Google Scholar
Sørensen, O. H., and Holman, D. (2014). A participative intervention to improve employee well-being in knowledge work jobs: a mixed-methods evaluation study. Work Stress 28, 67–86. doi: 10.1080/02678373.2013.876124
CrossRef Full Text | Google Scholar
Teddlie, C., and Tashakkori, A. (2009). Foundations of Mixed Methods Research: Integrating Quantitative and Qualitative Approaches in the Social and Behavioral Sciences. Los Angeles, CA: SAGE.
Google Scholar
Tvedt, S. D., Saksvik, P. Ø, and Nytrø, K. (2009). Does alter process healthiness reduce the negative effects of organizational change on the psychosocial work environs? Work Stress 23, 80–98. doi: 10.1080/02678370902857113
CrossRef Total Text | Google Scholar
Ullman, J. B., and Bentler, P. M. (2003). "Structural equation modeling," in Handbook of Psychology, eds J. A. Schinka and W. F. Velicer (New York, NY: John Wiley & Sons, Inc.).
Google Scholar
Source: https://www.frontiersin.org/articles/10.3389/fpsyg.2016.01380/full
Post a Comment for "what is the best research method to use for comparing interventions"