Assessing Strategic Readiness for Healthcare Analytics: System and Design Theory Implications

  • Sathyanarayanan Venkatraman
  • Rangaraja P. Sundarraj
  • Ravi Seethamraju
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 10844)


The adoption of analytics solutions in hospitals is a recent trend aimed at fact-based decision making and data-driven performance management. However, the adoption of analytics involves diverse stakeholder perspectives. Currently, there is a paucity of studies that focus on how the practitioners assess their organizational readiness for health analytics (HA) and make informed decisions on technology adoption given a set of alternatives. We fill this gap with our study by designing a strategic assessment framework guided by a DSRM approach that iteratively extends our past artifact. Our approach first entails the use of many in-depth case-studies, as well as embedded experts from the industry to inform the objective setting and design process. These inputs are then supported by two multi-criteria decision-making methods. We also evaluate our framework with healthcare practitioners for both design validity and future iterations of this project. Implications of our work for theory of design and action are also highlighted.


DSRM IPA DEMETAL Health-Analytics Theory for design and action 

1 Introduction

Healthcare organizations (HCOs) around the world have been investing in emerging information technologies to solve their business issues and challenges related to cost reduction, patient care, and performance management. Examples of this trend include the adoption of Health-Analytics (HA) to enable data-driven decision-making capability [1] and to improve healthcare processes [2, 3]. These adoptions have spread across multiple areas of the hospital (including clinical, operational, administrative and strategic business areas to derive business insights [4]) and have contributed to both long-term and short-term goals [5].

While the business case for HA may appear to be there anecdotally, as with any emerging technology, during the process of adoption, hospitals face many challenges related to cost, financial, business-case, culture, executive support, skills, clarity, and data availability [6]. As an example, lack of availability of EMR system and data limits hospitals in leveraging HA for their clinical decision support. Some researchers have raised doubts on value creation of information technologies in hospitals [7], and past studies point out that, despite investments in technologies there are increasing evidence of entrenched inefficiencies and suboptimal clinical outcomes [8]. The key to success of HA adoption is the hospital’s readiness to adopt the technology. Hence, the HA adoption-decision entails multiple stakeholders at multiple levels and due consideration of factors that impact the success of adoption. Despite the fact that the requirements of the stakeholders from various departments differ, at a foundational level, the hospital needs to be strategically ready and fit to adopt HA. The research question addressed herein is: how can hospitals strategically assess their readiness for HA technology with a system driven approach and make informed decisions on adoption? Our objective of this study is to design a framework to support the above.

Past IS management studies have primarily focused on post-adoption. One example of a system for a particular pre-adoption decision can be found in [9], wherein the focus is on technology-related organizational factors. Given this gap, our research considers a strategic-level assessment. While (as in [9]), we follow the Design Science Research Method (DSRM) approach [10], we bring-in: (i) the additional richness of fourteen case-studies; and (ii) also embed industry experts to evaluate and provide inputs based on their real-life experience. These steps of the design process provide the inputs for the instrument that forms part of our system. In addition, since the adoption of HA is a complex decision-making process, we embed suitable multi-criteria decision-making (MCDM) techniques [11, 12]. An example instantiation of a prototype tool of this design framework is by a team of practitioners from a hospital. With many hospitals across the world seriously exploring the feasibility of adoption of HA solutions, our research is timely, and relevant for practitioners.

The paper is organized further as follows: Sect. 2 provides a background to our study; Sect. 3 describes the method and the guidelines of our study; Sect. 4 details the design and development process and the artifact; Sect. 5 details demonstration and evaluation process that we followed; Sect. 6 discusses the design theory construct, learnings, and implications of this study; and Sect. 7 summarizes and concludes.

2 Background

Health-Analytics (HA) is a recent innovation in healthcare IT, which enables the hospitals in making evidence-based decisions in their various process areas on a day to day basis. There are several examples of past studies that highlight how hospitals leverage HA to analyze the clinical data [13], operational data [14], administrative data [15], and strategic data [16]. They derive value through the systematic use of above-said data and the related business insights developed through applied analytical disciplines such as statistical, contextual, quantitative, predictive and cognitive models to drive fact-based decision making for planning, management, measurement and learning [8, 17]. These business insights eventually help transform the business processes in many departments of hospitals [5] which results in increased efficiency of patient care, reduction of healthcare costs and increase in clinical outcomes [3]. Hospitals, today, are also collecting significant amounts of structured and unstructured data [1] and this availability of data offers organizations opportunities for innovations to their business processes and services through the adoption of HA technology.

In line with our research on the development of an artifact to help the adoption of HA, we review the past IT adoption studies towards evaluating the models suitable to be used in pre-adoption stages by practitioners. IT adoption body of knowledge has been made theoretically rich by popular models like TAM [18, 19], UTAUT [20], DeLone & McLean IS Success Model [21]. TAM and UTAUT focus on user level adoption and IS Success Model focuses on post-implementation success. Several studies have used these models in various industry contexts including healthcare domain [22, 23]. Also, few HA-specific models also have been developed such as Brook’s BI Maturity assessment model [24] and HOT-Fit model [25]. Brook’s model focuses on maturity assessment of HA and HOT-fit model is the extension of the original IS success model.

Practitioners to a large extent, follow the models developed by the industry such as “Healthcare Information and Management Systems Society (HIMSS)” [26] or the “Healthcare Analytics Adoption Model (HAAM)” [27]. HIMSS EMR model has eight stages that track hospital’s progress towards a paperless patient record environment, and this integrates closely with HIMSS analytics model which measures organization’s analytical maturity across data, enterprise approach, leadership, strategic targets, and analytical staff capabilities. In contrast, the HAAM model is more technical and defines various stages of HA adoption in hospitals in the increasing order of maturity. The above models being developed by the industry, are practitioner-oriented with a focus on maturity assessment of hospitals post adoption of technology. However, neither the academic models nor the practitioner ones cited above can be used for pre-adoption evaluation and decision scenario although they provide insights on the phenomenon of adoption of technology and also methods to measure or evaluate the HA systems. Also, except the Brook’s model, rest of the cited ones are more theoretical, trying to understand the phenomena of IT adoption and not meant for practitioners’ use in hospitals. Though there is a need in the industry for pre-adoption focused models and techniques, the past studies have not addressed the same.

Secondly, apart from the standard IT adoption studies from the past which model the phenomenon of adoption, we also explore studies [4, 28] that focus on the antecedents of HA adoption and they provide indicators on the factors that a practitioner would consider while adopting HA at a strategic level. On similar lines, the study by Ghosh and Scott [29] highlight the technical catalysts and antecedents to developing analytics competency in healthcare domain.

Myers et al. [30] called for a need for frameworks for IS to assess and justify the investments in IT. Developing either artifacts, systems or instruments focused on assessment of a concept is not new. We have seen similar studies in IS domains [31, 32] although the objective of most of these instruments was to support quantitative survey. Ebner et al. develop an instrument for IT benchmarking which is practitioner-oriented [33], and this was developed over several iterations of interaction with the industry. However, it is not healthcare domain specific. In healthcare IT domain, Venkatraman et al. [9] develop an artifact for the a pre-adoption decision support HA [9]. However, its focus is only on the organizational aspects, and it ignores the environmental and economic aspects which impact HA adoption decisions. Hence, we found a clear need to develop and enhance our past study with more exploratory and qualitative work and also embed practitioners in the process to increase the rigor. Our current study addresses the above need, and we further provide the details of the iterative method that we adopted in our study and the artifact that we developed through that process.

3 Method

As per Hevner et al., the design science contributes to building innovative IT artifacts to solve identified business needs [34]. The identification of the design research problem was triggered by the industry requirements for a tool or framework which executives can use to assess their readiness for adoption of HA technology. As the heart of our study lies in the design science, our emphasis is on the construction-oriented view of IS. Keeping with the above principle, we follow the Design Science Research Process (DSRP) proposed by Peffers et al. [10] to develop an artifact which can be used by practitioners in hospitals while making decisions on adopting HA technology.

DSRP, the gated methodology with six stages emphasizes on continuous evaluation and feedback to previous steps to refine the design. Past studies [33] have developed instruments (artifacts) with multiple iterations of testing on the field with organizations. Our current study (Ref. Fig. 1) aims to achieve a similar aim with two iterations of the design process but provides additional rigor by the use case study approach and novel methods like embedding experts into the design. The objective of this research is to refine the outcome from iteration one [9] and produce an artifact closer to industry requirements. In doing so, we now explain how our study followed the guidelines of design science research, as codified by Hevner et al. [34]:
  1. 1.

    Design as an Artifact: The objective of our research is to produce a tangible artifact in the form of a Strategic Assessment Framework for HA Adoption Decisions.

  2. 2.

    Problem Relevance: The problem that we intend to solve is helping executives make informed decisions on HA adoption to minimize risks. The relevance of this problem is high, as the industry, amidst uncertainties of business value from IT investments [7], is on the cusp of adopting HA into their mainstream business towards enhancing the clinical outcomes and streamlining the hospital management processes while optimizing the cost of operations.

  3. 3.

    Design Evaluation: The design was carried out in two iterations (Ref. Fig. 1). With our engagements with industry practitioners in the case study organizations, we received a feedback on the initial version of the artifact from the past study [9] which led us to enhance the objective of our study. The new artifact includes strategic assessments of readiness apart from maturity assessment (tactical) in the original artifact. Section 6 has the details of the further evaluation of the enhanced artifact in this study and the feedback from practitioners. We will continue to evolve in our design through further evaluations and refinements.

  4. 4.

    Research Contributions: There is a paucity of research in the areas of defining antecedents that can explain the adoption-decisions and artifacts that can be used by the practitioners to assess their readiness to adopt HA technology. Our research intends to fill this gap.

  5. 5.

    Research Rigor: We use the qualitative case study approach to explore landscape of HA adoption factors, antecedents and challenges before the development of the artifacts. In our first iteration we had three participants from industry and in the current study we expanded our reach to twenty seven practitioners to understand the aspects of the problem and possible solutions. Apart from this, we also embedded experts from a hospital to support our design process with an intent of constructing an artifact closer to industry requirements.

  6. 6.

    Design as a Search Process: We understand the reiterative nature of design science research and hence every artifact (typology, framework, assessment tool) that we produce is planned to be tested & validated in the field and provide for a feedback loop to modify, change the artifacts based on actual usage.

  7. 7.

    Communication of Research: We will use conferences and publications as a means to present both the technical and management view of the synthesized information, research data, findings and the artifact.

Fig. 1.

Iterative design of artifact.

4 Design and Development of the Artifact

In this stage, the core of our design science research, we started by determining the artifact’s functional requirements (Ref. Fig. 2) and then designed the actual artifact by closely engaging with a team of expert practitioners from one of our cases. We built the framework in an iterative approach. Our team regularly met to validate and enhance the technical design and accuracy of the algorithms. Design-science research relies upon the application of rigorous methods in both the construction and evaluation of the design artifact [34], and we relied on past literature and detailed case study analysis to determine the functional requirements. The review of past research [4, 5, 35] gave us insights on the areas where hospitals implement analytics, and we used that as a base to explore and conduct our case study interviews.
Fig. 2.

Determining the core functional requirements of the artifact.

Our case study analysis confirmed that, broadly, HA is adopted in: (a) clinical areas to make decisions pertaining to diagnosis and medical interventions; (b) operational areas such as labs, pharmacies, theatres, and all support functions to the core clinical function; (c) the administrative areas such as HR, resources, and facilities; and (d) the strategic areas such as finance, planning, and customer relations. Apart from the validating functional areas of HA, our discussions with the cases in the second iteration provided the following three crucial inputs on the considerations for assessment of the HA adoption readiness:
  1. 1.

    The readiness of hospitals could vary based on the functional area that they want to adopt HA. As an example, many cases in our study were ready for HA adoption in operational and administrative areas, but not on the clinical side.

  2. 2.

    While assessing their readiness for HA, hospitals consider several factors related to economics, technology, organization, and environment. HA adoption is a complex multi-criteria decision.

  3. 3.

    Since the span of applications for HA is vast, the requirements of the stakeholders from various departments differ. More so, the functional responsibility (CIO/CTO/CFO/COO/CEO) of the stakeholder has an impact on how he/she would evaluate the HA and assess their readiness to adopt.


Our expanded case study analysis carried out in the second iteration provided us fifteen antecedents or “factors of considerations” (as compared to basic five factors considered in first iteration) that hospitals should use to assess their readiness to adopt HA. We derived these factors based on the coding of text segments in the interview transcriptions using MAXQDA 2018 software. Initially, a priori coding was set up based on past literature and the TOE theoretical framework [36]. We applied axial coding to select core thematic categories present in interview transcripts and discovered common patterns and relations. The codes were grouped under the technological, organizational, environmental themes, and based on further grouping we added economic factors as an additional theme. Out of totally 1232 code segments in the transcripts from interviews of 27 respondents, the codes with high frequency and widely said by the participants as an important factor were short-listed.

The details of the codes, coding frequency, and their definition in the context of the artifact that we plan to develop are given below (Ref. Table 1). The table also has the addition information on the primary focus area (marked as P on the table) of the stakeholders by their functional responsibility. Our CxO level case study participants had a mix of technical, operational and business profiles. In our cases, the technical executives (CIO, CTO) were focused on ensuring the readiness from a HA technology
Table 1.

Short-listed factors and stakeholder’s focus

Factors, and their definitions in the artifact code frequency (in brackets)

Stakeholders priority







Cost (28): The extent to which the HA cost is affordable





ROI (26): The extent to which the HA delivers return on investments





Benefits (58):The extent to which the HA provides qualitative benefits





Economies of scale (20): The extent to which the HA offer economies of scale





IS maturity (40): The extent of the current maturity of IS (EMR, HIS) which needs to integrate with HA




IT maturity (40): The extent of the current maturity of underpinning IT which needs to integrate with HA




Medical infra maturity (46): The extent of the current maturity state of the available medical equipment which needs to integrate with HA




Data quality (76): The extent to which the current systems produce quality data which are analysable by HA





Leadership & vision (56): The extent to which the HA aligns with leadership vision and would gain support for adoption




Competency (29): The extent of the required skills that we have to develop, use, and sustain HA competency




Culture (24): The extent to which the organizational culture supports fact based decision making with HA





User Adoption (84): The extent to which the users would accept and adopt HA






Govt. regulation (38): The extent to which HA helps achieve compliance to govt. regulations




Competition (16): The extent to which HA helps the hospital operate in an competitive environment




Supply-demand state (13): The extent to which the current supply-demand state of resources would cause issues in HA adoption



perspective; the operational executives (COO) were concerned about role of HA in enhancing hospital performance; and the business executives (MD, CEO) of the hospitals were focussed on alignment of HA with their long-term vision. Based on the above we derived the two critical functional requirements for the artifact:
  • Strategic assessment to include fifteen factors (F1–F15) (Ref. Table 1).

  • Preference elicitation methods to be used to seek inputs from the group of stakeholders.

4.1 The Artifact: Strategic Readiness Assessment Framework for HA

We will now describe the details of the artifact development stages (Ref. Fig. 3). The case study data provided the inputs on the factors of consideration for readiness assessment, the functional areas of the hospital, and the types of stakeholders involved in decision making. For us to construct the framework with the desired output, these inputs had to be coded in to mathematical derivations and algorithms. To ensure reliability and accuracy of such algorithms, we depended on the reusable and time-tested techniques from past studies which have been proven in many empirical studies. One of the key benefits of reusable design artifacts is that, they can be instantiated and combined in different ways to produce concrete designs [37, 38]. Also, our objective was to develop an artifact for the practitioners that puts to use the techniques from the academic world. Towards that objective, we embedded IPA [11] and DEMATEL [12] techniques in our artifact, which we will explain now.
Fig. 3.

Artifact development stages.

IPA, a well-known technique was first introduced by Martilla and James [11] as a means to the management diagnosis of new product success in marketing. Due to its simplicity and ease of interpretation with IPA maps, it has been used in many studies including IS [39]. In our study, we use it as a framework to capture the judgments of the stakeholders on their perceived importance (“importance”) of the readiness factor (F1–F15) (Ref. Table 1) and about envisaged performance (“performance”) of the HA adoption in the context of the factor. We elicit the judgments from multiple stakeholders with a Likert scale of 1–7 and calculate the mean of the responses for each of the fifteen factors. In short, for every readiness factor, there would be a pair of importance and performance mean scores. Based on the scores of importance and performance, we then draw an IPA Graph (Ref. Fig. 4) plotting the factors in four quadrants. The IPA Graph presents the factors segregated in four quadrants, with a logic that: (a) factors with high importance and low performance need more focus; (b) the ones with high importance and high performance need to maintained status quo; (c) for the ones with low importance and high performance, should be defocused because of possible overkill and current undue focus; and (d) the ones with low importance and low scores can be ignored as low priority. IPA graph gives executives a simple interface to understand their overall readiness for HA adoption and areas that they need to focus on getting business value from HA investments.
Fig. 4.

IPA: importance – performance graph.

DEMATEL was originally proposed by Gabus and Fontela [12] to study and resolve the complicated and intertwined problem groups in decision scenario with multiple criterions. DEMETEL has been successfully used in many studies in the past [40, 41] dealing with complex decisions, and HA adoption being a similar one, we embed this technique in our artifact. The advantage of DEMETEL is its ability to investigate the interrelations among criteria and build a Network Relationship Map (NRM) as an outcome (Ref. Fig. 5). Following are the steps (The detailed explanation of mathematics involved in deriving relationships is beyond the scope of this paper) involved in this technique:
  • Step 1: Multiple stakeholders’ input is captured on the impact of a given factor on others through a pairwise comparison, and they are aggregated to arrive at direct relationship matrix (D).

  • Step 2: The direct relationship matrix is then normalized (N).

  • Step 3: A total relationship matrix is arrived based on N (T = N (I  N) − 1 ).

  • Step 4: From the T, the influence strength of the factors are calculated.

  • Step 5: Finally the NRMs (causal diagrams) are built based on the influence strengths.

Fig. 5.

DEMETAL: network relationship map for the factors impacting HA.

In our artifact, we use the DEMATAL to assess the interrelationship between the HA factors and we do this two hierarchical levels, (a) group level inter-relationships between economic, organizational, technological and environmental aspects and (b) factor level inter-relationships within the aspect. As an output, we get five NRMs which provide insights on the cause-effect relationships of the factors (Ref. Fig. 5). A combination of IPA Graph and a DEMATAL NRMs provide a strategic view of the readiness of the organization to adopt HA.

5 Demonstration and Initial Evaluation

We built the artifact in collaboration with a team of practitioners in one of our cases with whom we carried out a formal demonstration and walk-through of the Strategic Readiness Assessment Framework for HA. Our objective of this initial evaluation was to assess and take feedback on its usefulness and the realistic representation of decision scenarios. The hospital chosen for evaluation is one of the largest eye-care hospitals in India and in the cusp of adopting HA to enhance their clinical outcomes. The evaluation team comprised of the CITO (Chief information and technology Officer) and two IT managers from his team who have been working closely with many HA vendors and piloting their solutions. We carried out an initial presentation of the system, detailing the objectives, and various factors for which the inputs are needed in the framework. Subsequently, we carried out a walk-through of the complete system to the practitioners and demonstrated with test data. The practitioners worked on the prototype hands on and provided their feedback on their perceptions of adopting the framework for making HA adoption decisions. We used the instrument designed by Moore and Benbasat [32] to measure the perceptions of adopting an information technology innovation for our formal evaluation. The details of the constructs used in the evaluation and the expert review feedbacks are summarized (Ref. Table 2) below.
Table 2.

Evaluation of strategic readiness adoption framework for HA.


Constructs evaluated

Expert review comments (verbatim)


Voluntariness: Degree to which the use of RAF is perceived being voluntary

“Use of such assessment framework is voluntary. We already have been doing similar assessments but may not be very structured.”


Relative advantage: The degree to which the use of RAF enhances the job performance

“RAF is useful, but more than a strategic assessment, it would be good to have a technical assessment or evaluation tool for HA products. The demands of each of functional departments which includes clinical and administration drives adoption of Analytics and assessments differently.”


Compatibility: The degree to which the use of RAF is compatible with or requires a change in one’s job

“This tool fits into the kind of strategic work I do on a daily basis. Good for managers who need to make technology investment decisions.”


Ease of Use: The degree to which RAF is easy to learn and use

“Currently the interface is very rustic and not intuitive. More automation is needed to improve the tool.”


Result demonstrability: The degree to which the results from RAF are demonstrable

“The results reflect my assessments i have done earlier using other informal methods. However, the RAF method seems to be more scientific.”


Trialability: The degree to which it is possible to try using RAF

“Need to try out again after the user interfaces are improved.”

Our initial evaluation provided many crucial findings. First, the user interface needs to be improved with more automation to increase the ease of use for the practitioner. This learning validates the past research on impact of “ease of use” [19] in ensuring technology is adopted. Second learning we have is “One size does not fit all.” Separate modules of evaluation each for clinical, operational, administrative, strategic functions need to be provided, as the readiness requirements of each of the them in a hospital is different and this was reinforced through a written comment from the practitioner on the evaluation form: “Having one view of the organization maturity will be quite difficult to develop and implement in view of specific challenges and diverse needs of operations.” Finally, Technical evaluation for HA products needs to be included as a separate module to be used at the next level once the organizational readiness for HA is determined. The above could also be an independent artifact by itself.

6 Discussions

We have constructed an artifact that can bring in efficiency and effectiveness into the HA adoption decision-making process. In the DSR design contribution framework [42] that maps the application domain maturity and solution maturity, the artifact that we developed falls under quadrant “Extended Known Solutions for New Problems”, i.e. adopting solutions from other fields and applying it in new domain. In the process of developing the artifact, we used the known solutions from different domains and applied in our new problem which is “technology adoption in healthcare”. The application of IPA techniques is more prevalent in marketing domain and DEMATEL MCDM in social and industrial domains. We applied these techniques in solving our problem of enabling healthcare executives to make informed decisions on HA adoption.

6.1 Artifact as a DSR Knowledge

While communicating the process of the design work it is critical that we also look into how the developed artifact plays a role in theorizing design science [42], in other words, “Artifact as DSR knowledge.” Reflecting the iterative DSR process that we went through in designing the artifact, the we map (Ref. Table 3) the process and the output to the eight components of design theory structure proposed by Gregor and Jones [43].
Table 3.

Design theory for strategic readiness adoption framework for HA adoption.



Purpose and scope

• Constructing a strategic framework that can be used by healthcare executives to make informed decisions on HA adoption


• Semi-structured case study questionnaire

• Interview transcripts

• Algorithm for IPA analysis

• Algorithm for DEMATEL analysis

Principles of form and function

• Strategic assessment to include economic, organizational, technological and environmental factors

• Preference elicitation methods to be used to seek inputs from the multiple stakeholders (CIO/COO/CFO/CEO)

Artifact mutability

• The two iterations that changed/enhanced artifacts provide insights into the mutability:

– The semi-structured questionnaire for case study interviews had to be modified for organizational profiles (health service provider, health service eco-system partner such as insurance)

– Based on where HA is adopted the assessment focus changes

Testable propositions

• P1: The readiness factors change based on the functional area where HA is adopted

• P2: The importance of factors change based on stakeholder profile and responsibility (CIO/CTO/COO/CFO/CEO)

Justificatory knowledge

• The readiness factors/antecedents were derived from past academic literature and further validated with case studies

• The case study was based on TOE framework

• IPA and DEMATEL techniques formed the basis of the artifact

Principles of implementation

• The artifact designed to be used for group decision making and the assessment to be carried with multiple stake holders to get an organizational view of readiness

• Recommended sample size (6–15) to get a reliable output

Expository instantiation

• MS Excel based elicitation tool with built-in functions to calculate the IPA and DEMATEL logic

Our artifact aids in a decision making which is scientific, evidence-based and involves relevant stakeholders in the organization. The multi-case study analysis, iterative design, and the “embedded expert” approach to the design process gave us several learnings. First, our involvement with practitioners in evaluation cycle not only enhanced the artifact but also fundamentally changed the objective of the design. The above highlights the importance of evaluation and the constant feedback between stages [10]. Second, our study reiterated importance of user-centric design thinking as many people look at the same given problem in different ways. Hence, it is not just about “what problem”, but also the “whose problem” which is equally important. Third, design that embeds practitioners can produce useful artifacts closer to the industry requirements.

Finally, in our view, a study focused on pre-adoption is as important as post-adoption of technology as rightly put by Sherer [7] that “traditional IT value research approaches that deal with the outcome of past IT investments through post hoc analysis will be neither timely nor relevant to influence health care practice now, when substantial investment incentives are spurring adoption and industry change”. Our study is a step towards supporting practitioners who face challenges in demonstrating the business value of IT in healthcare [7] by providing them with a useful framework and tool.

7 Conclusion

Triggered by real-life industry issues in HA adoption and using design science research method, we progressed from identification of the problem to the construction of an artifact for hospitals make an informed decision on HA adoption. We also applied the research rigor in engaging with industry and embedding experts in the design process. Our design would be an ongoing activity with multiple iterations of future enhancements with the industry seriously exploring to adopt HA technology we believe that our study has been valuable. For the academicians, this study opens up the possibility of studies focused on developing artifacts for HA technical maturity assessments and decision support tools. The other possible options are creating web-based benchmarking tools that can assess the industry on technology adoption and provide a comparative view of the organization.


  1. 1.
    Groves, P., Kayyali, B., Knott, D., Van Kuiken, S.: The “big data”revolution in healthcare. McKinsey Q. 22 (2013)Google Scholar
  2. 2.
    Ammenwerth, E., Brender, J., Nykänen, P., Prokosch, H.U., Rigby, M., Talmon, J.: Visions and strategies to improve evaluation of health information systems: reflections and lessons based on the HIS-EVAL workshop in Innsbruck. Int. J. Med. Inform. 73, 479–491 (2004)CrossRefGoogle Scholar
  3. 3.
    Raghupathi, W., Tan, J.: Information systems and healthcare: charting a strategic path for health information technology. Commun. Assoc. Inf. Syst. 23, 501–522 (2008)Google Scholar
  4. 4.
    Venkatraman, S., Sundarraj, R.P., Seethamraju, R.: Healthcare Analytics Adoption-Decision Model: A Case Study. In: 2015 Proceedings of the PACIS (2015)Google Scholar
  5. 5.
    Ward, M.J., Marsolo, K.A., Froehle, C.M.: Applications of business analytics in healthcare. Bus. Horiz. 57, 571–582 (2014)CrossRefGoogle Scholar
  6. 6.
    Lavalle, S., Hopkins, M.S., Lesser, E., Shockley, R., Kruschwitz, N.: Analytics: the new path to value. MIT Sloan Manag. Rev. 52(1), 1–24 (2010)Google Scholar
  7. 7.
    Sherer, S.A.: Advocating for action design research on IT value creation in healthcare. J. Assoc. Inf. Syst. 15, 860–878 (2014)Google Scholar
  8. 8.
    Cortada, J.W., Gordon, D., Lenihan, B.: The value of analytics in healthcare. IBM Institute for Business Value Healthcare (2010)Google Scholar
  9. 9.
    Venkatraman, S., Sundarraj, R.P., Mukherjee, A.: Prototype design of a healthcare-analytics pre-adoption readiness assessment (HAPRA) instrument. In: Parsons, J., Tuunanen, T., Venable, J., Donnellan, B., Helfert, M., Kenneally, J. (eds.) DESRIST 2016. LNCS, vol. 9661, pp. 158–174. Springer, Cham (2016). Scholar
  10. 10.
    Peffers, K., Tuunanen, T., Rothenberger, M.A., Chatterjee, S.: A design science research methodology for information systems research. J. Manag. Inf. Syst. 24, 45–77 (2008)CrossRefGoogle Scholar
  11. 11.
    Martilla, J.A., James, J.C.: Importance-performance analysis. J. Mark. 41, 77–79 (1977)CrossRefGoogle Scholar
  12. 12.
    Gabus, A., Fontela, E.: The DEMATEL observer - DEMATEL 1976 Report - Battelle Geneva Research Center, Geneva, Switzerland (1976)Google Scholar
  13. 13.
    Shneiderman, B., Plaisant, C., Hesse, B.W.: Improving healthcare with interactive visualization. IEEE Comput. Soc. 46, 58–66 (2013)CrossRefGoogle Scholar
  14. 14.
    Songthung, P., Sripanidkulchai, K., Luangruangrong, P., Sakulbumrungsil, R.C., Udomaksorn, S., Kessomboon, N., Kanchanaphibool, I.: An innovative decision support service for improving pharmaceutical acquisition capabilities. In: 2012 Annual SRII Global Conference, pp. 628–636 (2012)Google Scholar
  15. 15.
    Peck, J.S., Benneyan, J.C., Nightingale, D.J., Gaehde, S.A.: Characterizing the value of predictive analytics in facilitating hospital patient flow. IIE Trans. Healthc. Syst. Eng. 4, 135–143 (2014)CrossRefGoogle Scholar
  16. 16.
    Aktaş, E., Ülengin, F., Önsel Şahin, Ş.: A decision support system to improve the efficiency of resource allocation in healthcare management. Socio-Econ. Plann. Sci. 41, 130–146 (2007)CrossRefGoogle Scholar
  17. 17.
    Davenport, T.H., Harris, J.G.: Competing on Analytics: The New Science of Winning. Harvard Business Press, Boston (2007)Google Scholar
  18. 18.
    Davis, F.D.: A technology acceptance model for empirically testing new end-user information systems: theory and results (1986)Google Scholar
  19. 19.
    Venkatesh, V., Davis, F.D.: A theoretical extension of the technology acceptance model: four longitudinal field studies. Manage. Sci. 46, 186–204 (2000)CrossRefGoogle Scholar
  20. 20.
    Venkatesh, V., Morris, M.G., Davis, G.B., Davis, F.D.: User acceptance of information technology: toward a unified view. MIS Q. 27, 425–478 (2003)CrossRefGoogle Scholar
  21. 21.
    DeLone, W.H., McLean, E.R.: The DeLone and McLean model of information systems success: a ten-year update. J. Manag. Inf. Syst. 19, 9–30 (2003)CrossRefGoogle Scholar
  22. 22.
    Hikmet, N., Bhattacherjee, A., Menachemi, N., Kayhan, V.O., Brooks, R.G.: The role of organizational factors in the adoption of healthcare information technology in Florida hospitals. Health Care Manag. Sci. 11, 1–9 (2008)CrossRefGoogle Scholar
  23. 23.
    Yu, P.: A multi-method approach to evaluate health information systems. Stud. Health Technol. Inform. 160, 1231–1235 (2010)Google Scholar
  24. 24.
    Brooks, P., El-Gayar, O., Sarnikar, S.: A framework for developing a domain specific business intelligence maturity model: application to healthcare. Int. J. Inf. Manage. 35, 337–345 (2015)CrossRefGoogle Scholar
  25. 25.
    Yusof, M.M., Kuljis, J., Papazafeiropoulou, A., Stergioulas, L.K.: An evaluation framework for health information systems: human, organization and technology-fit factors (HOT-fit). Int. J. Med. Inform. 77, 386–398 (2008)CrossRefGoogle Scholar
  26. 26.
    Davis, M.W.: The seven stages of EMR adoption: majority of hospitals are in stage 3 and rising. Healthc. Exec. 25, 18–19 (2010)Google Scholar
  27. 27.
    Sanders, D., Burton, D., Protti, D.: The healthcare analytics adoption model (HAAM): a framework and roadmap.
  28. 28.
    Malladi, S.: Adoption of business intelligence & analytics in organizations – an empirical study of antecedents. In: 2013 Proceedings of the AMCIS, vol. 2016, pp. 1–11 (2013)Google Scholar
  29. 29.
    Ghosh, B., Scott, J.E.: Antecedents and catalysts for developing a healthcare analytic capability. Commun. Assoc. Inf. Syst. 29, 395–410 (2011)Google Scholar
  30. 30.
    Myers, B.L., Kappelman, L.A., Prybutok, V.R.: A comprehensive model for assessing the quality and productivity of the information systems function. Inf. Resour. Manag. J. 10, 6–26 (1997)CrossRefGoogle Scholar
  31. 31.
    Lee, Y.W., Strong, D.M., Kahn, B.K., Wang, R.Y.: AIMQ: a methodology for information quality assessment. Inf. Manag. 40, 133–146 (2002)CrossRefGoogle Scholar
  32. 32.
    Moore, G.C., Benbasat, I.: Development of an instrument to measure the perceptions of adopting an information technology innovation. Inf. Syst. Res. 2, 192–222 (1991)CrossRefGoogle Scholar
  33. 33.
    Ebner, K., Mueller, B., Urbach, N., Riempp, G., Krcmar, H.: Assessing IT management’s performance: a design theory for strategic IT benchmarking. IEEE Trans. Eng. Manag. 63, 113–126 (2016)CrossRefGoogle Scholar
  34. 34.
    Hevner, A.R., March, S.T., Park, J., Ram, S.: Design science in information systems research. MIS Q. 28, 75–105 (2004)CrossRefGoogle Scholar
  35. 35.
    Zhang, N.J., Seblega, B., Wan, T., Unruh, L., Agiro, A., Miao, L.: Health information technology adoption in U.S. acute care hospitals. J. Med. Syst. 37(2), 9907 (2013)CrossRefGoogle Scholar
  36. 36.
    Tornatzky, L.G., Fleischer, M., Chakrabarti, A.K.: The processes of technological innovation (1990)Google Scholar
  37. 37.
    Purao, S., Storey, V.C.: Evaluating the adoption potential of design science efforts: the case of APSARA. Decis. Support Syst. 44, 369–381 (2008)CrossRefGoogle Scholar
  38. 38.
    Han, T., Purao, S., Storey, V.C.: Generating large-scale repositories of reusable artifacts for conceptual design of information systems. Decis. Support Syst. 45, 665–680 (2008)CrossRefGoogle Scholar
  39. 39.
    Skok, W., Kophamel, A., Richardson, I.: Diagnosing information systems success: importance-performance maps in the health club industry. Inf. Manag. 38, 409–419 (2001)CrossRefGoogle Scholar
  40. 40.
    Ahmadi, H., Nilashi, M., Ibrahim, O.: Organizational decision to adopt hospital information system: an empirical investigation in the case of Malaysian public hospitals. Int. J. Med. Inform. 84, 166–188 (2015)CrossRefGoogle Scholar
  41. 41.
    Amiri, M., Salehi, J., Payani, N., Shafieezadeh, M.: Developing a DEMATEL method to prioritize distribution centers in supply chain. Manag. Sci. Lett. 1, 279–288 (2011)CrossRefGoogle Scholar
  42. 42.
    Gregor, S., Hevner, A.R.: Positioning and presenting design science research for maximum impact. MIS Q. 37, 337–355 (2013)CrossRefGoogle Scholar
  43. 43.
    Gregor, S., Jones, D.: The anatomy of a design theory. J. Assoc. Inf. Syst. 8, 312–335 (2007)Google Scholar

Copyright information

© Springer International Publishing AG, part of Springer Nature 2018

Authors and Affiliations

  • Sathyanarayanan Venkatraman
    • 1
  • Rangaraja P. Sundarraj
    • 1
  • Ravi Seethamraju
    • 2
  1. 1.Department of Management StudiesIIT MadrasChennaiIndia
  2. 2.Business SchoolThe University of SydneySydneyAustralia

Personalised recommendations