Human capital analytics: why aren't we there? Introduction to the special issue

Dana Minbaeva (Department of Strategic Management and Globalisation, Copenhagen Business School, Frederiksberg, Denmark)

Journal of Organizational Effectiveness: People and Performance

ISSN: 2051-6614

Article publication date: 5 June 2017

4757

Citation

Minbaeva, D. (2017), "Human capital analytics: why aren't we there? Introduction to the special issue", Journal of Organizational Effectiveness: People and Performance, Vol. 4 No. 2, pp. 110-118. https://doi.org/10.1108/JOEPP-04-2017-0035

Publisher

:

Emerald Publishing Limited

Copyright © 2017, Emerald Publishing Limited


Human capital analytics: why aren’t we there? Introduction to the special issue

This is a very special special issue. It is organized as a dialogue among HR executives, analytics practitioners, consultants, and academics, and it consists of invited scholar papers, regular submissions, perspectives, and opinion pieces. This structure is necessary given the nature of the phenomenon that is the focus of this special issue: human capital analytics (HCA).

The idea for this special issue came up at a one-day conference organized by the HCA Group at Copenhagen Business School in October 2016. The conference brought together a panel of distinguished speakers from the industry, academia, and consulting, and asked them one question: why do companies struggle to move to analytics?

There has been enormous interest in HCA among businesses, consultants, and academics. Analytics has been called a game changer for the future of HR (van der Togt and Rasmussen, 2017, p. 150). Moreover, as van den Heuvel and Bondarouk (2017, p. 129) state: “Inspired by success stories of organizations generating up to $100 million in savings, while at the same time improving the engagement and productivity of employees, advanced HR analytics is fast becoming mainstream […] and increasingly considered as an indispensable HR [and management] tool” (2017, p. 129).

Nevertheless, organizations have struggled to move from operational reporting to true analytics. A study undertaken by Deloitte (2015) found that although 75 percent of surveyed companies believed that using HCA was important for business performance, only 8 percent evaluated their organizational capabilities in this area as “strong.” Several consultancy reports and numerous LinkedIn blogs concur: despite the vastness of available corporate data, organizations have been slow to develop their HCA capabilities, and those that have focused on such development have struggled to move from operational reporting for benchmarking and decision making to analytics in the form of statistical analysis, development of “people models,” analysis of dimensions to understand cause and delivery of actionable solutions (Bersin et al., 2014).

Why do we need to move from operational reporting to analytics?

One might wonder what the fuss is all about. Plenty of companies advance with operational reporting, and are able to create or insource some well-designed dashboards and distribute backward-looking reports to business units on a regular basis. As van der Togt and Rasmussen (2017) point out, “decent Management Information (i.e. having the facts available) often already generates 80 percent of the value without sophisticated analytics because it allows fact-based diagnostics and decisions” (p. 150).

Why, then, is moving to analytics desirable from the company’s point of view? To answer this question, we can draw an analogy from the knowledge management field, which clearly distinguishes among data, information, and knowledge (Awad and Ghaziri, 2004). Data are a set of unorganized and unprocessed facts. The meaning one brings to the evaluation of data transforms it into information. Unlike data, information has meaning and shape because it is organized for a purpose. Although information is about facts, it is based on reformatted and processed data. Knowledge is an understanding of information based on its perceived importance or relevance for a problem area. Data are available to anyone and, as such, is not relevant for organizational competitive advantage. Information may be a source of a short-term performance parity. However, knowledge is the ultimate source of organizational competitive advantage (see the knowledge-based view of the firm; Grant, 1996). Similarly, information derived from operational reporting is based on an aggregation of data that makes processing easier. For example, dashboards assemble data in a format that provides a snapshot of the organization. In contrast, HCA is “the systematic identification and quantification of the people-drivers of business outcomes” (van den Heuvel and Bondarouk, 2017, p. 130) with the purpose of improving day-to-day evidence-based decisions and, ultimately, initiating transformational processes in the organization. Analytics should allow an organization to balance intuition, experience, and beliefs with hard facts and evidence: “It helps focusing on what really matters and on what works and what doesn’t work” (van der Togt and Rasmussen, 2017, p. 150).

In a nutshell, a move from reporting to analytics is desirable, as companies that only use data for reporting are at a significant disadvantage. A reliance on operational reporting may result in short-term performance parity, but will not result in a sustained competitive advantage – “the ability to create more economic value than the marginal (breakeven) competitor in its product market” (Peteraf and Barney, 2003, p. 314).

Why do companies struggle to move to analytics?

At this point, everyone agrees that analytics and evidence-based decisions are the future. The path to the ultimate goal has been identified, steps have been defined, and successful cases and “how we did it” stories are ready to guide our way. So, after five years of extensive conversations, why do we still have so far to go? Perhaps our experts can provide us with some answers.

In this special issue, the dialogue begins with an invited scholar paper by John Boudreau and Wayne Cascio. In tackling the question of why most organizations still find themselves struggling to move from operational reporting to analytics, Boudreau and Cascio distinguish between “push” and “pull” factors. I will borrow this distinction to organize my introduction to this special issue.

Push

The “push” factors, or the conditions necessary for suitable HCA to be available, are analyzed using the LAMP model (Boudreau and Ramstad, 2007). Each element of the model – logic, analytics, measures, and process – highlights a number of reasons for why organizations struggle to fully adopt HCA. In relation to “logic,” which is defined as frameworks that articulate the connections between human capital and performance, Boudreau and Cascio talk about the need for a deep understanding of the “relationship between higher performance in individual actions and collective interactions, and their effect on unit and organizational outcomes” (p. 121). Such an understanding will enable analysts to ask “strategically relevant questions” and present them in “a logical framework that shows the intervening linkages between HR investments and critical organizational outcomes” (p. 121). As Levenson and Fink (2017) point out that “the most effective organizations begin every analytics project with a key question or investment decision as the focus […] this approach avoids the dreaded ‘that’s interesting’ response to analytic results” (p. 162).

With regard to the “analytics” element, Boudreau and Cascio point to the lack of “logical depth to clarify these [analyzed] relationships” and the “lack of sufficient sophistication or power in the analytics models” (p. 121). In other words, the models are overly simplistic and underspecified – they are often about whether variable X leads to variable Y, and rarely include model specifications, such as control variables, moderation variables, or mediating variables. Notably, some analysts appear to have been extremely successful with such simplistic models. Boudreau and Cascio give an example of a successful use of a simple correlation between employee engagement and unit-level results, which the analyst presents in a visually striking way.

Why is an underspecified model a problem? In any complex system, like an organization, an effect cannot be attributed to a single factor, as numerous factors explain variance in unit-level results and affect the relationships between engagement and those results. However, this is not just about models. As Boudreau and Cascio point out, turning analytical insights into concrete business actions begins with effective storytelling with data. David Green from IBM Watson Talent (2017, p. 171) is on the same page: “You can create the best insights in the world, but if you don’t tell the story in a compelling way that resonates with your audience then it is highly likely that no action will be taken” (p. 174).

When discussing the third element in LAMP framework, “measures,” Boudreau and Cascio warn of the danger of making significant investments but failing to progress with analytics. They write:

To be sure, data management remains a significant obstacle to more widespread adoption of HR analytics […]. That said, it is also far too common that the massive data bases available are still built and structured to reflect early models of HC analytics […]. At best these kinds of data represent operational or advanced reporting, and not strategic or predictive analytics that incorporate analyses segmented by employee population and that are tightly integrated with strategic planning

(p. 122).

Indeed, this is not about big data. It is about smart data – data that is organized, structured, and continuously updated. Poorly organized firm data can be very costly. When formal, centralized coordination of data collection is lacking, we often see such problems as data duplication and incorrect entries. Moreover, such a situation makes it impossible to combine different data sets; creates unexplained breaks in time-series or longitudinal data; and leads to data inconsistencies due to the proliferation of various metrics, codings, or time frames. Accordingly, analyses based on such data are rarely comparable or combinable over time, and seldom show the impact of human capital on organizational outcomes.

Data management is expensive, but it is necessary. Levenson and Fink (2017) refer to poor data management as “lack of basic hygiene,” stating “this is especially […] the case where organizations have complex, fragmented ecosystems, and databases that evolved along with their core processes, rather than being designed as part of a coherent data strategy” (p. 165). Olly Britnell, Global Head of Workforce Analytics at Experian, explains: “Your data does not need to be perfect to do people analytics, it does need to be credible” (Green, 2017, p. 173). In other words, organization must be able to trust the data (see also Levenson and Fink, 2017, on criticizing the data).

Finally, “process” is about “communication and knowledge transfer mechanisms through which the information becomes accepted and acted upon by key organization decision makers” (Boudreau and Cascio, 2017, p. 121). Boudreau and Cascio talk about the mental modes of leaders outside the HR profession (see also Boudreau, 2012) and the need to retool HR’s way of telling stories. van der Togt and Rasmussen (2017) explain this idea further: “For us [at Shell], the ultimate two questions we ask ourselves whenever we initiate an analytics project are: (1) Would our senior line management see the value of the insight and proposed intervention in light of our business strategy, and (2) What would it take to suspend long-held beliefs in light of new data?” (p. 153). The first question is about business value of analytics and about translating science in such a way that it is simple enough to be understood, communicated, and actioned upon by practitioners. It is also about “sensing the information that a leader needs at the right time” (Boudreau and Cascio, 2017, p. 123). This is probably why Shell established a portfolio approach for selecting projects (van der Togt and Rasmussen, 2017): “We regularly do a review of potential projects in terms of potential business value (operational, financial), and in terms of the likelihood of deriving actionable insights from the predictive analytics we do” (p. 151, emphasis added). Levenson and Fink (2017) also advise analytics teams to effectively focus on a few core areas “rather than attempting to tackle every interesting question in an organization” (p. 161). In designing a portfolio approach, Green (2017) recommends introducing a prioritization mechanism, and uses Chevron and IBM as examples. When deciding which projects to prioritize, Chevron uses the relationships between business impact and cost, and IBM considers the level of impact of the proposed initiative and ease of implementation.

The second question relates to organizational transformation and change management. Any high-impact analytics project is a change management project. For Shell’s analytics team, “insights from HR analytics challenge beliefs that we all have (individually or collectively) about how the world ought to work […]. Therefore, one has to invest in the analytical skills as well as the consultative skills to make interventions stick through proper change management” (van der Togt and Rasmussen, 2017, p. 153). Accordingly, it is crucial for HCA teams to have an organization-wide network of capable individuals that they can rely on when rolling out changes and integrating the findings from their projects into organizational routines. In describing one of the first successful analytics projects, Garvin (2013) uses an analogy from change management – unfreeze, change, and refreeze. The analysts on Google’s Project Oxygen team were able to identify and connect with leading thinkers in their functions – engineering or sales, so called “tech advisers,” and to persuade them that the team’s findings were viable and credible. (Garvin, 2013). This ensured the socialization of the findings, management’s buy-in and, ultimately, organizational action.

Pull

In addition to the “push” factors, Boudreau and Cascio identify a number of “pull” factors that “may be holding back HC analytics the perspective of the audience” (p. 123). These factors relate to the ability, motivation, and opportunity (AMO) of analytics users. Note that users may not just be the HR audience, but the entire organization.

In this regard, I argue for the need to develop HCA as an organizational capability. Based on insights from the organizational capabilities perspective (Teece et al., 2000; Winter, 2000) and the micro-foundational view of strategy (Felin et al., 2015; Foss and Pedersen, 2014), I argue for conceptualization of HCA as an organizational capability that is rooted in three micro-level categories (individuals, processes, and structure) and comprises three dimensions (data quality, analytical competencies, and strategic ability to act) (Minbaeva, 2017). As an organizational capability, HCA is collective in its nature, but it is dependent on the AMO of analytics users, and rooted in their actions and interactions.

Several contributors to this special issue make useful suggestions for developing the AMO of analytics users, many of which have been proven in practice. Morten Kamp Andersen (2017) identifies six competencies that every analytics team should have (see also figure 1 in Green, 2017). In their search for these competencies, businesses reach out to academia. van der Togt and Rasmussen (2017) point out that the number of analysts with PhD degrees keeps growing, as does the cooperation between academia and companies.

In addition, Green advises: “One way to accelerate progress is to leverage (beg, borrow or steal if necessary!) resources from outside HR” (p. 173). Analytics is about relationships. In other words, it is not only about having individuals on HCA teams with the right AMO, but also about having an organization-wide network of capable individuals. HCA teams can then insource those individuals if a need for additional, high-order competencies arises.

Beyond “push” and “pull”

However, there is more to it. In the opening of his paper, Morten Kamp Andersen asks a very relevant question: how can we examine why are not we there without discussing what we mean by “there”? As Andersen suggests, “without knowing where we are going and what to aim for, it is difficult to access if we are there yet” (2017, p. 155) and, if I may add, to determine how we get there. Therefore, before starting to invest significant organizational resources in building HCA, companies should ask a simple question: are we doing this because it is fashionable (e.g. analytics as management fad; Rasmussen and Ulrich, 2015) or because it is rational (e.g. analytics as a way to do “more for less,” Huselid and Becker, 2005)? Peter V.W. Hartmann, Business Intelligence Expert at Mærsk Drilling, suggests: “Start by asking yourself why you want to use analytics – is it because it is the latest trend or because your business needs it? How is this connected to your business strategy? How does it ensure implementation of your business strategy? Answers to these questions should help you answering the next important question: what do you want to learn from using analytics?”[1].

Throughout this process, the company must define for itself what “being there” means. As I mentioned earlier, a move from operational reporting to analytics is highly desirable. However, for some companies, basic analytics of HR and business data (layer 4 in figure 1) may be sufficient. In fact, basic models with simple regressions can easily be understood by the organization and their results are actionable (see the example from Vestas in “How I Did It: Investigating the Business Case for Staff Diversity,” published by HCA Group at CBS (see footnote 1)). For others, a more scientific approach is necessary for identifying the right measures, and some sophisticated knowledge of statistics is needed to develop the right methods (see example from Shell in “How I Did It: Unleashing the Business Value of Diversity, published by HCA Group at CBS (see footnote 2)). Note that both examples focus on the topic of diversity, but Vestas prioritized obtaining easily understood results in order to initiate organizational actions, while Shell wanted to understand the sources of different forms of diversity and their different impacts on performance.

In this regard, Green (2017) stresses the importance of a clearly defined strategy and vision for analytics, and the relevance of analytics for business:

  • “To support Chevron’s business strategies with better, faster workforce decisions informed by data” (Chevron).

  • “Better, faster business decisions enabled by credible data and insight for our business leaders” (Intuit).

  • “All people decisions at Google are based on data and analytics” (Google) and simply

  • “#datadrivenHR” (Microsoft) (p. 173).

When talking about strategy and vision, one cannot ignore the role of senior leadership. Green (2017) warns: “Without CHRO and senior executive involvement your people analytics adventure is likely to be doomed from the start” (p. 172). Boudreau and Cascio (2017) also point out that “a fundamental requirement is that HCA address key strategic issues that affect the ability of senior leaders to achieve their operational and strategic objectives” (p. 122). Given the insights from the various collaborative projects that HCA Group has undertaken with numerous companies in northern Europe, I would argue that one of the decisive factors for the success of Shell’s analytics journey is the close cooperation between Jorrit van der Togt, the Executive Vice President of HR Strategy and Learning, and Thomas Rasmussen, the Vice President of HR Data and Analytics, as well as the strong support from the senior business leaders.

Final thoughts

Many of our experts stress the importance of collaboration between academia and practice. Indeed, “it takes two to tango.” In this light, I would like to discuss “traditional” research and managerial implications but from a “non-traditional” point of view. In other words, I would like to examine what research could do better in order to support practitioners and what practitioners could do better in order to encourage research.

Implications for research: what can research do better to support practitioners?

The obvious priority for researchers focused on the HCA phenomenon is to identify its theoretical roots. van den Heuvel and Bondarouk (2017) provide a detailed overview of the field’s development in which they emphasize its multidimensionality. Boudreau and Cascio (2017) relate HCA to the field of strategic human resource management and the discussion of high-performance work systems. In my previous work, I conceptualized HCA as an organizational capability and unpacked the concept using the micro-foundational view of strategy (Minbaeva, 2017). What other established academic fields and disciplines should support the development of HCA as a concept and inform future research on this phenomenon? The answer to this question will ensure that HCA avoids becoming just a management fad (Rasmussen and Ulrich, 2015).

There are at least three questions that, when answered, would support practitioners in their analytics journey.

Does development of HCA as organizational capability result in enhanced performance?

There is a significant need for further theoretical and empirical work that systematically links HCA with organizational performance in a strategic context. Researchers proposing that HCA can lead to superior organizational performance when developed as organizational capability should comprehensively identify and meticulously theorize about the relevant causal mechanisms and variables involved (Minbaeva, 2017). We will need large-N empirical studies with longitudinal data to prove that the move from operational reporting to analytics results in enhanced performance.

What are the main conceptual models that could be used in HCA?

We used to claim that business leaders do not have access to existing advanced management research and, hence, do not exploit scientific knowledge or the analytics techniques developed by researchers. However, this is changing. When companies want to look at what drives performance, they are starting to seek inspiration in academic research. When describing Shell’s analytics journey, van der Togt and Rasmussen (2017) write:

In Shell, we started looking at what drives individual and company performance, following a rich set of academic literature that looks at HR practices affecting company performance. In line with the findings of Jiang et al. (2012), we used our vast people database while respecting data-privacy requirements, and discovered that the single most important driver of individual performance is employee engagement

(p. 150).

Over the years, those of us in academia have accumulated numerous models that have been tested and retested numerous times and in various settings (i.e. “logics” in LAMP). This knowledge needs to be identified, translated, and communicated to practitioners. A single paragraph at the end of a paper covering managerial implications is not enough. We need to translate our findings for practitioners on our websites[2], on our blogs, and in the media. We must proactively address the “So what?” questions in our publications to not only make our research relevant for practice but also to assist practice in advancing from descriptive metrics to predictive analytics and from correlations to causality.

How can we make analytics actionable?

In management research, we are continuously encouraged to think about relevance: “we read each other’s papers in our journals and write our own papers so that we may, in turn, have an audience […]: an incestuous, closed loop” (Hambrick, 1994, p. 13). Unfortunately, practitioners are seldom invited into this loop. As a result, both sides hit a wall when trying to answer the question of how to make analytics actionable (Cascio and Boudreau, 2011; Rasmussen and Ulrich, 2015). I argue that to move on, practitioners and researchers need each other.

In a conversation with one of HCA Group’s corporate partners, an analytics practitioner noted a need for more than “pretty science” – a need for results that are easily comprehended and useful. A typical researcher might wonder about the benefits of responding to this need. In addition to data access, I would say that researchers would benefit from detailed insights and thorough contextualizations of their findings. These insights are impossible to obtain if the researcher’s aim is to gather as much data as possible, and then run to the office and shut the door after posting a note along the lines of “Do Not Disturb – Running STATA.” Practitioners always start with a business challenge. Therefore, they have insights into what should work but does not, which they might share with good “dance partners.” They do not have answers but they have a lot of questions. As we know from Whetten’s (1998) widely cited paper “What Constitutes a Theoretical Contribution?,” if we start with an interesting question, and answer what, how, and why in our theory development, we will not have (as many) problems justifying our value-added in our responses to editors and reviews.

Therefore, in order to make analytics actionable, we need to get both researchers and practitioners on the dance floor, where they can tango together.

Implications for practice: what can practice do better to encourage research?

Levenson and Fink (2017) warn of the danger of falling into “the trap of thinking that any progress on quantifying people issues is a step in the right direction, even though there may be little to no actionable insights” (p. 161). They encourage practitioners to learn from the mistakes done of academia: “the entire history of social science is marked by glaring examples of own measurement in a vacuum does not necessarily provide acitonable insight the organization can use to improve processes and performance” (Levenson and Fink, 2017, p. 161).

However, in practice, the emphasis is still on workforce reporting rather than true analytics. We have seen showcases of operational reporting, examples of creative dashboards, and great examples of real-time descriptives. This is probably acceptable, as many HCA teams aim to create a habit of using data, improve organizational literacy, learn to read and understand numbers, and move organizations from zero knowledge to being informed. However, in many instances, the HCA team claims to be carrying out analytics when in reality they do not go beyond a very basic descriptive analysis. Why is this the case?

One explanation may be that managers working in the field of analytics still do not have a clear definition of the concept. Analytics is about building causal models in which variance in variable X (independent variable, explanatory variable, explanans) causes variation in variable Y (dependent variable, explananda). It is about impact rather than percentages and about explanations rather than associations. It is important to move beyond correlation and toward causality, as our goal is to explain and predict. Thankfully, every contributor to this special issue emphasizes these aspects, and it is my hope that the definition of what analytics is and is not is now clear.

However, there are certain questions that practitioners can help answer to encourage and facilitate research.

Where does analytics belong?

Researchers are still debating whether analytics should belong to HR, line managers, or business intelligence. Andersen (2017) weighs the pros and cons of moving analytics outside of HR. van den Heuvel and Bondarouk (2017) argue that moving analytics to the HR department or to a general business-intelligence department is desirable, as such a placement may considerably influence the analytical models that are developed. We hope to see more cases examining different placements of analytics in organizations and more real-time experiments along these lines to inform the research.

Could we see more innovation and experimentation in analytics?

Research on strategic HRM in general and HR performance in particular is dominated by replication studies. Practitioners seem to lack innovativeness in the kinds of data used for analyses, and in their models, visualizations, and implementation. Practitioners’ openness to new methods of data collection and new ways of cooperating with academia will allow researchers to work across the boundaries of academic disciplines and undertake innovative research that scores high on rigor as well as relevance.

Is analytics a disruption for HR as we know it?

With the introduction of strategic workforce planning and actionable analytics, do line managers need HR business partners to discuss the changes in their workforces driven by market growth and talent supply? Would line managers prefer to obtain their figures by playing with scenario planning in the strategic workforce planning application? Given the expansion of digitalization and the rise of e-HR, what should be outsourced to robots or automated, and what should be kept for HR? How will the rise of analytics shape the employable HR profile over the next three to five years? In the search for answers, researchers will come to practitioners.

One thing is clear: the tremendous advancements in information technology and growing stakeholder expectations of economic gains pose significant challenges to HR as we know it, but they also offer tremendous opportunities for HR. Analytics creates a need for a significant makeover of HR. However, instead of seeing disruption as a negative, undesirable force, HR should grasp the opportunity to reinvent itself in a way that ensures organizational value creation. This will require “hard work, stamina, and the right cross-fertilization between academic rigor and business relevance” (van der Togt and Rasmussen, 2017, p. 153).

Notes

2.

For example, HCA Group at CBS has established a website that offers executive summaries of the most recent and relevant articles (see www.cbs.dk/hc-analytics under “Research Insights”).

References

Andersen, M.K. (2017), “Human capital analytics: the winding road”, Journal of Organizational Effectiveness: People and Performance, Vol. 4 No. 2, pp. 155-158.

Awad, E. and Ghaziri, H. (2004), Knowledge Management, Prentice Hall, NJ.

Bersin, J., Houston, J. and Kester, B. (2014), Talent Analytics in Practice: Go from Talking to Delivering on Big Data, Deloitte University Press, available at: https://dupress.deloitte.com/content/dam/dup-us-en/articles/hc-trends-2014-talent-analytics/GlobalHumanCapitalTrends_2014.pdf

Boudreau, J. and Cascio, W. (2017), “Human capital analytics: why are not we there?”, Journal of Organizational Effectiveness: People and Performance, Vol. 4 No. 2, pp. 119-126.

Boudreau, J.W. (2012), “Decision logic in evidence-based management: can logical models from other disciplines improve evidence-based human resource decisions?”, in Rousseau, D. (Ed.), The Oxford Handbook of Evidence-based Management, Oxford University Press, New York, NY, pp. 223-248.

Boudreau, J.W. and Ramstad, P.M. (2007), Beyond HR: The New Science of Human Capital, Harvard Business School Press, Boston, MA.

Cascio, W.F. and Boudreau, J.W. (2011), Investing in People: Financial Impact of Human Resource Initiatives, 2nd ed., Pearson Education, Upper Saddle River, NJ.

Deloitte (2015), Global Human Capital Trends, Deloitte University Press, available at: www2.deloitte.com/content/dam/Deloitte/at/Documents/human-capital/hc-trends-2015.pdf

Felin, T., Foss, N. and Ployhart, R. (2015), “The microfoundations movement in strategy and organization theory”, Academy of Management Annals, Vol. 9 No. 1, pp. 575-632.

Foss, N. and Pedersen, T. (2014), “Micro-foundations in strategy research”, Strategic Management Journal (Virtual issue), available at: http://onlinelibrary.wiley.com/journal/10.1002/%28ISSN%291097-0266/homepage/microfoundations_vsi_intro.htm

Garvin, D. (2013), “How google sold its engineers on management”, Harvard Business Review, December, pp. 74-82.

Grant, R. (1996), “Prospering in dynamically-competitive environments: organizational capability as knowledge integration”, Organization Science, Vol. 7 No. 4, pp. 375-388.

Green, D. (2017), “The best practices to excel at people analytics”, Journal of Organizational Effectiveness: People and Performance, Vol. 4 No. 2, pp. 171-178.

Hambrick, D.C. (1994), “Top management groups: a conceptual integration and reconsideration of the ‘team’ label”, in Staw, B.M. and Cummings, L.L. (Eds), Research in Organizational Behavior, JAI Press, Greenwich, CT, pp. 171-214.

Huselid, M. and Becker, B. (2005), “Improving HR’s analytical literacy: lessons from moneyball”, in Ulrich, D., Losey, M. and Meisinger, S. (Eds), The Future of HR: 50 Thought Leaders Call for Change, John Wiley and Sons, New York, NY, pp. 278-284.

Jiang, K., Lepak, D., Hu, J. and Baer, J. (2012), “How does human resource management influence organizational outcomes?”, Academy of Management Journal, Vol. 55 No. 6, pp. 1264-1294.

Levenson, A. and Fink, A. (2017), “Human capital analytics: too much data and analysis, not enough models and business insights”, Journal of Organizational Effectiveness: People and Performance, Vol. 4 No. 2, pp. 159-170.

Minbaeva, D. (2017), “Building a credible human capital analytics for organizational competitive advantage”, Human Resource Management, Vol. 4 No. 2 (forthcoming)

Peteraf, M. and Barney, J. (2003), “Unraveling the resource-based tangle”, Managerial and Decision Economics, Vol. 24 No. 4, pp. 309-323.

Rasmussen, T. and Ulrich, D. (2015), “How HR analytics avoids being a management fad”, Organizational Dynamics, Vol. 44 No. 3, pp. 236-242.

Teece, D., Pisano, G. and Shuen, A. (2000), “Dynamic capabilities and strategic management”, in Dosi, G., Nelson, R. and Winter, S. (Eds), The Nature and Dynamics of Organizational Capabilities, Oxford University Press, New York, NY, pp. 334-362.

van den Heuvel, S. and Bondarouk, T. (2017), “The rise (and fall?) of HR analytics: a study into the future application, value, structure, and system support”, Journal of Organizational Effectiveness: People and Performance, Vol. 4 No. 2, pp. 127-148.

van der Togt, J. and Rasmussen, T.H. (2017), “Toward evidence-based HR”, Journal of Organizational Effectiveness: People and Performance, Vol. 4 No. 2, pp. 149-154.

Whetten, D. (1998), “What constitutes a theoretical contribution?”, Academy of Management Review, Vol. 14 No. 4, pp. 490-495.

Winter, S. (2000), “The satisfying principle in capability learning”, Strategic Management Journal, Vols 10/11 No. 21, pp. 981-996.

Acknowledgements

The author hopes that the collection of papers presented in this special issue will inspire researchers to conduct innovative and theoretically grounded investigations in the field of HCA, and practitioners to bring their organizations on the analytic journey toward value creation.

Related articles