JOHN WUBBEL is a data science software engineer with a decade of experience using JMP to manufacture products in the life science pharmaceutical industry. His most recent work includes using structured data in graph database systems for advanced analytics, as well as modeling and structuring data for artificial intelligence and machine learning.
Generalized Context for Decision Process Improvement
- 1.1 Situational Assessment (current state)
- 1.2 Problem Statement
- 1.3 Visualizing State Transition
- 1.4 Metrics On-Demand
DECISION PROCESS IMPROVEMENT FOR CORPORATE PERFORMANCE MANAGEMENT
Business is making clear that to stay competitive in the market we need to make decisions quickly and often with disparate data sets. JMP CONNECTIONS should be viewed as a business-oriented data discovery tool and is not an information technology (IT) or enterprise SAP®1 Centric model because as is so often the case, data sets are not under the control of the IT department. Data may reside in silos, dozens of spreadsheets, or proprietary database applications. Thus, we can best describe this exercise as the "decision process improvement." If we can improve on the way metrics are produced, it can directly improve the timely implementation of actual decisions for corporate performance management.
The Holy Grail of the Information Age particularly in the information technology (IT) shop is the notion of data integration and interoperability. The Institute of Electrical and Electronics Engineers defines interoperability as:
The ability of two or more systems or components to exchange information and to use the information that has been exchanged.
Unfortunately, interoperability has never been entirely achieved across a large enterprise before.
However, in support of staying competitive, the popular business press and IT periodicals have been pushing "business intelligence" (BI). Business intelligence is a broad category of applications and technologies for gathering, storing, analyzing, and providing access to data to help enterprise users make better business decisions.
As postulated in the Preface, a tough economy implies a propensity to cut back on expenditures across a wide cross section of the enterprise that may also include BI software acquisitions. Utilizing JMP Pro®, the following pages will show precisely how the development of state-of-the-art metrics can be facilitated without the need for a major capital expenditure (CAPEX) project.
1.1 SITUATIONAL ASSESSMENT (CURRENT STATE)
ADVANCEMENT IN METRICS FOR BUSINESS AUGMENTATION
Before describing the common state of affairs that may be typical from small to large businesses, a framework for visualizing capability maturity with regard to the development of metrics and their use is outlined in Figure 1.1.
Figure 1.1 JMP CONNECTIONS Capability Maturity Model Levels 0 and 1
- The lowest level of capability maturity (Level 0) would be a business or organization that may not have an IT department. Most of the management and reporting of business data is done using spreadsheets and perhaps the facilities of software office suites/applications for presentations. Reporting may be ad hoc or sporadic due to such factors as data that is not readily in a form for use in conducting statistical analysis when required. Companies often have so much data that they realize knowledge is locked up; however, they have no practical, inexpensive way to develop and utilize it.
- The first level of maturity (Level 1) is where companies produce dashboards, scorecards, and KPIs on a regular basis. Perhaps on an annual basis, metrics are reviewed for relevance as needs change over time. Metrics retained may be refined and presentation and timely delivery mechanisms are level set2 depending on who is to be receiving them and at what levels of the enterprise they are to be receiving and using them. Publishing BI tools like dashboards (DBs) and scorecards (SCs) have measurable cycle times.
- The second level of maturity (Level 2) for an organization would be a realization that some subset of deliverable metrics could be converted to metrics "on-demand." In identifying these on-demand metrics, the cycle time to generate or refresh a set of deliverable dashboards would be completely eliminated. (See Figure 1.2.)
Figure 1.2 JMP CONNECTIONS Capability Maturity Model Levels 2 and 3
- The third and highest level of maturity (Level 3) is a two-part configuration. (See Figure 1.2.)
Level 3, Part 1
- Eliminate cycle time to create on-demand metrics resulting in reduction in FTEs.
Level 3, Part 2
- Human capital resource reallocation for:
- Performing advanced statistical analysis
- Predictive analytics and modeling
Level 3, Part 1, maturity level, focuses on reducing the time it takes (cycle time) to produce the metrics on a scheduled basis, thus in turn reducing the number of FTEs required to produce those metrics. One FTE required to update a dashboard every week does not leave enough time for any other production tasks for metrics. The amount of time for an FTE is finite. As hours are freed up, other knowledge within the data sets can be developed and utilized. Achieving the second level of maturity leads into Level 3, Part 2 because now predictive analytics and the full power of JMP Pro can be leveraged perhaps without the addition of more FTEs. The graphic view in Figure 1.3 summarizes the reference model for maturity capability for business intelligence metrics.
Figure 1.3 JMP CONNECTIONS Capability Maturity Reference Model
The development of JMP CONNECTIONS is applicable to literally every type of business. All examples cited in this book are totally fictional and for illustrative purposes, which can be adapted to any business. The examples are generic in the sense that the common fuel crucial to business execution is the enterprise data, mature knowledge assets, and performance indicators across the spectrum of organizations that desire optimal results. In many circumstances, particularly in larger firms, one expects to find whatever data they need on the large enterprise database applications. In fact, the information is out there but its access is less than ideal. It may in no way be in a format to provide any statistical analysis capability. It lacks a certain agility for manipulative processes for generating BI tools or data. It is a "what you see is what you get" due to the hard-coded requirements built into the application. Consequently, a query returned is often a table of data or records that do not necessarily communicate or impart knowledge to the recipient. Something extra needs to be done.
Additionally, one would think that, especially within technology firms or scientific and engineering firms, data management would be state of the art. For many and perhaps for a majority, business is conducted using spreadsheets, small desktop database applications, web applications, text files, and sticky notes. In fact, the proliferation of spreadsheets from one year to the next with no sense of version control is prevalent where many sheets act as placeholders for data rather than actually doing any computations or analysis.
Given the standard corporate desktop environment, when a set of metrics are required, they are likely prepared using a combination of the office suite applications. These may include the word processor, spreadsheet, and presentation software applications. A chart or graph may be present with some annotation explaining the meaning of the numbers and is the bare minimum or Level 0 of maturity for making metrics. Thus, it is useful to point out here exactly what types of BI solutions exist.
- Executive scorecards and dashboards
- Online Analytical Processing (OLAP) analysis
- Ad hoc reporting
- Operational reporting
- Data mining
- Customer intelligence
Each of the BI solutions has a data analysis ingredient or function that derives the reported out metric for a particular BI solution. While features and functions may be alike, what sets these apart is how they are applied to support decision making.
To be more precise in thinking about analytic metrics, there are three areas of data analysis derived from data science, information technology, and business applications that can be categorized as follows:
- * PREDICTIVE (Forecasting)
- * DESCRIPTIVE (Business Intelligence and Data Mining)
- * PRESCRIPTIVE (Modeling, Optimization, and Simulation)
Without efficient sharing of operational business intelligence, a company is going to suffer breakdowns from small to large, be unable to properly grow, and could even be flirting with massive disaster. A small issue, for example, can escalate into something very large very quickly if there's not good sharing of business intelligence. No operational intelligence, or incorrect intelligence, means that a company will create and execute strategies and plans (i.e., make decisions) that could inadvertently be bad for the company.
Beyond the Level 3 capability maturity, one may begin to get a sense about the concept of a BI Competency Center. A competency center is inclusive of all three areas when it comes to data analysis-PREDICTIVE, DESCRIPTIVE, and PRESCRIPTIVE-with respect to applying BI solutions to support various decision-making units within the enterprise. The competency center concept also can act as a facilitator for efficiently...