DCAM v3 Framework – 8.0 Analytics Management

DCAM Framework Component 8

Upper Matter

Introduction

The Analytics Management component, the eighth element of the DCAM framework, plays a dual role within data management processes by acting as both a data creator and consumer. As a data creator, Analytics Management must adhere to the principles set out by the first seven components of DCAM; which define the core capabilities necessary for effective Data Management. This ensures consistency and accuracy in the way data is consumed, and insights are generated, supporting the organization's decision-making processes. To assist analytics professionals, there must be clear processes that govern how the Analytics Management function is structured and managed. These processes must outline the necessary governance frameworks for designing, executing, validating, and delivering models, including those related to Large Language Models, to meet organizational needs.

The Analytics Management component, the eighth element of the DCAM framework, plays a dual role within data management processes by acting as both a data creator and consumer. As a data creator, Analytics Management must adhere to the principles set out by the first seven components of DCAM; which define the core capabilities necessary for effective Data Management. This ensures consistency and accuracy in the way data is consumed, and insights are generated, supporting the organization's decision-making processes. To assist analytics professionals, there must be clear processes that govern how the Analytics Management function is structured and managed. These processes must outline the necessary governance frameworks for designing, executing, validating, and delivering models, including those related to Large Language Models, to meet organizational needs.

The primary objective of Analytics Management is to formalize how analytics activities are structured, governed, and executed within an organization. This ensures that these activities align with the broader Data Management practices. The organizational structure, whether the Analytics teams are centralized or distributed, will depend on the organization's culture. However, a coherent framework for analytics can enhance synergies, maximize efficiencies, and improve overall effectiveness when aligned with a comprehensive Analytics Strategy.

Definition

The Analytics Management component is an integral part of Data Management. It is a set of capabilities required to structure and manage the analytics activities of an organization. The capabilities align Analytics Management with Data Management in support of business and functional priorities. They address the culture, skills, platform, and governance required to enable the organization to obtain business value from analytics.

The Analytics component outlines the necessary capabilities for success but does not address the different organizational structures that may exist, nor will it delve into the factors to consider when choosing the most suitable structure. Each organization will need to design an appropriate structure to support their analytical needs and capabilities based on their unique resources and capacity.

Scope

  • Develop an Analytics Strategy that aligns with the overarching Business Strategy.
  • Ensure the Analytics Strategy is aligned with the Data Management Strategy.
  • Establish the Analytics Management function.
  • Ensure clear accountability for the analytics created and for their uses throughout the organization.
  • Work with Data Management to align analytics with all DCAM components, especially Data Architecture and Data Quality Management.
  • Establish an analytics platform that provides flexibility and controls to meet the needs of the different stakeholder roles in the Analytics Operating Model.
  • Design and deploy effective governance over the data analysis life cycle including tollgates for model reviews, testing, approvals, documentation, release plans, monitoring, and regular review of processes, adjustments and retiring.
  • Ensure that Analytics follows established guidelines for privacy, data ethics, regulatory compliance, model bias, and model explainability requirements and constraints.
  • Manage the cultural change and education activities required to support the Analytics Strategy.

Value Proposition

Organizations that excel in managing their Analytics functions and resources are adept at harnessing the power of analytical methods, advanced algorithms, and high-quality data. By improving the speed and efficiency of converting data into actionable insights, they not only enhance their ability to make quick, informed decisions but also ensure adherence to ethical and legal standards. This process boosts overall performance and competitiveness.

Overview

At its core, analytics is about enhancing decision-making. It goes beyond just crunching numbers; it's about extracting practical insights that can guide strategic decisions. This approach prioritizes the importance of actionable knowledge derived from data, rather than just the analytical process itself.

Analytics-driven, real-time decision making has emerged as a crucial business differentiator, powered by swift advancements in technology, increased data accessibility and processing capabilities, and advanced analytical techniques that cater to stakeholder requirements.

Analytics Enablers & Stakeholders

Diagram 8.1: Analytics Enablers & Stakeholders

Advances in technology and data include:

  • NoSQL and graph database technologies enabling greater varieties and quantities of both structured and unstructured data to be accessed and processed efficiently.
  • The falling costs of processing, coupled with faster processing speeds and the scalability of cloud computing, are making sophisticated analytics techniques more affordable and accessible.
  • Availability of ‘“all-in-one” solutions democratize access to analytical techniques.
  • Lower data storage costs and increased storage limits, increase the quantity of data available to analytics.
  • New sources of data such as sensors, telematics, and satellite imagery, enable new data sets and new combinations of data sets to be analyzed.
  • Advanced data visualization simplifies how people explore and interpret exceptionally large volumes of analytics results.
  • Increased acceptance and use of Machine Learning in model development has been enabled by an audience of professionals that have embraced and accepted Artificial Intelligence as a productivity enhancer and by the progress made towards a legal and regulatory framework to support it.

To successfully implement and manage their analytical capabilities in alignment with business objectives, organizations must follow the discipline and fundamental principles set forth in the DCAM Analytics Management best practices, which encompass:

  • An Analytics Strategy supporting business needs. The growing volume and diversity of data available to organizations has expanded the potential applications of advanced analytics. As the demand for these analytics increases, it becomes crucial for organizations to prioritize projects based on their significance. To effectively leverage analytics in support of business objectives, organizations need a clear Analytics Strategy to deliver business needs supported by a Funding Model established to sustain the effort.
  • A clear Analytics Operating Model. In many organizations, analytics practitioners often work more closely with business units than with traditional technology teams. It's essential for the organization to have a clear operating model that ensures consistency and efficiency in their activities. Those designing this model must ensure it remains aligned with the stakeholders and maintains the necessary agility.
  • Analytics Management aligned to Data Management. In the absence of effective data management, a significant amount of an analytics practitioner’s time is spent manipulating, cleansing, and transforming data in preparation for the analytics. It is important for Analytics to be aligned with the Data Management initiative. Data Analytics Management should ensure that data is understood and that authoritative sources are used where these are available. Data Quality Management will provide measures of data quality that Analytics practitioners should reference and use to manage data quality issues as they prepare data for their models.
  • An Analytics Platform to meet comprehensive needs. Not all analytics activities will involve the creation of models that, once successfully tested, will run as production processes or services. Some analytics will be one-time exercises to investigate historical issues or to address current, specific questions. There may also be experiments with new data sources or new analytical techniques. The appropriate analytical technique or approach will need to be selected for each problem statement to ensure the organization maximizes the benefits and considers the related. The platform that supports analytics must be designed to accommodate these different types of activity and the specific needs of different stakeholders.
  • Analytics Governance and discipline. Model governance and transparency are essential for the responsible development and deployment of advanced analytics, including Machine Learning and Artificial Intelligence. Compliance with privacy regulations is crucial when models are operationalized. There is a growing need for explainability in model decisions, which may also be a regulatory requirement, making it important to address these aspects early in the model development process. Additionally, identifying and controlling potential bias in training data is vital to prevent prejudicial outcomes against specific groups. Ongoing oversight is necessary for data encountered in production as well. Finally, adherence to the organization's Code of Data Ethics is critical to ensure that data usage within models is ethical and appropriate, guiding how organizations act on model-generated decisions and recommendations.
  • Analytics Education to support Analytics Culture. Effectively addressing the challenges of creating an analytics-driven organization requires significant changes in both culture and behavior. It is crucial for everyone involved to obtain the necessary understanding of the benefits that analytics can provide, and conversely, the possible negative repercussions that may result from its application.

Core Questions

  • Does the Analytics Strategy clearly articulate the role that Analytics will play in delivering the business strategy?
  • Is the Analytics Strategy aligned with the Data and Data Management Strategy?
  • Is there organizational alignment and support for the Analytics Strategy, operating model, and funding approach? Are analytics activities prioritized according to business priorities?
  • Is the value of analytics understood and measured?
  • Are Data Management Governance and Analytics Management Governance aligned?
  • Does the analytics platform support the needs of the Analytics practitioners?
  • Are processes in place to control the release of analytics models into production?
  • Is approval and release of models aligned with privacy and data ethics governance?
  • Are there initiatives in place to address the cultural change and analytics practitioner education to enable Analytics success?

Core Artifacts

  • Analytics Strategy
  • Analytics Classification System
  • Analytics Operating Model
  • Analytics Methodology
  • Model Documentation Standard
  • Data Obfuscation Strategies
  • Model Testing, Approval, Release, Regular Reviews, Monitoring, Adjusting, and Sunsetting processes or procedures
  • Cultural Behaviors Gap Analysis and Initiatives Backlog
  • Analytics Skills Gap Analysis

8.1 Analytics Function

For an Analytics function to be sustainable, it must be formally established and recognized. It must be approved by management and supported by an approved Funding Model and an effective governance structure. Roles and responsibilities must be established, and an analytics methodology must be adopted.

8.1.1 Analytics Strategy and Approach

Description

The Analytics Strategy must be defined based on the organization’s strategic objectives and goals while also considering existing analytics capabilities and advances. The role of the Analytics function must be communicated to stakeholders and the Analytics Strategy should be formally empowered by senior management.

Objectives
  • Formally establish the organization’s Analytics strategy.
  • Obtain executive management support for the Analytics strategy.
  • Communicate the role of Analytics across the organization through formal organizational channels.
Advice

The Analytics Strategy must be aligned to the business and operational objectives of the organization and aligned with the Data Management Strategy.

The strategy presents the vision for Analytics, addressing the organization, leadership, techniques, platform, processes, and culture. It describes how to address the gaps and realize the vision. The goal of developing the Analytics Strategy is to capture high-level objectives and translate them into achievable analytics solutions.

Achieving stakeholder and executive buy-in is critical to the success of the strategy. A well-documented Analytics Strategy is both a statement of approach and a marketing document to present to stakeholders and executive management. Effective communication of the strategy is key to empowering a coordinated approach across the organization and avoiding inconsistencies and inefficiencies of different business areas taking different approaches.

Questions
  • Is there a formal, documented strategy and approach for Analytics?
  • Have stakeholders been identified and involved in the creation and approval of the strategy?
  • Are the Analytics Strategy and the vision of the Data Management Strategy aligned?
  • Does the Analytics Strategy support the high-level objectives of the organization?
  • Has executive management support for the strategy been obtained?
  • Has the role of Analytics been formally communicated throughout the organization?
  • Are the different types of analytics documented that are relevant to the organization?
Artifacts
  • Vision statement of the target state for Analytics and how it supports the organization’s objectives
  • Prioritized approach to the initiatives required to realize the Analytics vision
  • Documented alignment of the Analytics Strategy and vision of the Data Management Strategy
  • List of stakeholders and evidence of bi-directional communication
  • Evidence of formal communication of the strategy and approach
  • Formal approval of the strategy by executive management
Scoring

Not Initiated

No formal Analytics Strategy exists.

Conceptual

No formal Analytics Strategy exists, but the need is recognized, and the development is being discussed.

Developmental

The formal Analytics Strategy is being developed.

Defined

The formal Analytics Strategy is defined and has been validated by the stakeholders.

Achieved

The formal Analytics Strategy is established and understood across the organization and is being followed by the stakeholders.

Enhanced

The formal Analytics Strategy is established as part of business-as-usual practice with a continuous improvement routine.

The strategy and approach are reviewed and updated at least annually.

8.1.2 Analytics Operating Model

Description

An operating model is defined to describe how Analytics will be structured within the organization. It establishes the scope and coverage of the different aspects and types of Analytics and designs how they should work together to serve the needs of the business and support the Data Management Strategy. The Analytics Operating Model addresses analytics organization structure, role-level responsibilities, governance structure, funding processes, and technology platforms and is integrated with the Data Management Operating model.

Objectives
  • Define the Analytics team's structure within the organization, delineating the scope and coverage.
  • Define the roles and responsibilities within each type of Analytics team.
  • Define the interrelationships between the various Analytics teams, between Analytics teams and the Data Management Program and data teams, and between Analytics teams and the business functions.
  • Align the Funding Model with the overall funding approach of the organization.
Advice

The Analytics Operating Model encompasses disciplines from different areas such as IT, business functions, and Analytics. It provides clarity on how these areas work together to provision the required data through analytical techniques and models, and to deliver the models into production. The Analytics Operating Model designers must ensure it is informed by the business requirements and business architecture and is kept current with developing input from business leaders.

The Analytics Operating Model clarifies the ownership of line-of-business Analytics, cross-business functions, and any overarching Analytics program. It is aligned with other operating models within the organization. If there is a strong culture of federated operating models, this should be embraced. Conversely, if there is a strong centralized operating model culture, carefully consider the benefits before deviating from this approach.

Best practices suggest that the ideal make-up of an Analytics team includes people with mathematical, analytical, and technical skills, as well as strong business acumen and project management experience. In practice, there are often multiple localized analytics teams within an organization, each with varying skill levels and delivery approaches. Whether Analytics is operated in a centralized, decentralized or federated model, ensure processes and routines are in place to develop and maintain analytics best-practice standards consistently across the organization. The goal is to develop a discipline around Analytics that instills confidence and level-sets expectations in the user community. Analytics becomes proficient at providing solutions through organization-wide awareness, collaboration, sharing, and best practices.

The Funding Model must address all aspects of Analytics. Types of work beyond business-driven Analytics projects should be addressed. These include experimentation, foundational work, creation of re-usable data assets, ad-hoc high-priority analysis, and more. Funding of software licenses, platform, and infrastructure costs must be considered. When addressing people costs, consider the differentials in costs of Analytics specialists. Alignment with the overall funding approach of the organization may require the model to distinguish between Analytics initiatives aligned with operating units and aspects of the operating model that are not operating-unit specific.

Ensure that Analytics is funded as a sustainable function. Use the Funding Model to distinguish aspects of funding that are discretionary from those that are non-discretionary. Stakeholder involvement in the creation of the Funding Model is critical to obtaining a financial commitment that can be sustained.

Having buy-in from stakeholders is key if the Analytics organization is to be effective. Stakeholders should be involved throughout the creation of the operating model to avoid misaligned expectations.

Questions
  • Has the structure of the Analytics teams within the organization been defined?
  • Have the roles and responsibilities of each type of Analytics team been defined?
  • Do each of the team structures delineate the scope and coverage of each?
  • Are the interactions of Analytics teams, the Data Management initiative, and the business defined?
  • Is the Funding Model consistent with the established funding approaches and budget processes of the organization?
  • Does the Funding Model address how Analytics will be funded in an ongoing, sustainable function?
  • Are measurable benefits from Analytics initiatives used as support for the funding requirements?
  • Have stakeholders approved the operating model?
  • Has the functional structure described in the operating model been implemented?
Artifacts
  • Operating Model Terms of Reference
  • Organizational structure of Analytics teams
  • Role definitions and RACIs (both between Analytics teams and between Analytics and other functions)
  • Process models for Analytics, Data Management and business interactions
  • Documented Funding Model
  • A written process to review, modify, and validate resource plans
  • Evidence of alignment with budget processes and organizational cycles
  • Documentation of measured benefits
  • Funding plans and budget allocation
  • List of stakeholders and evidence of bi-directional communication
  • Documented formal approval of the operating model and provisions to review and refresh as needed
Scoring

Not Initiated

The operating model for Analytics does not exist.

Conceptual

The operating model for Analytics does not exist, but the need is recognized, and the development is being discussed.

Developmental

The operating model for Analytics is being developed.

Defined

The operating model for Analytics has been defined, reviewed, and approved.

Achieved

The operating model for Analytics has been implemented, and the organization structure was established.

Enhanced

The operating model and organization structure for Analytics is established as part of business-as-usual practice with a continuous improvement routine.

8.1.3 Analytics Governance

Description

Explicit governance of analytics should be established for an organization. The governance structure oversees implementation and sustains the operating model for Analytics, implementation of the analytics platform, and ongoing initiatives to shape the analytics culture of the organization. It ensures alignment of analytics with business strategy, data ethics, and the Data Management Program.

Objectives
  • Define the governance structure for analytics.
  • Create policies to enforce analytics governance implementation.
  • Establish governance forums with written and approved charters.
  • Implement operating governance structures.
  • Identify and engage with stakeholders.
  • Communicate stakeholder roles and responsibilities.
  • Hold stakeholders accountable for their participation in analytics via performance reviews and compensation considerations.
Advice

The governance structure should complement and align with the Analytics Operating Model. Care should be taken to ensure the governance structure does not constrain timely decision making. Members of the Analytics governance structure need to be clear about their roles and empowered to drive change. Ensure there is representation from all the disciplines required to resolve issues arising from identification of opportunities, data collection and provisioning, and model deployment.

Analytics will have dependencies on other areas of the organization for deliverables such as data collection and model deployment. It is particularly important to agree in advance to clear hand-offs and escalation paths. Consider having members of the governance organization reporting to senior management, such as the Chief Operating Officer, to gain this empowerment.

Tool selection should be aligned with the business strategy and not driven just by what tooling or expertise exists in the organization.

Ethics and privacy are central themes in the governance of analytics. Start by building on the ethics and privacy data governance processes, and structures already in place. Governance processes need to demonstrate fairness, transparency, and security of data usage.

Questions
  • Do terms of reference for analytics governance exist?
  • Do policies exist that enforce analytics governance implementation?
  • Are there metrics for what is governed and measures of successful compliance?
  • Is there a tooling inventory with usage recommendations available?
  • Is there RACI (Responsible, Accountable, Consulted, Informed) documentation for all stakeholders and participants in the analytics governance process?
  • Is the governance process demonstrating that ethics and privacy are governed?
Artifacts
  • Analytics governance terms of reference
  • Evidence of policies written, implemented, and enforced to show that analytics is properly governed
  • Analytics governance metrics and measurements
  • Analytics project tracking
  • Model inventory, including model reviews and updates
  • Tooling inventory and recommendations
  • Analytics governance RACI
Scoring

Not Initiated

No governance structures for Analytics exist.

Conceptual

No governance structures for Analytics exist, but the need is recognized, and the development is being discussed.

Developmental

Analytics governance structures are being developed.

Defined

Analytics governance structures have been defined, reviewed, and approved.

Achieved

Analytics governance structures are established and operational.

Enhanced

Analytics governance structures are established as part of business-as-usual practice with a continuous improvement routine.

8.1.4 Analytics Development Life Cycle

Description

An analytics development life cycle or methodology provides a framework for the activities performed through the analytics life cycle. The life cycle begins with understanding the business problem and runs through deployment, operation, and review of the solution. The framework provides a common language and structure for stakeholders to refer to the different stages and aspects of analytics activities.

The spectrum of analytics (e.g., Management Information, Business Intelligence, Artificial Intelligence, Descriptive, Diagnostic, Predictive, Prescriptive) employed by the organization is defined. Names and descriptions of the different types of analytics relevant to the organization are formalized in a categorization system. It provides a common language for the organization to refer to analytics and helps avoid confusion and misunderstanding.

A standard for model documentation ensures consistency across the organization in the way that model provenance, assumptions, inputs, outputs, parameters, and limitations are captured and communicated.

Objectives
  • Select or develop an analytics methodology that defines the analytics life cycle for the organization.
  • Define and adopt a standard for model documentation.
  • Formally document the analytics methodology and model documentation standard.
  • Ensure the methodology, categorization system, and model documentation standards are understood and adopted by analytics practitioners.
  • Provide feedback mechanisms for ongoing refinement and improvement of the methodology and model documentation standard.
Advice

Whether the organization should buy or outsource analytics solutions as opposed to internally developing them will depend on the maturity and culture of the organization. Most organizations follow a hybrid approach where for some problems they develop the analytics solution internally (in-house) and for others they outsource or procure externally. For each challenge faced, the organization needs to carefully review the benefits and cost of developing the solution in-house vs. outsourcing and select the most beneficial to the organization.

The analytics methodology should focus on significant steps in the end-to-end process rather than specific analytical or modeling techniques. Analytics tools come and go, while analytics methodology should be sustainable and stable. However, to reach a long-term stable analytics methodology, the organization must be open to methodology improvements based on experience. The analytics methodology should accommodate innovation as well as business use-case-driven analytics.

There will be more flexibility to define and specify the analytics methodology for internally developed analytics than for externally developed analytics. However, in both cases, it may be worthwhile considering external best practice methodologies and adapting them to the organization.

In defining the spectrum of analytics, examples will bring clarity. The boundaries of the analytics spectrum determine the activities to which the analytics governance and best-practice frameworks apply. For example, an organization may define its lowest boundary of analytics such that static (i.e., not automatically refreshable) Management Information reports are out-of-scope, whereas self-service Business Information reports may be considered in-scope.

The categorization system for levels of analytics will not be static. It will work best by allowing for emerging levels of sophistication as Analytics matures and evolves.

To ensure that the analytics methodology and categorization systems are appropriate to the organization and to get stakeholder buy-in, key influencers and stakeholders should be involved in the selection or development of the analytics methodology and categorization system. Form a community of “champions” to act as owners of the methods and drivers of improvement.

Questions
  • Has an analytics methodology that defines the analytics life cycle of the organization been selected or developed?
  • Has a model documentation standard been created?
  • What types of analytics has the organization developed historically?
  • What types of analytics does the organization foresee using in the future?
  • What types of analytics are being used by other organizations?
  • Which stakeholders should be approached for input into the categorization system for levels of analytics?
  • Has it been confirmed that the analytics methodology, categorization system, and model documentation standard have been understood and adopted by analytics practitioners?
  • Have the analytics methodology and model documentation standard been formally documented?
  • Have feedback mechanisms for ongoing refinement and improvement of the analytics methodology, categorization systems and model documentation standard been developed and made available to stakeholders?
  • Is there a register of analytics projects with up-to-date status available?
  • Is there a model register with review dates and update history?
Artifacts
  • Analytics methodology and terminology document
  • Model documentation standard
  • Records of stakeholder approval of the analytics methodology and model documentation standard
  • Analytics categorization approach
Scoring

Not Initiated

No formal analytics development life cycle exists.

Conceptual

No formal analytics development life cycle exists, but the need is recognized, and development is being discussed.

Developmental

Analytics development life cycle is being developed.

Defined

Analytics development life cycle has been defined, reviewed, and approved.

Achieved

Analytics development life cycle is established and in use.

Enhanced

Analytics development life cycle established as part of business-as-usual practice with a continuous improvement routine and reviewed regularly.

8.1.5 Analytics Processes

Description

Analytic processes have been documented, communicated and implemented across the organization. The success of an analytics program requires standard organization-wide processes that are repeatable, sustainable and measurable. The organization should leverage existing industry standards and best practice. The use of the standard processes must be required by policy.

Objectives
  • Establish standardization across all analytics processes and practitioners.
  • Ensure analytics processes are aligned and leverage standard data management processes.
  • Ensure compliance with data management policies.
  • Ensure data created by Analytics integrates into Data Management ecosystem.
  • Analytics is an active participant in and aligned with data management practices where appropriate (e.g., data quality, data governance, metadata management, data controls, issue management)
Advice

Analytics management must establish standardized processes to ensure that practitioners consistently and regularly document requirements, findings, adjustments, assumptions, and decisions while keeping up with ongoing changes. These standards should account for interactions with stakeholders, the data management team, and other relevant business units within the organization.

Analytics management and data management will depend on one another to effectively support the business in achieving its goals and objectives. This collaboration will require data and analytics practitioners to engage collectively in the established standard practices. The analytics and data management teams must work together to share and leverage processes, avoiding duplication, particularly in areas such as data quality and issue management.

Additionally, it is crucial for analytics practitioners to be aware of and understand the data management policies that pertain to the data they are using in their analyses. The Analytics team will not only utilize existing data resources but will also generate new data through their models, which must be managed according to organizational data management practices.

Questions
  • Are standard analytics processes defined?
  • Are analytics practices aligned and coordinated with data management practices?
  • Do analytics practices reference data management practices where appropriate?
  • Are the analytics processes supported by policy?
  • Are the appropriate stakeholders engaged in the analytics practices?
Artifacts
  • Documented processes for analytics practitioners
  • Evidence of analytics and data management collaboration
  • Evidence of stakeholder involvement in the standard analytics processes as appropriate
  • Policies in support of analytics standards
Scoring

Not Initiated

Analytics processes are not documented.

Conceptual

Analytics processes are not documented, but the need is recognized, and the development is being discussed.

Developmental

Analytics processes are being developed in alignment with all data management requirements.

Defined

Analytics processes have been developed, reviewed and approved and are in alignment with all data management requirements.

Achieved

Analytics processes have been adopted and are successfully meeting data management requirements.

Enhanced

Analytics processes have been adopted and are successfully meeting data management requirements as part of business-as-usual practice with a continuous improvement routine.

8.1.6 Analytics Monitoring

Description

The objective of Analytics is to synthesize data, create views and provide insights that are used to support the business needs (e.g. satisfy regulatory requirements, conduct critical activities such as closing of books, deliver outcomes of business value). The organization should document the performance indicators to be used to routinely measure the impact and effectiveness of their analytics solutions and tie them back to business and data management strategies.

Objectives
  • Measure and communicate business value created by analytics.
  • Engage the stakeholders to create shared accountability and buy-in to quantified benefits driven by analytics.
  • Measure the business impact of analytics use cases, projects, and experiments.
Advice

Organizations should consider in advance how to define and measure the success of the outcome(s) of any analytical solution. A useful starting point is to establish reliable comparison benchmarks from which to measure incremental impacts. Control groups and randomized experimental methods are good ways to establish these benchmarks. These tools become especially important to diagnose cause and effect in situations where actions based on analytics insight do not result in the anticipated benefits. For example, the failure of an analytics-based marketing campaign that does not deliver expected benefits may not simply result from failed analytics models, it may arise from other factors such as poor execution or low-quality marketing data.

There can be analytics where the benefits cannot be quantified. In this case, the qualitative benefits of these should be documented and recognized. The value of analytics that informs the decisions should also be recognized.

When the incremental value of analytics can be measured, the successful benchmarking approaches can be applied to a whole portfolio of analytical solutions. The Analytics teams quantify the business value of their actions to the organization at large, justifying the initial and additional investments in analytical capabilities. The systematic quantification and communication of business value supports the development of a data-driven culture of inquiry.

Questions
  • Do analytics specification documents define key performance indicators to evaluate the analytics?
  • Are key performance indicators, control groups, or other experimental methods routinely used to prove the incremental value of analytics against a benchmark?
  • Do the control groups or other experimental methods in use successfully isolate the impact of analytic insights versus the impacts of other related business actions?
  • Are the benefits of analytics communicated and understood by senior management and validated by an independent third party/function (e.g., Finance)?
Artifacts
  • Analytics specification documents
  • Evidence of the ROI of analytics solutions
  • Evidence of experimental design to evaluate the incremental value of analytics
  • Post-implementation benefits evaluations
  • Analytics key performance indicator documentation or metadata
  • Evidence of communication of analytics benefits to senior management
Scoring

Not Initiated

Analytics usage is not measured or understood to be driving business value.

Conceptual

Analytics usage is not measured or understood to be driving business value, but the need is recognized, and the development is being discussed.

Developmental

Measurement of analytics usage and business value is being developed.

Defined

Measurement of analytics usage and business value has been defined, reviewed, and approved.

Achieved

Measurement of analytics usage and business value is being performed.

Enhanced

Measurement of analytics usage and business value is established as part of business-as-usual practice with a continuous improvement routine.

8.2 Analytics & Business Alignment

The Analytics and Business functions must be aligned and both functions must support business goals together. Analytics activities must be prioritized to meet the needs of business strategy and drive business value.

8.2.1 Analytics and Business Architecture Collaboration

Description

The business architecture of an organization defines its structure, governance, processes and information for decision making. The Analytics organization understands the business architecture of the organization and uses it as an input to define the Analytics strategy and requirements of the organization.

The Analytics organization engages frequently with the business stakeholders to understand their needs. To execute on this vision, the organization has a well-defined process to capture business requirements, estimate and understand the impact on analytics, prioritize them and develop and execute the roadmap to deliver them.

Objectives
  • Review the business/functional architecture (e.g., strategy, structure, governance, processes) regardless of the degree of formalization.
  • Understand and document key analytical requirements appropriate for the business and functions.
  • Align the Analytics capabilities and processes with business requirements.
  • Communicate and anchor analytical requirements to stakeholders.
Advice

Businesses do not all function identically, and decision-making priorities can differ across various segments of the same organization. To effectively address these differences, analytics leaders must collaborate with business leaders to grasp their needs and determine how analytics can best integrate with the overall business architecture, which encompasses a wide range of components beyond just IT systems.

  • Designers of the Analytics Operating Model should ensure that it reflects both business requirements and the broader business architecture, maintaining alignment with ongoing input from business leaders. Given the close relationship between analytics and senior decision-making, certain elements of the analytics organization may need to be decentralized and integrated within various business units or functional areas. Conversely, some analytics functions could be most effectively supported by a centralized center of excellence.
  • The most successful analytics organizations focus on projects that align with business objectives and the overarching business architecture. Engaging deeply with these objectives ensures a well-balanced analytics portfolio that considers strategic and tactical initiatives, compliance alongside performance, business-as-usual support versus innovation, and various metrics.
  • Thorough documentation of analytical requirements fosters a clear and consistent understanding among stakeholders. Continuous engagement with the shifting dynamics of business needs allows Analytics to remain focused on pressing priorities.
  • Questions
    • Is the Analytics Operating Model and plan documented?
    • Have key dependencies been considered in the design of the Analytics organization and its connectivity to decision making?
    • Does the Analytics plan include specific reference to business requirements?
    • Have the senior executive and line-of-business executive teams endorsed the Analytics plan?
    • Is there a defined process for ongoing updates of the Analytics plan?
    • Does the analytics organization understand the business requirements?
    • Can the analytics teams show connectivity between their business requirements and the business architecture?
    • Are gaps consistently captured and is there a process for acting on them?
    • Does the business understand the connection between their business architecture and the analytics design?
    Artifacts
    • Evidence of alignment of Analytics with business strategy and plan
    • Documented alignment of Analytics, Business Architecture and business requirements
    • Details of business analytics requirements
    • Evidence of senior executive input and approval
    Scoring

    Not Initiated

    Dependencies between Analytics and Business Architecture are not understood.

    Conceptual

    Dependencies between Analytics and Business Architecture are not understood, but the need is recognized, and the development is being discussed.

    Developmental

    Dependencies between Analytics and Business Architecture are being determined.

    Defined

    Dependencies between Analytics and Business Architecture have been defined, reviewed, and approved.

    Achieved

    Dependencies between Analytics and Business Architecture are understood and are being addressed.

    Enhanced

    Understanding and addressing the dependencies between Analytics and Business Architecture is established as part of business-as-usual practice with a continuous improvement routine.

    8.2.2 Business Driven Analytics Prioritization

    Description

    The business strategy drives the Analytics Roadmap and activities. This fact keeps the focus on analytics investments and how they can maximize business outcomes. The Analytics organization should be aligned and in collaboration with business stakeholders.

    Objectives
    • Understand leadership expectations and explore how analytics can support strategic value creation.
    • Work collaboratively with leadership to build a comprehensive list of current business opportunities and analytics use cases.
    • Co-create an analytics vision and roadmap that helps the alignment with business priorities, recognize short- and long-term focus areas, and creates senior management buy-in.
    • Develop benefits use cases for all analytics and rank them based on their business value using the analytics prioritization process.
    • Communicate the Analytics vision and the prioritization of analytics use cases to stakeholders.
    Advice

    Align the Analytics Roadmap with the business strategy establishing a forum, across the business, to supervise prioritization of analytics use cases. Review and prioritization must be a regular activity performed on a regular basis.

    A framework to assess the benefits of analytics use cases should have clear dimensions for scoring and grading them. Scoring and grading prioritization should include, but not be limited to, the complexity, feasibility, regulatory requirements, and business value of the analytics solution, alignment to the organization’s business objectives, the number of data sources involved, and the size of the user community impacted. The framework should be transparent, allowing the fair assessment of analytics use cases and the communication of use case prioritization with the business stakeholders.

    The organization should ensure that analytics activities include both those that drive business outcomes as well as those that develop the foundational and future state of analytics capabilities. Create a roadmap that contains a detailed list of initiatives that are sequenced to help the organization deliver on its strategic objectives and establish processes to revisit sequencing as business priorities and market conditions change.

    Questions
    • Is there a formal, documented, and prioritized roadmap for Analytics?
    • Does a process exist to ensure prioritizations are revisited and modified by accountable stakeholders?
    • Does the Analytics Roadmap articulate its support of the organization in its strategic business objectives and does it capture what is needed to support business as usual?
    • Has the Analytics vision and roadmap received appropriate high-level review and approval?
    • Do the analytics use cases document the assessments of their business value as well as their feasibility?
    • What is the process to capture the business strategy and the demand/impact to analytics?
    • Are there clear grading criteria for assessing or scoring the potential business value and feasibility of analytics use cases?
    • Has communication support been established so business stakeholders can prioritize the use cases?
    Artifacts
    • Analytics vision statement or plan
    • Prioritized roadmap of Analytics
    • Details of use case benefits
    • Evidence of high-level sponsorship and support
    Scoring

    Not Initiated

    Prioritization of Analytics is not driven by business strategy.

    Conceptual

    Prioritization of Analytics is not driven by business strategy, but the need is recognized, and the development is being discussed.

    Developmental

    The approach to business strategy driving the prioritization of Analytics is being developed.

    Defined

    The approach to business strategy driving the prioritization of Analytics has been defined, reviewed, and approved.

    Achieved

    Prioritization of Analytics is driven by business strategy.

    Enhanced

    Prioritization of Analytics driven by business strategy is established as part of business-as-usual practice with a continuous improvement routine.

    8.2.3 Analytics Support for Business Needs

    Description

    Analytics activities properly aligned to business objectives can deliver value in multiple ways. Examples include, informing decisions, validating understanding, identifying process effectiveness, or suggesting actions. Analytics teams drive innovation, taking new ideas to the business as well as responding to business-led use cases. Appropriate analytics insights should be embedded in business information and processes. In many cases, this will include automation of analytics with technology for greater speed and efficiency.

    Objectives
    • Clarify which business questions, decisions, actions or processes would be impacted by the analytical solution in the formulation of a new analytics use case.
    • Early in the design of any analytics solution, establish which user community will benefit from the analytical solution.
    • Co-develop analytics solutions with relevant business teams.
    • Ensure access to analytics outputs and that their visualizations are readily available.
    • Communicate and deploy outputs in ways the business users can easily consume them, including the development of good visualizations and user interfaces to ensure the explainability of the analytics outputs.
    Advice

    Analytics should both respond to business needs and proactively generate new insights. In each case, there is a responsibility to ensure the business can use the analytics outputs to support its decisions and act on the new insight.

    Regular communication between the business and Analytics teams is critical for success. Interaction allows the analytic solutions to be co-developed so that they are focused on the right needs and delivered in a way that is most effective for the users to act. During this process, Analytics can add value by considering a broader array of data inputs and insights that may be derived from a mix of internal and external sources.

    When designing new analytics use cases or initiatives, it is important to consider the actionability of insights. In some instances, limits on actionability may arise from ethical, legal, or commercial considerations. For example, while it may be technically feasible to predict which customers are most likely to churn, the business must be extremely careful when deciding whether to act on this insight to avoid unintended consequences of contacting them. The use of personally identifiable information often requires special care for ethical and regulatory reasons.

    All Analytics activities must have a purpose. Some analysis is likely to be investigative and so not directly actionable in a business process. However, these analyses should fit into a decision process where exploratory analysis helps the business choose the next course of action.

    Analytics teams have multiple options for the deployment of insights depending on the context and time-sensitivity of the use case. For example, the ability to automate or centralize critical algorithms while minimizing manual manipulation is likely to be more appropriate for high volume, close to real-time decision making. In contrast, ongoing business processes with humans in the loop are more likely to need analytic outputs using visualizations and integrated business intelligence tools. In both scenarios, it is important to ensure the explainability of the analytics outputs so that stakeholders can gain trust in using the analytics.

    Questions
    • Do formal procedures exist for documenting the purpose, scope, and delivery of analytics use case insights?
    • Does the analytics specification document identify the specific user community impacted?
    • Does the analytics specification document identify how the outputs will be delivered, acted on, or embedded in the business process?
    • Is the analytics tooling readily available to all potential beneficiaries in the organization?
    • Has the organization demonstrated the benefits of automating analytics that supports business processes?
    Artifacts
    • Analytics governance policies
    • Analytics specification documents
    • Evidence of alignment with privacy and ethics policies
    • Evidence of analytics automation
    • Analytics output visualizations
    Scoring

    Not Initiated

    The means of ensuring that Analytics support and influence business needs, and are actionable where required, does not exist.

    Conceptual

    The means of ensuring that Analytics support and influence business needs, and are actionable where required, does not exist, but the need is recognized, and the development is being discussed.

    Developmental

    The means of ensuring that Analytics support and influence business needs, and are actionable where required, is being developed.

    Defined

    The means of ensuring that Analytics support and influence business needs, and are actionable where required, has been defined, reviewed, and approved.

    Achieved

    The means of ensuring that Analytics support and influence business needs, and are actionable where required, is implemented.

    Enhanced

    The means of ensuring that Analytics support and influence business needs, and are actionable where required, is established as part of business-as-usual practice with a continuous improvement routine.

    8.3 Analytics Management & Data Management Ecosystem Alignment

    Analytics Management and Data Management must be aligned to ensure they function as an integrated part of the organization's data ecosystem, supporting business goals. Since analytics relies on upstream data and generates insights for decision-making, alignment with Data Management is essential to maintain data reliability and foster confidence in data-driven decisions. Analytics must understand data lineage, adhere to approved business definitions, and follow identification and classification standards.

    8.3.1 Analytics Alignment with Data Management and Data Architecture Standards

    Description

    Data Management and Data Architecture play a crucial role in supporting Analytics Management by ensuring data completeness, quality, and traceability.

    Objectives
    • Ensure that Analytics Management, Data Management and Data Architecture are in collaboration to establish and maintain quality data sources for consumption.
    • Ensure that all Data Management and teams are utilizing standard data architecture structures (e.g., business glossary definitions, data catalog, metadata, taxonomies, ontologies).
    • Ensure policy and standards for Analytics Management and Data Management are aligned and understood by both organizations.
    • Align the development life cycles of Analytics Management and Data Management.
    Advice

    Data Management and Data Architecture play a crucial role in supporting Analytics Management by ensuring data completeness, quality, and traceability. Authoritative data sources must be used whenever possible, but when alternative sources are required, they should be recorded to maintain integrity. Data lineage provides transparency, enabling trust in analytical products and business decisions by ensuring data can be traced from source to final output, with all transformations and aggregations understood and agreed upon. Consistent data mapping to clear business definitions is essential for reliable decision making. In a mature data environment, these definitions are maintained in authoritative sources, captured in metadata and lineage, and governed by data management policies and standards to ensure quality, controlled access, and appropriate usage.

    The Analytics Management and Data Management organizations must collaborate directly and have a clear understanding of each other's roles and responsibilities within the data ecosystem of the organization. Without alignment between the teams, it will be challenging for either team to contribute effectively towards achieving the business objectives.

    Questions
    • Does Analytics Management use data sources supported and published for consumption by Data Management?
    • Is Analytics Management participating in the use and maintenance of the standard data architecture structures and tools (e.g., catalog, glossary, metadata)?
    • Are the policy and standards established for Data Management and Analytics Management aligned and not conflicting?
    • Are the Analytics Management and Data Management teams aware of roles and responsibilities each has related to data management?
    • Is there a regular cadence of both teams participating together in support of the business objectives?
    Artifacts
    • Analytics Management and Data Management Alignment Approach
    • Analytics Management Policy and Standards
    • Data Management Policy and Standards
    • Data Architecture Standards
    • Data Management Tool Standards
    • Meeting artifacts supporting collaboration
    Scoring

    Not Initiated

    No formal approach for alignment exists between Analytics Management and Data Management.

    Conceptual

    No formal approach for alignment exists between Analytics Management and Data Management but the need is recognized, and the development is being discussed.

    Developmental

    The approach for alignment of Analytics Management and Data Management is being developed.

    Defined

    The approach for alignment of Analytics Management and Data Management has been defined, reviewed, and approved by stakeholders.

    Achieved

    The approach for alignment of Analytics Management and Data Management is established and supports ongoing collaboration.

    Enhanced

    The approach for alignment of Analytics Management and Data Management established as part of business-as-usual practice with a continuous improvement routine.

    8.3.2 Analytics Data Preparation Standards

    Description

    Data preparation is the process of collecting, structuring, cleansing, and transforming data so it can be readily and accurately analyzed for business purposes. The data preparation process must be defined, and all process steps applied consistently to achieve fit-for-purpose data. Data preparation must be subject to the organization’s Data Governance policy and standards, including the use of the business glossary and metadata. The data must be based on authoritative data sources, where relevant. Data preparation must be performed in a way that preserves data lineage and integrity.

    Objectives
    • Define standards for data preparation that include the identification of data required for the analysis, available authoritative data sources, and data accessibility.
    • Define standards for specification of data elements, data quality, need for data cleansing (including defect tracking and root cause fix), and conformance, transformation, and aggregation.
    • Ensure data preparation follows the organization’s Data Management policy and Data Management standards and preserves data lineage and integrity.
    • Create and maintain adequate documentation on data definitions, data sources and lineage, data usage, and data owners.
    • Maximize the re-usability of any prepared data and data preparation processes to create efficiencies and time to market improvements for future data-preparation needs.
    Advice

    Data preparation processes leverage authoritative data sources, business glossary, and metadata to provide accurate, well-defined data for consumption. When new data sets or data elements are sourced, documentation should be completed to ease reuse. A well-documented data catalog supports and significantly accelerates future data sourcing and wrangling processes.

    Understanding data quality and determining whether the data is fit-for-purpose must be a key activity of the Analytics practitioners in the data preparation process. Data of higher quality can be reused in other processes more readily. Profiling data can support data sourcing decisions. Profiling provides a statistics-based understanding of data content and data quality across multiple data sources.

    Data preparation should be a repeatable process and a formalized best practice. Preference should be given to the use of self-service data preparation solutions versus spreadsheets. Spreadsheets are affordable but error-prone, difficult to control, and maintenance heavy. Data-preparation tools can provide an effective and controlled data preparation process and the ability to integrate structured and unstructured data more efficiently. Strategies for agile creation of re-usable data sets should be considered to foster efficiency by reducing the need for internal approvals via obfuscation techniques.

    Questions
    • Do data preparation standards cover the identification of data required for the analysis, available (authoritative) data sources, data accessibility, specification of data elements, data quality metrics, data cleansing, and conformance, transformation, and aggregation?
    • Does data preparation follow the organization’s Data Management policy and Data Management standards and preserve data lineage and integrity?
    • Is adequate documentation maintained on data definitions, data sources, data lineage, data usage, and data owners?
    • Are any prepared data or data preparation processes leveraged across multiple analytical processes, tools, or teams?
    Artifacts
    • Data preparation and analytics guidance and design documentation
    • Evidence of adoption of data preparation processes by Analytics teams
    • Documentation of data architecture and data lineage, including changes made through data preparation
    • Data definition, sourcing, lineage, usage, and ownership documentation supporting data-preparation decisions
    • Evidence of re-use of data sets and data preparation processes
    Scoring

    Not Initiated

    No data preparation standards exist.

    Conceptual

    No data preparation standards exist, but the need is recognized, and the development is being discussed.

    Developmental

    Data preparation standards are being developed.

    Defined

    Data preparation standards have been defined, reviewed, and approved.

    Achieved

    Data preparation standards are being applied consistently.

    Enhanced

    The consistent application of data preparation standards is established as part of business-as-usual practice with a continuous improvement routine.

    8.4 Analytics Platform

    For Analytics to be effective and efficient, it must be supported by a platform that is designed and implemented to meet its needs. The operating model drives many of these needs. The different requirements of production and non-production environments must be addressed, and there must be a version-control regime for models that is appropriate for each of these environments. Strategies for anonymization of sensitive data are required to maximize the reusability of data sets. The platform should provide appropriate flexibility to scale up and down. Both production and non-production environments should be aligned with ethics and privacy governances.

    8.4.1 Analytics Platform Supports Analytics Operating Model

    Description

    The analytics platform is a combination of tools, applications, and infrastructure that enables analytics to be created and executed in an organization. It must have the necessary capabilities to support the way that Analytics teams are structured and operated, and the way they engage with the business and stakeholders. It must enable Analytics to develop, test, govern, socialize, maintain, and mature analytical models.

    The platform must support data operation activities and be flexibl• Ensure the platform design takes account of the agreed requirements.e enough to enable a variety of individuals to access data, conduct analyses, and create visualizations as needed. These interactions must implement any required segregation of duties and the ability to set access rights to different users of the system.

    Beyond the Analytics teams, business users will need to validate and utilize the results of the analytics models.

    Objectives
    • Understand the platform requirements of the different roles defined in the operating model.
    • Identify stakeholders and obtain agreement to the requirements to be supported.
    • Ensure the platform design takes account of the agreed requirements.
    Advice

    The functional and non-functional requirements to support the Analytics Operating Model must be understood before the procurement or development of the platform commences. Any pre-existing infrastructure and solutions should not constrain the requirements.

    Facilitation of segregation of duties, and support and control of different levels of access, must be part of the design. The platform design should ensure ease of access and use. It should support the need for platform users to understand both the data being used and the results being produced.

    It is advisable to design the platform to support data operations, ensuring efficient data throughput from applications through various stages to the end user. As data consumption by diverse stakeholders increases, it is crucial to manage data as a continuous flow, integrating seamlessly across different organizational verticals.

    Questions
    • Have the platform requirements of different roles described in the operating model been defined?
    • Have platform requirements been discussed and agreed with business and Analytics stakeholders?
    • Are the requirements understood by those responsible for the procurement and implementation of the platform?
    • Has the platform design been reviewed to confirm it addresses the agreed requirements?
    • Has platform support of enforcement of access rights been validated?
    • Can stakeholders easily review the outputs of the platform?
    • Can the platform support users beyond the data analysts who produce the outputs?
    • Can the platform support data operations?
    Artifacts
    • Documented assessment of the platform design support for the operating model
    • Policies relating to segregation of duties and evidence of their review and approval
    • Evidence of bi-directional communication with the stakeholders/business on the requirements of use of the platform
    • Policies relating to the management of access
    Scoring

    Not Initiated

    The requirement for the platform design to meet the needs of the Analytics Operating Model has not been identified.

    Conceptual

    The platform design does not meet the needs of the Analytics Operating Model, but the need is recognized, and the development is being discussed.

    Developmental

    The design of the platform to meet the needs of the Analytics Operating Model is being developed.

    Defined

    The design of the platform to meet the needs of the Analytics Operating Model has been defined, reviewed, and approved.

    Achieved

    The platform design meets the needs of the Analytics Operating Model.

    Enhanced

    Platform design support for the needs of the Analytics Operating Model is established as part of business-as-usual practice with a continuous improvement routine.

    8.4.2 Analytics Platform Supports Innovation and Production

    Description

    A separate environment is needed for data discovery, development and testing before delivering models into production. Testing in this low risk, sandbox environment ensures appropriate testing takes place before the model is used and relied upon by the organization. The sandbox must provide appropriate capacity for its computational requirements determined by the business needs. A sandbox environment requires proper security rules to ensure data protection and controlled user access.

    Objectives
    • Define and support the requirements for the innovation environments.
    • Define and support the requirements to segregate the development and test sandbox from the production environment.
    • Establish distinct design and change control processes for the sandbox, development, and production aspects of the platform.
    • Establish required data protection and controlled user access.
    Advice

    The nature of analytics is based on iterative experimentation, so development environments need a degree of agility that supports the ability to fail fast and fail safely. Non-production environments, especially a sandbox/innovation environment, need a greater level of flexibility made possible by a sufficiently agile change management process that can address multiple scenarios.

    There must be clear segregation of environments to ensure the change management process is appropriate for each level. It is change management that enables the Analytics teams to be productive. Change management processes for sandbox environments must be flexible enough to support adequate turnaround times for experimentation. A rigidly separate environment must be provided for non-production model testing, with clear demarcation separating it from production. The change management process for production must be appropriately robust in comparison to the more flexible non-production environments.

    It is crucial to ensure that the innovation or sandbox environment is adequately set up to handle data that resembles production data and complies with all relevant data protection standards. This preparation allows for the migration and testing to be finalized before deploying to a production environment.

    Questions
    • Are there documented requirements distinguishing the sandbox/innovation environments and production environments?
    • Are the memory, storage, and processing capabilities of the different environments documented?
    • Have the stakeholders (business, analytics) approved that the environments meet their requirements?
    • Has the design process for each environment been assessed and put into operation?
    • Are different change control processes in place for each environment to support the business requirements and risk appetite of the organization?
    Artifacts
    • Production and non-production environments strategy
    • Documented designs of sandbox, development and production environments
    • Documented change control processes for the sandbox
    • Evidence of engagement with stakeholders of policy review/implementation
    • Evidence of communication of strategy and policies
    • Evidence of user-access rights to productive and non-productive environments
    Scoring

    Not Initiated

    The separate needs for innovation and production are not understood.

    Conceptual

    The separate needs for innovation and production are not understood, but the need is recognized, and the development is being discussed.

    Developmental

    The separate needs for innovation and production are being defined.

    Defined

    The separate needs for innovation and production have been defined, reviewed, and approved.

    Achieved

    The platform addresses the separate needs for innovation and production.

    Enhanced

    The need for the platform to address the separate needs for innovation and production is established as part of business-as-usual practice with a continuous improvement routine.

    8.4.3 Analytics Platform Version Management

    Description

    There must be controlled management of change to all models developed within the analytics platform and a documented process for recording the changes. Effective governance and change control of the models must be in place and must be auditable.

    Objectives
    • Ensure that changes to the analytics models are made with appropriate authorization.
    • Ensure that changes are tracked and the nature of each change is documented and auditable.
    • Ensure that models are only released into the production environment through the controlled process of the appropriate test-environments release mechanisms.
    • Ensure that appropriate versioning and archiving are in place for data sets used to train and validate models.
    • Address requirements to re-run analytics on historical data with the version of the model used at that point in time.
    Advice

    Analytical models should follow the change control process and the organization’s software development life cycle process. Seek concurrence through discussion with the Technology function, the change management function, and connected business functions.

    Maintain documentation that explains changes made to models. This documentation must trace version history to know which versions of which models are used in which implementations.

    Document the process for the creation and storage of retrievable backups of previous versions of code and data sets. These backup files are needed in the event of errors in the upgrade or if the analysis provided by previous versions needs to be validated.

    While this section focuses on the need for version control for all analytical models, an Analytics organization needs to ensure that there is version control in all analyses designed and deployed on the platform including those with simple calculations or data aggregations. This will ensure that the most recent number is referenced and that if there are changes to the analytical approach, the Analytics practitioners and their stakeholders understand them and are aware of them.

    Questions
    • Is there a change control process for amendments to analytics models on the platform?
    • Is the defined change control process in place and understood by the responsible individuals within the company?
    • Does documentation exist to show and explain the changes made to models?
    • Are previous code versions and data sets available to ensure understanding of previous model analyses?
    • Does the change control process ensure appropriate authorization for changes?
    Artifacts
    • Model specifications
    • Policies and procedures associated with version control for models and data sets
    • Model release protocols and policies
    • Evidence of model review and approval
    Scoring

    Not Initiated

    No version-control regime exists.

    Conceptual

    No version control regime exists, but the need is recognized, and the development is being discussed.

    Developmental

    A version control regime is being developed.

    Defined

    A version control regime has been defined, reviewed, and approved.

    Achieved

    The version control regime has been put in place and is being followed.

    Enhanced

    Adherence to the version control regime is established as part of business-as-usual practice with a continuous improvement routine.

    8.4.4 Data Anonymization Strategy

    Description

    Data anonymization strategies can help to ensure that data used during the development phases of models complies with the appropriate regulatory regimes governing the organization and best practices regarding data privacy. Data anonymization also applies to commercially sensitive confidential data, such as company results before their release.

    The obfuscation capability is important for non-production testing environments and, in some cases, can also be relevant to production environments. Where external professional testers have access to test environments, non-production environments must store the minimum amount of personally identifiable or commercially sensitive information.

    Objectives
    • Implement all relevant policies and develop the process and procedures to identify the data that should be classed as commercially sensitive or personally identifiable information, using the organization’s data classification system to help determine the level and sophistication of obfuscation that is required.
    • Develop methods to obfuscate data by masking or blurring it, as needed by the business and approved by the compliance and regulatory teams.
    Advice

    Collaborating within a cross-functional team that encompasses Privacy, Legal, Compliance, and other appropriate departments, determine the data classifications that define which information is considered commercially sensitive or classified as Personally Identifiable Information in accordance with the applicable regulatory framework. These classifications will dictate the extent of obfuscation or deletion required for the data. Confirm whether data anonymization is needed for production and non-production models and whether it applies to model outputs as well as inputs.

    Questions
    • Are data anonymization requirements understood?
    • Is there a process documented and followed to ensure regulatory compliance in the use of sensitive data (personal and/or commercial)?
    • Have obfuscation techniques been established and adopted?
    • If sensitive data needs to be used to ensure accurate reporting and prediction optimization, is there a process to ensure that the sensitive or restricted data utilized is obscured or deleted after use?
    Artifacts
    • Evidence of data anonymization policies, referencing data types and detailing specific requirements for production and non-production environments
    • Non-production environments strategy including the management of data replication across environments
    • List of stakeholders and evidence of bi-directional communication and recognition of relevant policies
    • Documentation of obfuscation techniques with guidance on how and when to apply them
    • Evidence of data obfuscation/anonymization of relevant models/model outputs
    Scoring

    Not Initiated

    No data anonymization strategies exist.

    Conceptual

    No data anonymization strategies exist, but the need is recognized and the development is being discussed.

    Developmental

    Data anonymization strategies are being developed.

    Defined

    Data anonymization strategies have been defined, reviewed, and approved.

    Achieved

    Data anonymization strategies are supported and are being used.

    Enhanced

    The use and support of data anonymization strategies are established as part of business-as-usual practice with a continuous improvement routine.

    8.4.5 Platform Scalability Management

    Description

    Business requirements change rapidly. The analytics model environment must have the flexibility to cope with new data without requiring significant redesign. The model environment needs to be able to provide an increase in data computing power requirements to address forecasted business growth and the growing sophistication of models. The costs of computational power need to be tracked. There must be an understanding of how additional requirements could be accommodated, and potential costs must be estimated in advance.

    Objectives
    • Understand the processing power capacity and flexibility potentially needed for the analytics model environment.
    • Support the ability to scale up and down to accommodate planned requirements.
    • Establish a mechanism for estimating and tracking model processing costs.
    Advice

    A process should be established for stakeholders to provide direction on what future requirements the analytics models may be required to support. These requirements provide input for capacity planning. They should include data volumes and an estimate of the analytical workloads.

    The scalability requirements should be reflected in the technology roadmap for Analytics. They should establish a position for the use of both on-premises and cloud infrastructure as appropriate for the organization.

    The costs of the analytics platform should be tracked and reviewed against expectations by the stakeholders to ensure the environments are sized appropriately. Assess and quantify the business benefits obtained from the model outputs to determine if the costs of the models and processing and the costs of idle capacity that supports scalability are justified.

    Questions
    • Is there business direction on future requirements and strategy that the analytics models are required to support?
    • Are future data volumes and workloads understood and quantified?
    • Are volume and workload requirements documented and reviewed by Analytics?
    • Can the models accommodate increased data volumes or variables without requiring a complete redesign and the related costs?
    • Are the costs of the analytics platform tracked and reviewed against expectations to ensure that environments are sized appropriately?
    • Does the assessment include the costs of idle capacity?
    Artifacts
    • Documentation evidencing production and non-production and low-level, non-functional requirements
    • Non-production environments strategy
    • Cost tracking of both non-production and production environments, including any utility computing
    • Evidence of spending reviews and ROI for production and non-production environments
    Scoring

    Not Initiated

    Environment scalability requirements are not understood.

    Conceptual

    Environment scalability requirements are not understood, but the need is recognized, and the development is being discussed.

    Developmental

    Environment scalability requirements are being developed.

    Defined

    Environment scalability requirements have been defined, reviewed, and approved.

    Achieved

    Environment scalability requirements are understood and supported.

    Enhanced

    Understanding and support of environment scalability requirements are established as part of business-as-usual practice with a continuous improvement routine.

    8.5 Model Development Life Cycle

    While some analytics activities will be exploratory or one-off, Analytics must be able to deploy models into production in a controlled and governed manner. Testing, approval, and release processes are central to this, along with processes for regular review of deployed models. These must be aligned with the organization’s governance approaches for data ethics and privacy. Requirements to understand and control model bias must be addressed, as must the need to be able to explain how model decisions have been reached.

    8.5.1 Model Development Processes

    Description

    Before release into an operational environment, models must be tested to ensure that they are performing as expected and in consonance with all model specifications. Model validation should include thorough review and testing of assumptions and analytical techniques and reviews to ensure that the data are not misused and remain easily auditable. This validation includes alignment with the code of data ethics and data privacy governance. The satisfactory outcome of testing and approval for release should be formalized by the official designated for this purpose. The model should be released following established release protocols.

    The performance of the model may change as ingested data changes over time, potentially affecting the efficacy of the model’s original intent. Its performance and its continued adherence to model specifications must be reviewed periodically, and any time there is a change to model specifications.

    Objectives
    • Define formal processes for model testing, validation, approval, release, and periodic review.
    • Ensure the authority for model approval is clear and appropriate.
    • Ensure that models (released or pending) deployed into production environments are performing as expected and in consonance with each model’s specifications.
    • Establish safeguards to ensure model release into production minimizes disruption to the organization’s operations and protects against data breaches and corruption of data.
    Advice

    As part of the model testing, a wide range of inputs must be run through the model to ensure that it is performing as expected. Such input sets may include test data, current data, and artificially created representative samples of input data that could theoretically, even if rarely, arise in operations. Model outputs should be checked for any unintended consequence arising from using the model, such as unfair bias or undesirable business outcomes. Model testing and controls should be automated where possible.

    The individuals authorized to approve a model should have input to the specifications of the portfolio of evidence to be provided for review in the approval. They should understand how the model works and the business objectives it is designed to meet. They should also have a broader awareness of the overall business function context in which the model will operate and of governance and ethical parameters.

    To ensure responsible use of data, it is essential to validate all data in models prior to their release into production environments, preventing potential misuse. Additionally, the processes of model development, validation, approval, and release must align with established data ethics governance and privacy governance structures and guidelines. This includes ensuring that all stakeholders involved in the approval and release of models have a clear understanding of privacy governance requirements. Ultimately, a cohesive framework that integrates both data ethics and privacy considerations is crucial for the integrity and compliance of model operations.

    Consideration of data ethics needs to be embedded in the end-to-end process of model development and management.

    Compliance of a model with privacy requirements must be understood and confirmed before a model is released but should already be considered at the design stage.

    Questions
    • Do formal procedures exist for model testing, approval, release, and periodic review?
    • Is the authority for model approval designated at the appropriate level of seniority?
    • Have stakeholders approved the enforcement of the procedures for testing, approval, release, and review?
    • Do models perform as expected and in consonance with their specifications?
    • Have safeguards been established to ensure model release minimizes disruption to operations and protects against data corruption and breaches?
    • Is there a process in place to ensure a formal Code of Data Ethics and associated guidance are reviewed and remain up to date?
    • Is there a process in place to review all models in production when the privacy requirements change?
    Artifacts
    • Model specifications
    • Model testing, approval, and review procedures
    • Evidence of automation of model tests
    • Model release protocols and policies
    • Details of model input datasets and outputs
    • Schedule of periodic model reviews
    • Evidence of model review and approval
    • List of stakeholders and evidence of approval of enforcement of procedures
    Scoring

    Not Initiated

    No model testing, approval, release, and regular review processes exist.

    Conceptual

    No model testing, approval, release, and regular review processes exist, but the need is recognized and the development is being discussed.

    Developmental

    Model testing, approval, release, and regular review processes are being developed.

    Defined

    Model testing, approval, release, and regular review processes have been defined, reviewed, and approved.

    Achieved

    Model testing, approval, release, and regular review processes are in place and effective.

    Enhanced

    Effective model testing, approval, release, and regular review processes are established as part of business-as-usual practice with a continuous improvement routine.

    8.5.2 Model Bias Management

    Description

    Analytics stakeholders need to be aware of any prejudices and unfairness of models and any unintended consequences they may cause. This type of model bias is a governance issue and must be addressed with appropriate checks and balances that include awareness, mitigation, and controls. Models must be subject to active management of bias that provides for ongoing assessment and validation of algorithms, data sets, and model outcomes.

    Objectives
    • Establish processes and controls to ensure that input data sets do not introduce model bias.
    • Ensure that model bias and its effects are identified.
    • Establish procedures for addressing model bias once identified.
    • Ensure that the shortcomings of algorithms are understood, and they are not applied to questions where answers will be invalidated by algorithmic bias.
    Advice

    Analytics model results are based on the data inputs to the model, either when created and trained or when running in production. Bias arises when that data is not representative of the real world, whether missing key variables that would lead to different decisions or including human-produced content that incorporates biases of those persons. Many organizations are establishing Data Ethics committees charged with reviewing models and their outcomes for signs of intended or unintended bias.

    To minimize the harmful effects of bias, stakeholders need to be aware of the possible bias in any model. They must understand the various types of bias and how each can impact data, analysis, and decisions.

    Identification and management of bias must be incorporated in the formal procedures for model approval and regular review. Regular reviews of models must include monitoring for sudden or creeping bias being introduced as the models are exposed to new data in production.

    Questions
    • Are stakeholders aware of the potential sources of model bias?
    • Is the analysis of model bias included in the analytics methodology and model documentation standard?
    • Are there procedures and controls to address bias in model input data?
    • Are model shortcomings documented and considered in decisions on the application of the model?
    • Is model bias addressed in procedures of model approval and regular review?
    • Are judgment-based decisions on model bias risk made at an appropriate level of seniority?
    • Is there a regular review with appropriate specialists to determine if there is drift in the model resulting in bias?
    Artifacts
    • Stakeholder education material on model bias
    • Procedures for the analysis of model bias
    • Procedures and controls for review of bias in input data sets
    • Documented analysis of model bias
    • Documentation of model shortcomings and limitations
    • Evidence of model bias and shortcomings considered in model deployment decisions
    • Procedures for regular audit of models to ensure bias has not arisen
    Scoring

    Not Initiated

    Model bias is not understood or managed.

    Conceptual

    Model bias is not understood or managed, but the need is recognized, and the development is being discussed.

    Developmental

    Processes are being developed to ensure model bias is understood and managed.

    Defined

    Processes have been defined, reviewed, and approved to ensure model bias is understood and managed.

    Achieved

    Processes have been established and implemented to ensure that model bias is understood and effectively managed.

    Enhanced

    Understanding and effective management of model bias are established as part of business-as-usual practice with a continuous improvement routine.

    8.5.3 Model Requirements Traceability

    Description

    Requirements for explaining how a model works and reaches its outcomes should be precise and must be incorporated into the modeling process and the operational processes that support the models. Transparency of how a model works is critical to recognizing bias and aligning business processes and activities to achieve non-biased outcomes.

    Objectives
    • Define requirements for model explainability in business terms.
    • Assign accountability for model explainability during the requirements phase of the modeling process.
    • Perform additional steps to mitigate ambiguity in requirements for model explainability.
    • Incorporate processes to ensure requirements for model explainability result in a clear understanding of how a model works and for model transparency.
    Advice

    Model explainability requirements must specify how easily models can be understood and how easy it must be to understand the cause of decisions in a model. The goal is for stakeholders to be able to explain what models do and how they do it, to both internal and external audiences. Model explainability must be able to stand up to Internal Audit or external regulatory reviews. Explanations should be consistent.

    Identify and consult the stakeholders who can understand and discuss the requirements for model explainability.

    Include requirements for model explainability in the organization’s analytics methodology and model documentation standard. Embedding standards into the fabric of an organization is a critical step in achieving model explainability.

    Update the organization’s processes, standards, and policies as the adoption of requirements for model explainability increases. Keep documentation and processes current. The alternative results in tribal knowledge, operational silos, and increased risk that intellectual property and knowledge are lost, making model explainability more difficult.

    Establish methods to monitor and continuously improve model explainability. The goal of continuous improvement is to provide opportunities for stakeholders to enhance model explainability to meet business objectives. Sources of improvement ideas include surveys, interviews and peer reviews, success stories, and analysis of gaps highlighted by monitoring and measurement. Bring new ideas and thinking into the organization by keeping pace with emerging and waning industry standards, technology improvements, and by comparing processes and outcomes to those of competitors, allies, and other industries.

    Questions
    • Are models explainable in business terms?
    • Can the requirements for model explainability be traced to business processes, activities, and outcomes?
    • Is there evidence of roles and responsibilities?
    • Was accountability assigned during the requirements phase of the modeling process?
    • Have steps or processes been identified to mitigate ambiguity?
    • Are models easily interpreted?
    • Is it easy to understand the basis of decisions in the models?
    Artifacts
    • Policies for requirements for model explainability including the types of models acceptable for business use cases
    • Defined steps to mitigate ambiguity, both demonstrated and documented or observed
    • Requirements for model explainability documented in analytics methodology and model documentation standard
    • Minutes from peer-review sessions of model explainability
    • Documentation of rationale for model choice
    • Analytics practitioner role definitions that include responsibilities relating to model explainability
    • Evidence of monitoring and continuous improvement efforts for model explainability
    Scoring

    Not Initiated

    Requirements for explainability are not understood.

    Conceptual

    Requirements for explainability are not understood, but the need is recognized, and their development is being discussed.

    Developmental

    Requirements for explainability are being developed.

    Defined

    Requirements for explainability have been defined, reviewed, and approved.

    Achieved

    Requirements for explainability are understood and incorporated.

    Enhanced

    Understanding and incorporation of explainability are established as part of business-as-usual practice with a continuous improvement routine.

    8.6 Analytics Education and Adoption Program

    An analytics education program should empower all individuals involved in the analytics process, not just specialized analytics practitioners. While these experts require skills in data collection, analysis, and interpretation to support decision-making, education should also extend to business analysts, casual analytics users, and business leaders who rely on and request analytics insights. Training helps participants better understand analytics capabilities, enabling business leaders to ask more informed questions and analysts to deliver deeper insights. Since analysis occurs at all levels of an organization, equipping employees with analytics knowledge fosters more informed decision-making. Ultimately, effective analytics requires change, and education is key to driving consistency and adoption across the organization.

    8.6.1 Analytics Education Approach and Plan

    Description

    A program is established to develop personnel for their roles within the organization by equipping them with the appropriate Analytics concepts, skills, and accountabilities. The skills required to perform different analytics roles and responsibilities are understood, and the education program has been created to develop these skills.

    Objectives
    • Identify and align the skills required to fulfill roles and responsibilities at different levels.
    • Create a learning map to provide analytic roles with the right skills and tools to perform their activities.
    • Review and confirm that the learning map fulfills role definitions.
    • Embed a process to refresh the learning map based on industry standards and developments.
    Advice

    Analytics practitioners need to be provided with the right education and training to build their skill set to maximize the value an organization gets from its Analytics capability. When developing the learning map, Analytics practitioners across all levels should be involved to ensure all the required skills are represented across Analytics. Roles and responsibilities should be used as a starting point to identify the skills and tools that will enable practitioners to perform these effectively. Conduct a current state assessment and gap analysis of existing skills, tools, and training in Analytics to identify and address gaps and common challenges, guiding the focus of learning experiences.

    The learning map should be designed to provide Analytics practitioners with both practical and theoretical experiences. Consider experiences from structured learning, social learning, and experiential learning activities. The various experiences should be built into the learning map, with enough time committed to each activity and a clear delivery roadmap with milestones to monitor learning progress. The learning map must be refreshed at intervals aligned with the operating model and the learning experiences updated to reflect industry developments and best practices.

    The education initiatives for Analytics practitioners are established with a process for continuous improvement. The success of the education initiatives must be measurable and monitored to ensure skills gaps are addressed on an ongoing basis.

    Questions
    • Is there an approach and plan to support organization analytics education?
    • Have skills requirements across Analytics practitioner levels been defined and agreed upon?
    • Has a gap analysis to identify skills gaps been performed?
    • Have stakeholders reviewed and approved the learning map?
    • Does the learning map fulfill role definitions across all roles and levels?
    • Do the learning maps cover all types of learning?
    • Is there a defined process to refresh the learning map?
    • Has the education initiative been designed to provide practical and theoretical experiences?
    • Has a learning experience roadmap been defined?
    • Have success metrics been approved?
    • Is a feedback mechanism in place?
    • Is the Analytics Education Program aligned and supported by the organization’s central training function, if applicable?
    Artifacts
    • Analytics Education Approach and Plan
    • Analytics Roles and Responsibilities
    • Analytics Skills Requirements to Role Matrix
    • Analytics Role Learning Paths
    Scoring

    Not Initiated

    No formal Analytics Education Approach and Plan exist.

    Conceptual

    No formal Analytics Education Approach and Plan exist, but the need is recognized, and the development is being discussed.

    Developmental

    The Analytics Education Approach and Plan are being developed.

    Defined

    The Analytics Education Approach and Plan have been defined, reviewed, and approved.

    Achieved

    The Analytics Education Approach and Plan are implemented and understood across the organization.

    Enhanced

    The Analytics Education Approach and Plan are established as part of business-as-usual practice with a continuous improvement routine and are reviewed regularly.

    8.6.2 Analytics Change Management

    Description

    Analytics Change Management should be approached as a structured program with essential initiatives established. The focus is to understand the necessary skills and behaviors the organization must develop to optimize the value of analytics. To help support progression to the optimized future state for Analytics, the team can utilize an approach to develop and implement prototype initiatives that promote and encourage the necessary skills and behaviors. After demonstrating the value of these initiatives, they should be integrated and maintained as part of ongoing operations.

    Objectives
    • Develop Analytics Change Management Approach in support of promoting improved analytics across the organization.
    • Establish the desired analytics skills and behaviors for the organization to optimize the value of analytics.
    • Develop a plan to support enabling the organization to improve analytics skills and behaviors.
    Advice

    Leveraging and promoting the analytics education program can be a primary activity to move an organization forward in the journey to improved analytics understanding and capability. Documenting and communicating successful analytics initiatives that promote the desired skills and behaviors can help to address gaps and continue to build awareness.

    It is best to work with the Analytics community to define the key performance indicators associated with the behaviors and build business case(s) for implementation of the solution as a business-as-usual activity. Obtain buy-in and sponsorship to drive Analytics initiatives. Following implementation, measure the change in behavior and its impact at regular milestones.

    Questions
    • Have the analytic skills and behaviors preferred by the organization been established?
    • Are successful analytics use cases documented and communicated to the organization?
    • Is analytics change management collaborating with the analytics education program to support analytics success?
    • Does a process exist to determine if new behaviors are sustainable and providing business value?
    • Has a process to measure behaviors over time been defined?
    Artifacts
    • Analytic Change Management Approach and Plan
    • Analytics Skill and Behavior Standards
    • Analytics Case Study(ies)
    • Analytics key performance indicators and scorecards, with targets
    • A documented process to determine sustainability and business value of new behaviors
    • Evidence of behavioral measurement
    Scoring

    Not Initiated

    No formal Analytics Change Management Approach exists.

    Conceptual

    No formal Analytics Change Management Approach exists, but the need is recognized, and the development is being discussed.

    Developmental

    Formal Analytics Change Management Approach is being developed.

    Defined

    Analytics Change Management Approach is defined, reviewed, and approved.

    Achieved

    Analytics Change Management Approach is established to continue improvement in analytics skills and behaviors.

    Enhanced

    Analytics Change Management Approach is established as part of business-as-usual practice with a continuous improvement routine. Plans are regularly reviewed.

    Leave a Reply

    Join the DCAM User Group. Be a thought leader, share your best practice with other industry practitioners. Then share this invitation with your fellow members - let’s get the crowd moving.
    Join the Crowd