Data Governance Policy​ ​​

Effective from: 1 June 2023, Last updated: 1 June 2023 

1. Purpose

Textile Exchange (“Textile Exchange” or “we” or “us” or “our”) data are organizational assets critical to support our mission – to inspire and equip people to accelerate the adoption of preferred materials – and they must be accessible, well managed, and properly secured. 


We envision an organizational data management system that provides secure, defined, quality data whenever and wherever needed in a cost-effective, reusable, and repeatable manner to ensure Textile Exchange remains the trusted authority of preferred materials in the textile industry. 


Our data governance mission is to define and manage a quality data resource that enables Textile Exchange to inspire and equip people to accelerate the adoption of preferred materials in line with the Climate+ goals

The Data Governance Policy establishes a framework to manage Textile Exchange data efficiently and effectively by: 

2. Scope

This policy applies to all staff, service providers, suppliers, and others (collectively, “users”) provided with access to Textile Exchange data and systems and must be adhered to, in processing all Textile Exchange data. 

The policy provides a framework to govern data and is supplemented by operational policies and procedures (i.e., standards, registers, guidelines, forms, and data agreements), which collectively operationalize data management in Textile Exchange. 

Operational Policies

  • Data Compliance Policy
  • Data Sensitivity Classification Policy
  • Data Access Policy
  • Data Quality Policy
  • Data Provider Engagement Policy
  • Data Security Policy
  • Data Security Incident Policy
  • Change Management Policy
  • Data Retention and Recovery Policy
  • Privacy Policy

3. Principles

The following principles set the expectation on how data should be managed in Textile Exchange.  They guide the consistent development of data governance policies and procedures across the organization.  

Textile Exchange recognizes that information is a corporate asset and must be identified, protected, used, and managed consistently throughout its life cycle, meeting required levels of information quality and in alignment with these data governance policies.  

Adherence to this policy is mandatory across all Textile Exchange departments that process (create/acquire/collect, use, transform, store) critical data. The data governance framework as defined in this policy applies to all data across the organization.   

Departments must monitor and manage the quality of critical data which they originate. Clear accountabilities for managing data must be formalized by departments establishing roles outlined in this policy. This ensures that critical data is managed in a consistent manner across the organization for effective governance.      

Information assets must be managed in keeping with their risk and value. Textile Exchange must have the means to measure and ensure that data quality standards are in place for defined critical data, and commensurate with the acceptable risk for data management. 

All processes and systems that handle critical data must have the means to ensure that data quality standards are met in the originating systems, as well as in the Enterprise Data Warehouse.  

Common standards and definitions are essential for data harmonization.  For all critical data, there must be a single, traceable authoritative source, clarity of ownership, and a documented data lineage. Where critical data is duplicated, Enterprise Data Warehouse sourcing principles will guide the selection of an authoritative source and placement of data. 

3.1 Data is an organizational asset. All structured data and unstructured data are organizational assets. Secure and effective management of these assets is critical to our success. 

3.2 Single Source of truth. Textile Exchange processes data in different ways from many sources. Creating a single source of truth for data is essential for data integrity and maximizing the value of data. 

3.3 Data process or processing must comply with the law. Textile Exchange will take all necessary steps to comply with relevant data-related rules, laws, and regulations, particularly the General Data Protection Regulation (GDPR). 

3.4 Data is of high quality and usability. Textile Exchange delivers quality (accurate, complete, reliable, relevant, and timely) data that is easy for users to derive useful information to meet the organization’s strategic objectives. 

3.5 Data is safe and secure. Textile Exchange will take all necessary steps to ensure that data is protected from unauthorized access, loss and that recovery measures are in place to manage data security incidents. 

3.6 Data is used properly. Textile Exchange encourages the use of data for assessments and decision-making. The use of data (both internally and shared externally) must be legal, ethical, and appropriate. 

4. Data Governance Framework

Textile Exchange’s data governance framework lays out the role delegations, rules of engagement, and normative resources (i.e. Data Governance RACI Chart, Data Governance Policies, Critical Data Identification, Data Performance Metrics, Business Data Glossary, Business Rules, Data Classifications, Data Identifiers, Data Inventory, Data Model, Data Value Chain, Data Processes/Procedures/Guidelines) required for the proper management of organizational data.  

The following diagram outlines how the data governance framework shall be implemented in Textile Exchange.

5. Data Governance Operating Model

Textile Exchange’s data governance operating model summarizes the roles of key data stakeholders within the organization. It is based on a hybrid approach that combines centralized governance with decentralized management of data. 

5.1 The centralized component is a single Data Governance Committee and Data Governance Office that governs all critical data.  

The committee, headed by the Executive Sponsor, comprises relevant Head of Departments from across the organization alongside the Data Governance Director and the Data Governance Manager.  

The Data Governance Office is made up of the Data Governance Director, Data Governance Manager, and Data Custodians of key systems within the organization, including Benchmark, Data Intelligence, Data Analytics, Technology, Internal Systems, and Finance. The office has the primary responsibility for developing and gaining approvals for normative resources and implementing them across the organization. The office also manages communications, training, reporting, budgeting, and escalation of issues to ensure that data is managed consistently across the organization.  

5.2 The decentralized aspect is that all functional data management activities remain within the reporting lines of each department. The data stewards who manage data, report to the Head of Departments while executing standards, policies, and processes defined by the Data Governance Office and approved by the central Data Governance Committee. In departments with more than one Data Steward for a data domain, a Lead Data Steward should be appointed. 

6. Roles and Responsibilities  

The data governance roles defined below speak only to the responsibility of a position pertaining to data governance and do not correlate to the actual functional and hierarchical organizational structure in Textile Exchange. Depending on resource availability, one role may be assigned to multiple persons or multiple roles may be assigned to one person. Additionally, a role that rightly falls under the purview of an individual may be delegated to another.    

6.1 Data Governance Executive Sponsor provides strategic directives and oversight on the overall development and performance of the data governance framework. The Executive Sponsor is the final authority on data governance.  Responsibilities include: 

6.2 Data Governance Committee is chaired by the Executive Sponsor and comprises the relevant Head of Departments alongside the Data Governance Director and the Data Governance Manager. It reviews, approves, and resolves data governance policies as well as normative resources to consistently manage data across the organization. Responsibilities include: 

6.3 Data Governance Director leads the overall development, implementation, and update of the data governance. Responsibilities include: 

6.4 Data Governance Manager manages the development, implementation, and update of the data governance framework. Responsibilities include: 

6.5 Head of Department leads an operational unit accountable for the critical data of one or more data domains (e.g. Climate+, Standards & Assurance, Fibers & Materials, Benchmark, Membership, Impact Incentives.). The Head of Department is accountable for executing the data governance framework and all data management activities within the unit, including access management.  Responsibilities include: 

6.6 Data Stewards manages an operational unit for the critical data of a data domain, where data originates or is first collected, regardless of format (e.g. Preferred Fiber & Materials Report, Climate+ Impact Modelling, Assurance etc.). The Data Steward is responsible for all data management and quality within the unit as well as the full lifecycle of the data.  Responsibilities include: 

6.7 Data Custodian manages one or more systems (or technical environment) where data is stored. This can be a technology managed application, business managed system or manual system. He/she processes, stores, and/or maintains data in a system and/or technical environment for a data domain. Responsibilities include: 

7. Responsible, Accountable, Consulted, Informed (RACI) Chart 

8. Critical Data Identification 

Not all data in Textile Exchange pose the same degree of risk.  This Policy assigns accountabilities for critical data, defined as data that is governed by or required for risk management, regulatory compliance, and financial reporting, or data that is essential for business growth or senior leadership decision-making (i.e., key business leadership forums).  

Data domains that hold critical data for Textile Exchange are identified below: 

9. Data Value Chain & Processes 

9.1 Data Value Chain

The following diagram outlines Textile Exchange’s role in the data value chains for various data domains.  

9.2 Data Lifecycle 

The following diagram outlines key considerations to be taken at each stage of the data lifecycle.  

9.3 The role of assurance in data governance 

Each department is responsible for defining monitoring and evaluation data for the program under its purview: 

(a) Monitoring data means ongoing measurement of a set of indicators that are tracked regularly over time. The focus is generally on tracking the use of inputs, activities, outputs, and short-term outcomes of an intervention. (ISEAL) 

(b) Evaluation data means the comparison (evaluation) of actual results and impacts obtained against plans or objectives. Evaluations may look at efficiency, effectiveness, medium-term outcomes, or impacts. Unlike monitoring, each evaluation may be a stand-alone activity, looking at a different set of evaluation questions. (ISEAL) 

The Operational Compliance department plays a unique role in data governance in that it provides the program assurance and oversight by ensuring that (a) the program is implemented consistently, competently, and impartially, (b) the program risks are managed, (c) the assurance model is fit for purpose, and (d) the program assurance system is accessible and adds value to its stakeholders. To do so, Operational Compliance is responsible for legal compliance and defining assurance data required for a program.  

(c) Assurance data means demonstrable evidence that specified requirements relating to a program, product, process, system, person, or body are fulfilled (Adapted from ISO 17000).  

10. Communications and Training 

10.1 Communications. This policy will be communicated to stakeholders publicly via the organization’s website  

10.2 Training. Textile Exchange should foster a data culture in the organization.  

(a) Data governance training should be provided as part of staff induction and continuous development. 

(b) All data users should be trained in relevant data governance policies before access is given. This should include understanding the potential consequences of non-compliance.  

11. Review Process

This policy will be reviewed and updated at least annually to keep pace with any data and system developments. In periods of rapid change, this policy may be modified and updated as needed to reflect current priorities.  

12. Escalations and Exceptions 

Data governance issues should first be escalated within a department. Unresolved issues may be escalated to the Data Governance Office, then the Data Governance Committee. The Executive Sponsor is the final authority over all data governance issues.   

There are no exemptions from implementing and sustaining data governance across Textile Exchange. All exceptions to this policy should be documented and brought to the Data Governance Director for approval. He/she may delegate decisions for non-material exceptions to members of his/her team, provided all exceptions are tracked and monitored. Affected departments should be able to review exceptions and feedback on the decision process.  Challenges to material exceptions may be referred to and resolved by the Data Governance Committee. 

It is understood that certain circumstances may cause the need for additional time and funding to meet the requirements of this policy.  The rationale for deferrals should be documented and approved. The rationale for deferral should be accompanied by a resolution to be actioned within a reasonable period, or this policy must be revised to permit the exception condition to exist.  

13. Definitions

assurance data means demonstrable evidence that specified requirements relating to a program, product, process, system, person, or body are fulfilled (Adapted from ISO 17000). 

critical data is data that is governed by or required for risk management, regulatory compliance, and financial reporting; or data that is essential to support business growth or senior leadership decision-making (i.e., key business leadership forums).   

data means information, especially facts or numbers, examined, considered, and used in calculating, reasoning, discussion, planning, or decision-making. Data can be considered the building blocks of ‘information.’ ISO defines data as a “reinterpretable representation of information in a formalized manner suitable for communication, interpretation, or processing.” (Adapted from ISO/IEC 2382) 

data access means the ability to access, change or delete data elements stored in a repository such as a database.  

data culture means the collective behaviors and beliefs of people who value, practice, and encourage the use of data to improve decision-making. 

data domain means a high-level categorization of business data defined for the purpose of assigning accountability and responsibility (e.g. Certification, Benchmark, Materials Production). Although data domains may often appear to map to Textile Exchange’s organizational structure, they need not do so. Data domains may comprise smaller sets of data known as sub-domains if the need arises (e.g. scope certification data, organic cotton production data). 

data element means the fundamental data structure in a data processing system or any unit of data that has a precise meaning or precise semantics, such as certified organization name and land area.  

business data glossary means the inventory of business terms, definitions, and taxonomy within a data domain.   

data governance framework means a collection of policies, processes, and role delegations that ensures the integrity, security, and compliance in an organization’s enterprise data management. Synonymous: data governance program. 

data governance means the overall management of the availability, usability, integrity, and security of the data employed in an organization. (ISEAL) 

data governance program means a governing mechanism, which includes a defined set of procedures and a plan to execute those procedures. (Adapted from ISEAL) 

data harmonization is the method of unifying disparate data fields, formats, dimensions, and columns into a composite dataset. 

data identifier means a language-independent label, sign or token that uniquely identifies an object within an identification scheme. 

data inventory means an inventory of all the data assets (source data along with its metadata) maintained by Textile Exchange as well as the information regarding its scope, source, provider, and storage location. 

data lineage is the description of the movements and transformations of data from point of origin or derivation to consumption.   

data process or processing means any operation or set of operations that are performed on data, whether or not by automated means, such as collection, recording, organization, structuring, storage, adaptation or alteration, retrieval, consultation, use, disclosure, dissemination, erasure or destruction. 

data provider means the entity that is disclosing data for processing and/or uses by the data recipient. 

data quality determines the level of confidence in Textile Exchange data and is defined in the context of seven data quality dimensions.  It speaks to the state of completeness, validity, consistency, timeliness, and accuracy that makes data appropriate for a specific use.  It is a perception or an assessment of the information’s fitness to serve its purpose in a given context. Data quality is affected by the way data is entered, stored, and managed.  Data quality includes defined data quality threshold that determine whether data quality is within required tolerances. 

data quality dimensions are based on best practices and include: 

data quality rule means the logic, including a pass/ fail criterion, to analyze the quality of data, e.g., year must be 4 digits.  A data element will be considered as passing the rule if the value is either “2022” or “2023.” 

data quality standard means the objective and the overall scope of the data quality management framework defined with reference to specific data quality dimensions.  

data quality threshold means the minimum acceptable pass rates for Data Quality Rules. E.g., 100% threshold means that there is no tolerance for errors. 

data security incident means an accidental or deliberate event that results in or constitutes an imminent risk of the unauthorized access, loss, disclosure, modification, disruption, or destruction of Textile Exchange data, particularly personal data. 

data taxonomy means a hierarchical structure separating data into specific classes based on common characteristics. The taxonomy represents a convenient way to classify data to prove it is unique and without redundancy. This includes both primary and generated data elements. 

data user means any authorized entity or individual that has been granted access rights to Textile Exchange data and/or systems to perform an agreed set of activities.  

disclosure means the voluntary reporting or sharing of data publicly or to a specific third party.  

evaluation data means the comparison (evaluation) of actual results and impacts obtained against plans or objectives. Evaluations may look at efficiency, effectiveness, medium-term outcomes, or impacts. Unlike monitoring, each evaluation may be a stand-alone activity, looking at a different set of evaluation questions. (ISEAL) 

full backup means backup of the entire database system (including transaction log).   

identifiable data means any data that can be used to distinguish or trace an individual or entity’s identity and any information that is linked or linkable to an individual or entity.  

incremental backup means backup of only the changes that have been made since the last incremental backup. 

information means knowledge concerning objects such as facts, events, things, processes, ideas, or concepts that, within a certain context, have a particular meaning (adapted from ISO/IEC 2382) 

metadata is essentially data about data. It is used to describe characteristics such as content, quality, format, location, and contact information of physical or digital data.  Metadata ensures data can be discoverable, citable, reusable, and accessible in the long term. 

monitoring data means ongoing measurement of a set of indicators that are tracked regularly over time. The focus is generally on tracking the use of inputs, activities, outputs, and short-term outcomes of an intervention. (ISEAL) 

payload means the carrying capacity of a packet or other transmission data unit. 

periodicity means the frequency of data transfer by the data provider to Textile Exchange.  

personal data means data in any format that relates to an identified or identifiable living person. An identifiable living person is someone who can be identified directly or indirectly from an identifier such as a name, an identification number, location data, an online identifier, or one or more factors specific to the physical, physiological, genetic, mental, economic, cultural, or social identity of that person. 

project means a temporary endeavor undertaken to create a unique project service or result. (PMBOK® Guide) 

program means a group of related projects managed in a coordinated way to obtain benefits and control not available from managing them individually. Programs may contain elements of work outside of the scope of the discrete projects in the program. (PMBOK® Guide) 

raw data means the original data (record), which can be described as the first capture of information, whether recorded on paper or electronically. Synonymous: Source data. 

record information created, received and maintained as evidence and as an asset by an organization or person, in pursuit of legal obligations or in the transaction of business. (Adapted from ISO 15489-1:2001) 

retention means the agreed length of time an organization will keep different types of records. Retention schedules are policy documents that support compliance with legislative and regulatory requirements. 

sensitive personal data means personal data on racial/ethnic origin, commission/ alleged commission of an offense, political opinions, religious or philosophical beliefs, trade union membership, genetic/biometric data, data concerning health, or data concerning a natural person’s sex life/sexual orientation.   

service provider means any individual or entity that is contracted to process its data or to develop, maintain or update its systems.  

Single source of truth (SSOT) is the practice of aggregating the data from many systems within an organization to a single location. A SSOT is not a system, tool or strategy, but rather a state of being for an organization’s data in that it can be found via a single reference point.  

system means any information or data system designed to collect, process, store, and distribute data, such as technology platform, database, data warehouse, websites, applications, computer hardware, and computer equipment.