Data is a known fact or collection of known facts, which can include numbers, words, sounds or images.
At UNSW, data includes all administrative, learning, teaching and research data.
About data management
Data management is a group of activities relating to the planning, development, implementation and administration of systems for the acquisition, storage, security, retrieval, dissemination, archiving and disposal of data. Such systems are commonly digital, but the term equally applies to paper-based systems where the term records management is commonly used.
Data management embraces all forms of data, whether these datasets are simple paper forms, the contents of relational databases, multi-media datasets such as images, or research data.
Key data management activities include:
- Data policy development
- Data ownership
- Metadata compilation
- Data lifecycle control
- Data quality
- Data access and dissemination.
Why is good data management important?
Effective data management underpins top-level aims of good stewardship of public resources and responsible communication of research results.
Storing the right data in the right place protects staff, teaching and research work, and the University’s reputation. Failing to do so may mean recreating documents, redoing experiments, paying to repeat procedures, hours of tedious administrative clean-up and searching through multiple backup tapes, or even retracting a published paper.
The benefits of good data management
Aside from complying with data storage requirements associated with institutional, funding and publishing bodies, there are many benefits to making the effort to establish strong data management plans and adopt good data storage practices for your activities.
What is in it for you?
UNSW Data Management Policies and Procedures
UNSW staff must manage their data in accordance with the following UNSW policies and procedures:
- Data Governance Policy
- Data Breach Policy
- Data Classification Standard
- Data Handling Guideline
- Recordkeeping Policy
- Recordkeeping Standard
- zID Usage Guideline
- IT Security Policy - Information Security Management System (ISMS)
- IT Security Standard - Secure Algorithm (SAL)
Researchers should refer to the Research Data Management at UNSW page for information on research data management obligations.
Measurements of Data Quality
Every organisation is unique, but there are a number of quantitative Data Quality measures that are universal:
DQ Quant. MeasureDescription
CompletenessThe degree to which all required occurrences of data are populated
UniquenessThe extent to which all distinct values of a data element appear only once
ValidityThe measure of how a data value conforms to its domain value set (i.e., a set of allowable values or range of values)
AccuracyThe degree of conformity of a data element or a data set to an authoritative source that is deemed to be correct or the degree the data correctly represents the truth about a real-world object
IntegrityThe degree of conformity to defined data relationship rules (e.g., primary/foreign key referential integrity)
TimelinessThe degree to which data is available when it is required
ConsistencyThe degree to which a unique piece of data holds the same value across multiple data sets
RepresentationThe characteristic of Data Quality that addresses the format, pattern, legibility, and usefulness of data for its intended use in addition to quantitative
Data Quality measures should also consider qualitative measures. Some examples include:
DQ Qual. MeasureDescription
Business Satisfaction MeasuresThe increase/decrease in business satisfaction based on surveys
Collaboration/Improved Productivity MeasuresPercent of times the Data Governance Council detected and eliminated redundant intra- or inter-departmental projects/initiatives
Business Opportunity/Risk MeasuresBusiness benefit gained due to quality data or business risk realized due to questionable data. Increase in competitive analytics due to data availability and Data Quality improvements
Compliance MeasuresUsers with access to update/influence the master data are restricted to only those employees who have need and have been approved access as part of their job functions
It is very important to establish the measures of Data Quality most important to UNSW. This is required to establish a baseline for the quality of your data and to monitor the progress of your DQM initiatives. The other foundational components of the Data Quality Cycle are to Discover, Profile, Establish Rules, Monitor, Report, Remediate, and continuously improve Data Quality.
Components of DQM
Once in place, these key components provide robust, reusable, and highly effective DQM capabilities that can be leveraged across the enterprise:
Data DiscoveryThe process of finding, gathering, organising, and reporting metadata about your data (e.g., files/tables, record/row definitions, field/column definitions, keys)
Data ProfilingThe process of analysing your data in detail, comparing the data to its metadata, calculating data statistics, and reporting the measures of quality for the data at a point in time
Data Quality RulesBased on the business requirements for each Data Quality measure, the business and technical rules that the data must adhere to in order to be considered of high quality
Data Quality MonitoringThe ongoing monitoring of Data Quality, based on the results of executing the Data Quality rules, and the comparison of those results to defined error thresholds, the creation and storage of Data Quality exceptions, and the generation of appropriate notifications
Data Quality ReportingThe reporting, dashboards, and scorecards used to report and trend ongoing Data Quality measures and to drill down into detailed Data Quality exceptions
Data RemediationThe ongoing correction of Data Quality exceptions and issues as they are reported