Data Maturity Framework For The Not-For-Profit Sector

You might also like

Download as pdf or txt
Download as pdf or txt
You are on page 1of 12

Data Maturity Framework

for the Not-For-Profit Sector

Data Orchard | info@dataorchard.org.uk | www.dataorchard.org.uk

Version 2 created by Data Orchard CIC September 2019, replacing version 1 published in January
2017 in partnership with DataKind UK. Freely available for non-profit use licenced under Creative
Commons Licence: Attribution-NonCommercial-ShareAlike 4.0 International (CC BY-NC-SA 4.0)
The Data
Maturity Framework

USES
DATA
ANALYSIS

DEVELOPING

MASTERING
LEADERSHIP

EMERGING
UNAWARE

LEARNING
CULTURE
TOOLS
SKILLS

7 KEY THEMES 5 S TA G E J O U R N E Y
Data Maturity Themes

USES
• Purposes for collecting
and analysing
• Benefits and rewards CULTURE
• Team approach
DATA • Self-questioning
• Collection • Openness
• Quality • Protection
• Sources
• Assets TOOLS
• Storage
ANALYSIS • Type
• Type • Quality
• Technique • Sharing
• Joining • Integration
• Presenting
SKILLS
LEADERSHIP • Capacity
• Attitudes • Skills
• Plans • Training
• Capability • Access to knowledge
• Investment and expertise
USES Purposes for collecting and analysing data | Benefits and rewards

UNAWARE EMERGING LEARNING DEVELOPING MASTERING


SCORE 0-1 SCORE 1-2 SCORE 2-3 SCORE 3-4 SCORE 4-5

• Records basic client information and • Collect more data than required by • Collect data to be able to understand • All client, activity, output, and • Data is used extensively, and in
activities/work delivered in order to legal/funders/contracts. and evidence the types of clients/ outcomes data is routinely collected. inter-related strategic ways, for a wide
operate at a basic level. • Most data collection and analysis needs and problems the organisation • Services/products/campaigns are range of purposes.
• Little or no benefits or rewards in relates to capturing activities, addresses. Uses both internal and monitored to show performance on • Sophisticated use of analysis delivers
relation to services. measuring outputs, and basic financial reliable external data sources to do so. how, when and where these are used insights and predictions to influence
• Collect and use data for requisite analysis and forecasts. • Starting to use data to understand by whom. service and organisational success.
purposes e.g. basic financial • Rewards mostly around improved different ways beneficiaries initially • Services/products/campaigns are • Evidencing and improving outcomes
management and legal/funder/ understanding of beneficiaries and contact and engage with services over starting to be targeted at specific and impact is a primary focus.
contract compliance reporting. ability to articulate the scale of time (or not). demographics and/or geographic Experiments to identify differentiated
• Continued funding may be seen as the activities delivered. • Able to demonstrate activities being locations in response to better impact and how to optimise this.
only reason for collecting data. • Raising income and understanding delivered for specific types of users understanding of needs/problems. • Predict user needs and service/product
fundraising performance likely to be across a range of projects and services. • Services/products/campaigns are options based on understanding client
key focus for additional data collection • Capturing some outcomes data and regularly reviewed and adapted in behaviours and how to influence these
e.g. fundraising events, donors, grants, learning to measure outcomes response to data to optimise outcomes. for the best outcomes.
contracts, and sales. consistently. • Beginning to test assumptions on how • Design and delivery of services/
• Able to feedback information to • Starting to capture feedback, the organisation has an impact and for products/campaigns is optimised at an
funders, partners and networks around monitoring and evaluation data on whom. individual/personal level.
specific projects/services and the scale service/product quality and • Starting to differentiate between • Services and interventions are highly
of activities. May struggle with performance to inform improvements. approaches and understand what’s targeted possibly in collaboration with
multiple and or/repeat service use • Use data for income generation and working and what’s not. other partners/service providers.
data. some forecasting of sales and • Able to differentiate and explain • The organisation is embedded in
• Starting to explore the difference donations leading to more effective attribution and contribution in networks of knowledge and research in
between outputs and outcomes. fundraising and commercial income. evidencing and communicating the context of its work. Partnerships
• Starting to lead conversations with impact. and networks are strengthened
funders and partners using data. • Exploring and learning how to through collaborative data sharing.
• Building internal knowledge and measure long term impact. • Use data to increase efficiencies
expertise based on the analysis of data • Data is starting to be used to inform (resources, processes, services/product
and dialogue on how to act on this. efficiency savings (resources, processes delivery).
• Strategic planning, particularly around and service/product design). • Robust evidence ensures credibility
efficiency and service development, is • Can coherently make the case to and is used to influence external policy
becoming more data informed. funders/investors/clients for existing and decision makers.
and new services/products/campaigns. • Learn, evaluate and build knowledge
• Learning and evaluation is embedded - harness data for continuous
internally and informs both service and improvement of products/services/
staff/volunteer performance. campaigns.
• Strategic planning and decision • Strategic planning and decision making
making is becoming considerably data is highly informed by data and based on
informed. past, present and future analyses.
DATA Collection | Quality | Sources | Assets

UNAWARE EMERGING LEARNING DEVELOPING MASTERING


SCORE 0-1 SCORE 1-2 SCORE 2-3 SCORE 3-4 SCORE 4-5

• Limited data (if any) collected. • Data collection is both on paper and in • Though errors remain, data collection • D ata requirements defined and • Focuses effort on collecting the
• Collected manually, mostly on paper, digital forms though there may be methods and processes are being consistently collected via a range of minimal amount of the most important
only when needed for specific purpose. inconsistencies and inefficiencies in improved. methods. and valuable data.
• Data isn’t meaningful or useful to the approach. • Staff and volunteers are trained in data • Investment in staff and volunteer • Staff and volunteers are trained in data
organisation. • Staff and volunteers may not be collection. training to ensure consistency and collection and collection is automated
• Infrequently updated (if ever). trained in data collection. • Data is reviewed to assess how quality of data collection. where possible.
• Not checked for validity or accuracy. • Data is regarded as meaningful or relevant, meaningful, and necessary • Tests and refines how relevant, • Knows all its data is relevant,
• Data isn’t shared internally. useful primarily for meeting external it is. meaningful, and necessary its data is. meaningful and necessary.
• No external data sources used. legal/funder/contract requirements. • The organisation knows how good or • Focuses on collecting the right data for • Monitors and fully understands the
• Nobody is aware or interested in the • It may be patchy and inconsistent with bad its different data sets are; and clearly specified purposes. quality of the data it holds and hence
data assets in the organisation. gaps and duplications. knows which data sources can/can’t be • Data is monitored for quality including has high levels of confidence and trust
• Data is rarely updated and cleaned. trusted. completeness, accuracy, and validity. in its data.
• Mixed levels of confidence and trust in • Gaps, overlaps, and mismatches in the Tools and systems exist for cleaning • Data is versatile and re-usable for
data. available data have been identified. and maintenance. multiple purposes and audiences.
• Starting to find out what data is • Data becoming richer, more relational • Richer data collection with more • Invests in resources to collect, clean,
available internally. Know where most and therefore versatile. integration/alignment between maintain, and manage data well across
data assets are but there may be more • Starting to explore how internal data systems reduces duplication, the organisation.
squirrelled away in parts of the (perhaps in multiple, fragmented or inefficiency and error. • Shares data internally from different
organisation. isolated locations) can be shared and • Data sharing internally is becoming teams, departments and services.
• Occasional use of external information utilised. the norm. • Compares its data with other
sources relating to the wider context of • Additional internal and external data is • Occasional commissioning of organisations through shared
the organisation’s work. sourced. independent research and evaluation. measures and benchmarks.
• All data assets are known but may not • Extensive use of externally available • Uses publicly available external
be formally recorded. research and evidence around needs research (e.g. government/academic)
and successful interventions. and may contribute to this.
• Open datasets are occasionally used. • Regular use of valuable open/public
• Exploring shared measures and data sets.
benchmarks with other organisations. • Commissions external independent
• Recorded list of all data assets, whether research and evaluation.
they include personal/sensitive data, • Maintains full inventory of data assets
and assigned responsibility. across the whole organisation with
clearly defined variables, ownership,
review periods, and development
plans for each.
ANALYSIS Type | Technique | Joining | Presenting

UNAWARE EMERGING LEARNING DEVELOPING MASTERING


SCORE 0-1 SCORE 1-2 SCORE 2-3 SCORE 3-4 SCORE 4-5

• L imited analysis of financial, contract- • Analyses starting to explore service • Whole-organisation analyses are • Embedding systems for analysing • Analysis extends beyond the
related, and legal compliance data. users/clients/customers and target beginning to be performed on an meaningful and useful data. organisation to its wider context with
• Mainly simple manual counts from audiences. ad-hoc basis. • More consistent and regular approach cooperative analyses performed with
paper forms or digital records are used. • May include use of external research • Beginning to focus on what analysis is to data reporting and trends analysis partners/other agencies.
• Strategic discussions are not informed relating to the context e.g. to evidence meaningful and useful. on users/needs, activity, outcomes and • Analysis extends through descriptive,
at all by analysis of data. scale of need/problems. • Starting to identify what data should impact. diagnostic, predictive, and prescriptive
• Data is not used in reports. Anecdotes • Basic analysis, using counts and be routinely analysed and potentially • Monitors what’s happening in present techniques.
and observations are preferred. spreadsheets. automated. as well as what’s happened in the past. • May conduct Randomised Control
• People verbally report on data as part • Analysis is mostly descriptive about Some forward looking analysis/ Trials.
of strategic discussions. what happened e.g. summarising the forecasts may challenge views of future • More focus on what will happen in the
• Analysis and report creation skills are overview, averages, variation, range. performance. future with forecasting and predictive
being explored though results may be • Comparative trend analysis is starting • Analysis is more diagnostic about models to plan for the future needs of
variable. to be conducted over time (perhaps on where/why things happened e.g. beneficiaries, target services, increase
• Bar charts and pie charts may be the an annual basis). exploring root causes, clustering income, and maximise impact.
only types of analytical presentations. • Exploration and use of filters and cross patterns, anomalies, discovering • Advanced prescriptive analytics
tabs are being used to delve further differences and trends. Some attempts support decision-making on how do
into data. at A/B testing. Occasional use of things in the best way e.g. optimisation
• Data is manually collated in reports predictive analytics in some areas e.g of location, staff/volunteer capacity,
using data from different sources. timeseries/forecasting. recommending decisions for effective
• Data is manually reworked for • Aware of difference between intervention, experimental design,
presentation in written reports for correlation and causality. simulation, artificial intelligence.
different internal/external audiences. • Routine data analysis is partially • Data from different sources is brought
• Learning how to create more automated and partially manually together and analysed in an
sophisticated graphs and presentations collated from different sources. automated way to provide an
of the data (though audiences may • Presentation and communication of organisation-wide analysis.
find them difficult to interpret and data is honed to ensure its meaning is • Dashboards, business intelligence
understand). understood. systems, visualisation tools draw from
• Some use of dashboards and/or data warehouse.
businesses intelligence systems • Data visualisation delivers meaningful
• Beginning to explore interactive data analysis to different internal and
visualisation. external audiences.
• Both static and real-time dynamic • Non data specialists are able to
reporting conducted for different interactively explore, analyse and
audiences, some may be available for report on the organisation’s data.
non-specialists to independently
access.
LEADERSHIP Attitude | Plans | Capability | Investment

UNAWARE EMERGING LEARNING DEVELOPING MASTERING


SCORE 0-1 SCORE 1-2 SCORE 2-3 SCORE 3-4 SCORE 4-5

• Not interested in data at all and not • Some recognition of the importance of • Know data is important, curious to • Becoming engaged, supportive, ask • Value, plan and prioritise data and
seen as a priority. data but don’t see the value of it. learn about its potential uses and the right questions of the data, and analytics as a vital organisational
• No business plan or plans around data. • Little awareness of the potential uses benefits. active in harnessing its value. resource.
• Don’t use data for decision making, of data so not seen as a priority. • Leadership occasionally ask questions • Data is becoming more of a priority for • Fully understand how to use data to
instead rely on gut feeling, experience • There may be a business plan though about the data they are given but are the organisation as a whole (and improve what the organisation does.
and what seems to work. perhaps with no defined or not entirely convinced about its value. considerably so in some projects/ Data drives questions and the
• No data or analytics expertise or measurable organisational goals. • Data is an interest of the organisation teams). organisation is influenced by what data
understanding among leadership. • Limited or very basic data and analytics but not a priority. • Data is becoming aligned to tells them.
• Don’t invest in data and analytics. knowledge and experience among • There is a business plan with some overarching business plan and desired • Viewed as a major organisational
leadership. defined and measurable goals, though impact. priority.
• Typically use data about what data collection/analysis may not align. • Monitors what’s happening in the • Overarching business plan with clearly
happened in the recent past and verbal • Learning about impact e.g. exploring present as well as past trends. Some defined goals based on outcomes and
accounts of what’s happening for theory of change. exploratory forward-looking research differentiated impact, forecasting, and
decision-making. • Learning through experience, building and predictions. prediction of need.
• Very little investment though some ‘adequate’ skills. • Data champion within senior • Use past, present and forward looking
may occur under the radar within • Assessing data and analytics skills, management. data for business planning and
specific projects/parts of the knowledge, and roles across the • Addressing skills gap in leadership as a decision making (including
organisation. organisation. whole including understanding forecasting, modelling, prediction and
• People are expected to learn ‘on the • Might use past and current data for around impact measurement and optimisation).
job’ and there’s no investment in decision making with some simple management. • Range of people with data and
specific roles, tools, or training. trend analysis. • Starting to plan and prioritise data analytics expertise in leadership
• Invest small amounts in some basic/ organisation-wide. including at Board level.
existing tools and staff training on an • Beginning to commit significant • Invest substantially in continuously in
ad hoc basis. investment in people (job roles), skills, improving data collection and analysis
• Exploring ideas and needs for the learning, and tools. aligned to business plan.
future though no formal plan related
to data.
CULTURE Team approach | Self-questioning | Openness | Protection

UNAWARE EMERGING LEARNING DEVELOPING MASTERING


SCORE 0-1 SCORE 1-2 SCORE 2-3 SCORE 3-4 SCORE 4-5

• Nobody is interested in data. • Data is seen as the responsibility of • Data is starting to be recognised as • The whole organisation is starting to • Data seen as a team effort and critical
• Data only accessible to a single person ‘someone else’, usually in an important at a more senior level. use and share data. People from asset for every part of the organisation.
or team, usually junior staff. administrative role. • People across the organisation are different teams/levels of seniority • Everyone in the organisation is
• Opinion, observation, passion, and • Recognition that data should be starting to talk about how they can regularly discuss data and how to act committed to ensuring quality data is
belief are used for decision making. collected but it is not seen as a ‘whole work together to deliver better data for on it. available to support decision-making.
• Data is seen as a chore with questions/ team’ activity. decision making. • Specialist staff in some teams are • Very comfortable using data to ask
requirements mostly externally driven. • Data mostly sought out and used to • Beginning to ask challenging starting to use data to ask difficult difficult and complex questions, to
• Data is never shared internally or support and evidence what the questions of the data: why? what’s questions, challenging assumptions, challenge practices and preconceived
externally. organisation already believes or knows. changing? what difference are we practices and impact. notions about the past and future.
• There are no policies related to data. • Organisation’s culture doesn’t making? (mostly about the past.) • Concepts of right and wrong (ethics) • Aware of the practical difference
• Minimal, if any, security and protection encourage data sharing across teams, • Access to data may be limited by are being considered, particularly in between: correlation and causality;
of data on paper, computers, or though this may occur occasionally default rather than design. relation to personal data. attribution and contribution; ‘known
devices. verbally or via reports. • People would like to share more but • Data and analysis is becoming more and unknown’ unknowns.
• Data never shared externally except for may be constricted by access/ available and accessible to staff though • Explores potential negative impacts of
legal/contractual reporting purposes. permissions/cultural barriers. may require some intervention by interventions as well as data ethics.
• Basic policies for data protection and • Some data insights are shared with specialists to provide this. • Internal openness and data sharing is
security may be in place but not partners, networks, and in the public • External data sharing is done on an fundamental to the culture, subject to
monitored or enforced. domain. aggregated basis and insights are data protection/security.
• Data protection and security policies shared including some shared • Everyone can access analysis they need
are in place. measures and benchmarks. when they need it.
• Senior management have some • Exploring how data could be shared • Data insights/evidence publicly available.
limited understanding of legislation with beneficiaries on an individual • Extensive data sharing, with protocols
and the organisation’s responsibilities. basis as part of service delivery. in place with partners, networks,
• Policies and practices are well stakeholders to address shared
established to ensure data is problems and solutions.
safeguarded (e.g. rules on passwords, • Data may be shared with beneficiaries
how data is stored, rights/privileges to as part of service/support.
access organisational and beneficiary • High levels of confidence about the
data). security of data held in the organisation.
• Risks have been identified though may • Systems, automated if possible, in
not have been tested. place to delete personal data no longer
• Systems have been created to ensure necessary and respond to subject
data about identifiable individuals is access requests.
deleted when no longer necessary and • Risks monitored and tested to improve
respond to subject access requests. data security and protection.
• Trustees and senior management keep • Widespread knowledge/skills sharing.
abreast of current legislation and • Trustees and senior management keep
best practice. abreast of future changes in legislation
and best practice, and regularly check
Data Protection compliance.
TOOLS Storage | Type | Quality | Sharing | Integration | Investment

UNAWARE EMERGING LEARNING DEVELOPING MASTERING


SCORE 0-1 SCORE 1-2 SCORE 2-3 SCORE 3-4 SCORE 4-5

• Data is disorganised and unmanaged • Data is stored in designated physical • Digital data is mostly centrally stored • Data fully centrally stored in secure • Data is drawn from a minimal number
and stored in a range of places: on locations, in paper and digital formats, on a (secured, backed-up) cloud-based digital systems with managed access. of sub-databases and systems and held
desks, in filing cabinets, individual in organised ways. system or local server with managed • Data held in appropriate databases/ in singly accessible database e.g. data
peoples’ email inboxes, computers, • Some data is managed and controlled access. Some may remain inaccessible CRMs or other technologies accessible warehouse.
phones, or other devices. by people with clear responsibility for on computers, central shared drives or by expert users and some non-experts. • Capacity is available to store, manage,
• Nobody has any formal responsibility maintenance and cleaning (e.g. devices. • People are formally responsible for and analyse increasingly large volumes
for managing any data in their job. Any administration, finance, HR, • Unstructured data is becoming better managing the storage, cleaning and of data from multiple sources.
organising, archiving, updating or data managers). organised and searchable (e.g. folder maintenance, security, and backup of • The organisation has a robust
cleaning is performed in an ad hoc way • Data is mostly collected on paper/ structures/file naming conventions). all data. Where possible this is infrastructure, integrating tools
by individuals. phone/in person and then entered into • Data held in a range of systems all becoming routine and/or automated. wherever possible.
• Data is mostly collected via paper, a database or spreadsheet for basic separately managed. Tools likely to • Data is collected and automatically • Analytical infrastructure to support
email, SMS messages. This may not get analytical and reporting tasks. include at least one database or CRM stored digitally wherever possible e.g. effective decision making is a priority.
transferred onto spreadsheets or • Unstructured data is disparately held system. There are also likely to be online forms/apps directly into • All tools are documented and
structured document filing systems. on different individuals’ computers multiple spreadsheets in use. databases. supported (internally and/or
• Typically uses a spreadsheet/s or basic and devices. • Tools mostly used operationally rather • An inventory of tools and systems externally).
finance software package, though not • Data may be shared mostly by than analytically. May allow some basic (including hardware, software, licence, • Tools for delivering ‘batch analytics’
in any analytical way. emailing spreadsheets and documents inbuilt analysis and reporting but most passwords and access) is managed and and real-time ‘streamed’ data are used.
• No data sharing happens in the as attachments with duplication, often data has to be exported for maintained. • Advanced analytics and data science
organisation. version control, and security issues. analysis in another tool. Possible • Most tools are up to date with support tools present throughout the
• No planned investment of time or • Some people or teams may use advanced analytical tool used for basic available. Work-arounds are organisation.
money in any tools, systems, or cloud-based document storage to data processing or descriptive understood and replacements planned • Tools able to access and utilise internal
infrastructure. share some data (e.g. OneDrive, statistical analysis. for poorer tools. and external data directly, for both
Google Drive, Dropbox, Box). Note • Joining data or analysis across teams/ • Advanced tools being used for experts and non-experts.
these may be personally or services/functions requires manual sophisticated analytics in some parts of • Automated reporting e.g. through
organisationally owned. exporting and re-stitching. the organisation e.g. R, SAS, SPSS, dashboards.
• Tools are limited. May not be • Tools likely to be purchased or built as Python. • Self-service analytics available both
up-to-date, don’t meet current needs, ‘one-offs’ for specific purposes with • Some integration is beginning to occur inside the organisation and to some
and are not documented or supported. limited flexibility for change or between systems with automated/ external partners/stakeholders.
• Tools are acquired on a ‘needs-must’ improvement. Replacements and aligned reporting e.g. basic use of • Ongoing investment of time and
basis e.g. for a specific purpose/project upgrades are being researched and business intelligence tools. money in developing and improving
or when a crisis/system failure occurs. costed. Possibly some regular • Significant investment is being tools, systems and infrastructure to
expenditure on cloud-based software committed to new tools/integrations support data and analytics.
tools and ad-hoc hardware across the organisation. Staff time,
replacement. upgrades, and replacement are being
built into annual budgets.
SKILLS Internal capacity | Skills and training | Access to knowledge and expertise

UNAWARE EMERGING LEARNING DEVELOPING MASTERING


SCORE 0-1 SCORE 1-2 SCORE 2-3 SCORE 3-4 SCORE 4-5

• No staff commitment beyond basic • Responsibility for data collection and • Beginning to understand needs • Understand different skill sets within • High levels of staff commitment at
administrative level and finance roles. control is at administrator and finance around data skills and capabilities. data and analytics. senior, specialist, technical, and
• Little or no internal skills, training, or officer level. • Dedicated person/team in charge of • Dedicated skilled analytics roles administrative levels.
expertise. • Different staff collect, manage and use data, perhaps a data manager or senior established with several people • Senior data strategist embedded at
• No access to external knowledge or data within other roles e.g. fundraising, administrator. Some skilled data responsible for data in different roles/ heart of leadership decision making.
expertise around data or analytics. projects. people in other roles, though perhaps teams. • Able to independently manage/drive
• No real understanding of the needs with limited capacity to fulfil the task. • Possibly a senior person/team bringing and maximise data analytics to an
and skills required for building data • Adequate data analysis/reporting skills organisation-wide data together. advanced level.
capabilities. with some investment in more • Increased data literacy/responsibility • All staff trained with ongoing
• Data literacy is patchy, mostly low, advanced skills e.g. Database/CRM across the organisation. investment in developing data skills
amongst staff. administrator. • Individuals responsible for data have with high levels of data literacy across
• Basic/adequate skills and training in • Commitment to improving data advanced training and skills and the organisation.
using data for operational and literacy. regularly engage in learning to • Becoming experts that other partners/
administrative purposes. • Exploring up-skilling and recruitment develop and improve systems and peers use as a resource.
• Little or no staff/volunteer awareness or to fill skills gaps. embed these across the organisation. • Specialist staff regularly update skills
training in data protection and security, • In house or externally provided • Staff know how to respond to subject and knowledge through training and
though perhaps at least one person training for using data systems. access requests (where individuals conferences.
has completed a course. • Staff and volunteers have basic data request a copy of their data) or changes • Active in online learning networks and
• Occasional support from trustee/ protection and security training though in preferences on personal data. data and analytics communities of
volunteers perhaps mostly relating to might not be very confident. • Staff know how to respond to a data practice exploring new tools and skills.
database/finance or reporting. • Establishing relationships with breach, potential breach, or near miss. • Awareness about openness and
external support and advice, mostly • Regular use of advanced external protection of data are embedded
around specific tools, systems or expertise and specialist suppliers. throughout the organisation with
projects with some skills development. regular training, and in-house learning
network to develop and maintain best
practice.
• Ongoing relationships with a range of
trusted suppliers providing advanced
support and specialist expertise,
periodically reviewed.
Glossary

A/B Testing Dashboard Data Mining


A randomized experiment with two variants, A and B. It includes An information management tool that visually tracks, analyses and The process of discovering new information from large data sets,
application of statistical hypothesis testing or “two-sample hypothesis displays key performance indicators and measures to monitor the involving methods at the intersection of machine learning, statistics
testing”. A/B testing is a way to compare two versions of a single health of an organisation or department. They can be cloud based and database systems. Also referred to as knowledge discovering in
and customised to meet the specific needs of a department or
variable, typically by testing a subject’s response to variant A against databases or KDD. Often used to find anomalies, patterns and
organisation.
variant B and determining which variant is more effective. correlations to predict outcomes.
Data Analysis
Anonymised The process of cleaning, analysing and summarising data to discover Data Protection
Removal of identifying details from data before sharing for statistical useful information, inform conclusions and support decision making. Legal control over access to and use of data held by an organisation.
or other purposes, to preserve privacy.
Data Analytics Data Security
Artificial Intelligence The process of data analysis (compiling and analysing data) and the A set of standards and technologies that protect data from intentional
An area of computer science that emphasises the creation of tools and techniques to do so, to support decision making. Could be or accidental destruction, modification or disclosure.
basic counts and/or charts; descriptive (about what happened);
intelligent machines that work and react like humans e.g. speech
diagnostic (about why it happened); predictive (about what will Data Warehouse
recognition.
happen in future) or prescriptive analytics (about how you can do it in A system that pulls together data from many different sources
Batch Processing the best way). within an organisation for reporting and analysis. The reports
The processing of previously collected jobs in a single batch. Data Assets created from complex queries within a data warehouse are used to
A collection of data that holds valuable information or knowledge. make business decisions.
Business Intelligence System
This can include databases, CRM systems, spreadsheets, mailing lists,
Technology for and practice of the collection, integration, analysis Database
records of transactions or bookings, collections/libraries of
and presentation of business information to support better documents or images. A structured collection of data, generally stored and accessed
decision-making. electronically (including cloud-based) that is organised to be easily
Data Collection Methods
accessed, managed and updated.
CRM Various ways in which data is gathered and measured to answer
Customer Relationship Management system - technology for relevant questions in an accurate and systematic way. Methods vary Forecasting
managing all of an organisation’s relationships and interactions with according to the field of research but some examples are: The process of making predictions of the future based on past and
customers and potential customers. Note this could be cloud based. observations; interviews; questionnaires and surveys; focus groups; present data and most commonly by analysis of trends. To help
ethnographies; oral history; case studies; experiments; randomised
management cope with the uncertainty of the future, starting with
Cross Tabulations control trials.
certain assumptions based on experience, knowledge and judgement.
Data tables that present the results of an entire group of data e.g. Data Infrastructure
survey respondents as well as results from subgroups. To examine A digital infrastructure promoting data sharing and consumption. It Impact
relationships within the data that might not be apparent when includes hardware (computers, phones, devices, storage and backup) The overall difference you make to those you’re trying to help; the
analysing the whole group e.g. all survey respondents. and software tools which might be cloud-based. long-term, big picture general change you make as an organisation.
Glossary

Machine Learning Quantitative Transactional Data


An application of artificial intelligence that provides systems the Is information about quantities; that is, information that can be For example when somebody registers for a service, signs an
ability to automatically learn and improve from experience without measured and written down with numbers. For example, how many attendance register, books to attend an event, or perhaps when they
being explicitly programmed. Machine learning focuses on the people respond to a survey. purchase or donate.
development of computer programmes that can access data and use Randomised Controlled Trials Unstructured Data
it to learn for themselves, relying on patterns and inferences. A study where subjects (usually people) are randomly assigned to Information that does not have a pre-defined data model or is not
Modelling one of two groups: one (the experimental group) receive the organised in a pre-defined manner (e.g. not in a database), and often
The process of creating an abstraction of the real world in order to intervention that is being tested, and the other (comparison or includes text and multimedia content. Examples include e-mail
understand it. A model organises elements of data and standardises control group) receive an alternative (conventional) treatment. messages, word processing documents, videos, photos, audio files,
how they relate to one another and to properties of the real world Raw Data presentations, webpages and many other kinds of business
entities. Data that has not been processed for meaningful use, either documents. Experts estimate that 80 to 90 percent of the data in any
manually or using computer software. organisation is unstructured.
Monitoring Performance
Measuring activities in process to identify operational health of an Relational Data Versatile Data
organisation. Usually in the form of a set of tables or database, where data points Data that can be used or adapted for more than one purpose e.g. you
can be related to one other using a unique identifier, so that the data might report on one set of beneficiary characteristics over one
Open Data
can be accessed or reassembled in many different ways. time-period for a funder and use the same dataset to report on
Data that anyone can access, use and share. Used to bring about
different characteristics for a manager or board of trustees.
social, economic and environmental benefits. Rich Data
Usually qualitative data which reveals the complexities and richness
Optimisation
of what is being studied, particularly in understanding human
The process of making something as good or effective as possible.
behaviour. For example, combining customer data with social media
Outcome and market demographics to understand consumer behaviour.
The difference an activity makes to those you’re trying to help; the
Service Quality
short-term, specific change you make.
An assessment of how well a delivered service conforms to the client’s
Profiles expectations, in order to identify problems, improve the service and
Customer insights – characteristics of people you connect with e.g. client satisfaction.
demographic data like age, gender, ethnicity, nationality or Stream Processing
geographic data e.g. where people live or what the geographic reach/ Processing data in real time, analysing it as soon as it is produced and
need is by area. received. For example, instant analytics of website activity.
Qualitative Structured data
Relating to, measuring, or measured by the quality of something A standardised format for providing information in a fixed field within
rather than its quantity, using non-numerical data. For example, a record or file. This includes data contained in relational databases
describing an emotion or feeling. and spreadsheets.

You might also like