Professional Documents
Culture Documents
Data Maturity Framework For The Not-For-Profit Sector
Data Maturity Framework For The Not-For-Profit Sector
Data Maturity Framework For The Not-For-Profit Sector
Version 2 created by Data Orchard CIC September 2019, replacing version 1 published in January
2017 in partnership with DataKind UK. Freely available for non-profit use licenced under Creative
Commons Licence: Attribution-NonCommercial-ShareAlike 4.0 International (CC BY-NC-SA 4.0)
The Data
Maturity Framework
USES
DATA
ANALYSIS
DEVELOPING
MASTERING
LEADERSHIP
EMERGING
UNAWARE
LEARNING
CULTURE
TOOLS
SKILLS
7 KEY THEMES 5 S TA G E J O U R N E Y
Data Maturity Themes
USES
• Purposes for collecting
and analysing
• Benefits and rewards CULTURE
• Team approach
DATA • Self-questioning
• Collection • Openness
• Quality • Protection
• Sources
• Assets TOOLS
• Storage
ANALYSIS • Type
• Type • Quality
• Technique • Sharing
• Joining • Integration
• Presenting
SKILLS
LEADERSHIP • Capacity
• Attitudes • Skills
• Plans • Training
• Capability • Access to knowledge
• Investment and expertise
USES Purposes for collecting and analysing data | Benefits and rewards
• Records basic client information and • Collect more data than required by • Collect data to be able to understand • All client, activity, output, and • Data is used extensively, and in
activities/work delivered in order to legal/funders/contracts. and evidence the types of clients/ outcomes data is routinely collected. inter-related strategic ways, for a wide
operate at a basic level. • Most data collection and analysis needs and problems the organisation • Services/products/campaigns are range of purposes.
• Little or no benefits or rewards in relates to capturing activities, addresses. Uses both internal and monitored to show performance on • Sophisticated use of analysis delivers
relation to services. measuring outputs, and basic financial reliable external data sources to do so. how, when and where these are used insights and predictions to influence
• Collect and use data for requisite analysis and forecasts. • Starting to use data to understand by whom. service and organisational success.
purposes e.g. basic financial • Rewards mostly around improved different ways beneficiaries initially • Services/products/campaigns are • Evidencing and improving outcomes
management and legal/funder/ understanding of beneficiaries and contact and engage with services over starting to be targeted at specific and impact is a primary focus.
contract compliance reporting. ability to articulate the scale of time (or not). demographics and/or geographic Experiments to identify differentiated
• Continued funding may be seen as the activities delivered. • Able to demonstrate activities being locations in response to better impact and how to optimise this.
only reason for collecting data. • Raising income and understanding delivered for specific types of users understanding of needs/problems. • Predict user needs and service/product
fundraising performance likely to be across a range of projects and services. • Services/products/campaigns are options based on understanding client
key focus for additional data collection • Capturing some outcomes data and regularly reviewed and adapted in behaviours and how to influence these
e.g. fundraising events, donors, grants, learning to measure outcomes response to data to optimise outcomes. for the best outcomes.
contracts, and sales. consistently. • Beginning to test assumptions on how • Design and delivery of services/
• Able to feedback information to • Starting to capture feedback, the organisation has an impact and for products/campaigns is optimised at an
funders, partners and networks around monitoring and evaluation data on whom. individual/personal level.
specific projects/services and the scale service/product quality and • Starting to differentiate between • Services and interventions are highly
of activities. May struggle with performance to inform improvements. approaches and understand what’s targeted possibly in collaboration with
multiple and or/repeat service use • Use data for income generation and working and what’s not. other partners/service providers.
data. some forecasting of sales and • Able to differentiate and explain • The organisation is embedded in
• Starting to explore the difference donations leading to more effective attribution and contribution in networks of knowledge and research in
between outputs and outcomes. fundraising and commercial income. evidencing and communicating the context of its work. Partnerships
• Starting to lead conversations with impact. and networks are strengthened
funders and partners using data. • Exploring and learning how to through collaborative data sharing.
• Building internal knowledge and measure long term impact. • Use data to increase efficiencies
expertise based on the analysis of data • Data is starting to be used to inform (resources, processes, services/product
and dialogue on how to act on this. efficiency savings (resources, processes delivery).
• Strategic planning, particularly around and service/product design). • Robust evidence ensures credibility
efficiency and service development, is • Can coherently make the case to and is used to influence external policy
becoming more data informed. funders/investors/clients for existing and decision makers.
and new services/products/campaigns. • Learn, evaluate and build knowledge
• Learning and evaluation is embedded - harness data for continuous
internally and informs both service and improvement of products/services/
staff/volunteer performance. campaigns.
• Strategic planning and decision • Strategic planning and decision making
making is becoming considerably data is highly informed by data and based on
informed. past, present and future analyses.
DATA Collection | Quality | Sources | Assets
• Limited data (if any) collected. • Data collection is both on paper and in • Though errors remain, data collection • D ata requirements defined and • Focuses effort on collecting the
• Collected manually, mostly on paper, digital forms though there may be methods and processes are being consistently collected via a range of minimal amount of the most important
only when needed for specific purpose. inconsistencies and inefficiencies in improved. methods. and valuable data.
• Data isn’t meaningful or useful to the approach. • Staff and volunteers are trained in data • Investment in staff and volunteer • Staff and volunteers are trained in data
organisation. • Staff and volunteers may not be collection. training to ensure consistency and collection and collection is automated
• Infrequently updated (if ever). trained in data collection. • Data is reviewed to assess how quality of data collection. where possible.
• Not checked for validity or accuracy. • Data is regarded as meaningful or relevant, meaningful, and necessary • Tests and refines how relevant, • Knows all its data is relevant,
• Data isn’t shared internally. useful primarily for meeting external it is. meaningful, and necessary its data is. meaningful and necessary.
• No external data sources used. legal/funder/contract requirements. • The organisation knows how good or • Focuses on collecting the right data for • Monitors and fully understands the
• Nobody is aware or interested in the • It may be patchy and inconsistent with bad its different data sets are; and clearly specified purposes. quality of the data it holds and hence
data assets in the organisation. gaps and duplications. knows which data sources can/can’t be • Data is monitored for quality including has high levels of confidence and trust
• Data is rarely updated and cleaned. trusted. completeness, accuracy, and validity. in its data.
• Mixed levels of confidence and trust in • Gaps, overlaps, and mismatches in the Tools and systems exist for cleaning • Data is versatile and re-usable for
data. available data have been identified. and maintenance. multiple purposes and audiences.
• Starting to find out what data is • Data becoming richer, more relational • Richer data collection with more • Invests in resources to collect, clean,
available internally. Know where most and therefore versatile. integration/alignment between maintain, and manage data well across
data assets are but there may be more • Starting to explore how internal data systems reduces duplication, the organisation.
squirrelled away in parts of the (perhaps in multiple, fragmented or inefficiency and error. • Shares data internally from different
organisation. isolated locations) can be shared and • Data sharing internally is becoming teams, departments and services.
• Occasional use of external information utilised. the norm. • Compares its data with other
sources relating to the wider context of • Additional internal and external data is • Occasional commissioning of organisations through shared
the organisation’s work. sourced. independent research and evaluation. measures and benchmarks.
• All data assets are known but may not • Extensive use of externally available • Uses publicly available external
be formally recorded. research and evidence around needs research (e.g. government/academic)
and successful interventions. and may contribute to this.
• Open datasets are occasionally used. • Regular use of valuable open/public
• Exploring shared measures and data sets.
benchmarks with other organisations. • Commissions external independent
• Recorded list of all data assets, whether research and evaluation.
they include personal/sensitive data, • Maintains full inventory of data assets
and assigned responsibility. across the whole organisation with
clearly defined variables, ownership,
review periods, and development
plans for each.
ANALYSIS Type | Technique | Joining | Presenting
• L imited analysis of financial, contract- • Analyses starting to explore service • Whole-organisation analyses are • Embedding systems for analysing • Analysis extends beyond the
related, and legal compliance data. users/clients/customers and target beginning to be performed on an meaningful and useful data. organisation to its wider context with
• Mainly simple manual counts from audiences. ad-hoc basis. • More consistent and regular approach cooperative analyses performed with
paper forms or digital records are used. • May include use of external research • Beginning to focus on what analysis is to data reporting and trends analysis partners/other agencies.
• Strategic discussions are not informed relating to the context e.g. to evidence meaningful and useful. on users/needs, activity, outcomes and • Analysis extends through descriptive,
at all by analysis of data. scale of need/problems. • Starting to identify what data should impact. diagnostic, predictive, and prescriptive
• Data is not used in reports. Anecdotes • Basic analysis, using counts and be routinely analysed and potentially • Monitors what’s happening in present techniques.
and observations are preferred. spreadsheets. automated. as well as what’s happened in the past. • May conduct Randomised Control
• People verbally report on data as part • Analysis is mostly descriptive about Some forward looking analysis/ Trials.
of strategic discussions. what happened e.g. summarising the forecasts may challenge views of future • More focus on what will happen in the
• Analysis and report creation skills are overview, averages, variation, range. performance. future with forecasting and predictive
being explored though results may be • Comparative trend analysis is starting • Analysis is more diagnostic about models to plan for the future needs of
variable. to be conducted over time (perhaps on where/why things happened e.g. beneficiaries, target services, increase
• Bar charts and pie charts may be the an annual basis). exploring root causes, clustering income, and maximise impact.
only types of analytical presentations. • Exploration and use of filters and cross patterns, anomalies, discovering • Advanced prescriptive analytics
tabs are being used to delve further differences and trends. Some attempts support decision-making on how do
into data. at A/B testing. Occasional use of things in the best way e.g. optimisation
• Data is manually collated in reports predictive analytics in some areas e.g of location, staff/volunteer capacity,
using data from different sources. timeseries/forecasting. recommending decisions for effective
• Data is manually reworked for • Aware of difference between intervention, experimental design,
presentation in written reports for correlation and causality. simulation, artificial intelligence.
different internal/external audiences. • Routine data analysis is partially • Data from different sources is brought
• Learning how to create more automated and partially manually together and analysed in an
sophisticated graphs and presentations collated from different sources. automated way to provide an
of the data (though audiences may • Presentation and communication of organisation-wide analysis.
find them difficult to interpret and data is honed to ensure its meaning is • Dashboards, business intelligence
understand). understood. systems, visualisation tools draw from
• Some use of dashboards and/or data warehouse.
businesses intelligence systems • Data visualisation delivers meaningful
• Beginning to explore interactive data analysis to different internal and
visualisation. external audiences.
• Both static and real-time dynamic • Non data specialists are able to
reporting conducted for different interactively explore, analyse and
audiences, some may be available for report on the organisation’s data.
non-specialists to independently
access.
LEADERSHIP Attitude | Plans | Capability | Investment
• Not interested in data at all and not • Some recognition of the importance of • Know data is important, curious to • Becoming engaged, supportive, ask • Value, plan and prioritise data and
seen as a priority. data but don’t see the value of it. learn about its potential uses and the right questions of the data, and analytics as a vital organisational
• No business plan or plans around data. • Little awareness of the potential uses benefits. active in harnessing its value. resource.
• Don’t use data for decision making, of data so not seen as a priority. • Leadership occasionally ask questions • Data is becoming more of a priority for • Fully understand how to use data to
instead rely on gut feeling, experience • There may be a business plan though about the data they are given but are the organisation as a whole (and improve what the organisation does.
and what seems to work. perhaps with no defined or not entirely convinced about its value. considerably so in some projects/ Data drives questions and the
• No data or analytics expertise or measurable organisational goals. • Data is an interest of the organisation teams). organisation is influenced by what data
understanding among leadership. • Limited or very basic data and analytics but not a priority. • Data is becoming aligned to tells them.
• Don’t invest in data and analytics. knowledge and experience among • There is a business plan with some overarching business plan and desired • Viewed as a major organisational
leadership. defined and measurable goals, though impact. priority.
• Typically use data about what data collection/analysis may not align. • Monitors what’s happening in the • Overarching business plan with clearly
happened in the recent past and verbal • Learning about impact e.g. exploring present as well as past trends. Some defined goals based on outcomes and
accounts of what’s happening for theory of change. exploratory forward-looking research differentiated impact, forecasting, and
decision-making. • Learning through experience, building and predictions. prediction of need.
• Very little investment though some ‘adequate’ skills. • Data champion within senior • Use past, present and forward looking
may occur under the radar within • Assessing data and analytics skills, management. data for business planning and
specific projects/parts of the knowledge, and roles across the • Addressing skills gap in leadership as a decision making (including
organisation. organisation. whole including understanding forecasting, modelling, prediction and
• People are expected to learn ‘on the • Might use past and current data for around impact measurement and optimisation).
job’ and there’s no investment in decision making with some simple management. • Range of people with data and
specific roles, tools, or training. trend analysis. • Starting to plan and prioritise data analytics expertise in leadership
• Invest small amounts in some basic/ organisation-wide. including at Board level.
existing tools and staff training on an • Beginning to commit significant • Invest substantially in continuously in
ad hoc basis. investment in people (job roles), skills, improving data collection and analysis
• Exploring ideas and needs for the learning, and tools. aligned to business plan.
future though no formal plan related
to data.
CULTURE Team approach | Self-questioning | Openness | Protection
• Nobody is interested in data. • Data is seen as the responsibility of • Data is starting to be recognised as • The whole organisation is starting to • Data seen as a team effort and critical
• Data only accessible to a single person ‘someone else’, usually in an important at a more senior level. use and share data. People from asset for every part of the organisation.
or team, usually junior staff. administrative role. • People across the organisation are different teams/levels of seniority • Everyone in the organisation is
• Opinion, observation, passion, and • Recognition that data should be starting to talk about how they can regularly discuss data and how to act committed to ensuring quality data is
belief are used for decision making. collected but it is not seen as a ‘whole work together to deliver better data for on it. available to support decision-making.
• Data is seen as a chore with questions/ team’ activity. decision making. • Specialist staff in some teams are • Very comfortable using data to ask
requirements mostly externally driven. • Data mostly sought out and used to • Beginning to ask challenging starting to use data to ask difficult difficult and complex questions, to
• Data is never shared internally or support and evidence what the questions of the data: why? what’s questions, challenging assumptions, challenge practices and preconceived
externally. organisation already believes or knows. changing? what difference are we practices and impact. notions about the past and future.
• There are no policies related to data. • Organisation’s culture doesn’t making? (mostly about the past.) • Concepts of right and wrong (ethics) • Aware of the practical difference
• Minimal, if any, security and protection encourage data sharing across teams, • Access to data may be limited by are being considered, particularly in between: correlation and causality;
of data on paper, computers, or though this may occur occasionally default rather than design. relation to personal data. attribution and contribution; ‘known
devices. verbally or via reports. • People would like to share more but • Data and analysis is becoming more and unknown’ unknowns.
• Data never shared externally except for may be constricted by access/ available and accessible to staff though • Explores potential negative impacts of
legal/contractual reporting purposes. permissions/cultural barriers. may require some intervention by interventions as well as data ethics.
• Basic policies for data protection and • Some data insights are shared with specialists to provide this. • Internal openness and data sharing is
security may be in place but not partners, networks, and in the public • External data sharing is done on an fundamental to the culture, subject to
monitored or enforced. domain. aggregated basis and insights are data protection/security.
• Data protection and security policies shared including some shared • Everyone can access analysis they need
are in place. measures and benchmarks. when they need it.
• Senior management have some • Exploring how data could be shared • Data insights/evidence publicly available.
limited understanding of legislation with beneficiaries on an individual • Extensive data sharing, with protocols
and the organisation’s responsibilities. basis as part of service delivery. in place with partners, networks,
• Policies and practices are well stakeholders to address shared
established to ensure data is problems and solutions.
safeguarded (e.g. rules on passwords, • Data may be shared with beneficiaries
how data is stored, rights/privileges to as part of service/support.
access organisational and beneficiary • High levels of confidence about the
data). security of data held in the organisation.
• Risks have been identified though may • Systems, automated if possible, in
not have been tested. place to delete personal data no longer
• Systems have been created to ensure necessary and respond to subject
data about identifiable individuals is access requests.
deleted when no longer necessary and • Risks monitored and tested to improve
respond to subject access requests. data security and protection.
• Trustees and senior management keep • Widespread knowledge/skills sharing.
abreast of current legislation and • Trustees and senior management keep
best practice. abreast of future changes in legislation
and best practice, and regularly check
Data Protection compliance.
TOOLS Storage | Type | Quality | Sharing | Integration | Investment
• Data is disorganised and unmanaged • Data is stored in designated physical • Digital data is mostly centrally stored • Data fully centrally stored in secure • Data is drawn from a minimal number
and stored in a range of places: on locations, in paper and digital formats, on a (secured, backed-up) cloud-based digital systems with managed access. of sub-databases and systems and held
desks, in filing cabinets, individual in organised ways. system or local server with managed • Data held in appropriate databases/ in singly accessible database e.g. data
peoples’ email inboxes, computers, • Some data is managed and controlled access. Some may remain inaccessible CRMs or other technologies accessible warehouse.
phones, or other devices. by people with clear responsibility for on computers, central shared drives or by expert users and some non-experts. • Capacity is available to store, manage,
• Nobody has any formal responsibility maintenance and cleaning (e.g. devices. • People are formally responsible for and analyse increasingly large volumes
for managing any data in their job. Any administration, finance, HR, • Unstructured data is becoming better managing the storage, cleaning and of data from multiple sources.
organising, archiving, updating or data managers). organised and searchable (e.g. folder maintenance, security, and backup of • The organisation has a robust
cleaning is performed in an ad hoc way • Data is mostly collected on paper/ structures/file naming conventions). all data. Where possible this is infrastructure, integrating tools
by individuals. phone/in person and then entered into • Data held in a range of systems all becoming routine and/or automated. wherever possible.
• Data is mostly collected via paper, a database or spreadsheet for basic separately managed. Tools likely to • Data is collected and automatically • Analytical infrastructure to support
email, SMS messages. This may not get analytical and reporting tasks. include at least one database or CRM stored digitally wherever possible e.g. effective decision making is a priority.
transferred onto spreadsheets or • Unstructured data is disparately held system. There are also likely to be online forms/apps directly into • All tools are documented and
structured document filing systems. on different individuals’ computers multiple spreadsheets in use. databases. supported (internally and/or
• Typically uses a spreadsheet/s or basic and devices. • Tools mostly used operationally rather • An inventory of tools and systems externally).
finance software package, though not • Data may be shared mostly by than analytically. May allow some basic (including hardware, software, licence, • Tools for delivering ‘batch analytics’
in any analytical way. emailing spreadsheets and documents inbuilt analysis and reporting but most passwords and access) is managed and and real-time ‘streamed’ data are used.
• No data sharing happens in the as attachments with duplication, often data has to be exported for maintained. • Advanced analytics and data science
organisation. version control, and security issues. analysis in another tool. Possible • Most tools are up to date with support tools present throughout the
• No planned investment of time or • Some people or teams may use advanced analytical tool used for basic available. Work-arounds are organisation.
money in any tools, systems, or cloud-based document storage to data processing or descriptive understood and replacements planned • Tools able to access and utilise internal
infrastructure. share some data (e.g. OneDrive, statistical analysis. for poorer tools. and external data directly, for both
Google Drive, Dropbox, Box). Note • Joining data or analysis across teams/ • Advanced tools being used for experts and non-experts.
these may be personally or services/functions requires manual sophisticated analytics in some parts of • Automated reporting e.g. through
organisationally owned. exporting and re-stitching. the organisation e.g. R, SAS, SPSS, dashboards.
• Tools are limited. May not be • Tools likely to be purchased or built as Python. • Self-service analytics available both
up-to-date, don’t meet current needs, ‘one-offs’ for specific purposes with • Some integration is beginning to occur inside the organisation and to some
and are not documented or supported. limited flexibility for change or between systems with automated/ external partners/stakeholders.
• Tools are acquired on a ‘needs-must’ improvement. Replacements and aligned reporting e.g. basic use of • Ongoing investment of time and
basis e.g. for a specific purpose/project upgrades are being researched and business intelligence tools. money in developing and improving
or when a crisis/system failure occurs. costed. Possibly some regular • Significant investment is being tools, systems and infrastructure to
expenditure on cloud-based software committed to new tools/integrations support data and analytics.
tools and ad-hoc hardware across the organisation. Staff time,
replacement. upgrades, and replacement are being
built into annual budgets.
SKILLS Internal capacity | Skills and training | Access to knowledge and expertise
• No staff commitment beyond basic • Responsibility for data collection and • Beginning to understand needs • Understand different skill sets within • High levels of staff commitment at
administrative level and finance roles. control is at administrator and finance around data skills and capabilities. data and analytics. senior, specialist, technical, and
• Little or no internal skills, training, or officer level. • Dedicated person/team in charge of • Dedicated skilled analytics roles administrative levels.
expertise. • Different staff collect, manage and use data, perhaps a data manager or senior established with several people • Senior data strategist embedded at
• No access to external knowledge or data within other roles e.g. fundraising, administrator. Some skilled data responsible for data in different roles/ heart of leadership decision making.
expertise around data or analytics. projects. people in other roles, though perhaps teams. • Able to independently manage/drive
• No real understanding of the needs with limited capacity to fulfil the task. • Possibly a senior person/team bringing and maximise data analytics to an
and skills required for building data • Adequate data analysis/reporting skills organisation-wide data together. advanced level.
capabilities. with some investment in more • Increased data literacy/responsibility • All staff trained with ongoing
• Data literacy is patchy, mostly low, advanced skills e.g. Database/CRM across the organisation. investment in developing data skills
amongst staff. administrator. • Individuals responsible for data have with high levels of data literacy across
• Basic/adequate skills and training in • Commitment to improving data advanced training and skills and the organisation.
using data for operational and literacy. regularly engage in learning to • Becoming experts that other partners/
administrative purposes. • Exploring up-skilling and recruitment develop and improve systems and peers use as a resource.
• Little or no staff/volunteer awareness or to fill skills gaps. embed these across the organisation. • Specialist staff regularly update skills
training in data protection and security, • In house or externally provided • Staff know how to respond to subject and knowledge through training and
though perhaps at least one person training for using data systems. access requests (where individuals conferences.
has completed a course. • Staff and volunteers have basic data request a copy of their data) or changes • Active in online learning networks and
• Occasional support from trustee/ protection and security training though in preferences on personal data. data and analytics communities of
volunteers perhaps mostly relating to might not be very confident. • Staff know how to respond to a data practice exploring new tools and skills.
database/finance or reporting. • Establishing relationships with breach, potential breach, or near miss. • Awareness about openness and
external support and advice, mostly • Regular use of advanced external protection of data are embedded
around specific tools, systems or expertise and specialist suppliers. throughout the organisation with
projects with some skills development. regular training, and in-house learning
network to develop and maintain best
practice.
• Ongoing relationships with a range of
trusted suppliers providing advanced
support and specialist expertise,
periodically reviewed.
Glossary