When you have all this information about your business or project saved and tracked, what do you do with it? That’s where interpretation of data comes in. It is designed to help people with limited statistical or programming skills quickly become productive in an increasingly digitized workplace.
Data analysis and interpretation is the process of assigning meaning to the collected information and determining the conclusions, significance, and implications of the findings. The steps involved in data analysis are a function of the type of information collected, however, returning to the purpose of the assessment and the assessment questions will provide a structure for the organization of the data and a focus for the analysis.
Numerical vs. Narrative – Quantitative vs. Qualititative
The analysis of numerical (quantitative) data is represented in mathematical terms. The most common statistical terms include:
- Mean – The mean score represents a numerical average for a set of responses.For a data set, the terms arithmetic mean, mathematical expectation, and sometimes average are used synonymously to refer to a central value of a discrete set of numbers: specifically, the sum of the values divided by the number of values. If the data set were based on a series of observations obtained by sampling from a statistical population, the arithmetic mean is termed the sample mean to distinguish it from the population mean.
- Standard deviation – The standard deviation represents the distribution of the responses around the mean. It indicates the degree of consistency among the responses. The standard deviation, in conjunction with the mean, provides a better understanding of the data. For example, if the mean is 3.3 with a standard deviation (StD) of 0.4, then two-thirds of the responses lie between 2.9 (3.3 – 0.4) and 3.7 (3.3 + 0.4).
- Frequency distribution – Frequency distribution indicates the frequency of each response. For example, if respondents answer a question using an agree/disagree scale, the percentage of respondents who selected each response on the scale would be indicated. The frequency distribution provides additional information beyond the mean, since it allows for examining the level of consensus among the data.
- Higher levels of statistical analysis (e.g., t-test, factor analysis, regression, ANOVA) can be conducted on the data, but these are not frequently used in most program/project assessments.
The analysis of narrative (qualitative) data is conducted by organizing the data into common themes or categories. It is often more difficult to interpret narrative data since it lacks the built-in structure found in numerical data. Initially, the narrative data appears to be a collection of random, unconnected statements. The assessment purpose and questions can help direct the focus of the data organization. The following strategies may also be helpful when analyzing narrative data.
- Focus groups and Interviews:
Read and organize the data from each question separately. This approach permits focusing on one question at a time (e.g., experiences with tutoring services, characteristics of tutor, student responsibility in the tutoring process). Group the comments by themes, topics, or categories. This approach allows for focusing on one area at a time (e.g., characteristics of tutor – level of preparation, knowledge of content area, availability).
Code content and characteristics of documents into various categories (e.g., training manual – policies and procedures, communication, responsibilities). This approach keeps your information organized and easily accessible when you
Code patterns from the focus of the observation (e.g., behavioral patterns – amount of time engaged/not engaged in activity, type of engagement, communication, interpersonal skills).
Data Interpretation and Analysis Techniques
The analysis of the data via statistical measures and/or narrative themes should provide answers to your assessment questions. Interpreting the analyzed data from the appropriate perspective allows for determination of the significance and implications of the assessment.
Analysis of data is a process of inspecting, cleaning, transforming, and modeling data with the goal of discovering useful information, suggesting conclusions, and supporting decision making. Data analysis has multiple facets and approaches, encompassing diverse techniques under a variety of names, in different business, science, and social science domains.
Data mining is a particular data analysis technique that focuses on modeling and knowledge discovery for predictive rather than purely descriptive purposes. Business intelligence covers data analysis that relies heavily on aggregation, focusing on business information. In statistical applications, some people divide data analysis into descriptive statistics, exploratory data analysis (EDA), and confirmatory data analysis (CDA). EDA focuses on discovering new features in the data and CDA on confirming or falsifying existing hypotheses. Predictive analytics focuses on application of statistical or structural models for predictive forecasting or classification, while text analytics applies statistical, linguistic, and structural techniques to extract and classify information from textual sources, a species of unstructured data. All are varieties of data analysis.
Some Data Interpretation and Analysis Tips
- Consider the data from various perspectives. Whatever your project may be or whatever data you have collected from your business it’s always best to ask what that data means for various actors or participants.
- Think beyond the data but do not stray too far from the data. Be mindful that you are not making too much of your data or too little. Make the link between the data and your interpretations clear. Base your interpretations in your research.
- Make visible the assumptions and beliefs, or mental models, that influence your interpretation. We each carry images, assumptions, and stories in our minds about ourselves, others, the organizations we work in, etc. As a composite, they represent our view of our world. Because these models are generally unarticulated, i.e., below our level of our awareness, if left unexamined, these assumptions and beliefs can lead to incorrect interpretations. Reflect on your own thinking and reasoning. Individually and/or collectively list your assumptions about the inquiry focus.
- Take care not to disregard outlying data or data that seems to be the exception.
- Data that is surprising, contradictory or puzzling can lead to useful insights (insites.org)