For today's Interview Wednesday, we talk with Elizabeth Oyer, Director of Research at Eval Solutions in Carmel, Indiana, USA. You can read Elizabeth's blog here. Her Twitter account is @elizabethoyer.
#1. What got you interested in evaluation and evidence?
I believe my work in evaluation has been my destiny for a while. I began my undergraduate studies in Psychology with a minor in French (because I loved studying French!). By my sophomore year, the practicality of a French minor set in and I decided to double major in Communications. To satisfy my passion for French, I began teaching in a weekend program for
gifted students (3rd and 4th graders). In my senior year of teaching this class, I was paired with an Education major. We decided to divide up the teaching duties, alternating throughout each class. Well, instantly I could see how much better she was at teaching!
I was fascinated by the impact of our different abilities on the students. Staring down the barrel of graduate school, I was elated to discover the field of Educational Psychology and my path to evaluation began at Indiana University's program with an Inquiry emphasis.
What types of evidence do you work with most often (medical, business research, statistics, social science, etc.)?
For most of my projects, my work incorporates quantitative data (e.g., achievement data and survey data) as well as qualitative data (e.g., observations, interviews) and extant records (e.g., lesson plans) in education. The majority of my clients are in the field of education, although I also work with businesses (e.g., needs assessments, strategic planning) and non-profit organizations.
What is your involvement with evidence and evaluation: applying it, advocating its use, researching/developing it, synthesizing/explaining/translating it, communicating it?
I would say there is an element of all of these in my work! Certainly most of my projects incorporate reviewing literature and work in the field, modifying instruments to meet the clients' needs, reporting the results for accountability requirements but also translating trends in the data to feed back to answer local questions about progress, as well as disseminating the results (generally at conferences).
Where do you go looking for evidence, and what types of sources do you prefer? (formally published stuff such as journals, or something less formalized?)
When I am looking for evidence to help me process and contextualize results, I rely mostly on formal evidence (like journals). However, the informal conversations I have with my colleagues are priceless! When we are together we are always thinking, analyzing, evaluating our work to improve our impact on the field.
#2. On a scale of 1 to 10, where 10= ‘It’s crystal clear.’ and 1=’We have no idea why things are happening.’, how would you describe the overall “state of the evidence” in your primary field?
I would say with the advances in brain research and contributions coming from effectiveness of comprehensive reform, our understanding of what we're supposed to do is an '8' - but how to get systems to their optimal states of effectiveness consistently and across contexts is about a '4'.
Work on implementation fidelity and systems thinking is getting us there, but it certainly appears we have a long road ahead of us.
Which of these situations is most common in your field?
a) Much of the evidence we need doesn’t yet exist.
b) People don't know about the evidence that is available.
c) People don't understand the evidence well enough to apply it.
d) People don’t follow the evidence because it's not the expectation.
I would say a healthy level of 'c' and 'd' are common. As expectations and accountability standards at the federal level increase, the field responds. When there are vague policies, there are too many opportunities for excuses for why standards of evidence are too difficult to implement.
#3. Imagine a world where people can get the evidence they need, and exchange it easily and transparently. What barriers do you believe are preventing that world from becoming a reality?
I believe there are a couple of sizeable barriers to our knowledge in the field. The first relates to access to information about the work of our peers. For a large number of workers in the field (evaluators who are not in higher education), access to journals of published work is difficult. There are no models to open access to this work in the same way that there is easy access to other resources (like music and videos). Paying $20 to read what may be 1 out of 50 articles you need for a project is simply not feasible.
Secondly, I believe access to individual-level student data (in accordance with FERPA standards, of course) continues to be more difficult than necessary. In this digital age, access is more often a matter of the will of the community stakeholders (administrators, teachers, school boards, project leadership) than a practical issue. It is vital that our understanding of progress in education target observable impact on individual students; we must continue to implement evaluation frameworks that test our theory of change from system level factors (e.g, school policies) to classroom level factors (e.g., teachers and resources) on clearly articulated student outcomes.
Where do you see technology making things better?
Technology has the potential to improve access to work universally when we have a business model that allows for the work to be reasonably profitable. As we move toward instruction that is grounded in data, technology can provide a vehicle for information flow, especially with automated tools like a web services and data clouds.
#4. How do you prefer to share evidence with people, and explain it to them? Do you have a systematic way of doing it, or is there a format that you follow?
Well, many times there are reporting requirements set by the program officers funding the project, so those are always followed. However, the standards for evaluation have guidelines for communicating results and I try to incorporate these into my reports. I also try to create additional formats (like press releases, Executive Summaries) to provide access to information. Finally, I meet with all of my clients to discuss what the evaluation is saying about the project's progress (or how the evaluation needs to be improved to better inform the progress!).
I also have begun sharing my work in my blog and I try to present at conferences annually. Professionally, I would like to do more published work, so that will be my next target for myself.
What mistakes do you see people making when they explain evidence?
I would say poor evaluation frameworks or poor implementation of sound frameworks is at the root of a report that doesn't give the guidance that is needed. Spending time clearly articulating the theory of change down to the observable, measurable implementation goals for the core program elements is more often the problem than the data itself. Often folks go way beyond the evidence at hand to make generalizations that don't hold water.
#5. What do you want your legacy to be?
I'd like to be remembered for pursuing excellence in the field with integrity and respect for stakeholders while delivering high quality, useful information that changes the way things work. A very lofty goal that keeps me hopping!
#6. What question do you wish we'd asked?
"What is your most important resource?" My colleagues are my most important resource. I couldn't do my work with the hard work of my subcontractors and project partners and clients. My colleagues provide formal support (through partnerships on grants) as well as serve as an informal resource (to bounce ideas, confirm interpretations of evidence in tight spots) and just generally feed my evolution as a professional. My clients keep me motivated, with their passion for their projects, that my purpose is to provide the best evaluation services and seek out the best resources to meet the needs of each project.
Thanks for sharing these insights, Elizabeth.
Chime in. Would you like to be interviewed, or do you have someone to recommend? Drop me a note at tracy AT evidencesoup DOT com.