1. Your research started over two years ago, and we have seen a few great headlines on its outcomes. One of the findings is that educators’ resistance to trying out new technology is linked to their ‘fear of looking stupid’ in front of their students.Can you tell us more about the background of the project, your research, and your expectations? Were there any surprises or any findings you did not anticipate?
I work at Carnegie Mellon University, where innovation is a way of life, and where learning science is a research focus for many faculty members. When I arrived on campus, some of my colleagues had been asking for years why proven, science-based educational technologies weren’t in wider use. Folks at Carnegie Mellon have developed tools and technologies that promote marked improvements in student achievement, supporting more students as they learn better and faster—so why aren’t these tools ubiquitous in higher ed?
Anthropologists study people, and for two years I’ve had the privilege of studying the people and places of higher education. I took a deep dive into this complex landscape of culture, policy, and practice to build a nuanced picture of the interests and concerns that impact the use of educational technology on campus. Over the past two years, I have spent thousands of hours taking notes, photographing and videotaping meetings, listening to conference calls, watching collaborative work, following projects from ideation through classroom use, sitting in classes, observing students and educators as they grapple with university teaching, and interviewing people from across the university.
I used a complex research design including ethnographic methods, which have the power to tell us things you simply can’t learn through other means that are less intensive. This study used quantitative approaches like a survey, as well as interviews and ethnographic observation. Ethnographic studies require the researcher to set aside any bias towards the subject with the goal of avoiding hasty conclusions and preconceptions that might unduly cloud the analytic process, and it takes time to do right. The integration of massive amounts of thick, qualitative data in a process-oriented systems analysis can capture relationships among different people, policies, and practices that are often surprising. We’ve learned a lot!
Educators innovate in myriad ways and stick with what they believe to be tried-and-true teaching tools and practices whenever they can. What we found is that educators’ resistance to trying out new technology is proportional to perceived risks as well as benefits. Educators care about student outcomes, about their own fields of specialisation, and about professionalism. They are excited to try new things if they believe that students will fare better, it will advance their field, and they will maintain high standards of professional performance. Educators are reluctant to try new things that they think will take up a lot of valuable time; make it harder for students to understand key concepts; get in the way of student-faculty relationships; fail when students or faculty need it; leave students feeling bored or dissatisfied; waste students’ time; or give students the impression that they don’t know what they’re doing. Educators are glad to try out new technology if it will improve experiences for students and instructors, and benefit their field and institution. But teaching and learning are complex, and innovations always entail some uncertainty. There has to be a good reason to take risks with something that’s already working, and that reason is going to be different in different situations.
To date, the specificity of each context has proven a really thorny problem: If each faculty member is different, each tool or technology adds something new, each student unique, and each institution quirky in its own way, how can we identify generalizable guidelines? A key result of this research is the acknowledgement that we need a coordinated, evidence-based approach to identifying the best ways to support faculty, promote improved learning outcomes for students, and enhance educational offerings across the board.
2. Can you take us beyond the headlines: What are the main challenges related to this resistance to avoiding new tools, pedagogical styles and methods?
Educators will adopt new tools and practices if they believe that they, their students, and their institution will benefit from the change. This means that the biggest challenge is making sure that an educator sees the value of the innovation clearly and believes that using it will not risk wasting students’ time, create more work for educators, or otherwise cause problems. The development, adoption, implementation, and sustained use of innovative teaching tools and practices constitute a really complicated endeavor. Each instance involves lots of experts who have to work together. Collaboration is challenging, especially when it involves working with people who have really different expertise, perspectives, and priorities. Because of this, single cause-focused policy initiatives will fail because they don’t take this complexity into account. Embracing the complexity and identifying strategies to address it in the implementation of new teaching methods may be daunting, but my results say that supporting innovation means doing just that.
3. These outcomes are important due to the complexities of implementing innovations in higher education. They suggest new ways in which faculty might be supported in their efforts to improve learning.What practical recommendations or suggestions do you have for educators, senior-level management in HE, education-technology providers, or technology implementers and administrators, etc.?
The pace of change in the state of the art is really rapid: The newest and best is different every semester. Developers, policymakers, technologists, and support personnel who are familiar with educational technologies must keep in mind that educators who are less familiar with specific innovations are also less familiar with those tools and practices. What seems intuitive to a learning scientist or a technologist might need to be learned by faculty whose expertise lies elsewhere. To effectively support faculty, give faculty a way to express their concerns, and address those concerns. Don’t assume that a minor change is minor; you might be asking more than you realise of your educators or support staff. Provide educator-centred support for new tools and approaches. Help educators to envision their own use of a tool or practice, and provide just-in-time support for faculty who are experimenting. Experimentation always entails risk; if we want educators to take those risks willingly, they must believe that someone has their back and that their students, career, and institution will benefit— even if things don’t work out exactly as planned.
4. In the process of your research and talking to the educators involved, did you observe a change in their attitude? Did you see them let go of their reluctance as they perhaps became aware of their own fears? Have you gone back to those people and seen changes in their attitudes toward the use of technology as a result of your research and time spent with them?
Educators never start with a blank slate. Even first-time teachers have been students themselves and have been anticipating their roles as educators for many years. They have ideas about what works and what doesn’t, as well as plans and experiences that shape their teaching philosophies and teaching practices. And educators are constrained by their students’ needs, curricular needs, and institutional policy and culture, as well as their own personal and professional values. Every single educator I spoke with over the course of a year developed as an instructor during that time—as do all educators. Attitudes and practices change with experience and time. One of the biggest pitfalls for policymakers and ed-tech developers is to imagine that they are making decisions for a static system. Educators, students, technology, policy, and culture are always moving targets.
5. Can you let us know more about your anthropological approach and how it helps decode our behaviour?
Anthropologists study people, wherever and whenever they are. A holistic, systems approach to understanding the cultural context of any phenomenon can be incredibly illuminating. It’s stunning how much we don’t recognise about our own behaviour simply because we live it every day. Two people can have a conversation that both feel completely satisfied with, and yet they walk away from it with very different experiences of, and understandings derived from, the same exchange. Anthropology helps us to put these disconnects into context, to explain the underlying patterns that shape the different experiences people have of the same phenomenon. Anthropology gives us the tools to make the familiar strange and to apply rigorous methods to understanding the way that experience, behaviour, power, policy, and the world around us intersect.
I specialise in material culture, in technology, and so I lean heavily on my training as an anthropological archaeologist. Archaeologists famously learn about the lives and cultures of ancient people by studying the things they left behind. Archaeological methods, which were developed to study technological change in the past, can also be used to understand technological innovation in the present. The emergence, form, and social roles of today’s cutting-edge technologies are shaped by the same forces that affected the maintenance of water-management technologies in Southeast Asia 3,000 years ago, the development of pottery in the American Southwest 1,500 years ago, and the emergence and disappearance of electric cars in the United States a century ago. Technological innovations take root or fade from memory, and anthropological methods give us a way to understand the emergence of trends, as well as the key factors that shape people’s engagement with each other and with the world around them. Anthropology provides a lens through which to examine the production, use, reuse, repair, and discard of technologies as part of a more complete and integrated understanding of the relationship between labour, the value and exploitation of available resources, and the emergence, transformation, and diffusion of traditions.
6. Where will you/your research be going next?
While each faculty member is different, each tool or technology adds something new, and each institution is quirky in its own way, we can create some generalisable guidelines to rapidly identify significant quirks and effective policy levers. A coordinated, evidence-based approach to identifying these actionable levers is a big task, but we can take heart in the fact that it’s been done before. A call to extend evidence-based medicine throughout the global healthcare system was first framed about 25 years ago. It took a decade of trial, error, and extensive collaboration to develop the organisational bodies and roadmaps necessary for the creation of implementation science, an evidence-based approach to the effective dissemination, adoption, implementation, and sustained fidelity of application in evidence-based medicine. Today, researchers in areas such as healthcare delivery, epidemiology, and public health work together to break out of siloes and devise methods and theories that can bridge the gap between research and practice. If we want to find the best ways to support faculty, promote improved learning outcomes for students, and enhance educational tools and practices across the board, we can learn from the model of biomedicine. Education research stands to learn a great deal from the method and theory developed in implementation science.
7. Could you please tell us a bit about your background and the person or people who have been your biggest personal influences?
I love anthropology—people are fascinating! I’m a methodologist at heart, and I’ve spent my career working across sectors, in quantitative, qualitative, and mixed-methods research design and execution, in public, private, and non-profit contexts, and across a variety of disciplinary domains. I’ve worked in healthcare, political science, and criminal justice research, and I spent two years working for a university centre for teaching and learning, where I taught future faculty how to teach at the university level. Carnegie Mellon’s Simon Initiative, inspired and named for Nobel and Turing Award laureate Herbert Simon, harnesses a cross-disciplinary, learning-engineering ecosystem that has developed over decades and gives me the opportunity to do the kinds of creative research that might not be possible elsewhere. Cutting-edge research in learning science and innovative educational technologies abound, and there’s a genuine excitement about improving access to quality education for all students. It’s an inspiring place to work.
8. Is there anything specific you are looking forward to at OEB?
I’m excited to attend a cross-sector conference that draws engaged people from around the world.