While backward design has been widely accepted as the ‘right’ way to proceed when designing new courses or units, John Mullins and Arun Pereira argue that, in some business school settings, an alternative approach, which they call effectual design, may potentially be a superior approach to course design
It has been widely accepted in recent years that backward design is the ‘right’ way to go when designing new courses or units. As two ardent supporters of this approach, Wiggins and McTighe argued that the backward design approach is a good way of ensuring that the primary focus of what is eventually taught is student-centred. To this end, backward design encourages the instructor to establish the purpose behind an activity before baking it into the curriculum or course design.
We argue here that, for certain types of courses and programmes, an alternative approach, which we call effectual design, may make good sense. We will first define backward design and discuss its strengths and limitations, as reported in the learning literature. Then, drawing on the work of Saras Sarasvathy and Benjamin Bloom, we define and explore our alternative approach, and note some settings where we believe an effectual design approach is likely to have merit. Finally, we show why effectual design, when combined with project-based learning, is likely to work better in these settings.
Backward design, sometimes referred to as understanding by design, is a method of designing courses and their constituent units by first setting learning goals and then choosing the methods of instruction and forms of assessment. It is akin to choosing one’s destination before charting a path to reach it. The idea involves a three-step process (See Figure 1).
This approach is reported to offer several advantages, namely that
- it is easier and more logical for the instructor when designing learning activities and assessments, based on well-grounded and clear learning objectives
- it forces the instructor to assess learning and understanding
- assessment is designed before lesson planning, so instruction drives students toward exactly what they need to know
- it forces tough decisions about what’s in and what’s out
- it moves the instructor’s focus from content mastery to learning.
In summary, a key strength of backward design is that it establishes a tight connection between learning goals and assessing the extent to which such goals have been attained.
Yet despite its apparently clear merits, numerous drawbacks or limitations to backward design have also been reported, including the fact that
- what is taught is not necessarily what is learned, due to variation in students’ experience and prior knowledge
- its excessive rigidity may fail to take proper advantage of the rich diversity of today’s classroom populations and can lead to a potential lack of dynamism in the classroom environment
- in pre-determining an end goal, students are not empowered to reach for their own goals or to follow a process that may lead to results that surprise both the student and the teacher. For the master teacher, one could argue that it is also worthwhile to move beyond fixed goals and establish processes that lead students to relevant yet indeterminate outcomes
- backward-designed courses or curricula may fail to encourage or develop critical thinking and the pursuit of higher-order learning objectives.
In essence, critics argue, backward design is akin to a one-size-fits-all approach to learning. Further, we note that the typical forms of assessment in backward designed courses can be problematic for two additional reasons. First, exams, tests, and quizzes do not necessarily measure learning; arguably, they measure (at best) short-term memory; and, second, they bear an opportunity cost, by taking time away from other activities that could lead to learning.
Nonetheless, this approach may be well suited to many business schools’ core curricula, and to learning content which is hierarchical by nature. Learning the material for Statistics 1, for example, is a pre-requisite for tackling Statistics 2. As we’ll now argue, however, a backward design approach may be less well suited to other important elements in many business schools’ curricula, indeed to elements that may be critical to the schools’ survival in today’s rapidly changing education industry.
Knowing vs. Doing
As in other professions including medicine, engineering, architecture and more, we who teach in business schools are – or should be – more concerned with what our soon-to-be graduates or executive education participants can ‘do’ than with what they ‘know’ or ‘understand’. In business, our students need to be able to gather evidence and think critically about how best to weigh options and determine what their business should do next, often amid considerable uncertainty. Which market should be targeted? Is the return sufficient to justify the investment? And so on.
All these tasks fall at the middle and higher end of Bloom’s taxonomy of learning objectives (See Figure 2).
Unfortunately, evidence suggests that the backward design process often results in courses that are focused primarily on the bottom two learning objectives, understanding and remembering. Learning objectives in backward designed courses and curricula can, of course, include ‘doing’, and it is possible to assess such learning. But implementing assessments for this can be time-consuming and resource-intensive, especially in large core classes, and when compared to the ease and efficiency of other means of assessment that focus on Bloom’s lower-order learning objectives.
Of course, it is crucial to recognise that it is important for business school students to understand tools, frameworks and concepts in order for them to ‘do’. This mere understanding, however, is not the end game. It’s applying those tools, frameworks, and concepts, and evaluating the evidence that feeds into them, and the analysis that follows that enables students to be more effective doers – as leaders, managers and entrepreneurs in today’s uncertain world – and to create innovations that make their firms successful and the world a better place.
A well-established stream of literature in entrepreneurship argues that many entrepreneurs don’t begin their journeys with explicit goals in mind. Instead, as Saras Sarasvathy finds, they ask themselves and their partners what means are at hand. By drawing on these means, they determine a multiplicity of possible paths that might lead to one or more worthwhile destinations (See Figure 3).
In Sarasvathy’s words, “While causal thinkers are like great generals seeking to conquer fertile lands (Genghis Khan conquering two-thirds of the known world), effectual thinkers are like explorers setting out on voyages into uncharted waters (Columbus discovering the new world).”
According to Sarasvathy, “All entrepreneurs begin with three categories of means: (1) Who they are – their traits, tastes and abilities; (2) What they know – their education, training, expertise, and experience; and, (3) Whom they know – their social and professional networks”. Sarasvathy argues that the effectual approach is particularly valuable when working under uncertain conditions, such as developing a new product in a new market, where one’s ability to predict the future is limited.
Effectual reasoning and effectual design
Might it be true that Sarasvathy’s approach could be useful for designing new courses or elements thereof, and, if so, where? We argue that there are two important settings in business schools to which an effectual approach might be particularly well-suited: the design of new elective courses, and the design of short-form, non-degree executive education programmes.
In designing new electives, instructors may lack a clearly defined set of learning objectives. “Big data is important; we need a new elective there,” someone might say. Instead of attempting to determine all the settings in which big data plays a role and the variety of forms it can take, and then determining how to assess what students should learn for each of them, and only then determining what learning activities, materials, and resources should be assembled (a backward design approach), what if we were to approach the design of a new big data course as Sarasvathy might:
- Who am I, as an instructor? Do I lecture? Do I teach with cases? Is my classroom alive with in-class activities? Do I find projects effective?
- What do I know about big data? Does some of my tried and tested material fit such a course?
- Who do I know that might contribute to such a course? Has someone in my network already taught such a course, or developed sessions that would fit? Do I know potential guest speakers or companies amenable to the development of new cases that deal with big data?
Approaching course design in this manner allows the instructor to avoid the problem of needing to find or develop perfectly round pegs to fit into the perfectly round holes that a backward design approach would identify. Instead, they are free to assemble building blocks that fit their skills as an instructor (Who am I?), and draw on any prior knowledge of Big Data (What do I know?), as well as taking advantage of what others in their network have taught before (Who do I know?).
By relying on trusted partners, this approach mitigates the risk of having some ‘dud’ sessions (or a ‘dud’ course!), thereby limiting possible losses, in just the same way as effectual entrepreneurs do. Over time, as the course unfolds, or as subsequent iterations are updated, the course can evolve as contingencies – i.e., surprises that arise along the way – are leveraged.
In short-form, non-degree executive education settings, similar logic holds true. We often know little about our audience’s prior knowledge or experience, and executive education cohorts are often highly heterogeneous in nature. Thus, it’s difficult to formulate suitable learning objectives that make sense for all, despite the fact that the learning staff of our corporate clients typically ask us to do just that!
Moreover, because participants are likely to selectively take away only those snippets of learning that are relevant for them, it is arguably pointless, and perhaps even inappropriate in such settings to define a uniform set of learning objectives for all. Indeed, in our work in executive education, we find that when they arrive for a short course, many participants’ managerial plates are typically full (and will be even fuller when they return to their workplaces!). They are thus thrilled to come away from an executive education programme with just one or two meaningful and actionable take-aways (out of the many things that were taught) that they can implement at once.
Effectual design in action
As we think back to the various courses we’ve designed over our many years of business school teaching, we realise that rarely (if ever) have we used a strictly backward design approach. Instinctively, what we actually did most of the time was put into practice an effectual design approach:
- We considered who we are: our own most effective approaches to our teaching and learning craft.
- We considered what we knew already: what existing materials – cases, articles, exercises, projects, and more –were already in our repertoires and could comprise part of what was to be included in the new course.
- We considered who we knew: faculty elsewhere who had taught something like what we were preparing to teach and were willing to share their course outlines and course materials; and individuals or companies about whom we could develop compelling cases or other activities for classroom use.
As we think about it now, would we even consider designing something new for an elective or executive education audience any other way? Why wouldn’t we draw on these resources? Building upon our strengths, our knowledge and our past successes, and standing on the shoulders of others who have ‘done it before’ makes enormous sense, doesn’t it?
Anderson, L.W. & Krathwohl, D. 2000. A Taxonomy for Learning, Teaching, and Assessing: A revision of Bloom’s Taxonomy of Educational Objectives.New York: Longman.
Aviles, N. & Grayson, K. 2017. Backward Planning – How Assessment Impacts Teaching and Learning, IDRA Newsletter, August.
Bacon, D. 2005. The effect of group projects on content-related learning. Journal of Management Education, 29(2): 248-267.
Bacon, D. & Stewart, K.A. 2006. How fast do students forget what they learn in consumer behavior? A longitudinal study. Journal of Marketing Education, 28(3): 181-192.
Bowen, R.S. 2017. Understanding by Design. Vanderbilt University Center for Teaching and Learning. Retrieved April 14, 2020 from link
Brown, J.S. and Duguid, P. 1996. Stolen Knowledge, link
Burkholder, P. 2018. Backward Design, Forward Progress.
Cho, J. & Trent, A. 2005. “Backward” Curriculum Design and Assessment: What Goes Around Comes Around, or Haven’t We Seen This Before? Taboo: The Journal of Culture and Education, 9(2), 105-122 & 116-117.
Hitchcock, D. 2018. Critical Thinking. Stanford Encyclopedia of Philosophy, Fall. Zalta, E.N. (Ed.).
Kliebard, H. 1975. The rise of scientific curriculum making and its aftermath. Curriculum Theory Network. 5: 27-38.
McEvoy, G.M. 1991. Examining the exam: Ruminations on the development of a management action-skills assessment procedure. In Managerial Skills: Explorations in Practical Knowledge. Bigelow, J.D. (Ed.), Newbury Park, CA: Sage.
Pfeffer, J, & Sutton, R.I. 2000. The Knowing-Doing Gap, Harvard Business School Publishing.
Sarasvathy, S.D. 2001. What Makes Entrepreneurs Entrepreneurial? Working paper, University of Washington.Sarasvathy, S.D. 2013. Causation and effectuation: Toward a theoretical shift from economic inevitability to entrepreneurial contingency,” Academy of Management Review, (26)2: 243-288.
Sarasvathy, S.D. & Read, S. 2012. Co-creating a course ahead from the intersection of service dominant logic and effectuation,” Marketing Theory, 12(2): 225-229.