top of page
Search
  • Maciej Workiewicz

Research and teaching in business schools: Can the two go together


Dividing business schools between teaching- and research-oriented is a common practice among academics; particularly those on the job market. The idea is that a research-oriented institution provides the right environment, resources, and support for scholars who pursue a deeper understanding of their discipline or at least the best proxy we have – publications in top, peer-reviewed journals. This belief has been held more strongly in the US, but it has recently found new accolades among the European and Asian business schools [see my other blog post here].

However, the idea that scientific research in business academia can produce useful knowledge has its opponents. It would require a separate and lengthy blog post to even sketch the outlines of that debate and that is not my goal here. For curious readers I can suggest Mie Augier and James G. March’s excellent book “The Roots, Rituals, and Rhetorics of Change”. Here I will mention only one of the numerous accusations, which is that management research has little to do with management practice and there is very little a true entrepreneur, manager, leader, etc. can learn from an article at one of the top management journals. This probably is true in some sense as scientific articles are written for other scientist, are full of often necessary jargon and to fully absorb the information requires some training on the part of the reader. However, this criticism extends to the authors of these articles as well and casts doubt on whether an entrepreneur, a manager, or a business leader learn anything useful from a management scholar? Should business schools be hiring researchers or is it only some kind of a fetish imposed by arbitrary choices of business school ranking bodies?

In this post I want to share some data that suggest at least a partial answer to the following question: can hiring researchers help a business school to offer better and more relevant business education? I will use two sources for my inquiry: a) the excellent UT Dallas Research Ranking, and b) Financial Times MBA and Executive education rankings.

In my previous post I showed that European and Asian business schools have been catching up in terms of publications in the top management journals. Below is yet another way to show the same trend, but this time we will look at the performance of these schools in the FT MBA ranking as well. I will split the list of highest-ranked MBA programs into those based in North America and the rest of the world (mostly Europe and Asia & Australia). We will look at the relationship between the MBA rank and research rank for each of these groups as reported in the FT MBA ranking in 2008 and 2018. Figure 1. shows the scatter-plots and trend-lines. The first observation is that the research rank and MBA rank are positively correlated. This is to some extent expected. For one, research rank is one of the factors in the MBA ranking (it is 10% of the total score, right after salary and salary increase). However, remember that the argument presented by some is that research and teaching are substitutes and either the school excels in one or the other, so it could have gone the other way too or at least be flat.

Figure 1. Relationship between Global MBA rank and Research Score for 2008 and 2018 by region

One potential interpretation of this relationship is that the top schools use the resources obtained from the MBA and other programs to pay their faculty to do research instead of teaching. In most of top business schools faculty spends only a small portion of their time on teaching. Thus, according to this argument, the causality runs the opposite way: better teaching brings more money, more money can be spent on research. In other words, research is what professors like to do, thus more time for research is just one of the perks that come with the job. It is probably true to some extent.

The second observation is that while for the US schools the relationship between MBA and research rank remains more or less the same between 2008 and 2018, in other places, particularly in Europe, more and more schools are moving towards the North American model in that period. While in 2008 it would be relatively easy to split the North American and non-North American clusters, in 2018, they have become much more intermixed. This has happened not because more non-North American business schools have improved their MBA ranking (this has happened to some extent too), but rather the research rank has markedly improved for them (i.e., the green the dots have moved up). Thus, Figure 1 shows that non-North American business schools have been slowly approaching the model applied by their North American cousins.

However, this doesn’t yet answer our question whether management research published in top peer-reviewed journals by the faculty enhances, at least in some meaningful way, our understanding of the key issues in the management domain that in turn provides business practitioners with useful knowledge and skills?

Let’s take a look at the FT ranking of executive programs. The neat thing about those is that participants in those programs are asked to evaluate the teaching quality and relevance of things being taught (FT Open Enrollment Executive Program ranking 2018). More specifically, participants have to answer several questions. Here, I focused on three that I think approximate relevance and quality of teaching that interests us:

  1. Teaching methods and materials: the extent to which teaching methods and materials were contemporary and appropriate, and included a suitable mix of academic rigor and practical relevance.

  2. Faculty: the quality of teaching and the extent to which teaching staff worked together to present a coherent programme.

  3. New skills and learning: relevance of skills gained to the workplace, the ease with which they were implemented and the extent to which the course encouraged new ways of thinking.

I pair these with the UT Dallas research score for each school, which I calculated the same way as in my previous post, but this time for all 24 journals. Then I checked the correlations between each of the FT questions and the research score calculated from the UT Dallas ranking. I used the UT Dallas score here instead of the FT research rank from the MBA rankings, because not all the schools in the Executive Education ranking are present in the MBA one. UT Dallas Research Ranking covers all the schools as long as at least one of the faculty in their schools has co-authored a paper in one of the 24 journals tracked by the ranking.

It turns out that the correlations are all positive:

Correlation with research output in top journals with:

  1. Teaching methods and materials: 0.507

  2. Faculty quality: 0.568

  3. New skills and learning (relevance): 0.470

Good news? Well, correlation is not causation and these results should not be taken as a proof that rigorous research leads to impactful teaching. However, these numbers suggest at least that business schools can do both relatively well and don’t have to choose between the pursuit of knowledge and pursue of relevance in management teaching. It also makes the alternative story that articles in top journals have nothing to do with actual management issues less plausible. It seems that management research may be more than just debates between academics sitting in their ivory towers, all the while the real world outside does stuff. It looks like rigor and relevance can go together. At the end of the day, as someone once remarked, can you be truly relevant without being rigorous?


1,022 views0 comments
bottom of page