OR/MS Today - December 2002|
The Paper Chase
The Paper Chase
Comparing the research productivity of quantitative/technical departments in schools of business
By Ravi Bapna and James R. Marsden
During the last two decades, as information systems/information technology developed in importance within business schools, a variety of department structures emerged. While in some cases, a new focused department was created to house IS/IT, a frequent tactic was to co-locate IS/IT with management science, operations management or a combination of the two. One argument for doing so was linked to the view that research in these areas was growing more and more integrated. Colloquially, such departments are also referred to as quantitative/IS departments, operations/IS departments or, in some instances, as technical/IS departments.
Given the emergence of these departments, we felt it was appropriate to ask how they are performing in the research arena. To address this question, we present a summary analysis comparing the research productivity of such departments in terms of publication activity in four key INFORMS journals that span the topics and frequently publish multi-disciplinary research from these areas. We chose the four INFORMS journals because of their prestige and because INFORMS, through its conferences and publications, represents high quality in quantitative/technical research. Our presentation is meant as an addition to available information and not as an exclusive or "ultimate" measure of research productivity.
To many of us, university administrators and external constituents seem overly interested in analytical comparisons. Yet we academics are hardly immune from this curiosity. We state our specific choice of "who," "what" and "over what period" to compare as follows:
WHAT TO COMPARE: publication productivity in four key INFORMS publications Management Science, Information Systems Research, Operations Research and INFORMS Journal on Computing. These journals, with the exception of the INFORMS Journal on Computing, are part of the very limited and exclusive list used by BusinessWeek in evaluating the ranking of the best business schools.We begin by defining the variables "operationalized" for data collection. This is followed by a summary of our data collection activities and how we dealt with anomalies or data conflicts that arose. Data summaries are provided in a series of tables that include, for comparison and completeness, information of research productivity since 1990 by university, school/college and departments falling into our selected set.
For our purposes, the information that is required is the author's name, department/group and university/organization. For sake of completeness, we also recorded citation information such as article title, volume, issue, date and page numbers. Given that a vast majority of articles are co-authored (74.5 percent in our data set, shown later), we primarily base our comparisons on a weighting scheme that counts each article as an entity contributing one point to the pool. This point is then split up evenly across the number of authors, while also giving proportionate weighting to their departments. For instance, an article in Management Science that had three authors, two of which were from department A and another from department B, would credit department A with two-thirds of a point and B with one-third of a point.
In order to compare department productivity, we analyzed authorship of articles appearing in the four relevant flagship INFORMS journals (MS, OR, ISR, IJOC) from 1990 to the present. We created a normalized relational database in Microsoft Access that would allow us to record the complete affiliation details of all the authors of all the articles published in this time frame. However, in many instances information about department, school and university was only partially available. In several cases, only the university affiliation was presented. These inconsistencies were present across the four journals and over the years. We pursued several additional information gathering steps, detailed below, in order to resolve these inconsistencies:
Incomplete information resolution. The primary source of anomaly in the data is the lack of full affiliation information on the first page of each article. We considered the information to be full if it had three levels of information roughly corresponding to department, school and university. These levels may or may not exist, depending on the organizational structure of the particular university. If the department information for a particular author was missing, we deployed the following steps:
a) We looked to see if the author already existed in our database from a prior publication, and crosschecked to ensure that the author had the same university affiliation. If this was true then we considered it safe to assume that the person's department affiliation is the same.
b) If step (a) failed, we checked the school's Web site to determine whether it had distinct departments or academic groups and if the author belonged to these.
c) If steps (a) and (b) failed, we attempted to contact the author directly through e-mail.
d) If all three initial steps did not yield the information, we searched for the author's name using Google and tried to reach the author by telephone.
After all these steps, if we were able to determine that a particular school, such as the Sloan School of Management at MIT, did not organize themselves into such categories, we excluded it from the department level comparisons but included it in the school and university comparisons.
Identifying the comparison set of "IS/IT/Management Science" Departments. Given the broad appeal of journals such as Management Science, we wanted to narrow our comparison set by identifying departments that had an IS/IT and OR/MS focus and belonged to a business school. We did not want to compare such departments with, for instance, pure OR departments that exist in many mathematics programs or industrial engineering programs. In several cases, such a categorization is explicit in the department's name, such as the Operations and Information Management department at The Wharton School (University of Pennsylvania) and at University of Connecticut's School of Business. In other cases, we made the determination by visiting and examining departmental Web sites. For instance, the University of Washington Business School's Web site lists a Management Science Department, but indicates that it includes information systems, operations management and quantitative methods. Thus, the department comparison represents the relative productivity of those academic groups that have chosen to organize themselves into quantitative IS groups within business schools. As mentioned earlier, if such a departmental organization is not explicit or implicit, then we consider those authors in the school and university comparisons.
In summary, we constructed a database consisting of 3,338 articles published by 3,971 different authors belonging to 1,692 uniquely identifiable academic units. The comparison information derived from querying this database is presented below.
Presentation of Results
Joint authorship. The co-authorship distribution is presented in Chart 1.The unimodal distribution peaks at two and indicates that while co-authorship is widely prevalent, the vast majority of articles have, at most, three authors. Given the extensive level of co-authorship we also decided to investigate, within the departments that were productive, whether some departments had higher levels of intra-departmental collaboration than others. This was done using an absolute comparison scheme, the results of which are presented after the primary weighted comparisons.
Chart 1: Three-quarters of articles are co-authored.
Department comparisons (weighted measure). Table 1 presents the 36 departments achieving the highest total weighted measure linked to publishing in the four IS/IT related flagship INFORMS journals over the period from 1990 to the present. A bold line segregates the top 10 most productive performers from the rest.
Table 1: IS/IT/OR department comparisons (1990-2002).
It should be emphasized that Table 1 comparisons relate only to those academic units that have chosen, either explicitly or implicitly, to organize themselves as departments or groups. Notable highly productive research houses such as MIT's Sloan School and CRITO at the University of California-Irvine do not come under this umbrella, but figure prominently in the Business School comparisons shown below. [At Arizona State, the IS group was recently shifted into the Accounting Department but, for the majority of our time period, the departmental structure fit the grouping analyzed here. We included relevant publication from both Accounting and IS and Decision and IS Departments.]
In addition, we ran the queries for the first (1990-1996) and second half of the period, (1997-present) in order to see if productivity shifts had occurred. These results are reported in Tables 1a and 1b.
Table 1b: IS/IT/OR department comparisons (1997-present).
Observe that several departments made large steps forward in the productivity measure comparisons during the last six years. Several jumped into the top 10, including Management Science and Information Systems at the University of Texas at Dallas (from 43rd to sixth), Operations and Information Management at the University of Connecticut (from 13th to eighth), and Decision Sciences Department at the National University of Singapore (from 32nd to ninth).
Business school comparisons (weighted measure). Table 2 presents the top 36 most productive business schools publishing in the four IS/IT-related flagship INFORMS journals. These include entries from departments such as accounting, finance, etc. MIT's Sloan School of Management outshines all others in this comparison.
Table 2: Business school comparisons (1990-2002).
University comparison (weighted measure). Table 3 presents the top 36 most productive universities publishing in the four IS/IT-related flagship INFORMS journals. These include entries from departments such as accounting and finance, as well as applied math, industrial engineering, etc.
Table 3: University comparisons (1990-2002).
Comparisons using absolute measures. The comparisons above use a weighting scheme to account for co-authorship. For completeness, we also report comparisons using an absolute comparison mechanism where each author gets a point for being listed as an author of a published article in our dataset. In case of co-authored papers, this implies that each of the co-authors would contribute a point to the pool. The absolute measures incorporate a bias in favor of the importance of co-authored articles. For example, an article with five co-authors is assigned five points in the process while a single author article is assigned only one point in the process. Table 4 presents the results.
Table 4: IS/IT/OR department comparisons (absolute measure, 1990-2002).
These comparison results are remarkably similar to the weighted measure results presented in Table 1. The departments at Penn State and University of Texas-Dallas move into the top 10, while Vanderbilt drops out.
As we noted in our introductory remarks, the results contained here are presented as information for discussion. They are not an attempt at an exclusive comparison process, but rather as an initial look at how a set of relatively new department structures are performing in research output. We hope that the information provided in this article is useful to the readers and provides some insights into the research performance of the departments studied.
We have developed an accompanying Web site that will provide quarterly comparison updates and will include limited querying capability for our database. While we have made every effort to ensure the accuracy of the data and the department classification, it is to be expected in a project of this magnitude that some anomalies persist. We hope to work with the universities on resolving these, and expect that the Web site will be particularly useful in this regard. We are including individual measures in the Web site's design with the hope that researchers will not only look themselves up, but will also update us of any inconsistencies in the recording of the information. We also would like to suggest that INFORMS adopt a consistent affiliation reporting policy for all of its journals.
Jim Marsden (email@example.com) is Shenkman Family Chair and Head, Department of Operations and Information Management at the University of Connecticut. His research work has appeared in Management Science, IEEE Transactions, American Economic Review, Journal of Economic Theory, Journal of Political Economy and numerous other outlets. Ravi Bapna (firstname.lastname@example.org) is an assistant professor at Department of Operations and Information Management at the University of Connecticut. His research work has appeared in Management Science, Decision Sciences, Naval Research Logistics, CACM, and Decision Support Systems, among other outlets.
OR/MS Today copyright © 2003 by the Institute for Operations Research and the Management Sciences. All rights reserved.
Lionheart Publishing, Inc.
506 Roswell Rd., Suite 220, Marietta, GA 30060 USA
Phone: 770-431-0867 | Fax: 770-432-6969
Web Site © Copyright 2003 by Lionheart Publishing, Inc. All rights reserved.