How we arrived at the rankings
Before you read further, we should remind you that the PaGaLGuY B-school Rankings are a perception rankings that measure the preferences of the population polled. We believe that it is logically impossible to rank abstract entities such as b-schools by quality and be accurate too (read why). What a rankings can do, however, is pick one problem faced by aspirants and solve it really well. In that sense, the PaGaLGuY B-school Rankings is an accurate measure of the brand value of a b-school with respect to others, which is a parameter that aspirants hold paramount when applying to and joining MBA programs.
Salient features of the PaGaLGuY B-school Rankings, 2010
Total number of valid respondents polled (sample size): 9,576
Total number of b-schools in the initial list: 184
Total number of b-schools in the final rankings: 92
Of which Nationally Aspired-to b-schools: 55*
And Regionally Aspired-to b-schools: 37
(5 from Jaipur
14 from Mumbai
10 from Hyderabad
8 from Chennai)
* Two b-schools have been disqualified for trying to rig the rankings.
Selection of b-schools
For this year’s PaGaLGuY B-school Rankings, we took a bold jump by adding 100 more business schools to our initial list, taking the total number of b-schools ranked to 184 (as opposed to 85 in the previous year).
Where did the 100 new b-schools come from? Apart from the 85 b-schools from the 2009 rankings, we added prominent b-schools from the National Capital Region (New Delhi, Ghaziabad, Noida and Gurgaon), Jaipur, Kolkata, Mumbai, Pune, Hyderabad, Bangalore and Chennai to the list. These cities were chosen on account of their being the highest growing markets in business education, both in terms of the number of MBA aspirants and the proliferation of new business schools.
However, not all of these 184 b-schools catered to the same market. So we made a broad division of the 184 b-schools to ‘Nationally Aspired’ b-schools and ‘Regionally Aspired’ b-schools. 57 b-schools that were broadly aspired to by applicants from across the country went into the National list, while those known primarily in their local regions, or having a channel of entry restricted to locals went into the Regional list. To identify b-schools known primarily in their local regions, we used ranking vote pattern data from the previous two editions of PaGaLGuY Rankings.
The survey
The nucleus of the PaGaLGuY B-school Rankings is the CommunityRank methodology, which crowdsources the comparison of items in a list (such as that of b-schools) to a large mass of people to yield highly accurate results on how the mass thinks about the items.
(Certain parts of the text from here onwards require prior understanding of how ‘CommunityRank’ – our proprietary ranking methodology – works. So you might want to first get a basic understanding of CommunityRank and then come back here to resume reading.)
During last year’s b-school rankings survey in which 85 b-schools were compared using CommunityRank, we were constantly being asked why the survey contained comparisons such as ‘IIM Ahmedabad vs Chetana, Mumbai’, when the two b-schools clearly attracted different type of serious applicants. We were also questioned on how comparisons such as ‘Chetana, Mumbai vs ISME, Bangalore’ made sense when in real life, very rarely would an MBA applicant face a choice between joining one of these schools (Chetana admits students through the CET exam regulated by the Maharashtra government, with most seats reserved for candidates residing in Maharashtra. ISME on the other hand is a regional player in Bangalore).
Thus this year, we decided to add a necessary element of relevance to the surveys, and thus was born the regional list with b-schools segregated by city. The rationale being – that the National rankings would measure the perception of b-schools desired across the nation, while the regional ranking for a city would rank regionally known b-schools for the benefit of local applicants who are looking for MBA programs in or near their own city. Since familiarity about regional-level b-schools lies best with the local aspirant population, the survey would automatically serve relevant comparison questions about regional b-schools based on a respondent’s location.
In simple terms, the following is how the survey worked:
While everybody’s survey asked them to compare national b-schools, the comparison for regional b-schools was only given to respondents from that city.
Each time a respondent signed on to the survey, we would ask him for his city/hometown/place where he had stayed the most. All respondents were given comparison questions about schools from the National list, while those residing in one of the eight cities in the regional list were given comparison questions about b-schools in that city’s list.
Example: If a respondent entered ‘Mumbai’ as his city, his survey contained 38 comparisons from the national list, and a dozen more from the Mumbai list.
Based on the person’s declaration of work experience, gender and MBA interest, simultaneous rankings of most preferred b-schools according to women, freshers, work experienced aspirants, MBA students and MBA alumni too were generated.
In total, we received a total of 12.077 responses, from which we filtered out 9,576 valid respondents. These 9,576 respondents were used to calculate the PaGaLGuY B-school Rankings survey 2010, an increase of over 73% over last year’s sample size of 5,510 respondents.
Data cleaning
The following checks and balances were in place to iron out spurious survey responses:
The survey delivery system automatically kept track of each computer that the survey was delivered to. So if a respondent answered the survey multiple times, all the responses would show up together.
- Extensive pattern-matching algorithms were applied to the database of responses to uncover unusual similarities in demographic information. These were also applied to multiple responses that originated from a single IP address.
- Majority of the spurious responses were uncovered by the above two foolproof methods.
Additionally, the following extra measures were taken:
- Responses originating from public proxy servers were rejected.
- We looked for suspicious voting patterns throughout the rankings, wherein a person or group of people had taken the survey with the methodical intention of pushing one b-school up and one or more b-schools down. Such scandals usually take place in collusion, wherein a b-school’s students pre-plan responding to the survey by always voting their b-school up and voting rival b-schools down. All responses displaying such behaviour were rejected.
- As extra attention to detail, we also manually went through the entire list of responses to uncover any non-machine-readable abstract patterns and rejected suspicious responses.
In particular, two widespread cases of rigging were found, one from the Vinod Gupta School of Management, IIT Kharagpur and International Management Institute, Delhi. Both b-schools were disqualified from the rankings for unethical conduct in the survey by sections of their student communities.
Checking for validity of the rankings
Being a crowdsourced survey, the best and most important check for consistency in the rankings is to put it to the test of recursive induction.
In layman’ss terms, we measured this in two ways:
Method A: If a crowd size of 9,576 ranks b-schools in a certain way, any sufficiently large and randomly-selected slice from the total crowd size should rank the schools in the same order, with minimal variation.
Method B: Further, the rankings generated using any three randomly selected and disjoint slices from the total crowd size too should have minimal variation. For example, out of the total crowd size of 9,576, we chose three slices of 3,000 respondents each, with no common respondents between these slices. We then looked for any major variations in the rankings generated using each of the three slices.
After consistency checks, the following rankings were validated as fit for publishing:
National b-schools (Overall, Women, Freshers, Work-Ex, Alumni, Students and Aspirants)
Among Regional b-schools:
Mumbai
Jaipur
Chennai
Hyderabad
The following rankings were rejected either due to lack of consistency or too low a sample size:
New Delhi
Bangalore
Pune
Kolkata