The National University of Singapore (NUS) and the Nanyang Technological University (NTU) have generally fared well in the league tables that rank universities in the region and worldwide. But should they now stop playing the often maddening rankings game?
This is a question being asked by some as such tables continue to proliferate - sometimes painting starkly different pictures.
In the last two weeks alone, three rankings were released - the annual World University Rankings compiled by London-based education consultancy Quacquarelli Symonds, Asia's 75 most innovative universities list by Reuters and the Times Higher Education World Reputation Rankings.
More will follow in the weeks and months ahead - including the top Asian universities list, the young universities rankings, top 50 under 50 (50 years of age that is), rankings by subject and graduate employability rankings. The estimate is that, currently, there are more than 40 global and regional rankings, with NUS and NTU doing fairly well in many.
But the big news with the QS ranking this year was NTU beating NUS to be placed 11th, while NUS fell three places to be ranked 15th.
In the Reuters list, though, NUS was ahead - it was placed 11th, while NTU was placed 25th.
In the Times Higher Education World Reputation Rankings, NUS was placed 27th, one place lower than last year, while NTU remained in the 81 to 90 band.
The validity of these various rankings is often called into question when universities make significant movements up or down the tables.
This happened two years ago with the QS ranking, which saw both NUS and NTU leap several places into the top 13 in the world. NUS went up from 22 to 12, while NTU's rise was even more dramatic - it went from 39 to 13, just one place behind NUS.
Their dramatic rise was in part due to a change in methodology on how research citation was counted. QS said this was to correct a bias due to a large volume of citations of publications in fields such as the life sciences and medicine, compared with the arts and humanities.
As a whole, the rankings are based on "bad social science", as academics have pointed out. They use a mix of subjective and objective data and some aspects, such as teaching quality, are assessed using proxy measures such as the ratio of academic staff to students and the number of staff with PhDs.
Then, there is the undeniable fact that with the exception of a few rankings, such as the European Commission's U-Multirank ranking, university ratings have always served a commercial purpose.
In the early years, rankings would appear as a supplement, boosting sales for the newspapers or magazines that carried them.
This is still the case for rankings published by magazines or education consultancies, except now they boost readership and advertising online.
Experts on university rankings such as Dr Richard Holmes, author of Watching The Rankings, have highlighted the "prestigious events in spectacular settings" organised by ranking companies.
Dr Holmes describes these events as "a lucrative mix of business opportunities" for ranking companies, which offer a slew of services including consultancy and workshops for world-class wannabes. And universities which want to host these events at their campuses have to fork out tens of thousands of dollars for the privilege.
In return, the hosts are showered with attention from the various media outlets covering the event. And as Dr Holmes points out, ranking companies have been known to recalibrate the weighting given to the indicators in their regional rankings, sometimes to the advantage of the university hosting the prestigious summit.
He quotes the example of Times Higher Education's Asian rankings last year, which featured an increased weighting for research income from industry and a reduced one for teaching and research reputation. This was to the disadvantage of Japan and to the benefit of Hong Kong, where the Asian summit was held.
Yet, despite all these criticisms, there is no denying the growing influence of rankings. Academics use the tables when deciding on positions, and some governments base funding decisions on the rankings of their home institutions. But what is worrying is the growing reliance of students on the rankings.
NTU and NUS officials report that, at their open houses, they have had students and parents questioning them on their placings. Singapore Management University surveyed students on their choice of universities. And what comes through clearly is their preference for highly ranked universities.
Students heading overseas are even more reliant on rankings. According to a study done by international student recruitment agency IDP, students around the world considering an overseas education first look at how high a university ranks, instead of the courses being offered, or even the fees.
I had put the question on the usefulness of these comparisons to NTU president Bertil Andersson before. He noted that, prior to these international comparisons, the quality of education provided by universities was assumed.
"Young universities like NTU may be making big improvements, but no one was looking at it, so we went unnoticed. So, it has been good for NTU," he argued.
NUS provost Tan Eng Chye, whom I spoke to recently, admitted that the university's rise in the rankings had given it more visibility and helped to attract top academics and students.
The high placings have also helped NUS forge deeper partnerships with top institutions around the world, and this has opened up opportunities for its students, graduates and professors.
I asked SMU provost Lily Kong about students basing their choice of universities on the tables. SMU, as a specialist institution, understandably lags behind - in the 441-450 band - in the QS ranking. When it is compared with other specialist universities in the QS rankings, SMU then moves up to the 11th position.
SMU has also appeared in a few global rankings by disciplinary areas and done well in them. It was placed No. 1 in Asia in the Brigham Young University Accounting Research Rankings.
Dr Kong said rankings, especially those that compare like for like, can provide useful composite information on a university's progress over time and give students a broad indication of where a university stands in relation to its peers. But, like the other university heads, she advises students to base their choice of university on their interests, aspirations and the quality of the educational programmes and experience.
So to go back to the question posed at the start - it seems that love them or hate them, university rankings are here to stay.
It is true they do not do full justice to the strengths of a university, but they are being used as a basis for comparison for the growing number of globally mobile students, researchers and academics. What's more, league tables affect the accreditation of programmes, sponsorship decisions by donors and even fund allocation by governments.
As universities compete for talent and funds, they cannot ignore such rankings. The trick is not to become obsessed with them.