Acaba de aparecer el más reciente ranking de universidades británicas producido por The Complete University Guide – 2009, basado en información oficialmente proporcionada por las universidades a la agencia gubernamental de estadística [Higher Education Statistics Agency (HESA)], luego contratastada con cada institución.
El ranking ofrecido permite a cada persona crear su propia tabla sobre la base de 9 factores de calidad:
— Student Satisfaction
— Research Assessment
— Entry Standards
— Student–Staff Ratio
— Academic Services Spending
— Facilities Spending
— Good Honours
— Graduate Prospects.
— Completion
Ver más abajo una descripción para cada uno de estos factores y su aplicación.
Los 10 primeros lugares en el ranking general los ocupan mlas siguientes universidades:
Oxford
Cambridge
London School of Economics
Imperial College
Warwick
Durham
St Andrews
University College London
SOAS
Lancaster
How the League Table works
The League Table measures nine key aspects of university activity using the most recent data available at the time of compilation. As we have mentioned, a statistical technique called the Z-transformation was applied to each measure to create a score for that measure. The Z-scores on each measure were then weighted by 1.5 for Student Satisfaction and Research Assessment and 1.0 for the rest and summed to give a total score for the university. Finally, these total scores were transformed to a scale where the top score was set at 1000 with the remainder being a proportion of the top score. This scaling does not affect the overall ranking but it avoids giving any university a negative overall score. In addition, some measures (Entry Standards, Student/Staff Ratio, Good Honours, and Graduate Prospects) have been adjusted to take account of the subject mix at the institution. The details of how the measures were compiled, together with some advice about their interpretation, is given below.
Student Satisfaction
What is it? A measure of the view of students of the teaching quality at the university.
Where does it come from? The National Student Survey, a survey of final-year students in 2007.
How does it work? The National Student Survey asked questions about a variety of aspects of teaching. The average satisfaction score for the first fifteen questions, which relate most directly to teaching quality, was calculated.
What should you look out for? The survey is a measure of student opinion, not a direct measure of quality. It may therefore be influenced by a variety of biases, such as the effect of prior expectations. A top-notch university expected to deliver really excellent teaching could score lower than a less good university which, while offering lower quality teaching, nonetheless does better than students expect from it. Six Scottish universities were not included in the survey and were given the average outcome for all universities in the survey.
Research Assessment
What is it? A measure of the average quality of the research undertaken in the university.
Where does it come from? The 2001 Research Assessment Exercise undertaken by the funding councils.
How does it work? Each university department entered in the assessment exercise was given a rating of 5*, 5, 4, 3a, 3b, 2 or 1 (bottom). These grades were converted to a numerical scale and an average was calculated, weighted according to the number of staff in the department getting each rating. Staff not selected for the exercise were assumed to be conducting research at a level two grades below that of the outcome.
What should you look out for? The rating of 5*, 5, etc, is accompanied by a letter which indicates the proportion of staff included in the assessment. Thus, a 5A indicates that most staff were of 5 standard but a 5F indicates that most staff were not included in the return (and so unlikely to be active at that level). In Scotland, the SHEFC announced in advance that it would not distinguish between 5* and 5-rated departments for funding purposes. This may have affected the strategies adopted by some Scottish universities with the result that they obtained fewer 5* ratings than they might otherwise have done. The next RAE results will be available later this year.
Entry Standards
What is it? The average UCAS tariff score of new students under the age of 21.
Where does it come from? HESA data for 2006–07.
How does it work? Each student’s examination results were converted to a numerical score (A level A=120, B=100…E=40 etc; Scottish Highers A=72, B=60 etc) and added up to give a score total. HESA then calculates an average for all students at the university. The results were then adjusted to take account of the subject mix at the university.
What should you look out for? A high average score (it is over 400, or more than three As at A level, at some universities) does not mean that all students score so highly or that you need to take lots of A levels to get in. the actual grades needed will vary by subject and few if any courses will ask for grades in more than three subjects (even if some students do take more). Universities which have a specific policy of accepting students with low grades as part of an access policy will have to tend to have their average score depressed.
Student–Staff Ratio
What is it? A measure of the average staffing level in the university.
Where does it come from? Calculated using HESA data for 2006–07.
How does it work? A student:staff ratio (ie. the number of students divided by the number of staff) was calculated in a way designed to take account of different patterns of staff employment in different universities. Again, the results were adjusted for subject mix.
What should you look out for? A low SSR, ie, a small number of students for each member of staff, does not guarantee good quality of teaching or good access to staff. Universities with a medical school, where SSRs are usually low, will tend to score better.
Academic Services Spending
What is it? The expenditure per student on all academic services.
Where does it come from? HESA data for 2004–05, 2005–06 and 2006–07.
How does it work? A university’s expenditure on library and computing facilities (books, journals, staff, computer hardware and software, but not buildings), museums, galleries and observatories was divided by the number of full-time equivalent students. Libraries and information technology are becoming increasingly integrated (many universities have a single Department of Information Services encompassing both) and so the two areas of expenditure have both been included alongside any other academic services. Expenditure over three years was averaged to allow for uneven expenditure.
What should you look out for? Some universities are the location for major national facilities, such as the Bodleian Library in Oxford and the national computing facilities in Bath and Manchester. The local and national expenditure is very difficult to separate and so these universities will tend to score more highly on this measure.
Facilities Spending
What is it? The expenditure per student on staff and student facilities.
Where does it come from? HESA data 2004–05, 2005–06 and 2006–07.
How does it work? A university’s expenditure on student facilities (sports, careers services, health, counselling, etc.) was divided by the number of full-time equivalent students. Expenditure over three years was averaged to allow for uneven expenditure.
What should you look out for? This measure tends to disadvantage some collegiate universities as it mostly includes central university expenditure. In Oxford and Cambridge, for example, a significant amount of facilities expenditure is by the colleges but it has not yet been possible to extract comparable data from the college accounts.
Good Honours
What is it? The percentage of graduates achieving a first or upper second class honours degree.
Where does it come from? HESA data for 2006–07.
How does it work? The number of graduates with first or upper second class degrees was divided by the total number of graduates with classified degrees. Enhanced first degrees, such as an MEng awarded after a four-year engineering course, were treated as equivalent to a first or upper second for this purpose, while Scottish Ordinary degrees (awarded after three years rather than the usual four in Scotland) were excluded altogether. The results were then adjusted to take account of the subject mix at the university.
What should you look out for? Degree classifications are controlled by the universities themselves, though with some moderation by the external examiner system. It can be argued, therefore, that they are not a very objective measure of quality. However, degree class is the primary measure of individual success in British higher education and will have an impact elsewhere, such as employment prospects.
Graduate Prospects
What is it? A measure of the employability of a university’s graduates.
Where does it come from? HESA data for 2005–06.
How does it work? The number of graduates who take up employment or further study divided by the total number of graduates with a known destination expressed as a percentage. Only employment in an area that normally recruits graduates was included. The results were then adjusted to take account of the subject mix at the university.
What should you look out for? A relatively low score on this measure does not mean that many graduates were unemployed. It may be that some had low-level jobs such as shop assistants, which do not normally recruit graduates. Some universities recruit a high proportion of local students. If they are located in an area where graduate jobs are hard to come by, this can depress the outcome. A measure of the employability of graduates has been included in the HEFCE performance indicators but this is only available at institution level. The HESA data was used so that a subject-mix adjustment was made.
Completion
What is it? A measure of the completion rate of those studying at the university.
Where does it come from? HESA performances indicators, based on data for 2005–06 and earlier years.
How does it work? HESA calculated the expected outcomes for a cohort of students based on what happened to students in the current year. The figures in the Table show the percentage of students who were expected to complete their course or transfer to another institution.
What should you look out for? This measure of completion is a projection based upon a snapshot of data. It is therefore vulnerable to statistical fluctuations.
0 Comments