Organizacija, Volume 48 Research papers Number 2, May 2015 DOI: 10.1515/orga-2015-0008 Quality Evaluation Information Support in Higher Education Tatjana Kovač Faculty of Commercial and Business Sciences, Lava 7, 3000 Celje, tanja.kovac@fkpv.si Background and Purpose: The objective of this research was to develop a proposal of the evaluation information support in higher education that encompasses the Quality Assessor's and Quality Analyst's work support. Furthermore, the proposal also includes the fields evaluated in the processes of external evaluation of the higher education institutes by the Slovenian National Agency for Quality Assurance in Higher Education (hereinafter referred to as »NAKVIS«). Methodology: The method called the "Multi-Attribute Utility Theory" (hereinafter referred to as "MAUT") and MS Excel were used for the work support of the Quality Assessors, while the expert modelling in the Decision Expert programme (hereinafter referred to as "Dexi programme") was used for the work support of the Quality Analysts. Results: The identified criteria allow a uniform evaluation, regardless of the Assessor. The MS Excel template with automatic calculation was made as the technical support for an evaluator and the expert model in Dexi programme was designed for a Quality Analyst. Conclusion: The model of the Higher Education Institute quality evaluation, as presented in this article, can provide a comprehensive and transparent consideration of quality at the Faculty, and, in particular, facilitate the evaluation process due to its information and technical support. Keywords: quality in higher education, quality evaluation, quality evaluation information support, expert modelling 1 Introduction The quality in higher education has been an ongoing topic of both the expert and political audience debates. The Higher Education Institutes must monitor the performance quality and assess the progress made in achieving a higher quality annually. Every seven (7) years the quality of a Higher Education Institute in Slovenia and its Study Programmes is verified by the National Agency for Quality Assurance in Higher Education (NAKVIS). The basis for the Higher Institute and Study Programmes' quality evaluation are the Rules prescribed by NAKVIS (NAKVIS, 2014). The Rules include numerous instructions and recommendations for the Assessors but, on the other hand, do not offer a uniform methodology or even technical assistance to the evaluation. Therefore, in the evaluation processes, various Assessors use different approaches to the evaluation due to which the results are not comparable. The problem is, therefore, worth exploring with the intention of designing an information support model for the Higher Education Institute quality evaluation. The following research questions have been posed: 1. Can a uniform quality evaluation be achieved with the aid of appropriate information support irrespective of the Assessors (internal assessment experts or the NAKVIS experts)? 2. Can a transparent monitoring of quality, both inside an individual school and also among different schools, be achieved through the help of an expert model? The second question can be developed into a further research and, consequently, also demonstrated that a quality evaluation base of the Higher Education Institutes in Slovenia can be created by the use of the expert system NAKVIS. The quality evaluation process modelling of Higher Education Institutes achieves the following objectives: ■ It includes all fields of assessment determined by the Rules into the evaluation, ■ It identifies all the crucial criteria for the evaluation of the assessed fields, ■ It determines the effect of individual criteria to the entire assessment of quality and ■ Gives the information support to the evaluation process. Received: December 12, 2014; revised: March 4, 2015; accepted; March 13, 2015 Organizacija, Volume 48 Research papers Number 2, May 2015 The model represents a scientific contribution to the field of quality management in higher education. There are several opportunities for improvement in the field of Higher Education Institutes' quality evaluation (Paci et al., 2013). Despite having the uniform rules and recommendations for accreditation and external evaluation of Higher Education Institutes and study programmes (hereinafter referred to as "Rules", NAKVIS, 2014), it has been noted that various experts have different views on the quality in higher education (Davies and Ellison, 1997; Caldwell, 2008; Wintersteiner, 2003; Giancola and Hutchison, 2005; Morrison, 1998 and Glasser, 1998). Each university in Slovenia has developed its own self-evaluation quality model, but there is no information on whether the processes are supported by information. The available researches show that the evaluation models are engaged primarily with the evaluation content (criteria), for example, the model of internal evaluation at the University in Maribor (Pauko, 2011) or the case of evaluation in the education of adults (Zoric, 2004) and (Kovac, 2002). However, there are no clear methods and evaluation techniques (compare with Cret, 2011). Evaluations, which are the result of self-evaluation or external evaluations of institutes, are not transparent. Therefore, an understandable and transparent comparison between the evaluation periods of a certain institute or a comparison between the already evaluated institutes is not possible. The Evaluation Reports usually encompass comprehensive studies that do not give answers regarding the quality of a certain school or the reason why and in which fields a certain school is better than another one. Due to this, the Quality Analyst needs a lot of time to extract the essence from them and plan the appropriate actions (Sar-rico et al., 2010). 2 Development of the model The starting point for the development of the model presents the Higher Education legislation, the NAKVIS Regulations and the expert knowledge of the colleagues (the skilled experts for quality evaluation at NAKVIS). In this research, a systematic approach and the following steps have been used in the development of the model: ■ Identification and the hierarchic arrangement of criteria, ■ Determination of the criteria influence, ■ Determination of the evaluation rules, ■ Formulation of the evaluation template as a tool for the Quality Assessors and ■ Creation of the expert model for the support of Quality Analysis. 2.1 Identification and the hierarchic arrangement of criteria Development of the Higher Education Institute quality evaluation model is based on the Rules (NAKVIS, 2014), which assess the quality of the institute according to the following six fields: the environment integration, the operation of the institute, human resources, students, material conditions and the field of quality, innovation and development (Figure 1; compare Sultan and Wong, 2014). The Rules for higher education quality evaluation (NA-KVIS, 2014) give the recommendations and provisions on what to evaluate in a specified field of assessment. As the criteria in the Rules for higher education quality evaluation are not specified explicitly, a different understanding of evaluation could be present among various assessors. Due to this problem we have identified all the crucial and uniformly defined criteria. For example, the criteria for the evaluation of the institute's operation are the following: mission, vision and strategy in line with the objectives, Figure 1: The fields of quality assessment in Higher Education Institute 121 Organizacija, Volume 48 Research papers Number 2, May 2015 documented achievement of objectives, internal organisation and a transparent operation of authority bodies, defined competences, guaranteed participation in decision-making, developed scientific-research work (SRW) and professional co-operation with other institutions, scientific-research work in the study programmes and projects, publications of scientific-research work results, integration of scientific-research work into education (reform of teaching contents), and the arrangements or agreements on students' practicum. A list of criteria from all fields of assessment (Figure 1) was arranged based on the hierarchy and the breakdown of criteria into depth. The breakdown into depth is presented with the numbering of levels. As an example of identifica- tion and the hierarchic arrangement of criteria, the criteria for the field of »Operation of the Institute« are presented below (Table 1). The process was then repeated for all the assessment fields. Even though it seems that the criteria system is complicated for the evaluation, only the criteria at the deepest level are actually evaluated, while the evaluation at the highest level is determined automatically based on the evaluation rules. 2.2 Determination of the criteria influence For the purposes of the automatic calculation of the evaluation criteria at higher levels of the hierarchical tree, it was 2 OPERATION OF THE INSTITUTE 2.1 ORGANISATION 2.1.1 Mission, vision and strategy in line with the objectives 2.1.2 Documented achievement of objectives 2.1.3 Internal organisation and a transparent operation of authority bodies 2.1.4 Defined competences, guaranteed participation in decision-making 2.2 SCIENTIFIC-RESEARCH WORK AND RESULTS 2.2.1 Developed scientific-research work and professional co-operation with other institutes 2.2.2 Scientific-research work in the study programmes and projects 2.2.3 Publication of scientific-research work results 2.2.4 Integration of scientific-research work into education (reform of teaching contents) 2.3 OPERATION FOR STUDENTS 2.3.1 Arrangements and agreements of students' practicum; organisation of practicum at school 2.3.2 Monitoring of students' learning outcomes and competencies of graduates (planned vs.achieved) 2.3.3 Monitoring of students' progress; actions 2.4 INTERNATIONAL ACTION 2.4.1 International researches, programmes, agreements 2.4.2 Projects of integration into the higher education space of the EU 2.4.3 Mobility programmes (students, teachers, and personnel) 244 Foreign student's enrolment Table 1: An example of hierarchical criteria arrangement for the field of operation of the institute 122 Organizacija, Volume 48 Research papers Number 2, May 2015 necessary to determine the evaluation rules covering the impact or importance of the individual criteria. The influences of the criteria or their weights were determined on the basis of the profession's opinion - five verified NA-KVIS professionals have determined the weights of the criteria. In the further development of the model, the average weight values were used, determined by the included experts. 2.3 Determination of the evaluation rules Further, criteria need values to be assessed upon. Such values are called measurement scales and usually consist of five or three steps, depending on how precise the assessment should be. In the Excel template, the numerical evaluations ranging from 1 to 5 shall be used for the evaluation of criteria, while the automatically calculated evaluations done by the MAUT method shall be calculated at two decimals accurately. Such accuracy suffices for the Institute's quality evaluation and for the statistical comparison between the individual evaluations. The scale domain in the Dexi programme consists of semantic values in order to keep the semantic idea about measuring, comparing and explaining the particular criteria for example: not suitable, less suitable, suitable, very suitable, and excellent (Zangoueinezhad and Moshabaki, 2011). Namely, the Dexi programme operates on the basis of descriptive evaluations and their combinations. A numerical interval is assigned to each descriptive evaluation (for example, the numerical interval 1 - 1.58 presents the semantic assess "not sutable"). That is the reason why the calculations from numerical to descriptive evaluations are done automatically. In doing so, a part of the evaluation accuracy is lost, but this does not present a crucial factor for the qualitative analysis in the Dexi programme - transparent information given by the semantic evaluation is much more important. 2.4 Evaluation information support Evaluation information support of the Higher Education Institute is designed separately for both the Commission of experts that evaluate the institute (the Assessors) as well as for the Analysts who deal with the institute's quality system. The technical support for the first group is made in the MS Excel programme, while the second group can use the expert modelling in the Dexi programme. The Excel template uses the MAUT Method for the automatic calculations of dependant criteria assesses (Bo-hanec, 2006; Table 3). The input data for the automatic evaluation calculations are the average Assessors' evaluations. The technical support for the Quality Analyst represents the computer programme Dexi (Jereb, Bohanec and Rajkovic, 2003). The Dexi programme is a shell of an expert system intended for the support of decisionmaking in the events where the best solution is being chosen among the many in relation to the numerous observed criteria (Adelman, 1992; Benkovic et al., 1998; Rajkovic and Bohanec, 1991; Rajkovic, 1999). Such are almost all real problems, also the evaluation of the schools' quality. The programme is freeware (Bohanec, 2014), works in a Windows environment and has a simple user interface. The model in the Dexi programme includes a hierarchic criteria tree (Figure 2), made in accordance with the sample from the Table 2. For this purpose, a 5-stage evaluation scale with descriptive evaluations was used, viz: not suitable, less suitable, suitable, very suitable, and excellent. Namely, the Dexi programme works on the basis of descriptive evaluations, which gives the analyst and to all who receive the evaluation results a good notion on the evaluation. For this purpose we have determined the rules for the conversion of numerical intervals into the descriptive evaluations. The conversion is done automatically. The Model combines the evaluation fields into the so-called »X quality« and »Y quality« (compare with Kovac, 2010). The X quality is represented by the fields that are more or less governed by the laws and regulations, due to which it could also be called the objective quality. On the other hand, the Y quality connects the fields that are more subjective or more typical for the culture of a Higher Education Institute. The Y quality is affected by the so-called "soft factors", such as democratic leadership (Kohlberg, 2006), the involvement of students into management, work, and development (Cunningham, 2002). The effect of fields in the quality of X and Y is equivalent. 3 Process and Results of the Evaluation The Assessors enter their marks into the evaluation sheets (a part of the evaluation sheet is shown below in Table 2). The criteria are evaluated with marks from 1 to 5, with 1 being the worst and 5 being the best mark. The marks of all Assessors are then entered into the Excel template - the subsequent calculations consider the average of marks (a part of an Excel template is shown in Table 3). The first row of Table 3, the cell on the extreme right shows a calculated mark for the field "institute's operation" (3.27). The marks for the other fields are calculated in that same way. On a separate sheet of the Excel template, the marks from all the fields are gathered automatically and a final, global estimation of the institute's quality is later calculated based on the evaluation rules (Table 4). The institute's quality mark shown in the sample is 3.42. Each evaluation year is made on a special Excel template. The evaluation process is later continued by the organisation's Quality Analyst. With the help of the Excel template, the analyst receives the marks of all independent 123 Organizacija, Volume 48 Research papers Number 2, May 2015 Criteria Evaluation scale QUALITY EVALUATION Not suitable -X QUALITY Not suitable -Integration into the environment Not suitable I—Dialogue with the environment Not suitable '-Graduates Not suitable -The operation of the institution Not suitable -internal organization Not suitable -Scientific research work and results Not suitable -Operation for students Notsuitable ^International activities Notsuitable -Material conditions Notsuitable -Classrooms, equipment, accessibility Notsuitable tClassrooms and equipment Not suitable Accessibility Notsuitable -Library Notsuitable ^Financial resources Notsuitable LY QUALITY Notsuitable -Human resources Notsuitable -Human resources structure Notsuitable tTeachers Notsuitable Personnel Notsuitable -Habilitation Notsuitable LStructure of Senate Not suitable -Students Notsuitable -Enrollments, learning outcomes, information Not suitable -Support Notsuitable Hnvolvement Notsuitable [-Organization of students Not suitable ^Scientific-research work (SRW) Not suitable -Quality Notsuitable -Monitoring system Notsuitable |-Rules and regulations Not suitable '-Planning and realization Notsuitable '-Self-evaluation Notsuitable -Implementation of self-evaluation Not suitable -Analyses and documents Notsuitable -Actions Notsuitable Less Less Less Less Less Less Less Less Less Less Less Less Less Less Less Less Less Less Less Less Less Less Less Less Less Less Less Less Less Less Less Less Less Less Less Less Less su itable su itable suitable su itable su ita ble su itable su itable su itable su itable suitable su itable su itable su itable suitable su itable suitable su itable suitable suitable su itable su itable su itable suitable su itable suitable su itable suitable su itable su itable su itable su itable su itable su itable su itable suitable su itable su itable Suitable Suitable Suitable Suitable Suitable Suitab le Suitable Suitable Suitable Suitable Suitable Suitable Suitable Suitable Suitable Suitable Suitable Suitable Suitable Suitable Suitable Suitable Suitable Suitable Suitable Suitable Suitable Suitable Suitable Suitable Suitable Suitable Suitable Suitable Suitable Suitable Suitable Very suitable Very suitable Very suitable Very suitable Very suitable Very suitable Very suitable Very suitable Very suitable Very suitable Very suitable Very suitable Very suitable Very suitable Very suitable Very suitable Very suitable Very suitable Very suitable Very suitable Very suitable Very suitable Very suitable Very suitable Very suitable Very suitable Very suitable Very suitable Very suitable Very suitable Very suitable Very suitable Very suitable Very suitable Very suitable Very suitable Very suitable Excellent Excellent Excellent Excellent Excellent Excellent Excellent Excellent Excellent Excellent Excellent Excellent Excellent Excellent Excellent Excellent Excellent Excellent Excellent Excellent Excellent Excellent Excellent Excellent Excellent Excellent Excellent Excellent Excellent Excellent Excellent Excellent Excellent Excellent Excellent Excellent Excellent Figure 2: Expert model in the Dexi programme and the evaluation scale criteria (the ones in the deepest level of hierarchy; sample is shown in Table 5). Conversion from numerical into descriptive grades is needed for a further expert modelling and analysis of marks in the Dexi programme. The conversion of marks is done automatically according to the set rules (for example 1.59-2.58 means a descriptive grade »less suitable«). These descriptive evaluations are entered into the Dexi programme by the Analyst (Figure 3). Each variant in the model represents one evaluation period for example, a year. The variant called »simulation« is made by the analyst as a desired or planned state of quality, which presents the basis for the planning of changes and acting in the direction of quality improvement. 3.1 Model testing The proposed information support model for the Higher Education Institute quality evaluation was tested in practice - in the process of regular annual self-evaluation of quality at the Faculty of Commercial and Business Sciences in the closing stage of this research. The model has been presented in detail to all the Faculty employees and its management. The panel of Assessors was composed of the same members as in the previous year, so it was easier to make a comparison between the old and new systems. As it turned out, the new evaluation model brings a large saving of time and administrative work, while the proposed information support is user-friendly and does not demand a lot of training. The evaluation is done according to the uniform system and comparable results are received after a few repetitions of the evaluation process. In doing so, we have also given answers to the research questions asked in the Introduction. The model itself is designed in a way that it is possible to supplement it with new evaluation fields, new criteria or different evaluation rules. Additionally, the criteria weights can also be changed if, during practice, it emerges that some of the quality criteria is more or less important. For the purpose of wider use, the model would have to be tested on more Higher Education Institutes and additional expert opinions would have to be gained on the criteria and evaluation rules. 124 Organizacija, Volume 48 Research papers Number 2, May 2015 Table 2: An example of the evaluation sheet 2 OPERATION OF THE INSTITUTE Marks:1-5 2.1.1. Mission, vision and strategy in line with the objectives 2.1.2. Documented achievement of objectives 2.1.3. Internal organisation and transparent operation of authority bodies 2.1.4. Defined competences, guaranteed participation in decision-making 2.2.1 Developed scientific-research work (SRW) and professional co-operation with other institutions 2.2.2 Scientific-research work in the study programmes and projects 2.2.3 Publication of scientific-research work results 2.2.4. Integration of scientific-research work into education (reform of teaching contents) 2.3.1 Arrangements and agreements on students' practicum; organisation practices at school 2.3.2 Monitoring of students' learning outcomes and competencies of graduates 2.3.3 Monitoring of students' progress 2.4.1 International researches, programmes, agreements 2.4.2 Projects of integration into the higher education space of the EU 2.4.3 Mobility programmes (students, teachers, personnel) 2.4.4 Foreign students' enrolment Table 3: An example of the Excel template for the assessment of the institute s operation Experts' assessments 2 INSTITUTE'S ACTIVITY Ea-1 Ea-2 Ea-3 AVG. Weights 3.27 2.1 ORGANISATION 2.58 25% 0.65 2.1.1. Mission, vision and strategy in line with the objectives 1 2 1 1.33 2.1.2. Documented objectives 2 3 2 2.33 2.1.3. Internal organisation and transparent operation of authority bodies 2 3 2 2.33 2.1.4. Defined competences, guaranteed participation in decision-making 4 5 4 4.33 2.2 SCIENTIFIC-RESEARCH WORK AND RESULTS (SRW) 3.50 25% 0.88 2.2.1 Developed scientific-research work and professional co-operation with other institutions 5 4 5 4.67 125 Organizacija, Volume 48 Research papers Number 2, May 2015 Table 3: An example of the Excel template for the assessment of the institute s operation (continued) 2.2.2 Scientific-research work in the study programmes and projects 2 3 2 2.33 2.2.3 Publication of the scientific-research work results 4 4 4 4.00 2.2.4. Integration of scientific-research work into education (reform of teaching contents) 3 3 3 3.00 2.3 OPERATION FOR STUDENTS 3.56 27% 0.96 2.3.1 Arrangements and agreements on students' practicum; organisation practices at school 3 4 3 3.33 2.3.2 Monitoring of students' learning outcomes and competencies of graduates (planned vs. achieved) 2 5 2 3.00 2.3.3 Monitoring of students' progress; actions 5 3 5 4.33 2.4 INTERNATIONAL ACTION 3.42 23% 0.79 2.4.1 International researches, programmes, agreements 3 2 3 2.67 2.4.2 Projects of integration into the higher education space of the EU 4 3 4 3.67 2.4.3 Mobility programmes (students, teachers, personnel) 2 4 2 2.67 2.4.4 Foreign students' enrolment 5 4 5 4.67 Table 4: An example of an automatic mark calculation of the institute s quality The fields of quality evaluation Average mark Weights Global mark INSTITUTE'S QUALITY 1 - 5 1 INTEGRATION WITH THE ENVIRONMENT 3.83 14.33 % 0.55 2 THE OPERATION OF THE INSTITUTE 3.14 20.33 % 0.64 3 HUMAN RESOURCES 3.61 18.33 % 0.66 4 STUDENTS 3.71 20.33 % 0.75 5 MATERIAL CONDITION 2.94 12.33 % 0.36 6 QUALITY, INNOVATION AND DEVELOPMENT 3.18 14.33 % 0.46 3.42 126 Organizacija, Volume 48 Research papers Number 2, May 2015 Table 5: An example of descriptive marks of the independent criteria that are entered into the expert model 2.1 ORGANISATION Marks from Excel Descriptive evaluation 2.1.1. Mission, vision and strategy in line with the objectives 1.33 Not suitable 2.1.2. Documented achievement of objectives 2.33 Less suitable 2.1.3. Internal organisation and transparent operation of authority bodies 2.33 Less suitable 2.1.4. Defined competences, guaranteed participation in decision-making 4.33 Very suitable Criteria Years quality evaluation -x quality -Integration into the environment I—Dia logue with the environment l-Graduates -The operation of The institution -Internal organization -Scientific research work and results -Operation for students -International activities -Material conditions -Classrooms, equipment, accessibility tClassrooms and equipment Accessibility -Library l-Fmancial resources ly quality -Human resources -Human resources structure l-Teachers L-Personnel -Habilitation '-Structure of Senate -Students -Enrollments, learning outcomes, information -Support Hnvolvement l-Organization of students '-Scientific-research work (SRW) ^Quality -Monitoring system l-Rules and regulations ^Planning and realization LSelf-evaluation -Implementation of self-evaluation -Analyses and documents -Actions : 2012/13 2013/14 2014/15 Simulation 2015/16 Suitable Suitable Suitable Verysuitable Suitable Verysuitable Very suitable Verysuitable Verysuita ble Verysuitable Very suitable Verysuitable Suitable Verysuitable Very suitable Verysuitable Verysuitable Verysuitable Very suitable Verysuitable Suitable Suitable Very suitable Verysuitable Less suitable Less suitable Suitable Suitable Suitable Suitable Suitable Suitable Suitable Verysuitable Very suitable Verysuitable Suitable Suitable Very suitable Verysuitable Suitable Verysuitable Very suitable Verysuitable Suitable Verysuitable Very suitable Verysuitable Suitable Verysuitable Very suitable Verysuitable Excellent Excellent Excellent Excellent Verysuita ble Verysuitable Very suitable Verysuitable Less suitable Suitable Suitable Suitable Less suitable Suitable Suitable Verysuitable Less suitable Suitable Very suitable Verysuitable Less suitable Verysuitable Very suitable Verysuitable Suitable Verysuitable Very suitable Verysuitable Less suitable Verysuitable Very suitable Verysuitable Verysuita ble Suitable Very suitable Verysuitable Suitable Suitable Very suitable Verysuitable Suitable Suitable Suitable Verysuitable Suitable Suitable Suitable Verysuitable Verysuita ble Verysuitable Very suitable Excellent Suitable Suitable Suitable Verysuitable Suitable Suitable Suitable Verysuitable Suitable Suitable Suitable Verysuitable Less suitable Suitable Suitable Suitable Suitable Suitable Suitable Suitable Suitable Suitable Suitable Suitable Suitable Suitable Suitable Suitable Less suitable Suitable Suitable Suitable Not suitable Less suitable Suitable Suitable Suitable Suitable Suitable Suitable Suitable Verysuitable Very suitable Verysuitable Figure 3: The result of evaluation in the Dexi computer programme 127 Organizacija, Volume 48 Research papers Number 2, May 2015 4 Discusión and conclusion The model of the higher education institute quality evaluation, as presented in this article, can provide a comprehensive and transparent consideration of quality at the Faculty, and, in particular, facilitate the evaluation process due to its information and technical support - this is also the applied contribution of the model. The model includes all evaluation fields in accordance with the NAKVIS Rules. It comprises systematically arranged evaluation criteria and evaluation rules set by the profession, as well as the methods and instruments intended for the evaluation and analysis of quality. The expert model is prepared in such a way that it is possible to use it only for an individual assessed field or the assessment of an institute as a whole. A higher education institute is obliged to prepare a quality evaluation, as a so-called self-evaluation, every year. Therefore, it is sensible to make an evaluation of all fields each year. In this way, a highly transparent analysis of the annual self-evaluation results could be obtained with a clear progress or comparison of quality achievements in the individual years. The use of these methods and instruments in practice is easy and simple - all one has to do is enter the data into an electronic Excel spreadsheet and the calculations will be carried out automatically. The expert modelling in the Dexi programme enables that the Quality Analyst produces an in-depth analysis of the past evaluations and determines the effects of actions on their basis and proposes the future changes in the direction of a greater quality. It is also possible to produce the simulations of desired quality state and plan the way to their achievement. The presented quality evaluation model fulfils all the set objectives: the evaluation includes all assessment fields, it identifies the crucial criteria for the evaluation of the assessed fields, and it determines the effect of individual criteria on the total quality evaluation and gives the information support to the evaluation process. This content and method presents an additional value to the science of quality management in higher education. However, it needs to be emphasised that neither the evaluation technique in MS Excel nor the quality evaluation done with the expert model in the Dexi programme replaces the role of the Assessor - the expert for quality. Every Assessor has his/her own perspective towards the quality (for example on the meaning (weights) of criteria in the expert model, which showed itself also in the internal research among some of the NAKVIS experts). The evaluation model means an information support in the evaluation process, but the actual estimation is determined and argued by the expert - Assessor. The model has been limited to the quality evaluation of a Higher Education Institute, but it can be supplemented at any time with the Study Programmes' evaluation criteria. The basis is presented in Figure 4. The same as in the case of a school quality evaluation, the information support to the Study Programme evaluation is done as an evaluation template in MS Excel, while the expert model is done in the Dexi programme. Only the chosen criteria are used from the school evaluation model (NAKVIS, 2014). A unified quality evaluation system in the field of higher education may have wider benefits in the case of the use in the NAKVIS in the processes of accreditation and Figure 4: The fields of Quality Assessment in the Higher Education Programme 128 Organizacija, Volume 48 Research papers Number 2, May 2015 evaluation of the higher education institutes. With the use of these methods and techniques, the NAKVIS experts' commissions would have an instrument for a unified evaluation manner; therefore, the evaluations of the individual institutes would be comparable. In a few years we could create a base of evaluated institutes and gain an overview of the overall quality of higher education in Slovenia. The research was limited to the knowledge collected among the experts employed within a certain Faculty. In the case of a wider use of the model, the prototype would have to be "corrected" with a wider expert knowledge base. Literature Adelman, L. (1992). Evaluating decision support and expert systems. New York: John Wiley. Asif, M. & Searcy, C. (2014). A composite index for measuring performance in higher education Institutions. International Journal of Quality & Reliability Management, 31(9), 983-1001, http://dx.doi.org/10.1108/ IJQRM-02-2013-0023 Benkovič, J., Bohanec, M., Rajkovič, V. & Vrtačnik, M. (1998). Knowledge-based evaluation of higher education institutions. D. Brandt & J. Černetič (Eds.), Automated systems based on human skill 1997: joint design of technology and organisation: a proceedings volume from the 6th IFAC Symposium, Kranjska gora, Slovenia, 17-19 September 1997 (pp. 157-159). Oxford; New York; Tokio: Elsevier Science. Bohanec, M. (2006). Odločanje in modeli [Deciding and Models]. Ljubljana: DMFA - založništvo. Bohanec, M. (2014). Dexi: a program for multi-attribute decision making. Retrieved November 14, 2014 from http://kt.ijs.si/MarkoBohanec/dexi.html Caldwell, B. J. (2008). Reconceptualizing the Self-managing School. Educational Management Administration & Leadership, 36(2), 235-252, http://dx.doi. org/10.1177/1741143207087775 Cret, B. (2011). Accreditations as local management tools. Higher Education, 61(4), 415-429, http://dx.doi. org/10.1007/s10734-010-9338-2 Cunningham, C. (2002). Engaging the Community to Support Student Achievement. Retrieved March 30, 2004, from http://ericdigests.org/2003-1/student.htm Davies, B. & Ellison, L. (1997). School Leadership for the 21st Century: a competency andknowledge approach. London: Clys. Giancola, J. M. & Hutchison, J. K. (2005). Transforming the culture of school leadership:humanizing our practice. California: Corwin Press, A Sage Publication Company. Glasser, W. (1998). Dobra šola, Vodenje učencev brez prisile [A Good School, Leadership students without coercion]. Radovljica: Regionalni izobraževalni center. Jereb, E., Bohanec, M. & Rajkovič, V. (2003). Dexi: računalniški program za večparametrsko odločanje: uporabniški priročnik [Dexi: a computer program for multi-attribute decision-making: a user manual]. Kranj: Moderna organizacija. Kohlbergova pravična skupnost [Kohlberg equitable community]. (2006). Retrieved March 12, 2007, from http:// sl.wikipedia.org/wiki/Kohlbergova_pravi%C4%8D-na_skupnost Kovač, T. (2002). Odločitveni model za ocenjevanje kakovosti učno-vzgojnega procesa v izobraževanju odraslih. Magistrsko delo [Decision model for assessing the quality in adult educational process. Master's thesis], University of Maribor, Faculty of Organizational Sciences, Kranj, Slovenija. Kovač, T., Resman, M. & Rajkovič, V. (2010). The model for evaluating the influence of student participation on school quality. Napredak, 151(3/4), 335-349. Morrison, K. (1998). Management theories for education change. London: Paul Chapman Publishing. NAKVIS (2014). Criteria for the Accreditation and External Evaluation of Higher Education Institutions and Study Programmes. Retrieved June 20, 2014, from http://test.nakvis.si/en-GB/Content/Details/253 Paci, A. M., Lalle, C. & Chiacchio, M. S. (2013). Education for innovation: trends, collaborations and views. Journal of Intelligent Manufacturing, 24 (3), 487-493, http://dx.doi.org/10.1007/s10845-012-0631-z. Pauko, M. (2011). Izdelava modela notranjih institucionalnih evalvacij na Univerzi vMariboru. Magistrsko delo [Developing a model of internal institutional evaluation at the University of Maribor. Master's thesis], Univerza v Mariboru, Fakulteta za organizacijske vede, Kranj, Slovenija. Rajkovič, V. & Bohanec, M. (1991). Decision support by knowledge explanation. In H. G. Sol & J. Vecsenyi (Eds.), Environments for supporting decision processes: proceedings of the IFIP WG 8.3 Working Conference on Environments for Supporting Decision Processes, Budapest, Hungary, 18-21 June 1990 (pp. 47-57). Amsterdam [etc.]: North-Holland. Rajkovič, V. (1999). Večkriterijsko modeliranje in kakovost kompleksnih sistemov zdravstva in šolstva [Multi-criteria modelingand and complex system quality in health and education]. In R. Bohinc & M. Černetič (Eds.), Civilna družba v Sloveniji in Evropi: stanje in perspektive: zbornik razprav, SAZU, 23. marec 1999 (pp. 300-307). Ljubljana: Društvo Občanski forum: FDV. Sarrico, C. S., Rosa, M. J., Teixeira, P. N. & Cardoso, M. F. (2010). Assessing Quality and Evaluating Performance in Higher Education: Worlds Apart or Complementary Views?. Minerva, 48(1), 35-54. http://dx.doi. org/10.1007/s11024-010-9142-2 Sultan, P. & Wong, H. Y. (2014). An integrated-process model of service quality, institutional brand and be- 129 Organizacija, Volume 48 Research papers Number 2, May 2015 havioural intentions. Managing Service Quality, 24 (5), 487-521, http://dx.doi.org/10.1108/MSQ-01-2014-0007 Wintersteiner, W. (2003). Postmoderna, pluralizem in pedagogika - kulturno izobraževanje v časih globalizacije [Postmodernism, pluralism and pedagogy - cultural education in globalization]. Sodobna pedagogika, 54(3), 22-38. Zangoueinezhad, A. & Moshabaki, A. (2011). Measuring university performance using a knowledge-based balanced scorecard. International Journal of Productivity and Performance Management, 60(8), 824-843, http:// dx.doi.org/10.n08/1741040111n82215 Zoric, M. (2004). Evalvacija v izobraževanju odraslih: Pot do večje kakovosti izobraževanja odraslih [Evaluation in Adult Education: The path to a better quality of education]. Andragoška spoznanja, 10(2), 18-26. Tatjana Kovac is an Assistant Professor of Business Informatics and a Vice-Dean of the Faculty of Commercial and Business Sciences in Celje. She graduated in the field of Information Systems, got her Master's Degree in Management of Information Systems at the University of Maribor, Faculty of Organisational Science and got her PhD Degree in Pedagogy from the University of Ljubljana, Faculty of Art. Her research interests include expert modelling and decision support systems in the field of decision processes. She is involved in many Quality Management Projects at the Faculty. Informacijska podpora evalvaciji kakovosti v visokem šolstvu Ozadje in namen: Cilj raziskave je bil razviti predlog informacijske podpore evalvaciji v visokem šolstvu, ki bi podprl delo ocenjevalca kakovosti in analitika kakovosti. Zajeta so vsa področja, ki jih tudi NAKVIS (Nacionalna agencija RS za kakovost v visokem šolstvu) ocenjuje v postopkih zunanje evalvacije visokošolskih zavodov. Metodologija: Za podporo dela ocenjevalcev kakovosti smo uporabili metodo MAUT (Multi-Atribute Utility Theory) in Ms Excel, za podporo analitiku kakovosti pa exspertno modeliranje v programu Dexi (Decision Expert). Rezultati: Identificirani kriteriji za ocenjevanje posameznih področij presoje omogočajo poenoteno ocenjevanje, ne glede na to, kdo so ocenjevalci. Excelova predloga z avtomatskimi izračuni ocen je namenjena tehnični podpori dela evalvatorja, ekspertni model v Dexu pa je namenjen analitiku kakovosti. Zaključek: Predstavljeni model za evalvacijo kakovosti visokošolskega zavoda lahko zagotovi celovito in transpar-entno obravnavo kakovosti na fakulteti, predvsem pa z informacijsko in tehnično podporo olajša procese evalvacije. Ključne besede: kakovost v visokem šolstvu, evalvacija kakovosti, informacijska podpora ocenjevanju kakovosti 130