levels and techniques of evaluation in educational technology

40
اﻟﺮﺣﻴﻢ اﻟﺮﲪﻦ اﷲ ﺑﺴﻢSultanate of Oman Ministry of Higher Education Educational Technology At Omani Higher Education Institutions Presented by: Dr Ali Sharaf Al Musawi Dr Hamoud Nasser Al Hashmi Curriculum and Teaching Methods Dept., College of Education Center for Educational Technology Sultan Qaboos University 2004 Abstract

Upload: 6182062133

Post on 20-May-2015

1.699 views

Category:

Education


3 download

TRANSCRIPT

Page 1: Levels and techniques of evaluation in educational technology

بسم اهللا الرمحن الرحيم Sultanate of Oman Ministry of Higher Education

Educational Technology At Omani Higher Education Institutions

Presented by:

Dr Ali Sharaf Al Musawi Dr Hamoud Nasser Al Hashmi

Curriculum and Teaching Methods Dept., College of Education

Center for Educational Technology Sultan Qaboos University

2004

Abstract

Page 2: Levels and techniques of evaluation in educational technology

2

Educational Technology at Omani Higher Education Institutions

Dr. Ali Sharaf Al Musawi

Dr Hamoud Nasser Al Hashmi The purpose of this research was to address the current and prospective views on educational technology (ET) in order to discover the difficulties and develop its utilization in Omani higher education. The main instruments used to carry out this research were two questionnaires: the faculty members' questionnaire, and the technical/administrative staff questionnaire. They were developed by the researchers by generating a list of potential issues of ET derived from the literature and national level standardized surveys. The face validity and reliability of both instruments were calculated. In addition, in-depth interviews were conducted to verify some areas of the effectiveness of instructional software/equipment use brought up by faculty members in the questionnaire. Data were collected and several statistical treatments were used in data analysis. The participants were (159) ET specialists, administrators, and ET and learning resources centers’ (LRCs) staff representing all educational technologists who have been with the public and private Omani higher education institutions. The findings show no significant differences between the participants' views in relation to three variables (job, qualification, and type of institution) in terms of their abilities to use instructional equipment/facilities. The findings also show no significant differences between the participants' views in regard to: the impediments of use, and evaluation of instructional technology in relation to two variables (qualification, and type of institution). There are no significant differences in regard to: the frequency of use in relation to two variables (job and type of institution). There are no significant differences between the participants' views in regard to: the frequency of use, and ability to use instructional software in relation to two variables (job and type of institution). However, some other findings show significant differences in favor of faculty members in terms of instructional software design/production experience, and in favor of PhD holders in terms of the ability to use instructional software. The views, quoted in the interviews, confirm that the perceived values of the technology are tangible, and that this is an important reason for lecturers’ adoption of the technology. Unless the faculty members feel the effectiveness and the usefulness of the technology they will not use it. Their attitudes to the use of ET were influenced by these professional considerations in their teaching. Literature Survey

Page 3: Levels and techniques of evaluation in educational technology

3

Educational media and technology (ET) play a significant role within the teaching and learning process in higher educational institutions and become an important part of educational systems and processes. ET helps in improving educational methods and delivery and the quality of teaching and learning. However, ET should be planned in a strategic way in order to employ its capabilities to reach cost-effectiveness standards. Experience shows that Oman needs a vision by which its higher education can adapt ET. The purpose of this research is to review and analyze some indicators and trends presenting the future needs for development in ET. Educational Technology and Learning Research seems to indicate that educational technologies such as instructional radio and films were as effective as traditional classroom instruction (Hannafin and Savenye, 1993). Computer based education, when used in "tutorial" or "drill and skill" mode, leads to equivalent student's achievement using other kinds of classroom methods such as personal tutoring (Viadero, 1997). Likewise, one of the first Omani experimental research conducted on on-line instruction (OLI) at Sultan Qaboos University concluded that OLI is equally effective in students' achievement as traditional teaching methods (Al Musawi & Abdelraheem, 2004b). However, studies have revealed a modest but positive relationship between technology and achievement at all levels of education and subject. Research also indicates that students in a technology-rich environment write more, finish units of study more quickly, show more self-motivation, work cooperatively, express positive attitudes about the future, and are better able to understand and represent information in a variety of forms (Viadero, 1997). It could be, therefore, argued that ET may not directly affect students' achievement but it improves their learning styles and processes. Educational Technology Issues ET at the Omani higher education has received an improved positive administrative/faculty support and is actively developing in terms of staffing, equipment, and finance due to two main factors: “improvements and new acquisition of modern technology and software” and “improved administration support for media use in teaching”. However, it faces serious challenges which are “insufficient or limited materials/supplies/resources/space”, “inexperienced personnel”, “limited/inadequate training of staff”, and “need to employ new skilled staff” (Al Musawi, 2002). These results are corroborated by previous studies which concluded that ET at Omani higher education is characterized by the: underutilization of advanced technology; and unsatisfactory staff skills to fulfill the required level (Al-Hajri, 2000:94; Al Khawaldi, 2000:121). This can be partially attributed to the administrative leadership because as technology moves into institutions at an ever faster pace, administrators are feeling overwhelmed (Trotter, 1997). Studies show that teachers are, in many instances, short of the required preparation time to apply the new educational innovations. In addition, many teachers don't use, and sometimes resist, the use of technology. Possible explanations for such resistance are: poorly designed software, technophobia, doubt that technology improves learning outcomes, fear of redundancy where lecturer’s replacement by technology, resentment

Page 4: Levels and techniques of evaluation in educational technology

4

of the technology as a competitor for student's attention, and complacency with old practice by senior faculty (Hannafin and Savenye, 1993; Akinyemi and Al Musawi, 2002). In response to the-above mentioned issues, Boyd (1997) recommends that institutions employ a technology trainer. A clear trend towards technical/faculty staff increase along with a need to employ staff with specific specializations and qualifications are reported in the Omani context (Al Musawi, 2002). These findings are supported by the need to initiate college/university media specialist certification/accreditation programs (Abu Jaber & Osman, 1996; Al Khawaldi, 2000:121). Research also explains that new technologies require new skills; and that Omani higher education institutions are falling behind in professional development. There is, therefore, a strong need for updating and retraining staff and teachers (Abu Jaber & Osman, 1996). This implies that in-service teachers will need updates in acquiring new skills in order to manage educational innovations. However, a constant refrain in the literature concerning the use of technology is the need for more and better teacher training (Boyd, 1997; Bialo & Soloman, 1997; Zehr, 1997). To deal with issues raised above, more higher education institutions are requiring administrators, faculty, and technicians to take technology courses (Trotter, 1997). Teacher preparation programs are important because the future teacher will depend on the technological skills for both personal productivity and for instructional activities; and these should be part of the required courses for prospective teachers (Kook, 1997:58-59). Educational Technology: New Applications Omani efforts of utilizing educational and information technology in higher education proceed rapidly regardless of some issues resulted from the technical, logistical, and human factors (Al Musawi and Abdelraheem, 2004a). Nowadays, students navigate easily through the Internet searching for information and knowledge resources and get linked with their counterparts in any part of the world through the Internet (Al Rawahy, 2001). Sultan Qaboos University adopted e-learning by providing its faculty members with WebCT tools combined with f2f instruction. An increase in the number of on-line running courses and their users is noticeable. Internet instructional uses by SQU faculty members are however, mostly limited to obtaining information and rich resources available at all times. This suggests that they should be trained and encouraged to broaden their use beyond the present status (Abdelraheem and Al Musawi, 2003; and Al Musawi and Abdelraheem, 2004b). Major results show that e-learning is needed and its standards must be set before it can be developed, disseminated and diffused. This might yield to overcome problems of enrolment and access to Omani higher education (Al Musawi and Akinyemi, 2002). In this research, several questions have been raised. They are centered upon how higher education institutions' key persons such as specialists, technicians, and administrators perceive current and future issues in regard to ET. This should ultimately lead to set future indicators and goals by which Omani higher education improves its investment and utilization of ET in order to reach effectiveness levels/standards.

Page 5: Levels and techniques of evaluation in educational technology

5

Research Objective This research was designed to assess the current status of ET in order to discover the difficulties and develop its utilization in Omani higher education. It also aimed at determining indicators which help to formulate a future strategic plan for Omani higher education ET. Significance of the Research The need for such research is vital for the following reasons:

• To explore the extent to which ET services are utilized by Omani higher education institutions.

• To discover the range to which these services are likely to be developed in the next decade.

• To raise the awareness of Omani higher education administrators about the importance of coordinated strategic planning in this field.

• To pave the way for researchers and decision makers to measure the cost-effectiveness of ET services for planning purposes.

Research Questions Several questions have been raised as follows:

1. What are the current quantitative levels of technical and technological equipment/facilities?

2. To what extent is the effectiveness of the current design, production and use of instructional software/equipment?

3. What are the future equipment/facilities/software requirements in relation to the increase in students’ intake?

4. To what extent are the human, financial, and training resources available at present?

5. What are the needs for future human, financial, and training resources and university programs in ET field?

6. To what extent are ET research funds and mechanisms available? Research Methods The main instruments were two questionnaires: the faculty members' questionnaire, and the technical/administrative staff questionnaire. They were developed by producing a list of potential issues of ET derived from the literature and national level standardized surveys. The face validity of the list was calculated by presenting it to a group of referees in the area of ET. The experts made some modifications on the original sections and items; and added some others. The total sections became four for the faculty members' questionnaire incorporating: (1) demographics; (2) career development; (3) ability to use technology; and (4) training needs. The total sections of the technical/administrative staff questionnaire were the same in addition to one more section on quantities, budget, and staff issues. While all other sections of both questionnaires were designed using rating scale, the first and the second sections, along with some parts of the questionnaires, were generally formed of open-ended questions. Then, the reliability coefficient was measured by alpha-Cronbach and it was found between 0.85-0.89.

Page 6: Levels and techniques of evaluation in educational technology

6

In addition, in-depth interviews were conducted to verify some areas brought up on the issue of the effectiveness of instructional software/equipment use in the faculty members questionnaire. The format of these interviews was semi-structured. Research Design and Statistical analysis In this research the dependent variable is current and future prospective uses of ET measured by the sample's responses to the questionnaire items, whereas the independent variables are:

• Institution: (two levels: University and college). • Type of Institution: (two levels: Public and private). • Age: (five levels: 20-29, 30-39, 40-49, 50-59, and 60-65). • Nationality: (two levels: Omani and expatriate). • Gender (two levels: Male and female). • Job Area: (five levels: Educational only, Educational/Academic,

Technical only, Administrative only, and Technical/Administrative). • Qualification: (four levels: PhD, MA, BA, and Ed. Diploma). • Job (two levels: faculty and Technical/Administrative).

Analytic descriptive approach was used for the questionnaires. The following statistical treatments were used in data analysis: percentages, means, standard deviations, t-test, ANOVA, Pearson Correlation Coefficient, Scheffe test, and Chi2

test. The interviews were also analyzed qualitatively using the data encoding and transcription; followed by content analysis to find patterns and trends. Participants The population of this study included ET specialists, administrators, and ET/learning resources center’s staff representing all educational technologists who have been with the public and private Omani higher education institutions. The public HE institutions include: Sultan Qaboos University, Colleges of Education at the Ministry of Higher Education, Higher Technology Colleges at the Ministry of Manpower, and Health Institutes at the Ministry of Health. The private HE institutions include: nine university and colleges. The two forms of the questionnaire were distributed to all institutions with specific instructions on the people and ways to fill them. Participants were 159 faculty, technical, and administrative members of staff who responded to the questionnaire. This sample represents more than 45% of the total population as indicated by institutions' statistics. It should be noted that some returned questionnaires had some missing values and this has slightly affected the data analysis process. Table 1 describes the participants’ distribution in relation to demographic variables: institution, type of institution, age, nationality, gender, job area, and qualification. Only three variables; namely: job, type of institution and qualification, were used for analysis purposes because they are more influential than other variables.

Table (1) Participants' Distribution based on Demographic Variables

Page 7: Levels and techniques of evaluation in educational technology

7

No. Variable Levels Faculty Technical/Admin Staff

1 Institution University 11 13 College 90 45

2 Type of Institution

Public 60 14 Private 41 44

3 Age 20-29 10 30 30-39 33 19 40-49 35 7 50-59 21 56 60-65 2 2

4 Nationality Omani 24 53 Expatriate 77 4

5 Gender Male 70 48 Female 31 9

6 Job Specialty Educationalist 4 - Educationalist/Academic 97 - Technician - 26 Administrator - 15 Technician/Administrator - 17

7 Qualification PhD 36 11 MA 44 9 BA 21 28 Ed. Diploma - 7

8 Total 101 58 159

Data Analysis, Findings, and Discussions: Part One

Page 8: Levels and techniques of evaluation in educational technology

8

The Analysis of the Questionnaires

I. The Current and Prospective Quantitative Levels of Technical and Technological Equipment/Facilities

I.A. Current Quantities The Technical/Administrative staff were asked to summarize their institutions' current instructional equipment/facilities on a quantity categories scale of: (5-20); (20-50); (50-100); (>100); and (not applicable). Table 2 lists the current quantity categories in ratios as perceived by Technical and Administrative staff.

Table (2)

Percentages of the Technical/Administrative Staff Responses on the Current Equipment/Facilities Quantities (n=58)

Item 5-20 20-50 50-100 > 100 NA OHP 46.6 6.9 5.2 1.7 39.6 Slide Projector 48.3 5.2 1.7 - 44.8 Audio Recorder 48.3 5.2 1.7 - 44.8 VCR 41.4 5.2 5.2 - 48.2 Computer 19.0 3.4 6.9 25.9 44.8 LCD Data Show 32.8 3.4 6.9 8.6 48.3 Monitors 25.9 6.8 5.2 12.1 50.0 Photo/Digital Camera 41.4 3.4 - 3.4 51.8 Digital Video Camera 46.6 3.4 - 1.7 48.3 Scanner 50.0 6.9 - 1.7 41.4 Printer 29.3 17.3 1.7 1.7 50.0 Smart Board 34.5 5.2 - 1.7 58.6 Microphone 32.7 5.2 5.2 5.2 51.7 Headphones 27.5 5.2 5.2 12.1 50.0 Audio Studio 29.3 3.5 1.7 1.7 63.8 TV Studio 29.3 1.7 - - 69.0 Closed Circuit TV Lab 27.6 1.7 - - 70.7 Language Lab 32.8 - - - 67.2 Multimedia Lab 31.1 - 1.7 - 67.2 Intranet 22.4 - 1.7 10.4 65.5 Instructional Software 17.2 15.6 6.9 3.4 56.9

Page 9: Levels and techniques of evaluation in educational technology

9

Current Equipment/Facili ties Quantities

100-50-10020-505-20

Mea

n (%

)

40

35

30

25

20

15

10

5

0

Fig. 1 Current Equipment/Facilities Quantities

Table 2 and Fig. 1 above show that less than (50%) of the sample shows that almost all equipment/facilities are in the range of (5-20) in numbers. The finding points out that those ET equipment/facilities are currently few in numbers in higher education institutions. This finding is verified by another; more than (50%) of the sample indicated that their institutions currently lack most of these equipment and facilities. These ratios are alarming specifically if new technologies (e.g. Intranet - 65.5%; and multimedia/language labs- 67.2% each) are to be considered. I.B. Future Needs The Technical/Administrative staff were asked to expect their institutions' future needs of instructional equipment/facilities in terms of quantity categories of: (5-20); (20-50); (50-100); (>100); and (not applicable). Table 3 lists the expected quantity categories in ratios as perceived by Technical and Administrative staff.

Expected Increase in Equipment/Facilities Quantities

100- >50-10020-505-20

Mea

n (%

)

30

25

20

15

10

5

0

Fig. 2 the Expected Increase in Equipment/Facilities Quantities

Page 10: Levels and techniques of evaluation in educational technology

10

Table (3) Percentages of the Technical/Administrative Staff Responses on the Expected

Increase in Equipment/Facilities Quantities (n=58) Item 5-20 20-50 50-100 > 100 NA OHP 25.9 17.2 6.9 1.7 48.3 Slide Projector 31.1 6.9 6.9 1.7 53.4 Audio Recorder 29.4 6.9 8.6 1.7 53.4 VCR 31.1 5.2 8.6 1.7 53.4 Computer 6.9 5.2 10.3 24.2 53.4 LCD Data Show 19.0 8.6 58.6 5.2 8.6 Monitors 12.1 8.6 8.6 10.4 60.3 Photo/Digital Camera 29.3 5.2 5.2 1.7 58.6 Digital Video Camera 29.3 8.6 5.2 1.7 55.2 Scanner 24.1 12.1 5.2 1.7 56.9 Printer 19.0 13.8 10.3 1.7 55.2 Smart Board 17.2 8.6 5.2 6.9 62.1 Microphone 25.9 3.4 5.2 6.9 58.6 Headphones 19.0 5.2 5.2 10.3 60.3 Audio Studio 24.1 8.6 3.4 1.7 62.2 TV Studio 25.9 5.2 1.7 - 67.2 Closed Circuit TV Lab 25.9 1.7 1.7 - 70.7 Language Lab 32.8 3.4 1.7 - 62.1 Multimedia Lab 29.4 3.4 1.7 - 65.5 Intranet 22.3 5.2 5.2 5.2 62.1 Instructional Software 17.3 8.6 6.9 6.9 60.3

Missing values have an effect on these findings, specifically the lack of many responses which was analytically categorized as (not applicable). However, Fig. 2 and Table 3 above show a noticeable and more balanced increase in the expected quantities. A tendency to expand on the portable, new, and less expensive technologies was observed. There is also a tendency to utilize electronic software and equipment. This implies an awareness to expand on the e-learning, e-classroom, and digital multimedia technologies. By this, the 1st and 3rd research questions are answered.

Page 11: Levels and techniques of evaluation in educational technology

11

II. The Effectiveness of the Current Design and Production of Instructional Software

The participants were asked to describe their current experience to design and produce instructional software on a Rating scale of four options: (very advanced); (advanced), (beginner); and (not applicable); these options were analytically given the values of: (3), (2), (1), and (zero) consecutively. Then, participants' responses were arranged in descending order according to their means. Table 4 shows the descriptive statistics of this question.

Table (4) Frequencies, Means and Standard Deviations of the Participants’ Responses on

Designing/Producing Instructional Software

Table 4 shows that the means are between (1.34) and (2.06). In light of this, it was found that only the experience of transparencies design/production is at its advanced level since its mean was above the theoretical mean (2). Although this finding sounds to be consistent with Table 11 (see III.B.) findings in that the ability to use presentation software is at its advanced level, it also shows that the participants' experience in designing and producing instructional software is not as effective as expected. In order to shed more light on this finding, a comparison between faculty members' and Technical/Administrative staff was conducted using t-test. Table 5 below shows the results of t-test which indicate significant differences in favor of faculty members. This means that Technical/Administrative staff is far less in their instructional software design/production experience than the faculty members. This, in turn, implies the need to improve their experience through training and practice to enable them to perform their technical/administrative tasks.

Item Frequencies Mean* SD NA beginner advanced v. advanced Transparencies 27 13 40 76 2.06 1.13 E-learning Software

28 20 45 59 1.89 1.12

Instructional CDs 35 22 40 55 1.76 1.17 Presentation Software

35 23 34 56 1.75 1.19

Slides 37 23 38 57 1.74 1.19 Databases 39 24 38 48 1.64 1.19 Multimedia Software

40 30 34 39 1.50 1.17

Videotapes 53 21 35 40 1.42 1.23 Statistical Software 52 24 35 37 1.39 1.20 Audiotapes 59 23 30 40 1.34 1.24 *Theoretical Mean between 0-3.

Page 12: Levels and techniques of evaluation in educational technology

12

Table (5) Designing/Producing Instructional Software and t-test for Job Variable

Job n Mean SD t Sig. Faculty 100 1.77 .87 2.195 .030 Technical/Administrative Staff 57 1.45 .90

Another t-test was carried out to compare the participants' design/production experience and their type of institution. The test results shown in Table 6 reveal no significant differences which indicate that participants' experiences in both public and private higher education are the same.

Table (6) Designing/Producing Instructional

Software and T-test for Type of Institution Variable Type of Institution n Mean SD t Sig. Public 74 1.76 .96 1.43 Non sig. Private 83 1.55 .81

Page 13: Levels and techniques of evaluation in educational technology

13

III. The Effectiveness of the Current Use of Instructional Software/Equipment

III. A. Value to use instructional software/equipment The participants were asked to evaluate the classroom's use of the instructional software/equipment on a Rating scale of three options: (indispensable); (negligible), and (don't know); these options were analytically given the degrees of: (2), (1), and (zero) consecutively. Then, the participants' responses were arranged in descending order according to their means. These are listed in Table 7.

Table (7) Frequencies, Means and Standard Deviations of the Participants’ Responses on the

Value of Using Instructional Software/Equipment

Item Frequencies Mean* SD DN negligible indispensable Presentation Equipment 9 6 137 1.84 .50 Presentation Software 14 7 132 1.77 .60 Computer for Instructor Use 13 10 130 1.76 .59 Internet Link 14 11 129 1.76 .59 Many Computers for Students Use 15 13 123 1.72 .64 Instructional CDs 18 10 124 1.70 .67 Instructional Software 15 20 118 1.67 .65 Scanner 24 26 100 1.51 .80 Monitors 29 15 104 1.51 .76 Printer 28 21 102 1.90 .79 E-learning Software 28 44 79 1.34 .77 Smart Board 21 56 70 1.33 .72 Multimedia Software 34 33 84 1.33 .76 E-mail Software 26 48 76 1.33 .82 Microphone 38 32 78 1.27 .85 Headphones 49 31 69 1.13 .88 Digital Video Camera 55 44 51 0.97 .84 Photo/Digital Camera 60 41 49 0.93 .85 * Theoretical Mean between 0-2. Table 7 shows that the means are between (0.93) and (1.84). After comparing these means with the theoretical mean (1), it was found that participants' perceive using (16) instructional software/equipment in the classrooms as indispensable to the classroom teaching in their institutions. This implies the respondents' belief and awareness on the role of instructional technology in higher education. To probe the role of the participants' job to their evaluation of instructional software/equipment, t-test was carried out. However, the t-test results indicate no significant differences in relation to job variable (see Table 8). This suggests an unequivocal understanding amongst faculty members and Technical/Administrative staff towards the value of instructional technology for teaching process.

Page 14: Levels and techniques of evaluation in educational technology

14

Table (8) Value of use Instructional

Software/Equipment and t-test for Job Variable Job n Mean SD t Sig. Faculty 99 1.47 .42 .268 Non Sig. Technical/Administrative Staff 55 1.46 .47

The role of qualification in this evaluation was sought by conducting ANOVA test. Again, it is obvious that the result shown in Table 9 shows no significant differences which points to the equal consideration of all qualification levels (PhD, MA, and BA) in regard to evaluation of instructional technology.

Table (9) ANOVA for Value of use Instructional Software/Equipment

Source of Variance S.S df M.S F Sig. Between Groups .00962 2 .00481 .254 Non Sig. Within Groups 28.635 151 .190

Another t-test for type of institution variable confirmed that no significant differences exist (see Table 10). This again implies that respondents of both public and private sectors equally signify the value of instructional technology.

Table (10) Value of use Instructional

Software/Equipment and t-test for Type of Institution Variable Type of Institution n Mean SD t Sig. Public 72 1.41 .46 1.674 Non Sig. Private 82 1.52 .40

In sum, the significance that the participants demonstrate on the value of using instructional software/equipment in their institutions' classrooms is not influenced by their job, qualification, or type of institution. III. B. Ability to Use instructional Software/Equipment

• Ability to Use instructional software The participants were asked to describe their ability to use the instructional software in teaching/workplace on a rating scale of four options: (very advanced); (advanced), (beginner); and (not applicable); these options were analytically given the values of: (3), (2), (1), and (zero) consecutively. Then, participants' responses were arranged in descending order according to their means. Table 11 shows the descriptive statistics of the responses.

Page 15: Levels and techniques of evaluation in educational technology

15

Table (11) Frequencies, Means and Standard Deviations of the Participants’ Responses on their

Ability to Use Instructional Software in Teaching/Workplace

Table 11 shows that the means are between (1.17) and (2.46). After comparing these means with the theoretical mean (2), it was found that participants' perceive (6) of their abilities to use the instructional software in teaching/workplace as advanced. This finding could be attributed to the fact that the six instructional software are the most updated and the easiest to operate. The abilities were compared to job variable using t-test. Table 12 shows that no significant differences exist. This means that the ability to use the instructional software is not influenced by job; and the abilities of both faculty members and Technical/Administrative staff are the same.

Table (12) Ability to Use Instructional Software in Teaching/Workplace

and t-test for Job Variable Job n Mean SD t Sig. Faculty 84 1.95 .77 .439 Non Sig. Technical/Administrative Staff 54 1.90 .66

ANOVA test was done to compare the ability to use instructional software to qualification. Significant differences at the level (0.045) were found as shown in Table 13.

Table (13) ANOVA for Ability to Use Instructional Software in Teaching/Workplace

Source of Variance S.S df M.S F Sig. Between Groups 3.294 2 1.647 3.182 .045 Within Groups 69.881 135 .518

Item Frequencies Mean* SD NA beginner advanced v. advancedSearch Engines 8 6 39 85 2.46 .83 Word Processors 5 5 55 72 2.42 .73 WWW 7 10 49 70 2.34 .83 Internet Explorer 11 7 43 71 2.32 .91 Presentation Software 9 19 46 61 2.18 .91 Instructional CDs 20 10 44 62 2.09 1.06 Videotapes 36 11 37 51 1.76 1.22 Audiotapes 31 18 41 46 1.75 1.15 Database 34 31 29 39 1.55 1.16 E-learning Software 52 25 24 28 1.22 1.19 Multimedia Software 50 28 25 23 1.17 1.14 * Theoretical Mean between 0-3.

Page 16: Levels and techniques of evaluation in educational technology

16

In order to verify the difference direction, the above test was followed by Scheffe test. The results shown in Table 14 below indicate that significant differences at the level (0.05) between BA and PhD holders exist in favor of PhD holders. This means that the PhD holders are more able to use instructional software. This could be attributed to PhD holders' continuous practice of instruction.

Table (14) Multiple Comparisons Scheffe for Ability to Use Instructional Software

in Teaching/Workplace (I) (J) (I-J) Sig. PhD MS .00614 Non Sig. PhD BS .3478 .05 MS BS .2864 Non Sig.

A t-test was also conducted to compare the ability to use instructional software to type of institution. The results shown in Table 15 show no significant differences. This means that the ability is the same in both sectors' institutions.

Table (15) Ability to Use Instructional Software in Teaching/Workplace

and t-test for Type of Institution Variable Type of Institution n Mean SD t Sig. Public 72 1.41 .46 1.674 Non Sig. Private 82 1.52 .40

• Ability to use instructional equipment

The participants were asked to describe their ability to use the instructional equipment and facilities on a rating scale of four options: (very advanced); (advanced), (beginner); and (not applicable); these options were analytically given the values of: (3), (2), (1), and (zero) consecutively. Then, participants' responses were arranged in descending order according to their means. Table 16 below shows the descriptive statistics of the responses.

Page 17: Levels and techniques of evaluation in educational technology

17

Table (16) Frequencies, Means and Standard Deviations of the Participants’ Responses on their

Ability to Use Instructional Equipment/Facilities

Item Frequencies Mean* SD NA beginner advanced v. advanced Computer 6 9 42 76 2.41 0.81OHP 11 9 46 70 2.29 0.91Monitors 17 9 42 67 2.18 1.02Printer 16 13 42 61 2.12 1.02LCD Data Show 13 14 53 51 2.08 0.94Slide Projector 20 16 47 54 1.99 1.05Audio Recorder 25 11 39 58 1.98 1.13VCR 23 18 38 56 1.94 1.11Headphones 26 14 40 55 1.92 1.13Audio Studio 24 17 40 53 1.91 1.11Intranet 25 17 47 45 1.84 1.09Microphone 28 12 52 42 1.81 1.10Photo/Digital Camera 32 14 43 42 1.73 1.16Digital Video Camera 30 28 34 41 1.65 1.14Multimedia Lab 37 24 36 32 1.49 1.15Smart Board 52 17 31 23 1.20 1.18Scanner 60 17 26 26 1.14 1.21Language Lab 64 19 26 19 1.00 1.14TV Studio 73 12 24 22 0.96 1.19Closed Circuit TV Lab 77 15 20 15 0.79 1.10* Theoretical Mean between 0-3.

Table 16 shows that the means are between (0.79) and (2.41). After comparing these means with the theoretical mean (2), it was found that participants' perceive (4) of their abilities to use the instructional equipment/facilities as advanced. It is observed that these uses are of new, electronic, and portable technologies. This reflects both sectors' participants' awareness concerning the importance of these technologies. This finding was substantiated by the findings of Tables 17, 18, and 19 (see below). To verify the role of the participants' job in their ability to use the instructional equipment and facilities, a t-test was done. Table 17 presents the results of that test. It was found that no significant differences are there; which means that job does not play a role in identifying the abilities to use instructional equipment.

Table (17) Ability to Use Instructional Equipment/Facilities

and t-test for Job Variable Job n Mean SD t Sig. Faculty 84 1.72 .78 .380 Non Sig. Technical/Administrative Staff 55 1.77 .74

ANOVA test was conducted to compare the ability to use the instructional equipment and facilities to qualification variable. No significant differences were observed as

Page 18: Levels and techniques of evaluation in educational technology

18

presented in Table 18. This shows no influence exist due to qualification in determining the respondents' ability to use instructional technology.

Table (18) ANOVA for Ability to Use Instructional Equipment/Facilities

Source of Variance S.S df M.S F Sig. Between Groups 3.025 2 1.513 2.649 Non Sig. Within Groups 666.77 136 .571

To compare the ability to use instructional equipment/facilities to type of institution, a t-test was conducted. Table 19 presents the results of that test. It was found that no significant differences exist because of this variable.

Table (19) Ability to Use Instructional Equipment/Facilities

And t-test for Type of Institution Variable Type of Institution n Mean SD t Sig. Public 63 1.81 .84 .934 Non Sig. Private 76 1.68 .70

This generally reveals a conformity of that the abilities to use instructional equipment/facilities is not influenced by other variables related to job, or qualification, or type of institution. III. B. Frequency of Use Instructional Software/Equipment The participants were asked to describe the frequency of their use of the instructional software and equipment in their teaching/workplace on a rating scale of four options: (daily); (weekly), (monthly); and (never used); these options were analytically given the values of: (3), (2), (1), and (zero) consecutively. Then, participants' responses were arranged in descending order according to their means. Table 20 (see next page) shows the descriptive statistics of the responses. Table 20 shows that the means are between (0.46) and (1.57). After comparing these means with the theoretical mean (2), it was found that the all means were below the theoretical mean. Frequency of use seems thus to be weak. Surprisingly, this contradicts the participants' belief of the value of instructional technology summarized in Table 16 above. This reveals that the practice in the field is not as it is on the attitudinal level. However, this finding could be explained by the impact of some impediments listed in Tables 23 and 26 (see III.C.) in teaching and workplace which may obstruct the frequent use of instructional technology.

Page 19: Levels and techniques of evaluation in educational technology

19

Table (20) Frequencies, Means and Standard Deviations of the Participants’ Responses on the

Frequency of Use Instructional Software/Equipment

Item Frequencies Mean* SD NU monthly weekly daily Statistical Packages 27 50 27 41 1.57 1.09 Presentation Software 16 65 40 30 1.56 0.93 Instructional CDs 30 52 23 37 1.47 1.10 Database 51 38 14 43 1.34 1.23 Slide Projector 59 17 41 33 1.32 1.21 OHP 51 33 37 30 1.30 1.14 Scanner 47 38 36 27 1.29 1.10 LCD Data Show 34 61 31 16 1.20 0.93 Word Processors 17 96 23 12 1.20 0.75 Search Engines 14 107 20 10 1.17 0.73 Computer 21 95 30 9 1.17 0.68 E-mail Software 16 105 24 9 1.17 1.05 Monitors 46 54 23 24 1.17 1.01 WWW 40 68 15 25 1.17 0.68 Internet Explorer 19 101 26 7 1.14 0.68 VCR 69 24 19 34 1.12 1.24 Multimedia Software 75 18 19 37 1.12 1.27 Intranet 44 60 35 12 1.05 0.91 Instructional Software 52 54 21 19 1.05 1.01 Photo/Digital Camera 72 25 25 27 1.05 1.18 Multimedia Lab 77 17 22 27 0.99 1.20 Digital Video Camera 75 23 18 28 0.99 1.21 Printer 43 77 19 8 0.95 0.80 E-learning Software 72 31 17 22 0.92 1.12 Audio Recorder 86 17 21 25 0.90 1.18 Microphone 82 19 26 19 0.88 1.12 Headphones 91 21 23 14 0.73 1.04 Smart Board 91 20 16 17 0.72 1.07 Audio Studio 106 11 8 19 0.58 1.07 Language Lab 109 11 9 15 0.51 1.00 Closed Circuit TV Lab 113 11 4 17 0.48 1.01 TV Studio 111 10 10 12 0.46 0.95 * Theoretical Mean between 0-3.

Job variable does not seem to influence the frequency of use. This finding was derived from a t-test conducted and presented in Table 21. It shows no significant differences due to this variable.

Table (21)

Frequency of Use Instructional Software/Equipment and t-test for Job Variable

Job n Mean SD t Sig. Faculty 99 1.04 .50 1.323 Non Sig. Technical/Administrative Staff 56 1.14 .44

To verify whether the type of institution affect the frequency of use or not, a t-test was carried out. The test results are presented in Table 22 below. It was found that no

Page 20: Levels and techniques of evaluation in educational technology

20

significant differences is there due to this variable; which means that both sectors' participants have the same frequency of use.

Table (22) Frequency of Use Instructional Software/Equipment

and t-test for Type of Institution Variable Type of Institution n Mean SD t Sig. Public 73 1.10 .48 .614 Non Sig. Private 82 1.05 .48

III. C. Impediments of Use

• As perceived by faculty members The faculty members were asked to choose the impediments of using the instructional software and equipment in their teaching on a rating scale of four options: (definite impediment); (possible impediment), (not impediment); and (don't know); these options were analytically given the values of: (3), (2), (1), and (zero) consecutively. Then, participants' responses were arranged in descending order according to their means. Table 23 shows the descriptive statistics of the responses.

Table (23) Frequencies, Means and Standard Deviations of the Faculty Members’ Responses on

the Impediments to Use Instructional Technology Item Frequencies Mean* SD DN N. I. P. I. D. I. Little Number of Fixed/Portable Equipment 14 4 29 34 2.02 1.08 Little Financial Support 16 6 27 33 1.94 1.13 Students' Ignorance of How to Use Costly ET 14 3 38 26 1.94 1.03 Lack of Internet Links in Classrooms 17 7 25 30 1.86 1.15 Insufficient Time to Help Students 20 2 36 23 1.77 1.12 Little ET Training 19 3 38 20 1.74 1.09 Student's/ Low Awareness of ET Importance 25 2 28 27 1.70 1.22 Little Administrative Support 25 6 25 26 1.63 1.22 Small Size/Unequipped Classrooms 23 3 39 17 1.61 1.11 ET is Time Consuming in terms of Plan/Design/Produce Media

21 9 34 18 1.60 1.10

Technical Difficulties 30 2 30 19 1.47 1.22 Students become lazier when using ET 28 8 31 14 1.38 1.14 Inappropriateness of ET for My Academic Specialization

34 4 25 14 1.23 1.21

Increase of Cheating Amongst Students when Using ET

33 11 30 8 1.16 1.07

Class Management Difficulty resulting from Students Disturbance when Using ET

42 4 26 9 1.02 1.14

Feeling of the Possibility of Being Replaced by ET

49 9 19 3 0.70 0.96

* Theoretical Mean between 0-3. Table 23 shows that the means are between (0.70) and (2.02). After comparing these means with the theoretical mean (2), it was found that only one impediment (i.e. Little Number of Fixed/Portable Equipment) was perceived by faculty members as important. This reveals the need to increase the numbers of equipment in order to

Page 21: Levels and techniques of evaluation in educational technology

21

reach more effective levels of instructional technology applications in higher education institutions. This finding could be related to the findings presented in Table 2 and Table 3 on current and future quantitative needs. To see the role of faculty members' qualification on impediments to use instructional technology, ANOVA test was conducted. Table 24 shows the results of this test which report no significant differences due to qualification variable. This means that all faculty members' levels of qualification (PhD, MA, and BA) recognize the same impediments.

Table (24) ANOVA for Impediments to Use Instructional Technology (as Perceived by Faculty)

Source of Variance S.S df MS F Sig. Between Groups .253 2 .126 .303 Non Sig. Within Groups 33.409 80 .418

A t-test was also conducted to determine the effect of type of institution on faculty members' perception of impediments of use. The test results presented in Table 25 prove that there were no significant differences in relation to type of institution variable. This means that both sectors’ faculty members recognize the same impediments.

Table (25) Impediments to Use Instructional Technology (as Perceived by Faculty)

and t-test for Type of Institution Variable Type of Institution n Mean SD t Sig. Public 58 1.56 .67 .176 Non Sig. Private 25 1.59 .57

• As perceived by Technical/Administrative staff

The Technical/Administrative staff were asked to choose the impediments of using the instructional software and equipment in their workplace on a rating scale of four options: (definite impediment); (possible impediment), (not impediment); and (don't Know); these options were analytically given the values of: (3), (2), (1), and (zero) consecutively. Then, participants' responses were arranged in descending order according to their means. Table 26 (see next page) shows the descriptive statistics of the responses Table 26 shows that the means are between (1.41) and (2.51). After comparing these means with the theoretical mean (2), it was found that (4) impediments were perceived as important by the Technical/Administrative staff. These impediments are as follows:

1. Little training on ET. 2. Little financial support. 3. Ignorance of how to use costly instructional software/equipment. 4. Low awareness of the importance of ET.

Page 22: Levels and techniques of evaluation in educational technology

22

Table (26) Frequencies, Means and Standard Deviations of the Technical/Administrative Staff

Responses on the Impediments to Use Instructional Technology Item Frequencies Mean* SD DN N. I. P. I. D. I. Little ET Training 3 -- 15 31 2.51 0.7

9 Little Financial Support 5 3 15 27 2.28 0.9

7 Ignorance of How to Use Costly ET 8 -- 15 25 2.19 1.0

8 Low Awareness of ET Importance 7 2 16 23 2.15 1.0

5 Little Administrative Support 9 3 23 14 1.86 1.0

4 Little Number of Fixed/Portable Equipment 13 4 15 15 1.68 1.2

0 Technical Difficulties 15 -- 19 14 1.67 1.2

1 Inappropriateness of ET for Some Academic Specializations

14 2 19 12 1.62 1.17

Lack of Internet Links 16 6 14 12 1.46 1.20

ET is Time Consuming in terms of Plan/Design/Produce Media

14 7 17 8 1.41 1.11

* Theoretical Mean between 0-3. It seems that this finding reflects the Technical/Administrative staff concerns on their roles as planners, designers, and trainers. To enable them performing these roles, they have to be given the right amount of training, money, and learning on technologies of instruction. There were no significant differences in perceiving the impediments of use due to the Technical/Administrative staff qualifications. This finding was concluded from ANOVA test conducted and presented in Table 27 below.

Table (27) ANOVA for Impediments to Use Instructional Technology

(as Perceived by Technical/Administrative Staff) Source of Variance S.S df M.S F Sig. Between Groups 3.682 3 1.227 2.378 Non Sig. Within Groups 23.220 45 .516

Further, there was no significant differences in perceiving the impediments of use due to the Technical/Administrative staff type of institution. This finding was concluded from a t-test conducted and presented in Table 28 below.

Page 23: Levels and techniques of evaluation in educational technology

23

Table (28) Impediments to Use Instructional Technology

(as Perceived by Technical/Administrative Staff) and t-test for Type of Institution Variable

Type of Institution n Mean Std. Deviation t Sig. Public 10 1.64 .87 .937 Non Sig. Private 41 1.88 .70 This result shows conformity of respondents' perceptions on the issue of impediments. They identify the same impediments. To this end, the 2nd research question is answered.

VI. The Current and Prospective Human, Financial, and Training Resources

VI. A. Training

• Current Training The participants were asked to describe their in-service training workshops attended. Table 29 shows both the responses and results of Chi2 test in relation to qualification variable.

Table (29) In-Service Training Attended and Chi2 Test for Qualification Variable (n= 148)

Item 0-5 Workshop(s)

6-10 Workshop(s)

>11 Workshop(s) Chi2 Sig.

PhD 28 13 4 6.046 Non Sig. MS 35 8 7

BA 42 6 5 Table 29 reveals that participants of most levels of qualification attended (0-5) workshops. Chi2 value shows no significant differences. This means that qualifications held by the participants have no effect on their in-service training and the number of workshops attended. The participants were asked to describe hours allocated weekly for their own self career development. Table 30 shows both the responses and results of Chi2 test in relation to qualification variable.

Table (30) Allocated Weekly Self Career Development Hours and

Chi2 Test for Qualification (n= 152)

Item 1-9 Hour(s)

10-19 Hour(s)

20-29 Hour(s) Chi2 Sig.

PhD 9 17 21 14.134 .007 MS 12 24 17

BA 25 16 11

Page 24: Levels and techniques of evaluation in educational technology

24

Table 30 reveals that most levels of qualification allocated (10-19) weekly hours for self-development. Chi2 value shows a significant differences at the level of (0.007) in relation to qualification held by the participants. This means that qualifications positively influence the self-development weekly hours. The participants were also asked to describe the number of conferences attended. Table 31 shows both their responses and results of Chi2 test in relation to qualification variable.

Table (31) Number of Conferences Attended and Chi2 Test for Qualification (n= 137)

Item 5 conferences

6-10 conferences

>11 conferences Chi2 Sig.

PhD 34 8 5 9.737 .045 MS 36 7 2

BA 34 1 1 Table 31 reveals that most levels of qualification attended (5) conferences during their employment. Chi2 value shows significant differences at the level of (0.045) in relation to qualification held by the participants. This means qualifications held by the participants have a positive effect on the number of conferences attended

• Future Training Needs The participants were asked to choose areas of training they need to do on a rating scale of three options: (very important); (important), and (not important); these options were analytically given the values of: (2), (1), and (zero) consecutively. Then, participants' responses were arranged in descending order according to their means. Table 32 (see next page) shows the descriptive statistics of the responses.

Table (32) Frequencies, Means and Standard Deviations of the Participants’ Responses on

Future Training Needs Item Frequencies Mean* SD

N. I. I V. I. Using Audiovisual Equipment 44 58 34 1.93 .76 Using Digital Technology 52 52 31 1.84 .77 Using Statistical Packages 49 62 24 1.81 .71 Using E-learning Software 49 62 21 1.79 .70 Evaluating Instructional Media 58 56 18 1.70 .70 Designing Using Software 57 60 16 1.69 .68 Using Presentation Equipment 58 61 15 1.68 .67 Using Internet 75 38 23 1.62 .76 Designing Instructional Software 74 48 14 1.56 .65 Designing and Producing Instructional Media

71 52 12 1.56 .68

Using Computers 90 33 11 1.41 .64 * Theoretical Mean between 0-2.

Table 32 shows that the means are between (1.41) and (1.93). After comparing these means with the theoretical mean (1), it was found that all training areas were

Page 25: Levels and techniques of evaluation in educational technology

25

perceived by the participants as important. This finding shows that the participants think that the need is persistent to get resourceful and multi-purpose training in ET. To verify this finding, a t-test was conduct to explore the job variable in relation to future training needs. The test results are listed in Table 33. The mean of faculty members in this test was (1.7597) and the mean of Technical and Administrative staff was (1.5127). The results report that significant differences were found at the level of (0.005) in favor of faculty members. This finding can be attributed to the continuous instruction and research practiced by faculty members.

Table (33)

Future Training Needs and t-test for Job Variable Job n Mean Std. Deviation t Sig. Faculty 98 1.76 .49 2.834 .005 Technical/Administrative Staff 41 1.52 .41 ANOVA test was also conducted to examine the role of qualification in the training needs. The test results are shown in Table 34. Significant differences were observed at the level of (0.018).

Table (34) ANOVA for Future Training Needs

Source of Variance S.S df M.S F Sig. Between Groups 1.819 2 .910 4.121 .018 Within Groups 30.021 136 .221

In order to verify the difference directions, the above test was followed by Scheffe test. The test results show significant differences at the level of (0.012). This makes it clear that BA holders are more in need for training than the MA holders (see Table 35). Since the BA holders are usually newly-employed in higher education institutions, the finding shows that they require more attention in terms of training.

Table (35) Multiple Comparisons Scheffe for Future Training Needs

(I) (J) (I-J) Sig. PhD MS - .1654 Non Sig. PhD BS .1048 Non Sig. MS BS .2702 .012

Another t-test was conducted to compare public and private institutions' participant responses to their perception of future training needs. The result presented in Table 36 shows no significant differences due to type of institution. This means that both sectors' participants look at the future training needs from the same perspective.

Table (36) Future Training Needs and t-test for Type of Institution Variable

Type of Institution n Mean Std. Deviation t Sig. Public 64 1.70 .48 .405 Non Sig. Private 75 1.67 .48

Page 26: Levels and techniques of evaluation in educational technology

26

VI. B. Human and Financial Resources and Research Funds The Technical/Administrative staff was asked to give figures on their current and future annual human and financial resources, and research funds in their institutions. Table 37 shows the figure ranges of their responses.

Table (37) The Participants’ Responses on Rages of

Annual Human and Financial Resources and Research Funds (as Perceived by Technical/Administrative staff)

Item Current Future Technical Staff 2-50 1-80 Annual Budget (R.O.) 300-50.000 400-750.000 Research Funds (R.O.) 6.000 20.000 Table 37 shows a huge increase in future resources in terms of staffing, budgeting, and research funding. This implies that more technicians and administrators should be prepared in the field of ET in order to meet these needs and manage the resources. Since no higher education institution in the Sultanate has a specialized department of ET; higher education will need to have specialized departments in ET and to recruit more specialized faculty members to teach courses offered by these departments. Pearson Correlation test was conducted to compare the job specialty variable (Technician, Administrator, and Technician/Administrator) and the current/future resources of: staffing, budgeting, and funding research. The results of this test are shown in Table 38.

Table (38)

Pearson Correlation for Annual Human and Financial Resources and Research Funds Item Technical Staff Annual Budget Research Funds Current - .228 - .336 .003 Future .056 .481 - .559 Table 38 results shows no significant differences between the current/future resources and the job specialty variable which means that the three categories of job specialty perceive their current and future resources in the same way. To this end the 4th, 5th and 6th research questions are answered.

Page 27: Levels and techniques of evaluation in educational technology

27

Data Analysis, Findings, and Discussions: Part Two

The Analysis of the Interviews Four faculty members were interviewed to discover the extent to which they understand the value of ET in teaching and learning processes. Information will be sought concerning how thoughtful these faculty members are in using ET, so as to reveal the influential elements behind their use, understanding and attitudes. The first respondent who uses technology in his teaching is an Omani national. He has taught for approximately six years. His rank in the University is that of Assistant Professor. The second respondent who uses ET constantly is also an Omani national. She has taught for approximately ten years. Her rank in the university was Assistant Professor. The third respondent in this regard is an Omani national. He has taught for approximately three years. His rank in the University is a lecturer. The fourth one who does not use ET at all in his teaching is also an Omani national employed as Assistant Professor. The interviews model examines the way faculty members are influenced and encouraged to use ET in their teaching, through the different ways they perceive its value in structuring their lectures and, therefore their students’ learning. It also concerns the way faculty members acknowledge the practical benefits of using ET: for example, how it enhances students’ learning, makes ideas accessible, comprehensible and clear, as well as how it may save teaching and learning time, and is thus able to aid interaction and participation between lecturers and their students. I. Faculty members’ understanding of the value and the effectiveness of

ET in organizing the structure of a lecture I.A. Interviews

• Interview with faculty member 1 Questioner: “What are the reasons which make you use these particular types of ET? “ Answer: “… It gives me the chance to organize my lecture and present most of the course objectives at the same time”.

• Interview with faculty member 2 Questioner: “as an example of ET, where have you found transparencies to be the most useful? “ Answer: “… transparencies save my time, organize my lecture and help my students take notes. They play a major role in organizing, preparing and arranging the topic”

• Interview with faculty member 3 Questioner: “Through your extensive experience of this technology, where have you found it to be the most useful and effective?” Answer: “….It helps me arrange and organize my lecture or presentation in the lab in a sequenced and orderly way”

Page 28: Levels and techniques of evaluation in educational technology

28

• Interview with faculty member 4 Questioner: “Through your extensive use of education technology, where have you found it to be the most useful and effective?” Answer: “….I found it effective in some long lectures, which last more than two hours. These long lectures can become boring if I use only chalk and talk. So in such lectures I try to use ET, which helps me display ideas and their sequence in a systematic manner and to go from one point to another easily and smoothly. I also use video. I found it useful to explain the topic better than I could say it…..” I. B. Interpretation of Interviews These faculty members understood the general notion that ET helps them keep tidy and in sequence the essential aspects of the information to be communicated to their students. In this regard they understand that ET is a tool which keeps the logic of their lecture in a required sequence. These faculty members also understood that ET is useful and effective in structuring the logical sequence of the ideas and facts which they wish to present to their students. ET helped them both organize and prepare their lectures in many different teaching contexts, such as classrooms, laboratories and lecture theatres. Faculty member 1 explained this idea of clear presentation because he highlighted the importance of well-designed OHPs being used to enhance the clarity and flow of information from which students can readily make notes. Faculty member 2 noted that ET helped her to present and display complex ideas, because she found, from experience, that OHP transparencies can be designed to lay out the sequence of ideas clearly. Thus, she asserted that her use of OHPs was effective and useful in the organization of her teaching since she was able to display many of the course objectives. Faculty member 3 has found that particular technologies are useful and effective in teaching science in laboratory conditions. His use of multimedia technology, for example helped him demonstrate very complex biological ideas to a large group of students, because he was successfully able to relate one fact to another clearly. Faculty member 4 has also found ET, especially OHP and PowerPoint, useful and effective in managing students’ attention and concentration in his lecture. It helped him to display his ideas in sequence in a systematic manner and enabled him to go from one point to another easily and smoothly. He also found that selected videotapes which related to his lecture were useful and effective in enhancing the flow of information, as well as to explain some topics better than he could in his own words. It is clear from faculty members’ responses that they understand the essential purposes in using ET are to organize their teaching so as to:

• Organize the structure of the lecture; • Present complex ideas.

Page 29: Levels and techniques of evaluation in educational technology

29

II. Faculty members’ understanding of the effectiveness and value of ET in presenting complex idea.

II.A. Interviews

• Interview with faculty member 1 Questioner: Through your extensive experience of this technology, where have you found it the most useful and effective? Answer: “…I can say that I have found this technology 95% useful and effective….. It reduces students’ questions about aspects they have not grasped. Before this technology was available it was more difficult to draw students’ attention and to make it clear what I was talking about. Now with this new technology, we can overcome all these problems”.

• Interview with faculty member 2 Questioner: “What are the reasons which make you use a particular type of ET instead of other media?” Answer: “… It gives me the chance to organize my lecture and present most of the course objectives at the same time, because for this course I have to present many objectives at all levels and I used to write them on the board. This took most of my time especially, as I have to designate part of the course for students’ practice”. II. B. Interpretation of Interviews The responses of these two faculty members reflect their understanding of the value and the effectiveness of using ET in presenting complex ideas. The first faculty member indicated that the new technology helped him reduce the students’ questions, because he used to have to bring every student to the microscope to see the bacteria sample, which used to take most of the lecture time. In addition, his students often to ask him to repeat what he have said. It was also difficult for him before multimedia technology was available to convey clearly what he was talking about to a large group of students because it contained a lot of visual information which each student need to see. So with the help of this new technology he was able to overcome all the problems he previously confronted because he used ET to project images whether from microscope, video or computer on a screen for all his class to see. The second faculty members also understood that ET was helpful and effective in presenting complex ideas. She indicated that transparencies gave the opportunity to present most of her course objectives, instead of writing them on the board which took most of her time. By listing key ideas on OHP transparencies she could relate one idea to another directly and easily. Thus, faculty members demonstrated that ET was important in helping them present complex ideas and facts clearly, concisely and economically. This approach of organizing ideas and information to be learned is an established approach.

Page 30: Levels and techniques of evaluation in educational technology

30

Faculty members’ understanding of the effectiveness of ET skills in helping students’ concentration

III.A. Interviews

• Interview with faculty member 1 Questioner: Where have you found transparencies or videotapes effective and useful? Answer: “…They attract my students’ attention and make them concentrate on the subject especially if I use more than one medium and display video and audio materials”.

• Interview with faculty member 2 Questioner: Where have you found overhead projector and video programs or any other technology you use effective and useful?

Answer: “…it helps my students to understand and concentrate their thinking on the topic” III. B. Interpretation of Interviews With regard to students’ learning, faculty members indicated that ET was useful and effective because it helps their students concentrate and focus directly upon the information being communicated. For example she found that students respond very well to both video and audio materials because they attract interest and subsequently engage the students in the work so that they fully concentrate on the information displayed. Images and words projected on a screen provide students with opportunities to write information at their own pace. This means that they can concentrate and also see relationships between ideas better. Faculty member 1 found that when he uses ET, his students enjoy the lecture, because he feels that they understand the subject better and therefore fully concentrate on it. Faculty member 2 indicated that ET helped keep her students concentrate on the subject matter. From her experience, her teaching can be designed and planned in advance to focus on key issues. For this reason she claimed that presenting audio and video materials was effective and useful in promoting students’ learning because for example, videos are prepared materials which are designed to communicate specific information. Thus, faculty members who used ET found it useful and effective in students’ learning improving students’ concentration on the subject matter because ET provide pre-prepared materials which are legible, clean and sequenced. It is these elements which enable the learner to concentrate directly to the subject matter.

Page 31: Levels and techniques of evaluation in educational technology

31

VI. Faculty members’ understanding of the effectiveness of ET skills in attracting students’ attention to subject matte.

VI.A. Interviews

• Interview with faculty members 1 Questioner: “In your extensive use of this particular technology, where have you found it the most useful?” Answer: “… it occupies my students’ time and attracts their attention to the topic”.

• Interview with faculty member 2 Questioner: “Where have you found transparencies and videotapes or any other technology you use effective and useful”? Answer: “It was very effective and useful in attracting students’ attention….”

• Interview with faculty member 3

Questioner: ““If we talk again about your use of ET, what encourage you to make such heavy use of it? What is its value? Answer: “……and encourages them to participate more fully and to pay attention to the subject matter” VI. B. Interpretation of Interviews Faculty members express their views and understanding of the value and effectiveness of ET, because it helps them communicate clearly and directly with their students through attracting their attention to the subject they teach. Because students tend to concentrate better, they therefore learned better the information being provided. Faculty member 1 stated that he found multimedia technology useful and effective in attracting students’ attention to the topic he is talking about because he found when using such technology in a large group of students he does not need to repeat what he is explaining to his students since multimedia helped his student focus more on the subject he teaches. Faculty member 2 also found the use of videotapes effective and useful, and indicated that well-planned and designed video and audio materials can attract students’ attention to the subject matter that is because the power of visual information designed and live videos cannot be underestimated. Faculty member 3 also supports his colleagues’ views that the use of ET helped him attracting his students’ attentions because he has noticed that when using ET his students seem to focus more in the subject he teaches. Thus, faculty members demonstrate their beliefs that ET was effective and useful in promoting students learning through attracting their students’ attention to the subject matter.

Page 32: Levels and techniques of evaluation in educational technology

32

V. Faculty members’ understanding of the effectiveness of ET skills to facilitate face–to–face interaction

V.A. Interviews

• Interview with faculty member 1 Questioner: “Where have you found transparencies to be the most useful?” Answer: “…using transparencies give me the chance to face my students and interact with them”.

• Interview with faculty member 2

Questioner: “what encourages you to continue with use of ET?

Answer: “…it creates interaction and a participation atmosphere and also encourages the students to think of using alternative means in their teaching careers in the future”.

• Interview with Faculty member 3 Questioner: “What are the reasons that make you use particular types of ET instead of other media?”

Answer: “….When I use transparencies I face my students and interact with them”.

• Interview with faculty member 4 Questioner: “Through your extensive use of ET, where have you found it useful and effective”? Answer: “…I also found video useful to explain the topic better than me, and to create the interaction with the students. V. B. Interpretation of Interviews Students’ interaction with the subject matter, and with their instructor, is one of the issues in which faculty members demonstrate their understanding of the value of ET in students’ learning. Using ET to facilitate interaction with their students also encourages tutors to use it regularly. These faculty members understood that students’ engagement is an important element of the learning process. So they use ET to create and manage their students’ interaction with the topic and with their instructors. This is brought about by the fact that ET can hold and present complex information in written and visual form and can be viewed clearly. Importantly, the learner can see interrelationship of ideas more readily. Faculty member 1 noted that ET gave him the opportunity to interact with her students because the use of OHP presentation helped her face her students and make eye contact and allowed her to interact with them.

Page 33: Levels and techniques of evaluation in educational technology

33

Faculty member 2 developed this idea by saying that changing the classroom environment through the use of different technologies created interactions and student participation during her lecture. She also asserted that using ET encouraged her students to think of the use of alternative methods when they become teachers. Faculty member 3 has found ET useful and effective in face to face learning because using transparencies helped face-to-face interaction with students. Faculty member 4 has also found ET useful and effective. He highlights the importance of selecting appropriate videotapes in developing face to face interaction. These responses show faculty members’ understanding of the essential role of using ET in assisting students learning. They use ET to create and manage their student’s interaction with the topic and with their instructors. They understand that creating interaction is very important in learning task.

VI. Faculty members’ understanding of the effectiveness of (ET) skills

in helping students' self-directed learning

VI.A. Interviews

• Interview with faculty member 1 Questioner: “what encourage you to use ET continuously?” Answer: “…it helps in the self- directed learning situation.”

• Interview with faculty member 2 Questioner: “what encourages you to use ET continuously?” Answer: “… it assists students' self-directed learning”

• Interview with faculty member 3 Questioner: “what encourage you to use ET continuously?” Answer: “…. the students now depend on themselves…”

Questioner: “Do you mean it helps in self-directed learning?”

Answer: “Yes, it is very important in developing self-directed learning”

Page 34: Levels and techniques of evaluation in educational technology

34

• Interview with faculty member 4

Questioner: “If we talk again about your use of ET, what encourage you to make heavy use of it?”

Answer: “it helps the students to develop self-directed learning….” VI. B. Interpretation of Interviews The responses quoted above illustrate faculty members’ understanding of the value and effectiveness of ET in self-directed learning. They understand that some of the new technologies offer the opportunity for students to teach themselves or to use ET to promote their knowledge after finishing their class because the students can return to the information stored at the ET repeatedly. Faculty member 1 states that one of the reasons he is encouraged to use ET is because it helps him update his own knowledge through self-learning as well as being a self-directed learning tool for his students. Faculty member 2 supports this idea and expresses the same feeling by saying ET assists self-learning. Faculty member 3 has found that particular technology is useful and effective in students’ self-directed learning because students can now use such technologies to revise the course and enhance their knowledge independently of their teachers. It is clear from faculty members’ responses that they understand ET as being useful and effective in self-directed learning situations, both for themselves and for their students. Because of this reason they use it regularly in achieving students learning since students self-directed learning is a part of the learning process.

Page 35: Levels and techniques of evaluation in educational technology

35

VII. Faculty members’ understanding of ET value and effectiveness in helping enhance students’ achievement

VII.A. Interviews

• Interview with faculty member 1 Questioner: “Have you observed enhanced student achievement through the use of such technology?” Answer: “Definitely. Before I use this technology I had to bring every student to the microscope to show him the bacteria and to study their movement. It was so difficult to get this image into every student’s mind. But now with technology the student can easily see it from his desk and that gives him confidence and increases his knowledge and he really understand what is happening” Questioner: “How exactly you know that their achievement was enhanced? was that through examination?” Answer: “… What I have noticed about students’ achievements is that before I used this technology it was difficult to convey the information about the growth and movement of the bacteria to their mind. This was clear from the questions they kept asking. But now, using the technology, they raise their hands less often, which suggest to me that their achievements have increased”.

• Interview with faculty member 2 Questioner: “Have you observed enhanced student achievement while using such technology?” Answer: “… Definitely, because I have noticed when I use PowerPoint or transparencies or posters in the classroom, the level of student achievements is better than when I use only chalk and talk”.

• Interview with faculty member 3 Questioner: “In your opinion does ET enhance learner achievement?” Answer: “Yes, it helps the learner achievement by making things clear for him”

• Interview with faculty member 4 Questioner: “Through your extensive use of ET, have you observed that it enhanced students’ achievement? Answer: “Its usefulness and effectiveness is clearly obvious in the level of students’ performance in the exam. I noticed that the use of transparencies or any other medium in the classroom was reflected in the arrangement of their answers, especially from those who take notes during the lecture… ”

Page 36: Levels and techniques of evaluation in educational technology

36

VII. B. Interpretation of Interviews Enhancing students’ achievement is an important aspect of the effectiveness of ET. Those who use technology in their teaching demonstrated that from their experience they have noticed that ET does enhance achievement. The grades and examination performance become better when students are taught through the use of ET. Faculty member 1 has noticed that using multimedia has enhanced his students understanding of the growth and movement of the bacteria. This is clear to him from the fact that his students used to ask him to go over the topic again and again, where now they can directly and clearly see what is happening. This is a good example of how lecture has developed his confidence to teach with ET because they see a very effect use of it in teaching. Faculty member 2 asserted that she also noticed that when she uses PowerPoint or transparencies or posters in his classroom, the level of students’ achievement and understanding of the topic is better than when she only used chalk and talk methods. Faculty member 3 also has found that ET enhances learner achievement because it makes the ideas and facts clear to him. The response of faculty member 4 extends the idea of enhancing students’ achievement. He highlighted the way in which the use of well-designed transparencies was reflected in the students’ performance in examination. He noticed that students who take notes from transparencies achieve better results than those who do not take notes. Thus, it is clear from faculty members that one of the reasons they use ET is because it enhances their students’ achievements which is very essential part of the learning process.

VIII. Interviews Findings The value and the effectiveness of the ET can be seen by faculty members in how they organize two aspect of their teaching;

• The organization of the structure of the lecture; • The presentation of complex ideas.

Furthermore, faculty members who used ET found it useful and effective in enhancing students learning in various ways:

• It helps students’ concentration • It attracts attention to the subject matter • It allows face-to-face interaction • It enhances students’ achievement • It helps students to develop self-directed learning.

Page 37: Levels and techniques of evaluation in educational technology

37

Results and Conclusions This research was originally initiated to assess the current and prospective views on ET in order to discover the difficulties and develop its utilization in Omani higher education. On the technical resources level, it was found that almost all equipment/facilities are currently in the range of (5-20) in numbers; and several institutions lack most of these equipment and facilities, specifically the new technologies such as Intranet and multimedia labs. A tendency for future expansion on the portable, new, and less expensive technologies was found. The need to increase the quantities of updated technologies/software/equipment was clearly substantiated by the participants' written comments in the open-ended parts of the questionnaires. These findings imply an awareness to expand on the e-learning, e-classroom, and digital multimedia technologies. These findings corroborate to a great extent literature results conducted by Al Rawahy (2001); Al Musawi and Akinyemi (2002), and Al Musawi and Abdelraheem (2004a, and 2004b). On the human and financial resources level, it was found that the participants expect a huge increase in future resources in terms of staffing, budgeting, and research funding. This implies that more technicians and administrators should be prepared in the field of ET in order to meet these needs and manage the resources. Since no higher education institution in the Sultanate has a specialized department of ET; higher education will need to have specialized university programs in ET and to recruit more specialized faculty members. These findings are substantiated by research of Trotter (1997), Kook (1997), Boyd (1997), Abu Jaber & Osman (1996); Al-Hajri (2000), Al Khawaldi (2000), and Al Musawi (2002). On training level, it was found that participants of most levels of qualification attended (0-5) workshops. All training areas were perceived by the participants as important; and BA holders are more in need for training than the MA holders. The need for training was corroborated by the participants' written comments in the open-ended parts of the questionnaires. These findings support studies findings of Abu Jaber & Osman (1996), Boyd (1997), Bialo & Soloman (1997), and Zehr (1997). On the design and production levels, only the experience of transparencies design/production was perceived as advanced by the participants. The participants' experience in designing and producing instructional software was not found as effective as expected. Technical/Administrative staff was found far less in their instructional software design/production experience than the faculty members. On the use level, the participants perceive using (16) instructional software/equipment in the classrooms as indispensable to the classroom teaching in their institutions. They also perceive (6) of their abilities to use the instructional software in teaching and workplace as advanced. Further, it was found that PhD holders are more able to use instructional software and that the participants perceive (4) of their abilities to use new, electronic, and portable instructional equipment/facilities as advanced. This finding partially supports Abdelraheem and Al Musawi (2003) findings. However, the participants' frequency of using instructional software and equipment was weak Moreover, only one impediment (Little Number of Fixed/Portable Equipment) was

Page 38: Levels and techniques of evaluation in educational technology

38

perceived by faculty members as important whereas (4) impediments were perceived as important by the Technical/Administrative staff. Some of these impediments are generally reviewed by the work of Akinyemi and Al Musawi (2002). On the correlation level, the findings show views agreement between the participants in relation to three variables (job, qualification, and type of institution) in terms of their abilities to use instructional equipment/facilities. The findings also show views agreement in regard to: the impediments of use, and evaluation of instructional technology in relation to two variables (qualification, and type of institution). There is views agreement between the participants in regard to: the frequency of use in relation to two variables (job and type of institution). There is views agreement among the participants in regard to: the frequency of use, and ability to use instructional software in relation to two variables (job and type of institution). On the effectiveness level, the views, quoted in the interviews, confirm that the perceived values of the technology are tangible, and that this is an important reason for lecturers’ adoption of the technology. Unless the faculty members feel the effectiveness and the usefulness of the technology they will not use it. Their attitudes to the use of ET were influenced by these professional considerations in their teaching. These findings confirm Viadero's (1997) study. Lastly, participants' written comments show concerns on the following issues:

• Raising the awareness of senior administrators on ET. • Employment of only well versed expatriate faculty in ET. • Keeping abreast of the most update technology. • Appointment of technical (not academic) supervisors for multimedia

labs. • Speeding up the development of ET area. • Increase coordination amongst HE institution in regard to ET.

Recommendations In view of the above, it is recommended that:

1. Omani higher education institutions should be supported with more financial, technical, and human resources to increase and activate the use and number of new classrooms' instructional media and equipment.

2. Omani higher education should initiate and fund ET university programs and research.

3. Intensive systematic in-service training programs should be conducted for staff in areas of new ET design, production, use, and evaluation.

Page 39: Levels and techniques of evaluation in educational technology

39

References Abdelraheem, A. & Al Musawi, A. (2003). Instructional Uses of Internet Services by

SQU Faculty Members (Part2), International Journal of Instructional Media, 30 (2), 163-176. Available: http://static.highbeam.com/i/ internationaljournalofinstructionalmedia/january012003/instructionalusesofinternetservicesbysultanqaboosu/

Abu Jaber, M. & Osman, M. (1996). Utilization of Instructional Technology Services by Faculty Members at Sultan Qaboos University. International Yearbook on Teacher Education (ICET): 1996 Proceedings, (2), 13-21, Amman: Jordan.

Akinyemi, A. & Al Musawi, A. (2002). Re-orientating Instructional Development Practice in Higher Education: A Case of Sultan Qaboos University, In Kelliny, W. (Ed.) Surveys in Linguistics and Language Teaching III, E-Learning and E-Research, European University Studies Journal, (XXI)112, 29-37, Belgium: Brussels.

Al Hajri, M. (2000). The Learning Resources Centers at Ibra Technical Industrial College: Performance Evaluation. Unpublished M.Sc. thesis, University of Sheffield: UK.

Al Khawaldi, H. (2000). Faculty Perceptions towards ET Status at Omani Colleges of Education, Unpublished MA thesis, Yarmouk University: Jordan.

Al Musawi, A. & Abdelraheem, A. (2004a). E-learning at Sultan Qaboos University: Status and future, British Journal of Educational Technology, 35(3), 363-367. Available: http://www.blackwell-ynergy.com/links/doi/10.1111/j. 0007-1013.2004.00394.x/abs/

Al Musawi, A. & Abdelraheem, A. (2004b). The Effect of Using On-Line Instruction on Sultan Qaboos University Students' Achievement and their Attitudes Towards it, Education Journal- Kuwait, 18(70), 11-26.

Al Musawi, A. & Akinyemi, A. (2002). Issues and Prospects of E-Learning in Oman, Proceedings of ED-MEDIA 2002-World Conference on Educational Multimedia, Hypermedia & Telecommunications, (1), 17-18. Available: http://www.aace.org/dl/index.cfm/fuseaction/ViewPaper/id/10017/toc/yes

Al Musawi, A. (2002). The existing Formats and Functions of Media Units in the Omani Higher Education, Journal of Educational and Psychological Sciences- Bahrain, 3 (2), 33-51.

Al Rawahy, H. (2001). E-learning: A Telecommunications Perspective. In: Educational Technology Symposium/Exhibition 2001 Proceedings, SQU.

Bialo, E & Soloman, G. (1997). Open Your Eyes: The Evidence Is There! Technology and Learning, 18 (2), 70-71.

Boyd, E. (1997). Training-On-Demand: A Model for Technology Staff Development, Educational Technology, 37 (4), 46-49.

Hannafin, R.D., & Savenye, S. (1993). Technology In The Classroom: The Teacher's New Role and Resistance To It, Educational Technology, 33 (6), 26-31.

Page 40: Levels and techniques of evaluation in educational technology

40

Kook, J. (1997). Computers and Communication Networks in Educational Settings in the Twenty-First Century: Preparation for Educator's New Roles, Educational Technology, 37 (2), 56-60.

Trotter, A. (1997). A Test of Leadership, Education Week on the Web, Available: http://www.edweek.org/sreports/tc/admin/ad-n.htm (1997, December 4).

Viadero, D. (1997). A Tool for Learning, Education Week on the Web, Available: http://www.edweek.org/sreports/tc/class/cl-n.htm (1997, December 4).

Zehr, M. (1997). Teaching the Teachers, Education Week on the Web, Available: http://www.edweek.org/sreports/tc/teach/te-n.htm (1997, December 4).