VOL. 25, No. 2
This study identified the problem solving strategies used by students within a university course designed to teach pre-service teachers educational technology, and whether those strategies were influenced by the format of the course (i.e., face-to-face computer lab vs. online). It also examined to what extent the type of problem solving strategies and/or course format was correlated with students’ expressed level of confidence and competence to integrate technology into their future classroom settings. Data was extracted from surveys of over 1,500 students who had taken the educational technology methods course during one of nineteen semesters at a Midwestern university.
Results revealed the vast majority of students in both the face-to-face and online sections of a technology methods course felt they could integrate technology into their teaching, teach such technology to others, and use technology in their future teaching (technology competence). However the online students had significantly higher self-reported levels of technology competence. Significant differences were found between face-to-face and online students in the problem solving skills used while taking the course. However, there were no differences in problem solving methods for specific tasks. Small predictive relationships between technology competence, technology use confidence, problem solving skills, and course format were found.
La présente étude identifiait les stratégies de résolution de problèmes utilisées par les étudiants dans le cadre d’un cours universitaire conçu pour enseigner la technologie éducative aux enseignants complétant une formation préalable à l’enseignement, de même que l’influence que le type de présentation du cours (par ex. : laboratoire informatique en face-à-face vs. en ligne) pouvait avoir sur ces stratégies. L’étude examine également dans quelle mesure le type de stratégie de résolution de problèmes et/ou le type de présentation du cours pouvaient avoir une corrélation avec le niveau de confiance et de compétence à intégrer la technologie dans le cadre de leurs classes futures, tel que ce niveau fut exprimé par les étudiants. Les données ont été tirées de sondages effectués auprès de plus de 1 500 étudiants qui avaient suivi le cours des méthodologies de la technologie éducative au cours d’une des dix-neuf sessions à l’université Midwestern.
Les résultats démontraient que la grande majorité des étudiants, à la fois dans le segment face-à-face et le segment en ligne d’un cours sur les méthodologies de la technologie, estimaient qu’ils pouvaient intégrer la technologie dans le cadre de leur enseignement, enseigner cette technologie aux autres et utiliser la technologie dans leur enseignement futur (compétence en technologie). Toutefois, les étudiants en ligne avaient des niveaux auto-déclarés de compétence en technologie nettement plus élevés. Des différences importantes ont été décelées entre les étudiants en face-à-face et les étudiants en ligne au niveau des compétences de résolution de problèmes utilisées durant le cours. Toutefois, il n’y avait pas de différence au niveau des méthodes de résolution de problèmes pour des tâches spécifiques. De légères relations prédictives entre la compétence en technologie, le niveau de confiance dans l’utilisation de la technologie, les compétences de résolution de problèmes et le type de présentation du cours ont été relevées.
Today’s educational environments must include technology as an integral component (Lynch, 2009). Educational researchers have discovered that the adoption of technology not only fosters student learning but also has a positive impact on achievement when used appropriately (U.S. Department of Education [DOE], 2008). Technology-based tools can enhance student performance when they are integrated into the curriculum and used with knowledge about learning.
Given ever expanding content and technology choices, from multimedia to video via the Internet, there is an extraordinary need to understand how to achieve success, involving learner, teacher, curriculum, and the school environment in which technology is used (Marshall, 2002). Technology tools must now be an integral part of education, not just merely equipment existing in a classroom. In response, teacher education programs in the United States have increased their emphasis on technology integration, attempting to ensure that graduates meet technology standards (Stobaugh & Tassell, 2011). Indeed, the International Society for Technology in Education (ISTE) has developed performance assessment standards for initial and advanced educational computing and technology programs (National Council for the Accreditation of Teacher Education [NCATE], 1997; ISTE, 2000). These technology standards have been adopted by NCATE as a required component of pre-service teacher (i.e., those studying to become teachers) programs. To meet these technology accreditation requirements and performance standards, most colleges and universities have established a required technology course in their teacher preparation programs (Tanguma, Martin, & Crawford, 2002). A goal for these programs is that upon graduation, all pre-service teachers will be proficient in ISTE technology performance standards.
Overall, teacher preparation programs are tasked to create tech-savvy teachers who in turn can assist their students to convert information into meaningful knowledge, develop technological competencies, and participate in meaningful activities using advanced technologies (Kuiper, Volman, & Terwel, 2005). Yet, while it is important for pre-service teachers to graduate with proficient technology skills, this is not enough. Studies have found those who lack competence and/or confidence in technology will not use it (Gulbahar, 2008). And both pre-service and in-service teachers report moderately low levels of technology competence and confidence (Bers, 2010), and thus may not seek to integrate technology into their future teaching practices. Indeed, only 52% of the 4.0 million teachers working in public schools feel comfortable using the technologies available to them (National Center for Educational Statistics [NCES], 2009).
With over half of American public school teachers feeling uncomfortable with the integration of technology (NCES, 2009), and teachers reporting moderately low levels of technology competence (Bers, 2010), more research is needed regarding the technology-usage confidence and competence of pre-service teachers. Research perspectives in education define confidence as assuredness in oneself and in one’s capabilities (Simon & Durand-Bush, 2009), and competence is defined as a combination of skills, abilities, and knowledge needed to perform a specific task (National Post-Secondary Cooperative [NEPC], 2002). Confident and competent people feel they are able to tackle things or start new things and finish them, even though there may be learning challenges along the way.
One way to build such confidence and competence is to ensure pre-service teachers understand the material and are able to complete assignments in their technology courses (Karabenick & Newman, 2006). As students are introduced to new technologies, some may face new problems. For example, when a new software program is needed to complete an assignment, students may have difficulties mastering the organization and command options of that software. Yet, previous research has revealed that when faced with new problems, many learners hesitate to participate in the learning process because they lack confidence to solve such problems (Karabenick & Newman, 2006).
Until recently, the notion of seeking help from others (i.e., problem solving) was considered of little value with the idea that truly independent learners are not supposed to need others to succeed (Karabenick & Newman, 2006). In the literature on this topic, the concept of seeking help to solve problems is referred to in multiple ways (e.g., help seeking strategies, problem solving); however for consistency in the construct of this research, it is referred to as problem solving strategies.
However, problem solving is now listed among the most important activities contributing to university student success (McGee, 2005). Seeking help, or getting the assistance necessary to accomplish tasks independently, is an important self-regulation strategy that has been linked to high academic achievement and learner satisfaction in higher education (Zusho, Karabenick, Bonney, & Sims, 2007).
One standard theory on problem solving, initially outlined by Newell, Shaw, and Simon (1958), focuses on how humans respond when they are confronted with unfamiliar tasks. They found that problem solving is a social behavior grounded in the values and role structures of a given group or culture. Indeed, learners’ use of others to acquire information and master skills plays a central role in theories of development and learning (Vygotsky, 1978). By seeking help from others when necessary, the learner has the skill to take on more challenging tasks. Overall, seeking help when one cannot solve a problem is preferable to giving up (Butler, 1998), and research on problem solving has clearly established the importance of problem solving for students’ learning and mastery (Ames, 1983).
DePaulo, Nadler and Fisher (1983) published an important edited series of research studies that examined how learners seek help and investigated several aspects of the problem solving process. The meta-cognitive aspect of this process focuses mainly on the question-asking behavior or strategies used by students (Karabenick & Knapp, 1998; Puustinen, 1998). Of these question-asking behaviors, four primary strategies to solve problems have been found: (1) seeking instructor assistance, (2) seeking peer assistance, (3) further reading, and (4) trial and error.
In reference to students seeking instructor assistance, Newman and Goldin (1990) found that students often seek such assistance because they believe the teacher is more competent and better able to instruct. Students believe help from the teacher is more likely to foster learning (Newman & Goldin, 1990; Newman & Schwager, 1993). Also, students who seek help from instructors have a tendency to be more successful and master the task at hand (Bembenutty, 2006).
In reference to the second problem solving strategy of seeking peer assistance, there is evidence that this is extremely effective for a wide range of course goals and content, and for students with different levels and personalities (Johnson & Johnson, 1999). According to Karabenick and Knapp (1998), students view their peers as a fruitful source and often turn to them for assistance.
The third major type of problem solving is that of doing further reading on their own, and then working independently to solve their own course assignment problems (Karabenick & Newman, 2006). One way students work independently is to review information given in class along with any extra reading materials for assistance in how to complete an assignment. Another reading source is the “Help” link in a software program to find an answer to their questions. Students feel this help option is available to them when and if they need it, without the worry of time constraints (Whitesel, 2002).
The fourth major problem solving strategy is that of students solving problems via a trial and error method to complete a task. They just keep trying different things until they find the correct way to complete an assignment or a project (Johnson & Johnson, 1999). It has been found that using the trial and error method creates a meaningful learning experience, which may result in finding several solutions to a problem (Pavlina, 2005).
Previous research has established that problem solving allows the learner to acquire and master complex skills, thereby supplying the learner with confidence and competence. Studies tell us that if a teacher does not feel confident or competent with a skill, they will not use it (Lin, 2008; NCES, 2005). A principle emerging from research which compares the performance of experts and novices, and from research on learning and transfer, establishes that in order “to develop competency in an area, learners must have a deep foundation of knowledge and be able to organize that knowledge in ways that facilitate retrieval and application” (Bransford, Brown, & Cocking, 2000, p. 16).
Applying this concept to technology, Van Braak and Goeman (2003) identified several determinants of self-perceived technology confidence. Of these, students’ attitudes toward computers were found to be the strongest determinant of self-perceived computer competency. Positive attitudes toward computers, in turn, seemed to be mainly influenced by computer experience and amounts of computer use. Computer experiences and students’ attitudes toward computers were also found to directly impact technology competence. Which leads us to believe that problem solving is seen as a task-relevant effort, an investment that increases competence and confidence (Ames, 1983).
Some limited research has been conducted on problem solving issues within different course formats. In a face-to-face course, students have a specific time to be in the classroom for such course, and are able to receive one-on-one instruction, give and receive visual cues, interact socially and personal with peers, view modeling and demonstrations, and have spontaneous discussions (Tutty & Kline, 2008). Students can orally and in real time ask their instructor for assistance with any problems they may have. Whereas in the online environment, students are able to schedule study time around a work or social schedule, and can join discussions and revisit instructions at any time (Palloff & Pratt, 2007). If students need assistance from their instructor, however, they must rely on e-mail (or phone) and some delayed response from the instructor, who may or may not be available for real time chats or other forms of synchronous communications (such as a telephone conversation, or instant messaging).
Overall, the research on problem solving issues within different course formats has been mixed. For example, Kitsantas and Chow (2007) found that students in an online environment are more comfortable seeking help than in a traditional classroom. In another study comparing problem solving in an online environment to problem solving in a traditional classroom, Kumrow (2007) found more students in the online course engaged in problem solving. These online students also had higher final grades than those in the face-to-face classroom. Yet, another study on academic problem solving by Oberman (2006) found that physical proximity was an important factor of problem solving behaviors in a high school computer lab.
Given such outcome differences, more research was clearly needed. Therefore, our quantitative study explored the problem solving strategies used by pre-service teachers in learning specific technology skills and whether those strategies used were influenced by the format of a course (i.e., face-to-face computer lab vs. online). In addition, it examined to what extent the type of problem solving strategies and/or course format is correlated with students’ expressed level of confidence and competence to integrate technology into their future classroom settings. Specifically, the following research questions were examined:
This study used a casual comparative ex post facto research design, analyzing five years of data for over 1500 university students (the Summer I 2005 semester, through the Fall 2009 semester). The data was collected from pre-service teachers who completed an educational technology teacher methods course (EDT 3470), through either an online or face-to-face format at Western Michigan University. This educational technology course was created to assure that all pre-service teachers are proficient in required technology skills, and is based on ISTE Technology Standards. During the course, students learn to use several computer applications and are required to demonstrate their competency in using each application via class assignments.
We used five years of data to enhance the size of our population and statistical power within the results. Over the 5-year period, minimal course changes were made to address new technology applications; however, such revisions were equally made within both the face-to-face and on-line formats. While one could assume that students may have become more technologically competent and confident during this five-year period, we focused on differences between course formats, not any changes over time.
Students that enroll in EDT 3470 do so because it is a required class for all elementary pre-service teachers at this university. Such elementary pre-service teachers are all students who have been admitted to the elementary education program after the completion of 35 credit hours, and have achieved a cumulative grade point average of 2.5 or better at the time of application.
Students in this class are requested to complete an inventory of their perceived technology skills at the beginning and end of the course, using an instrument entitled ProfilerPRO. This assessment was created by Advanced Learning Technologies (ALTec), a division of the University of Kansas Center for Research and Learning, with its development partially funded by the U.S. Department of Education (ALTec, 2004).
The ProfilerPRO survey measures what a student leaving a teacher education program should know and be able to perform according to national ISTE standards and performance indicators. The instrument has 17 forced-choice, Likert-scale items on: technology operation and concepts, teaching learning and the curriculum, productivity and professional practice, and social, ethical and human issues. Respondents are asked to answer the questions as if they were currently teaching. The forced choice selections for participants include: (a) no opportunity or exposure, (b) I am aware, but I do not use this in my practice, (c) I am literate and integrate some of the indicators, (d) I integrate this into my teaching, and (e) I am able to teach others.
The second data set used in this study was from an end-of-course evaluation instrument developed for the EDT 3470 course. This evaluation is given online using Zoomerang software, and has forced-choice Likert-scale items measuring a number of items related specifically to the EDT 3470 course, including the problem solving strategies used overall in the course, as well as for specific technology tasks.
Fifteen hundred and twelve students over 19 semesters responded to all questions on the ProfilerPRO survey (both Pre and Post), which measured technology competence. Of those students, 870 (57.5%) took EDT 3470 in the face-to-face computer lab, while 642 (42.5%) took the course online. Thirteen hundred sixty students over 19 semesters responded to all questions on the course evaluation survey, which measured problem solving approaches, and technology use confidence. Of those students, 858 (62.2%) took EDT 3470 in the face-to-face computer lab while 522 (37.8%) took the course online. Table 1 summarizes the number of students from whom data was available for analysis in our study.
Table 1. Research Participants
Survey |
F2F n (%) |
Online n (%) |
Total (N) |
ProfilerPRO (measuring technology competence) |
870 (57.5) |
642 (42.5) |
1512 |
Course Evaluation (measuring problem solving and technology confidence) |
858 (62.2) |
522 (37.8) |
1380 |
While this study involved the comparison of data from a large number of students who had taken the same university course in one of two formats (i.e., face-to-face or online), a major limitation is that no random selection or assignment of students to such courses was possible. Therefore, students who were more inclined to use technology may have opted to take the online version. Data analysis therefore began by examining the Pre-ProfilerPRO data for any initial differences between the students within the online and face-to-face courses. Some significant differences (p-value of .05) were found and therefore an analysis of covariance (ANCOVA) was used to examine all comparison data for the face-to-face and the online groups. Preliminary checks were conducted to ensure that there was no violation of assumptions of normality, linearity, homogeneity of variances, homogeneity of regression slopes, and reliable measurement of the covariate. For each of the analyses, the independent variable was the course format (i.e., face-to-face, online).
Pearson correlation coefficients and regressions were also performed to determine what relationships existed between students’ self-reported technology competencies, self-reported technology usage confidence, and their use of any given problem solving strategies within the online course and the face-to-face computer lab course; and any differences within these relationships between the two course formats.
This section offers key findings for each research question, first offering the respondents’ views via frequencies and means, followed by patterns within the responses. Additional details regarding these findings can be found in Peterson (2010).
Table 2 summarizes data related to self-reported technology competence after taking the EDT3470 course. Overall, these data reveal that more than 85% of the students in both the face-to-face and online sections felt, after completing the university course which involved using several computer applications to create a variety of projects, they were able to either integrate technology into their teaching or teach such technology to others (i.e., technology competence). For each technology skill, those within the on-line courses reported having significantly higher levels of technology competency at the end of the course.
Table 2. Self-Reported Technology Competency
Technology Skill
|
Format |
No Opp n (%) |
Aware n (%) |
Literate n (%) |
Integrate n (%) |
Teach n (%) |
M |
Use to locate information |
F2F
|
0 (0.0)
|
7 (0.8)
|
105 (12.1)
|
358 (41.1)
|
400 (46.0)
|
4.32
|
On-line
|
0 (0.0)
|
3 (0.5)
|
43 (6.7)
|
195 (30.4)
|
401 (62.5)
|
4.55*
|
|
Use to improve products, learning |
F2F
|
1 (0.1)
|
11 (1.3)
|
112 (12.9)
|
433 (49.8)
|
313 (36.0)
|
4.20
|
On-line
|
0 (0.0)
|
4 (0.6)
|
42 (6.5)
|
255 (39.7)
|
341 (53.1)
|
4.45*
|
|
Exhibit positive attitude for use |
F2F
|
1 (0.1)
|
11 (1.3)
|
119 (13.7)
|
25 (48.9)
|
314 (36.1)
|
4.20
|
On-line
|
0 (0.0)
|
3 (0.5)
|
49 (7.6)
|
249 (38.8)
|
341 (53.1)
|
4.45*
|
|
Use to manage/ communicate |
F2F
|
1 (0.1)
|
26 (3.0)
|
190 (21.8)
|
390 (44.8)
|
263 (30.2)
|
4.02
|
On-line
|
1 (0.2)
|
8 (1.2)
|
92 (14.3)
|
247 (38.5)
|
294 (45.8)
|
4.29*
|
|
Use variety of media/formats |
F2F
|
2 (0.2)
|
29 (3.3)
|
182 (20.9)
|
405 (46.6)
|
252 (29.0)
|
4.01
|
On-line
|
0 (0.0)
|
9 (1.4)
|
84 (13.1)
|
243 (37.9)
|
300 (46.7)
|
4.29*
|
|
Understand technology systems |
F2F
|
3 (0.3)
|
20 (2.3)
|
184 (21.1)
|
425 (48.9)
|
238 (27.4)
|
4.01
|
On-line
|
2 (0.3)
|
5 (0.8)
|
95 (14.8)
|
277 (43.1)
|
263 (41.0)
|
4.24*
|
|
Use to develop strategies |
F2F
|
0 (0.0)
|
17 (2.0)
|
158 (17.2)
|
464 (53.3)
|
239 (27.5)
|
4.06
|
On-line
|
0 (0.0)
|
5 (0.8)
|
86 (13.4)
|
287 (44.7)
|
264 (41.1)
|
4.26*
|
|
Use to gain higher thinking skills |
F2F
|
1 (0.1)
|
14 (1.6)
|
170 (19.5)
|
479 (55.1)
|
206 (23.7)
|
4.01
|
On-line
|
0 (0.0)
|
3 (0.5)
|
83 (12.9)
|
319 (49.7)
|
237 (36.9)
|
4.23*
|
|
Use content tools to support learning/ research |
F2F
|
0 (0.0)
|
16 (1.8)
|
164 (18.9)
|
427 (49.1)
|
263 (30.2)
|
4.08
|
On-line
|
0 (0.0)
|
5 (0.8)
|
68 (10.6)
|
295 (46.0)
|
274 (42.7)
|
4.31*
|
|
Use to process data/report results |
F2F
|
1 (0.1)
|
13 (1.5)
|
124 (14.3)
|
380 (43.7)
|
352 (40.5)
|
4.23
|
On-line
|
1 (0.2)
|
2 (0.3)
|
76 (11.8)
|
222 (34.6)
|
341 (53.1)
|
4.40*
|
|
Observe use in field of study |
F2F
|
1 (0.1)
|
29 (3.3)
|
213 (24.5)
|
436 (50.1)
|
191 (22.0)
|
3.90
|
On-line
|
1 (0.2)
|
9 (1.4)
|
111 (17.3)
|
301 (46.9)
|
220 (34.3)
|
4.14*
|
|
Select proper tool for specific tasks |
F2F
|
3 (0.3)
|
18 (2.1)
|
215 (24.7)
|
440 (50.6)
|
194 (22.3)
|
3.92
|
On-line
|
0 (0.0)
|
6 (0.9)
|
101 (15.7)
|
316 (49.2)
|
219 (34.1)
|
4.17*
|
|
Use to create group projects |
F2F
|
2 (0.2)
|
30 (3.4)
|
204 (23.4)
|
383 (44.0)
|
251 (28.9)
|
3.98
|
On-line
|
1 (0.2)
|
11 (1.7)
|
116 (18.1)
|
271 (42.2)
|
243 (37.9)
|
4.16*
|
|
Use input/output devices |
F2F
|
7 (0.8)
|
40 (4.6)
|
270 (31.0)
|
376 (43.2)
|
177 (20.3)
|
3.78
|
On-line
|
1 (0.2)
|
21 (3.3)
|
138 (21.5)
|
282 (43.9)
|
200 (31.2)
|
4.03*
|
Likert scale: 1 = No opportunity or exposure, 2 = I am aware, but do not use this in my practice,
3 = I am literate and integrate some of the indicators, 4 = I integrate this into my teaching,
5 = I am able to teach others. * p ≤ .05
Looking now at patterns, a comparison of the percentage differences for each individual technology skill is displayed in Table 3. The table reveals how much higher one format was than the other for each competence skill, and which were significantly different.
Table 3. Comparison of Competence for Integration and Teaching of Technology Skills
Technology Skill
|
Teach Technology (Agree/Strongly Agree) |
Integrate Technology (Agree/Strongly Agree) |
||
F2F |
Online |
F2F |
Online |
|
Use to locate information |
+17%*
|
+11%*
|
||
Use to improve products, learning
|
+17%* |
+10%* |
||
Exhibit positive attitude for use |
+17%* |
+10%* |
||
Use to manage/communicate |
+16%* |
+9%* |
||
Use a variety of media/format |
+16%* |
+9%* |
||
Understand nature/operation of technology systems |
+14%*
|
+6%*
|
||
Use to develop strategies |
+13%* |
+8%* |
||
Use to gain higher thinking skills |
+13%* |
+5%* |
||
Use content-specific tools to support learning/research
|
+13%*
|
+3%
|
||
Use to process data/report results |
+12%* |
+9%* |
||
Observe/experience use in field of study |
+12%* |
+3% |
||
Use proper tool for specific tasks |
+12%* |
+2% |
||
Use to create group projects |
+12%* |
+2% |
||
Use input/output devices |
+11%* |
+1% |
Note: + # = % higher than the % in the other format; * = significant difference
Even though a majority of the students considered themselves competent enough to either integrate technology into their teaching or teach it to others, some significant differences were revealed between the course formats. Looking at each of the 14 technology skills assessed, a pattern was found with the students in the online course feeling more competent to teach each skill, while the students in the face-to-face course felt they had more of an ability to integrate it into their teaching. For example, a higher percentage of students (63%) from the online course agree or strongly agree that they were able to teach the use of technology to locate, evaluate and collect information from a variety of sources (i.e., used to locate information) when compared with the students (46%) in the face-to-face course (a difference of 17%); while a higher number of students (41%) in the face-to-face course felt they had the ability to integrate this competency in their teaching compared to 30% of the online students (a difference of 11%). The only exception to the pattern of the online students feeling more competent to teach, while those in the face-to-face felt more competent to integrate into teaching, was that online students (44%) believed they could integrate the use of input/output devices more than the face-to-face students (43%) (a slight difference of 1%).
Table 4 contains the frequencies, percentages, and means for the three survey questions related to self-reported technology use confidence after taking EDT3470.
Table 4. Technology Confidence
Technology Use |
Format |
Strongly Disagree N (%) |
Disagree n (%) |
No Opinion n (%) |
Agree n (%) |
Strongly Agree n (%) |
M |
Understands how technology supports student learning |
F2F
|
6 (0.7) |
21 (2.4) |
85 (9.9) |
444 (51.7) |
302 (35.2) |
4.18 |
On-line
|
5 (1.0) |
13 (2.5) |
47 (9.0) |
246 (47.1) |
211 (40.4) |
4.24 |
|
Will use technology to enhance student learning |
F2F
|
17 (2.0) |
35 (4.1) |
94 (11.0) |
427 (49.8) |
285 (33.2) |
4.08 |
On-line
|
14 (2.7) |
31 (5.9) |
45 (8.6) |
217 (41.6) |
215 (41.2) |
4.13 |
|
Can update portfolio after course |
F2F
|
16 (1.9) |
40 (4.7) |
132 (15.4) |
411 (47.9) |
259 (30.2) |
4.00* |
On-line
|
17 (3.3) |
54 (10.3) |
76 (14.6) |
205 (39.3) |
170 (32.6) |
3.88 |
Likert scale:1 = Strongly disagree, 2 = Disagree, 3 = No opinion, 4 = Agree, 5 = Strongly agree. * p ≤ .05
This study found that most (approximately 85%) of the students in both the face-to-face and online sections felt, after taking EDT 3470, they have the confidence to use technology in their future teaching. No significant difference was found between course formats for two of the three confidence questions. However, a small significant difference was revealed when students were asked about updating the portfolio that was created while taking EDT 3470. More students (78%, with 48% who agree and 30% who strongly agree) in the face-to-face course than students (72%, with 39% who agree and 33% who strong agree) in the online course agreed or strongly agreed they could so (a difference of 6%).
A comparison of the percentage differences for each computer usage confidence is displayed in Table 5, whereby the table shows how much higher one format was than the other for technology use confidence and which was significantly different.
Table 5. Comparison of Confidence for Agree and Strongly Agree Respondents
F2F (Agree/Strongly Agree) |
Online (Agree/Strongly Agree) |
|
Understands how technology supports student learning |
+1% |
|
Will use technology to enhance student learning |
(no difference) |
(no difference) |
Can update portfolio after course |
+6%* |
Note: + # = % higher than the % in the other format; * = significant difference
Students were also asked what problem solving skills they had used while taking EDT3470. Our study revealed that the overall use of the four problem solving skills discussed in this research do differ significantly between course formats. Most students in the face-to-face course either waited for assistance from their instructor, while most online students chose to discover an answer through the trial and error method or through further reading. Table 6 summarizes the number, frequencies, and means for these data.
Table 6. Overall Problem Solving Strategies Used
Strategy |
Format |
Never n (%) |
Not Very Often n (%) |
Often n (%) |
Most of The Time n (%) |
Always n (%) |
M |
Wait for instructor’s assistance |
F2F
|
22 (2.6) |
167 (19.5) |
269 (31.4) |
299 (34.8) |
101 (11.8) |
3.34* |
Online
|
26 (5.0) |
144 (27.6) |
152 (29.1) |
142 (27.2) |
58 (11.1) |
3.12 |
|
Finding out answer from peer |
F2F
|
40 (4.7) |
127 (14.8) |
287 (33.4) |
314 (36.6) |
90 (10.5) |
3.33* |
Online
|
33 (6.3) |
110 (21.1) |
142 (27.2) |
183 (35.1) |
54 (10.3) |
3.22 |
|
Trial and error on your own |
F2F
|
93 (10.8) |
187 (21.8) |
280 (32.6) |
213 (24.8) |
85 (9.9) |
3.01 |
Online
|
19 (3.6) |
84 (16.1) |
111 (21.3) |
228 (43.7) |
80 (15.3) |
3.51* |
|
Further reading |
F2F
|
133 (15.5) |
252 (29.4) |
272 (31.7) |
154 (17.9) |
47 (5.5) |
2.69 |
Online
|
30 (5.7) |
120 (23.0) |
240 (46.0) |
100 (19.2) |
32 (6.1) |
2.97* |
1 = Never, 2 = Not very often, 3 = Often, 4 = Most of the time, 5 = Always. * p ≤ .05
A summary of the percentage differences for overall problem solving skills used during EDT 3470 is displayed in Table 7, whereby the table shows how much higher one format was than the other for use of problem solving skills and which were significantly different.
Table 7. Comparison of Problem Solving Strategies Used Most Overall
|
F2F (Often/Most of the time/Always) |
Online (Often/Most of the time/Always) |
Instructor |
+11%* |
|
Peer |
+9%* |
|
Trial and Error |
+13%* |
|
Further Reading |
+16%* |
Note: + # = % higher than the % in the other format; * = significant difference
In addition to asking which problem solving strategies they tend to use overall, students were given specific technology “problems” and asked which of the problem solving skills they used for each. For example, students were asked to create a web site, and when students had problems creating internal links, the 46% of students in face-to-face course waited for their instructor to assist them, while only 3% of the online students waited for their instructor help (a difference of 28%). Table 8 summarizes the data for each of the 15 technology tasks. Note, for these data, changes had been made over time to the course survey whereby some questions were changed to reflect current technologies.
Table 8. Problem Solving Skills Used for Specific Technology Competencies
Question |
Format |
Instructor n (%) |
Peer n (%) |
Trial & error n (%) |
Read n (%) |
M |
Summer I 2005 to Fall 2009 participants (n = 1380) |
||||||
Getting Interactive PowerPoint to work |
F2F |
155 (18.1) |
178 (20.7) |
492 (57.3) |
33 (3.8) |
2.47 |
Online |
44 (8.4) |
70 (13.4) |
353 (67.6) |
55 (10.5) |
2.80* |
|
Inserting Images into Movie |
F2F |
258 (30.1) |
189 (22.0) |
384 (44.8) |
27 (3.1) |
2.21 |
Online |
59 (11.3) |
109 (20.9) |
295 (56.5) |
59 (11.3) |
2.68* |
|
Exporting slide show into movie |
F2F |
328 (38.2) |
173 (20.2) |
320 (37.3) |
37 (4.3) |
2.08 |
Online |
63 (12.1) |
118 (22.6) |
274 (52.5) |
67 (12. 8) |
2.66* |
|
Managing Site |
F2F |
281 (32.8) |
209 (24.4) |
333 (38.8) |
35 (4.1) |
2.14 |
Online |
63 (12.1) |
173 (33.1) |
225 (43.1) |
61 (11.7) |
2.54* |
|
Summer I 2005 to Summer II 2008 participants (n = 1221) |
||||||
Getting images to show up on web site |
F2F |
162 (20.0) |
154 (19.1) |
463 (57.3) |
29 (3.6) |
2.46 |
Online |
63 (15.3) |
138 (33.4) |
179 (43.3) |
33 (8.0) |
2.46 |
|
Getting attachments to show up on web site |
F2F |
232 (28.7) |
162 (20.0) |
387 (47.9) |
27 (3.3) |
2.26 |
Online |
52 (12.6) |
123 (29.8) |
201 (48.7) |
37 (9.0) |
2.55* |
|
Sending files to server for web site |
F2F |
304 (37.6) |
166 (20.5) |
303 (37.5) |
35 (4.3) |
2.08 |
Online |
56 (13.6) |
121 (29.3) |
190 (46.0) |
46 (11.0) |
2.56* |
|
Getting files from the server for web site |
F2F |
273 (33.8) |
179 (22.2) |
320 (39.6) |
36 (4.5) |
2.15 |
Online |
47 (11.4) |
112 (27.1) |
210 (50.8) |
44 (10.7) |
2.61* |
|
Linking to external sites properly |
F2F |
262 (32.4) |
156 (19.3) |
350 (43.3) |
40 (5.0) |
2.23 |
Online |
48 (11.6) |
86 (20.8) |
227 (55.0) |
52 (12.6) |
2.72* |
|
Linking to internal pages properly |
F2F |
250 (30.9) |
153 (18.9) |
371 (45.9) |
34 (4.2) |
2.24 |
Online |
35 (2.9) |
80 (19.4) |
256 (40.8) |
42 (10.2) |
2.79* |
|
Creating "hot links" in Word documents |
F2F |
202 (25.0) |
139 (17.2) |
419 (51.9) |
48 (5.9) |
2.39 |
Online |
33 (8.0) |
90 (21.8) |
236 (57.1) |
54 (13.1) |
2.76* |
|
Adjusting the content areas in Publisher to meet requirements |
F2F |
177 (21.9) |
133 (16.5) |
438 (54.2) |
60 (7.4) |
2.48 |
Online |
43 (10.4) |
117 (28.3) |
214 (51.8) |
39 (9.4) |
2.56* |
|
Fall 2008 to Fall 2009 participants (n = 159) |
||||||
Setting up a blog |
F2F |
7 (14.0) |
20 (40.0) |
22 (44.0) |
1 (2.0) |
2.34 |
Online |
2 (1.8) |
33 (30.3) |
62 (56.9) |
12 (11.0) |
2.77* |
|
Communicating with group members |
F2F |
0 (0.0) |
31 (62.0) |
17 (34.0) |
2 (4.0) |
2.42 |
Online |
5 (4.6) |
35 (32.1) |
57 (52.3) |
12 (11.0) |
2.70* |
|
Combining individual podcasts to create Collaborative Podcast |
F2F |
7 (14.0) |
27 (54.0) |
14 (28.0) |
2 (4.0) |
2.22 |
Online |
19 (17.4) |
43 (39.4) |
39 (35.8) |
8 (7.3) |
2.33 |
A summary of the percentage differences for problem solving skills used for each specific task during EDT 3470 is displayed in Table 9, whereby the table shows how much higher one format was than the other for use of problem solving skills for those individual tasks and which were significantly different. Here you can clearly see that face-to-face students relied on instructor assistance more than those in online classes, whereas those in the online classes depended significantly more on peers, trial and error, and further reading.
Table 9. Comparison of Problem Solving Skills for Specific Tasks
|
Instructor |
Peer |
Trial and Error |
Read |
||||
F2F |
Online |
F2F |
Online |
F2F |
Online |
F2F |
Online |
|
Internal Links | +28%* |
+5% |
+6% |
|||||
Export Movie | +26%* |
+3%* |
+16%* |
+9%* |
||||
Files to Server | +24%* |
+8%* |
+8%* |
+7%* |
||||
From Server | +23%* |
+5%* |
+11%* |
+6%* |
||||
External Links | +20%* |
+2%* |
+12%* |
+8%* |
||||
Insert Images | +19%* |
+1% |
+12%* |
+8%* |
||||
Hot Links | +17%* |
+5%* |
+6%* |
+7%* |
||||
Attachments | +16%* |
+10%* |
+1%* |
+6%* |
||||
Publisher | +12% |
+11% |
+2% |
+2% |
||||
Setting up Blog | +12%* |
+10%* |
+13%* |
+9%* |
||||
Managing Site | +11%* |
+9%* |
+4%* |
+8%* |
||||
PowerPoint | +10%* |
+8%* |
+11%* |
+7%* |
||||
Images on Web | +5% |
+14% |
+14% |
+4% |
||||
Group commun. | +5%* |
+30%* |
+18%* |
+7%* |
||||
Podcast | +3% |
+14% |
+8% |
+3% |
Note: + # = % higher than the % in the other format; * = significant difference
By using multiple regressions, our study revealed some small predictive relationships between technology competence, technology use confidence, problem solving skills, and course format. It also disclosed the differences of the relationships between the course formats.
We were able to determine that the variables of waiting for instructor assistance, asking a peer for assistance, discovering an answer through trial and error, and discovering through further reading account for 1.8% of the variance in self-reported competence for the online students and 0.3% for the face-to-face students. This allows a prediction that students feel more competent if they engage in some form of problem solving. Our results can also be used to predict that students in an online course feel a bit more competent when they engage in problem solving, than the students in a face-to-face course.
Another multiple regression revealed that the variables of waiting for instructor assistance, asking a peer for assistance, discovering an answer through trial and error, and discovering through further reading account for 1.6% of the variance in confidence for the online students and 3.0% for the face-to-face students. This allows a prediction that students feel more confident if they engage in some form of problem solving. Our results can also be used to predict that students in the face-to-face course feel more confident when they partake in problem solving, than the students in an online course.
Because students need to be prepared for the 21st century workforce, technology has become an essential element of today’s education. Educators face the challenges of not only integrating technology into their curriculum, but also teaching technology. One of the biggest obstructions for in-service teachers is the lack of technology competence and confidence, which leads to not using, integrating, or teaching technology. Higher education teacher preparation programs play an integral role in creating technology competent and confident educators. As pre-service teachers are learning technologies, problems may arise. Problem solving is an essential part of the learning process, and is a critical strategy for learning (Karabenick & Newman, 2006; Ryan & Pintrich, 1997). Based on this information, knowledge of the relationships between course format, problem solving, technology competence, and technology use confidence is helpful to educators.
In reference to technology confidence and competence, our study revealed that the vast majority of students in both the face-to-face and online sections felt, after completing a technology course as part of their teacher preparation program, that they could integrate technology into their teaching, teach such technology to others, and use technology in their future teaching. This finding aligns with the results of several previous research studies that found that, in order to gain computer competence, students need intense computer use and experience (Cretchley, 2007; Jih, 2004; Karabenick & Newman, 2006; van Braak & Goeman, 2003). It also supports findings by Albion (2007), and Lambert, Gong and Cuper (2008) that students who rated themselves as more knowledgeable in using technology were also more confident about using computers and believed them to be useful tools for their future classroom.
Our results show that, overall, when students were asked what problem solving skill they used when they encountered a problem in technology, most of the face-to-face students waited for their instructor to assist them, while most of the online students used the trial and error method or further reading. An explanation for this may be face-to-face students have immediate access to instructors, while online students do not. This could also explain our results that more online students feel they can teach technology, while more face-to-face students feel they can integrate technology into their future classrooms. As a result of online students doing their own problem solving, they might be internalizing the learning and thus feel confident enough to teach others. With the face-to-face students, they learn by watching the teacher model how she/he has integrated technology into the lesson, which may give them more confidence to do the same.
Our results also revealed that the use of problem solving skills could predict whether a student will feel competent or confident to integrate technology into their curriculum or teach it. Moreover, course format, online or face-to-face can predict whether more students will feel competent or confident after completing the course. The results confirm previous research that when students seek help, they feel more competent and confident (Ryan & Pintrick, 1997), and problem solving is seen as an investment that increases competence and confidence (McGee, 2005). Yet, previous research had not looked at course format differences, so our findings of differences between course formats are important.
Overall, our research helps demonstrate the value of problem solving strategies that students within a technology methods course are undertaking (whether online or face-to-face). With technology being a fundamental element of education, and only a little over half of American public school teachers feeling comfortable with technology (NCES, 2009), efforts need to continue to assist educators in understanding how to use technology with competence and confidence.
Sharon L. Peterson is an Assistant Professor in Educational Leadership, Research, and Technology at Western Michigan University. E-mail: sharon.peterson@wmich.edu
Louann Bierlein Palmer is a Professor in Educational Leadership, Research, and Technology at Western Michigan University. E-mail: l.bierleinpalmer@wmich.edu