Skip to content

Research Hub Logo

Distance learning: How effective is it, according to a meta-analysis of research?

Written By: Gemma Goldenberg
5 min read
Original research by:

Allen M, Mabry E, Mattrey M, et al. (2004) Evaluating the effectiveness of distance learning: A comparison using meta‐analysis. Journal of Communication 54(3): 402–420.



This Research Digest summarises a meta-analysis of over 500 research papers on distance learning. A meta-analysis is a way of combining data from several studies on the same subject. Merging the findings across multiple studies allows researchers to identify common effects.

This meta-analysis looks at whether the performance of students differs depending on whether they were taught in a traditional classroom, or via distance learning.

Distance learning is defined by the authors as any course ‘in which the expectation is that the student and instructor will not be physically copresent in the same location’ (Allen et al., 2004, p. 403).


What is the research underpinning the study?

Existing research on the efficacy of distance learning has been mixed. Although some studies have shown distance learning to be effective (Hiltz, 1986; Hackam and Walker, 1990) some research suggests that teachers perceive distance instruction negatively due to concerns over diminished contact with students and a loss of control over the learning environment (Mottet, 2000).

Student satisfaction was shown to be 22 per cent lower in distance education courses compared to traditional co-present courses (Allen et al., 2002).

Distance learning can be conducted using several different types of communication, including radio, televised lectures, email and online courses. Communication channels that allow for more interactivity, for example, those that enable the learner to react and respond in a two-way flow of messages, are thought to be more connective than, for example,  televised lectures. The level of connectivity and interactivity in each communication method is thought to be a key factor which can influence learning outcomes. Some researchers have raised concerns that certain types of communication will favour or advantage different people (Gunawardens and Zittle, 1997).


How did they conduct the research?

Both manual and electronic searches were used to locate over 500 research papers on the topic of ‘distance learning’ and ‘distance education’. Studies were excluded if they used computer-assisted instruction, whereby technology was used in addition to face-to-face teaching.

To be included in the meta-analysis, studies had to meet three criteria. They had to:

  1. involve a comparison between a distance learning course and a traditional format course
  2. include at least one assessment of student performance, related to mastery of a content or skill taught in the course
  3. report enough statistical information to allow an effect size to be calculated.

For the studies that met these criteria, the researchers identified variables that could affect whether distance learning or more face-to-face teaching was more successful. These variables included:

  • Whether the learning was synchronous (involves 2-way audio/visual links, a live instructor with whom the learner can communicate) or asynchronous (the student cannot directly communicate with the instructor in ‘real-time’)
  • The channel of delivery – whether the instruction used video, audio or written text, or a combination.
  • Course content – courses were classified as natural sciences and mathematics, military training, foreign language instruction, social sciences, education or combined content across areas.

Statistical analyses were then conducted to calculate the average effect of the method of instruction (distance or face-to-face) while also weighting the results of each study to allow for the sample size (studies with greater numbers of participants are more heavily weighted).


What were the key findings?

  • Overall, courses using distance education technology were found to show a small improvement in performance compared to traditional face-to-face teaching.
  • Whether courses were synchronous or asynchronous did not appear to influence the performance scores of students.
  • Video, as a channel of delivery, demonstrated a slightly higher level of performance when compared to face-to-face teaching
  • Researchers were not able to conclude from the data in this meta-analysis that the channel used for instruction (video/audio/written) had a significant effect on learning performance.
  • Course content did appear to affect how well students performed using distance learning. Foreign languages courses showed the greatest advantage for distance learning compared to face-to-face teaching. This is thought to be because of the opportunities it offered to interact on a regular basis with a native speaker of the language.
  • Overall, the results demonstrated little difference in student performance between face-to-face and distance learning.


Were there any limitations to the study?

  • Only two studies looked at distance learning courses that used audio as the channel of delivery. Due to this small sample size, it was not possible to make reliable conclusions on the effect of using audio channels for instruction.
  • The meta-analysis looked only at student performance and how it was affected by distance learning. There are other factors that may be important to consider, including completion rate of courses, cost, and student satisfaction. Furthermore, student performance was assessed only by grades and test scores. For some types of learning, tests may not have been the most appropriate method of assessment.
  • Distance learning courses were not compared against each other in terms of their structure, the quality of technology used and how much feedback instructors gave throughout the course. Each of these variables could have affected results.
  • Individual differences between students were not considered. For example, the age of the student may affect whether distance or face-to-face learning is more effective for them. Factors such as student motivation and proficiency using the technology required for distance learning were also not examined.


Impact on practice

What questions does the research raise for teachers? 

  • What types of learning, and which subject areas, might be best suited to distance learning?
  • What implications does this have for homework, revision and/or distance teaching during periods of time when schools are closed?
  • Which factors would I want to consider when comparing the effectiveness of distance learning compared to face-to-face teaching? For example, exam scores, student motivation and satisfaction, mastery of content?
  • Does the finding that synchronicity does not affect student performance ring true with my own experiences? What are the implications of this? Does it make distance learning easier, cheaper or quicker to organise if a ‘live’ interaction in real time is not required?


What are the limitations of this study for teachers’ practice? 

As this is a single meta-analysis and the participants were almost exclusively college undergraduates, caution should be taken when generalising its results. While the study might raise ideas and questions worth exploring, evidence from a broad range of studies – ideally based on the age phase you teach, would be beneficial for evidence-informed changes to practice.


Want to know more?

Allen M, Bourhis J, Mabry E et al. (2002) Comparing student satisfaction of distance education to traditional classrooms in higher education: A meta-analysis. American Journal of Distance Education 16: 83–97.

Allen M, Mabry E, Mattrey M et al. (2004) Evaluating the effectiveness of distance learning: A comparison using meta‐analysis. Journal of communication 54(3): 402–420.

Gunawardena CN and Zittle FJ (1997) Social presence as a predictor of satisfaction within a computer‐mediated conferencing environment. American Journal of Distance Education 11(3): 8–26.

Hackman MZ and Walker KB (1990) Instructional communication in the televised classroom: The effects of system design and teacher immediacy on student learning and satisfaction. Communication Education 39(3):196–206.

Hiltz SR (1986) The “virtual classroom”: Using computer-mediated communication for university teaching. Journal of Communication 36(2): 95–104

Mottet TP (2000) Interactive television instructors’ perceptions of students’ nonverbal responsiveness and their influence on distance teaching. Communication Education 49(2): 146–164.

    0 0 votes
    Please Rate this content
    Notify of
    Inline Feedbacks
    View all comments

    Other content you may be interested in