ON THE STUDENTS’ QOS-AWARENESS OF VIDEO LECTURES

Video lectures bring flexibility and enable distance learning in education. With the help of videos, educational organizations are able to serve a wider and more heterogeneous group of students. As streaming videos have become essential part of teaching in higher education, quality of service (QoS) issues should be taken more into consideration. However, systematic quality monitoring of video lecture delivery, to our knowledge, is rarely used in a way that would provide also the student with quality information. This paper describes our ongoing research on developing a streaming video quality evaluation tool integrated in a learning environment that our master’s students use. The tool serves both the education provider and the students.


INTRODUCTION
As the share of video streaming bandwidth usage of the Internet traffic constantly rises, it is natural that the role and importance of streaming video lectures in education also becomes emphasized.O'Callaghan et al. [1] conclude in their literature review on lecture videos that the positives of lecture recordings outweigh the possible negatives, and the continued use of lecture videos in higher education is recommended.
A streaming video lecture producer can affect factors like conditions of the recording set or the encoding parameters to make videos which meet specified requirements.But when the video is delivered to a student, either live or via on-demand streaming, the internet transmission conditions or even the student's client machine may cause deterioration in quality.Monitoring of quality statistics from streaming servers may be sufficient to get an overall idea of the quality level received by the students and to get information about the internet connections they are using.To get more precise information about the quality experienced, the metrics can be collected near the student, for example with an instrumented media player.Since students have power over some factors causing interference they are observing in the video, information about the quality of the video should be offered them, too.In addition, students' awareness of adaptive streaming is rising, and they are more eager to know if they are having the best quality possible.Thus, in providing video quality information also for students, our objective is to improve our QoS evaluation practices for lecture video delivery.This paper presents our method to develop an instrument that will make students more aware about the quality of videos they are receiving and also provide statistics for the lecture video producer.Our approach was to obtain QoS parameters near the students, and based on those values give feedback about their connection and video playback.For that purpose, we integrated a video quality meter into the learning environment our master's students use.The tool collects quality parameter values from the flash media player.The quality indicator rules are created based on measurements conducted in the test environment.The application shows the student real-time information about the streaming quality in a simple graphical form.The tool serves also the education provider by producing reports about playback information of video lectures the students are streaming.A pilot version of the tool is being tested with a small group of students.
The next subsection deals with the related work concerning media player monitoring and data collecting.Section 2 introduces our way of producing lecture videos and highlights QoS issues related to video lecture transmission.In Section 3, the functioning and the architecture of the video quality meter is presented, and some aspects for developing the meter are considered.Section 4 concludes the paper.

Related Work
Media player's behaviour gives a lot information about the user's viewing experience.Combining parameters from the player and network is a natural way to monitor QoS of video streaming.This strategy is omitted in various studies.Dalal et al. [2] built an instrumented media player to assess user-perceived quality with objective metrics collected by the player.The player was developed for RTP streaming over UDP, which was topical at the time.In [3], Dalal et al. extended their research to include also RTP over TCP.The ultimate goal of their work was to spot degraded video quality experience (QoE) in real time and, before the end user notices any change in video quality, affect the conditions giving rise to the degraded quality.In [4], the study was moved to the Flash environment, in which real-time video QoE was assessed for RTMP streams.The quality assessment system originally developed for RTSP/UDP streaming was modified to make it capable to infer QoE of RTMP video streams.Their preliminary study indicated that bitrate in combination with either frame rate or bandwidth serves as an accurate indicator of QoE for RTMP videos.
Nowadays, HTTP-based streaming technologies have become popular again, especially due to chunk-based delivery which enables switching between streams.By switching between stream chunks, based on bandwidth and client's CPU capacity, the client can get the best possible quality offered in current conditions.This is called adaptive streaming.Mok et al. [5] studied the connection between application QoS, network QoS and QoE with the HTTP streaming protocol.The application performance metrics were collected with a customized Flash video player called FlashTrack.For HTTP streaming, they chose buffering as the best metric for quality evaluation.
YouTube is the most popular multimedia sharing platform.Staehle [6] et al. developed a tool called YoMo running on the client.The tool collects information about the YouTube flow and estimates the amount of playtime buffered by the YouTube player.The aim is to predict possible stalling of the video and give valuable information to the ISP.
The purpose of these streaming quality monitoring methods is to provide information to streaming providers and in the long run for protocol and player developers.All this is done as invisibly as possible for the end user.However, users are also interested in the streaming quality they get.The popularity of network speed tests indicates that the users want to know if they are getting what they are paying for.For example, one of the most popular test sites, Ookla Speedtest service, carries out over 50 million tests a month [7].At the same time, while comparing the features of mobile devices and the use of applications in them, users have become more aware of different technologies supported by their devices.For that reason, we wanted to give QoS information also to our students.Also some big streaming providers are offering streaming quality information to their customers.In Netflix, the user can monitor statistics such as bitrate, framerate, and buffering speed of the video currently playing.Also, YouTube will notify the viewer as the quality degrades.

VIDEO LECTURES IN HIGHER EDUCATION
Video lectures are an essential part of teaching in many organizations.They enable distance learning, flipped classroom and MOOC courses and bring flexibility to adult education.In our master studies in mathematical information technology targeted at adult and working students, streaming video lectures has been a prerequisite for the whole program.All teaching is still arranged as face-to-face teaching to give students the freedom to choose traditional participation instead of videos.Although we have noticed a massive drop in attendance of lectures over years, some students prefer face-to-face teaching over streaming videos.On average, our student attends about one-fourth of the lectures in a course [8].Similarly, some students tend to favor real-time streaming videos if their schedules allow that.On average, a student participates in about one-sixth of the lectures during a course by watching real-time videos [8].The most popular participation mode in our education program is, however, ondemand videos.Our average student participates in over half of the lectures with on-demand videos, and the share doesn't even include the use of the videos for revision by those students who have attended the lecture or watched it as real-time video [8].

Producing Streaming Video Lectures
When starting to produce lecture videos, the education organizer has to make many decisions affecting the video quality.Video encoding parameters should be selected based on the knowledge of the students' terminal devices, their Internet connection speeds and the content of the videos.Hansch et al. [9] categorized the video production styles in education into 18 different content formats.From those, our education uses Classroom Lecture with a combination of Presentation Slides with Voice-Over, Picture-in-Picture, Udacity-Style Tablet Capture (in which lecturer's writing hand is captured), Screencast and Live Video.Alongside lecture videos, there are also elements enabling interactivity.These include live chat and video conferencing tools.High motion video content requires higher bitrates to achieve the same quality as more static videos do.All of our video content can be watched with a good quality and with relatively small bitrates (under 1Mbps).Most motion takes place in scenes with a writing hand.This content format is mainly used in our mathematics courses, which form almost half of the study program.
The most popular compression standard for streaming video at the moment is H.264/MPEG-4 AVC, and that format is the one also used in our lecture videos.It requires smaller delivery data rate while providing the same quality as its predecessors (MPEG-2 Part2 and MPEG-4 Part 2).The HEVC/H.265 standard will give a quality similar to that produced by H.264, with an approximately 50% decrease in the bit rate [10].However, the HEVC standard will not be commonplace until the major companies in the field announce HEVC playback support in their players, browsers, or mobile or desktop operating systems.At the moment, H.264 is compatible with virtually every device consumers use today.
Our lecture videos are encoded to the H.264 format with a free Adobe's Flash Media Live Encoder.From the live encoder, the H.264 stream is transmitted with the RTMP protocol to the media server.In our production, the Wowza Streaming Engine is used.It can convert the input stream to number of formats and is compatible with all standard streaming protocols.Thus, it is possible to stream the resulting video to multiple types of playback clients and devices.At the media server, the incoming H.264 stream is transrated into different bitrates, i.e. quality levels, to enable adaptive streaming for our students.
Nearly all of our students use HTTP adaptive streaming as a streaming protocol when watching lecture videos.Windows and Linux users use Adobe's HTTP Dynamic Streaming (HDS) protocol, and Mac OS and iOS users stream with Apple's HTTP Live Streaming (HLS) protocol.For Android users, the lecture videos are currently delivered with RTSP streaming, but soon we will make HDS streaming available for them, too.To transmit video stream over HTTP, the live stream or multimedia file has to be divided into small files (chunks).In our case, the input data is either in the form of raw h.264 stream (live streaming) or mp4 file (video on demand streaming).For HDS streaming, the Wowza Streaming Engine segments input data into Fragmented MP4 (fMP4) files and for HLS into MPEG2-TS (M2TS) files.In addition, the server generates manifest files that the player will use to request the fragment with a required bitrate.

QoS of Video Lecture Transmission
QoS matters cannot be ignored, either, in educational context.For a distance learner, lecture videos may be the only way to participate in teaching.In an education program based on video lectures, the student may watch the videos on a regular basis for several years.QoS parameters that cause a response in the media player will reduce the quality of the student's viewing experience.That creates frustration and anxiety, which interfere with learning.Typical QoS parameters in video streaming include bandwidth and the ratio of lost data packets.Along with the usage of mobile terminals, the bandwidth is likely to vary.Lost data packets are usually a result of congestion in a transmission channel.Insufficient bandwidth may cause the user's buffer fail to maintain enough data, resulting in rebuffering.The student sees interruptions and unsmooth playback of the lecture video.Packet loss causes defective reconstruction of B-and P-frames, which the student sees as distorted video images.In TCP-based streaming, the lost or corrupted packets are retransmitted.Lots of retransmissions, and especially retransmission timeouts, can cause delay and lead to stalling.However, once data delivery is ensured, video image artifacts such as blockiness or blurring are not typical in TCP streaming.
Adaptive streaming was developed to improve users' QoE.A comprehensive overview about the factors affecting the HTTP adaptive streaming quality is described in [11].If the video bitrate is too high for the available bandwidth, the client requests next segment in a lower bitrate during playback, thus preventing delays and stalling.On the other hand, switching to a higher bitrate segment in suitable conditions enables better bandwidth utilization and ensures the best quality possible for the user.HTTP streaming protocols are pull-based, meaning that the client is responsible for the adaptation algorithm according to which the segment is requested.The most common approach is to monitor the current available bandwidth, but the adaptation algorithm may also consider other parameters like delay or dropped frames.The strategy the client uses for requesting segments from a server can have a considerable impact on the achieved video quality [12].
The server side is responsible for producing properly encoded and segmented HTTP chunks in order to provide quality.For example, the decision about the length of segments is an optimization problem [11].Long segments allow high efficiency of coding and keep the amount of overhead low.On the other hand, short segments enable the client to react fast to changing network conditions.

VIDEO QUALITY METER
To offer students real-time information about the streaming video quality, we built a tool called Video Quality Meter, which estimates the current quality and, with a simple graphic element at the side of the media player, gives the student feedback.The Video Quality Meter was integrated to the virtual campus environment known as CiNetCampus Studies.

CiNetCampus Studies Environment
The lecture videos as well as other course material are distributed through a virtual campus environment called CiNetCampus Studies.The core of the virtual CiNetCampus Studies environment consists of a commercial learning management system (LMS) and a custom-build www-based video content management system (Video CMS).LMS provides an easy access to written study materials, to tools for targeted asynchronous communication and material dissemination, to student management as well as to other services typical for LMSs.Originally, the Video CMS was destined for improving lecture video distribution.To increase interactivity among students and lecturers, we added elements such as chat and video conferencing tools to the system.As we are constantly developing the virtual campus by adding to it new elements for the students' benefit, the Video CMS nowadays functions as a platform for all the applications and tools we want to integrate.Fig. 1 presents the current modules of CiNetCampus Studies, including the new Video Quality Meter.

The Functioning of the Video Quality Meter and the Test Setup
For Flash media playback, we use Flowplayer at the moment.The Flowplayer is an open source flash media player for the web, so we could modify it to collect information that the quality meter uses.At this first stage, we ended up collecting four metrics for streaming video quality assessment.These metrics are frame rate, buffer fullness, dropped image frames and client's download bandwidth.We also monitor the initial delay, but the meter doesn't analyse the values and information about them is not shown to the students.In future, the initial delay information can be used for example in assessing the starting value for the meter.
Buffer fullness and downlink bandwidth are more like predictive metrics, meaning their values may change before the user actually sees degradation in quality.Variation in frame rate and dropped frames will affect the video playback right away, which may be visible to the user as degrading QoE.Roughly speaking, buffer fullness and downlink bandwidth are related to network QoS, and frame rate and dropped frames depend more on the CPU performance of the client machine, especially in HTTP streaming.Insufficient CPU performance may cause failure of decoding a frame in time.That causes frames to drop, which affects also frame rate.However, separation of interferences caused by insufficient bandwidth or CPU power is not that straightforward.For example, too low bandwidth between the client and the server drains buffer and causes stalling but can also produce frame drops.
During streaming, the quality meter rates the quality on a scale of one to four, one being "bad" and 4 being "excellent".The rules were set based on preliminary tests.In preliminary tests, the bandwidth was restricted with a WAN-emulator to find limiting values for each metric.Fig. 2 shows how the student sees the quality meter as a little graphical element beside the lecture video.The rated value is shown with an arch on the left of the meter.
In addition to the four metrics, the quality meter monitors also video bitrate.Flowplayer provides a bandwidth detection plugin which enables dynamic bitrate switching.Earlier, we used only two bitrates in our adaptive streaming videos.When testing the meter in two study courses, the bandwidth options were increased to five: 340 kbps, 690 kbps, 1Mbps, 2Mbps and 4 Mbps.The current bitrate the student is receiving is shown on the right side of the meter icon.The bitrate doesn't affect the quality rate, which is shown with an arch.Every bitrate level can be rated from 1-4 according to the incidence of interferences, although the video encoded with 4Mbps is, naturally, sharper than the 340 kbps video.
Figure 2. The Video Quality Meter in use in a video view page.In this example, the student is receiving 2 Mbps video with a good QoS (The value of the meter is 3/4).The video may be losing some frames or the buffer isn't full all the time, but the user won't probably see any degradation in quality yet.
The architecture of the Video Quality Meter is presented in Fig.

The Video Quality Meter Assessment and Future Work
The Video Quality Meter is currently being assessed with System Usability Survey (SUS) with a couple of added questions of our own.As this paper is being written, too few students had responded to the survey to allow any conclusions to be made about the usability and students' satisfaction with the meter.However, we still got some good proposals from the students for developing the meter: for example, the possibility to choose the bitrate level manually by the user came up.That would be a good addition to the meter, and it would also be easy to realize since Flowplayer has a complete "bitrate selection" plugin for that.
All the respondents considered the information given by the meter as corresponding to their own experience of the quality.In addition, the students who stated owning a fast Internet connection (>10Mbps) had noticed improvement in the quality due to higher bitrate videos added for the testing period.This is a clear indicator for us to start producing video lectures in higher quality on a regular basis.On the other hand, we noticed Flowplayer to fail in some conditions to select the most appropriate bitrate in our environment.To correct that, we have to check carefully our settings and maybe dig more into the adaptation algorithm of the Flowplayer.In the next development phase of the meter, our goal is also to determine the source of the interferences.Thus the student would get information about whether the possible problems are the result of insufficient Internet connection or overloaded CPU.

CONCLUSION
As students' knowledge and experiences about the streaming media increase, their requirements regarding quality also increase.We wanted to offer students information about the quality they are receiving when watching our lecture videos.For that purpose, we developed a Video Quality Meter and integrated it into the virtual campus environment.The meter monitors QoS level metrics and rates the streaming video quality on a scale of one to four.In addition to the score value, the students are shown the current bitrate level from five options.This will tell them if their current connection corresponds to the speed they are paying and whether they are getting the best quality from our options.As the meter reacts only for conditions during the playout, the students are aware that the possible problems the meter is indicating are not related to the source video but either to the conditions in the communication channel or client machine, i.e. to factors the student have power over.

Figure 1 .
Figure 1.The current modules of CiNetCampus Studies environment.

3 .
The meter is part of the Video sharing module.The Flowplayer is embedded to the video view html page and streams video from the Wowza Media Engine.During playing, Flowplayer collects and sends data to a PHP script.The script is hosted on the same Apache HTTP server as the video view.The PHP script handles the analyzation of data and returns the value to the meter.If the value is less than four, the script saves the value to the database.Thus, only values that indicate possible degradation in quality are stored for later use.Also the information of the bitrate switches is saved.With the help of the saved information, we can monitor the quality of the video our students are receiving.

Figure 3 .
Figure 3. Architecture of the Video Quality Meter.