the Technology Interface/Winter 97 - Invited Paper
Institutional Software Development Center
University of Missouri-Rolla
In early October, 1996, the first "virtual" Mechanics of Materials class at the University of Missouri-Rolla had its audio and class notes broadcast over the Internet from the Basic Engineering Lecture Hall. A research project made possible through funding from the Instructional Software Development Center (ISDC) at UMR, culminated in a fully interactive, multi-media class being made available to the students as an alternative for "physically" attending class. Since then, the technology that allowed that first class' audio and class notes to be broadcast has expanded to allow audio, video and class notes to be transmitted to the students in real-time. And, whereas the original system required the students to call a dedicated phone number to ask the lecturer questions, the enhanced broadcasting system allows the "virtual" students to ask the lecturer questions over the Internet, in a truly interactive mode. The lecturer's "live" audio, video and class notes are transmitted to the students in a real-time, "live" format, but are also simultaneously digitally recorded for retrieval later by either the students or the lecturer. The broadcasting system's bandwidth requirements are flexible enough to work with either 28.8 kilobit per second (kbits per second) or 33.6 kbits per second modem rate lines, but can also be used to transmit over ISDN lines (130 kbits per second), T1 lines (1.544 Mbits per second divided into 24 channels) or T3 lines (44.736 Mbits per second divided into 28 T1 lines). The streaming media broadcasting technology that allows audio, video and class notes to be broadcast over the Internet is prioritized in such a way that the audio receives a dedicated segment of the available communication bandwidth due to its primary importance, the class notes receive the second highest priority, and the video transmission receives the lowest priority within the broadcasting system. Since the audio has a slightly fluctuating bandwidth requirement of approximately 8 kbits per second, and the interactive whiteboard has a widely varying bandwidth requirement of approximately 1-5 kbits per second, the remaining bandwidth available to the video transmission can fluctuate between 15-20 kbits per second of bandwidth. The video broadcasting system, though, has the flexibility to adapt to the changing bandwidth availability, and respond accordingly by decreasing the refresh rate of the video display.
The potential uses of this broadcasting system for educational purposes are enormous. Truly interactive "virtual" courses could now be offered to students over modem communication lines that would allow the "virtual" lecturer to come directly into students' homes or offices. Class size should no longer be a concern when offering a course, due to the much greater market that would be available to the educational provider state-wide, nationwide, or even worldwide. Educational institutions that were once restricted to offering courses only to residents of their immediate region could now offer those same courses to any individual that had access to a phone line, a modem and a standard computer.
The encoding and broadcasting of the lecture was developed with three primary design considerations in mind. First, the broadcast and encoding of the lecture had to be as transparent as possible to the lecturer. In order to satisfy this consideration, it was our intention to design the broadcasting system around the traditional lecture format. In this way, the lecturers would not have to change their pedagogical habits at all. Instead, we would change the broadcasting system to accommodate the lecturer. We designed the system so that whatever the lecturer wrote on the board for the "physical" class sitting in front of them would also be what the "virtual" students sitting in front of their computer monitors also see. Whatever the lecturer might say in class would also be exactly what the "virtual" students attending via the Internet hear. Any handouts that might be given out in class would also be immediately accessible and downloadable by the "virtual" students.
In order to accomplish this, either a digitizing tablet, or a large "whiteboard" digitizing whiteboard device, must be used to record what the lecturer writes down in front of their "physical" class, and then transmit that in real-time to the students over the Internet. In combination with the digitizing device, interactive whiteboard computer software must also be used to broadcast the lecturer's class notes to the "virtual" students attending over the Internet. In order to do this effectively, we used WhitePineboard software from WhitePine Software, although there are several other interactive whiteboard broadcasting systems that could also have been used.
In the same way, the audio and video transmission was also designed to be almost transparent to the lecturer. In fact, the only additional requirement that the lecturer would have placed on them would be the requirement to wear a tie-clip, FM wire-less microphone while they presented their lecture. Indeed, for many lecturers that teach large classes, this is common practice, and is not unfamiliar at all. And, since the lecturer is constrained to stand directly in front of the digitizing device to write on it, the recording and transmission of the video is perhaps the most transparent of all the requirements placed on them. The benefit that they receive from these few requirements is that at the end of their 50-minute lecture, they have recorded the entire presentation as a multi-media event, wherein the audio, video and class notes are saved in a temporal sequence, and entirely recorded for all posterity to replay. And yet, to the lecturer, they have only done what they have always done--present a traditional lecture, with a tie-clip microphone attached to their lapel.
The second constraint that we placed on this development was that the cost to both the university and the students had to be very reasonable and economical. There are many distance-learning programs that exist today that are based on extensive use of video and audio recording and their respective contemporary transmission technology. These systems use a variety of transmission techniques that place large requirements on bandwidth to transmit the voluminous data necessary to send and receive true video signals. The two most popular methods of transmission of educational courses are microwave-downlink systems and satellite-based transmission systems. Both of these systems are incredibly expensive to operate, require expensive video equipment to record the production, and save the information in analog mode, rather than digital mode. The annual costs of these production and broadcasting agencies to the universities affluent enough to have them is in the $100,000's, if not in the $millions.
Also, the cost of the media to record the lecture is quite expensive. If one video tape were required for each lecture, at a cost of $3 per video tape, then 500 lectures would cost approximately $1500 to store, and would take up the entire space of a large closet. If these tapes were to be accessible to students, then there would also be labor costs involved to retrieve these video tapes from the storage closet, and to deliver them to the students. These costs would not only include labor costs for the library, but would also include costs associated with delivering the video tapes to the students. Additionally, if the students were not resident on the campus, there would also be a delay associated with mailing of the video tape to the students, not to mention the preparation costs. This delay must also be considered as an indirect cost to the student, and to the University. However, with the digital storage of the information, 500 lectures could be recorded on one 2.5 Gigabyte hard drive, which costs approximately $250. That would make the cost of storing an individual lecture only 50 cents, and the storage space requirements to house those 500 lectures no bigger than that of a small book. Not only that, since the audio, video and class notes would be stored in a digital fashion, any of the 500 lectures could be immediately accessible by students, and not only students that are resident to the university, but also "virtual" students that might only be able to access the lectures over the Internet. Since the equipment to broadcast and record the lectures in digital mode would be 10,000%-20,000% less than what it currently costs to broadcast over satellite-based and microwave-based broadcasting systems, the overall savings to the university would be several orders of magnitude. A broadcasting system that currently may cost $500,000 annually in broadcasting and labor-related costs could be replaced by a digital broadcasting and recording system that would cost less than $5,000. And, this digital broadcasting system would also have the added advantage of being able to supply instruction to the students interactively in real time, and then to automatically store that information for immediate retrieval from a "digital library" at any time in the future. Of course, in order to be fair and realistic, as this technology replaced old broadcasting methods, there would be additional clerical and administrative costs that can't be foreseen at this time.
The third constraint that was placed on the development of this broadcasting system was that it should be able to operate over all of the communication lines that are currently available. This means that it should not only be capable of operating over ISDN, T1 and T3 communication lines, but that it should also be capable of operating over standard modem communication lines. In many ways, this was the most difficult of all of the constraints placed on the development of this broadcasting system. But, it was also the most exciting, and the single design constraint that would make this type of broadcasting system revolutionary if (when) it was perfected. In order for us to transmit audio, video and class notes within standard modem bandwidth limitations, we were constrained to minimize the bandwidth requirements of each of the individual screening technologies. In practical terms, this meant that we had to transmit the audio at or below 10 kbits per second, transit the class notes at or below 5 kbits per second, and use the remaining bandwidth to transmit the video. Only in this way would we be able to successfully create a broadcasting system that would allow students at home or in their offices to participate in courses offered by the university. And, this we did.
We also felt that we should endeavor to minimize the cost to the student of buying software that would allow them to listen to, and see, the lecture that was being broadcast over the system. In this regard, we attempted to limit our choices of multi-media broadcasting technologies to those systems that were standards of the industry, and we also provided players that were either free, or relatively inexpensive, for the students to purchase. For the audio broadcasting platform, this meant that we constrained ourselves to consider only broadcasting systems that would work with standard sound cards and that also had audio player programs that were standard to the computer industry. We used this same criteria for our selection of the class notes distribution. This meant that we would either have to use a World-Wide Web browser as our class notes distribution method or use one of the interactive whiteboard software programs that are currently being developed and sold commercially. For the video broadcasting system, we constrained ourselves to look at systems that would not require video frame capture boards to be installed in either the lecturer's or student's computer. That industry has not achieved enough maturity and standardization for one image capture board to have become recognized as a standard within the industry. Also, the cost of image frame capture boards is exceptionally high, on the order of $1000, and would have placed an untenable financial requirement on the students to make the process viable. For that reason, we decided to constrain ourselves to only looking at video broadcasting systems that were capable of using the Connectix QuickCam as their video capturing device. The Connectix QuickCam currently sells for only $80 for the black-and-white version, and only $170 for the color version. Neither of these two QuickCams needs a separate image capture board in order to operate effectively. Instead, these cameras plug directly into the parallel printer port on the back of any computer. Using these constraints as a basis, we explored several different broadcasting platforms, some of which were specifically directed toward audio broadcasting and others that were directed more toward video broadcasting.
When the ISDC first started investigating techniques for broadcasting over the Internet, we decided to start with technology that was already existent at the university. One broadcasting technology that many of us were familiar with was the RealAudio audio broadcasting system that is used by many radio stations across the United states. The RealAudio application software is manufactured by Progressive Networks and is used by the largest audio broadcast on the World-Wide Web, AudioNet. Amazingly, when we first contacted Progressive Networks to discuss the prices of their RealAudio Encoder and Server, we were told that the University of Missouri-Rolla already had a RealAudio Server existent on the campus. It turned out to be a small 10-stream RealAudio Server that had been purchased originally to broadcast archived lectures on science for the local public radio station, KUMR. When KUMR was contacted, it turned out that the RealAudio Server was an archaic 16-bit version 1.0 that ran under Windows 3.1 and had been purchased for development purposes. The first step in developing the broadcasting system was to upgrade the RealAudio Server to version 2.0, and to buy the RealAudio Encoder that would allow us to encode the audio produced during a lecture. Once this was done, we concentrated on limiting the audio bandwidth requirements to as small a value as possible. There were two modes in which the audio could be encoded. These two modes were the 14.4 kbits per second mode and the 28.8 kbits per second mode. The 14.4 mode was established by Progressive Networks in order to transmit audio over 14.4 kbits per second modem communication lines and was optimized for voice broadcasting. Although it was called the 14.4 kbits per second mode, it actually transmitted the audio at a rate of 9.6 kbits per second. The 28.8 kbits per second mode utilized a much greater extent of the available bandwidth, but was also superior in its encoding structure for the broadcasting of music.
For our application, the 14.4 kbits per second mode seemed to be the solution to both of our dilemmas. Its quality of the spoken voice was superior, and it also used a much smaller amount of available bandwidth. If we were going to succeed in our attempts to keep the entire broadcast at or below the 28.8 kbits per second modem communication line bandwidth limitation, it would be the only option that we could realistically use. Even more fortunate, when we actually started to experiment with the 14.4 kbits per second mode, we discovered that it actually only used approximately 8.1 kbits per second, with fluctuations as high as 9.1 kbits per second, and as low as 7.5 kbits per second. When we experimented with the 28.8 kbits per second mode, we found that the actual bandwidth requirement varied from 19.5 kbits per second to 26.5 kbits per second. Therefore, in actuality, the encoded transmission of the audio signal is never really constant, but continuously varies over a range of different bandwidth requirements. Of course, this is due to the fact that the Encoder encodes different packets of transmitted audio signal with varying effectiveness, sometimes requiring more than its specified bandwidth requirements, and sometimes less.
In addition to the many improvements that were included with the upgrade to the RealAudio 2.0 Server, we also increased the amount of available streams from 10 to 50. The logic we employed in making this upgrade, which cost almost $3000, was that if we were going to use the RealAudio audio broadcasting system to create a "virtual" classroom for undergraduate students, than we would need at least 35 available streams to allow that many students to log in simultaneously. In this regard, Progressive Networks was exceptionally helpful in only charging the university for the difference between the original price that we paid for the 10-stream RealAudio Server and the price for the upgraded version of the 50-stream Server. In addition to the new version of the RealAudio Server, Progressive Networks also developed a new RealAudio Encoder and another piece of application software, the RealAudio TimeLine editor. The RealAudio TimeLine editor allowed other multi-media applications, such as World-Wide Web browsers, to be launched as the RealAudio Encoder was encoding the audio from the live-stream and feeding it to the RealAudio Server. This piece of application software was essential in linking the RealAudio audio broadcasting system with the Netscape World Wide Web browser software that originally delivered the class notes to the students.
When the class first began, the class notes broadcasting system consisted of scanning class notes, that had been previously created by the lecturer the day before, at a resolution of 400 by 400 dpi, with sixteen shades of gray. These scanned class notes were then saved as .gif image files. They were then placed on a World-Wide Web homepage so that students would be allowed to examine and download that day's notes either prior to the class starting at 10:30 a.m. or to download them during the class time while they were listening to the live audio broadcast of the class. Since the RealAudio Encoder was only using approximately 8 kbits per second of the bandwidth, there was almost 20 kbits per second of the bandwidth that was still available for use by the Netscape World Wide Web browser for downloading that day's class notes. In order for the class to be truly interactive, there was also a dedicated call-back line for students that would be attending the class in a "virtual" mode in order for them to contact the lecturer if they had any questions.
This broadcasting system worked very well during the four weeks that it was placed in operation. Students that would attend in the "virtual" mode could download that day's class notes prior to launching their RealAudio player and listening to the live lecture that was being given that day. Then, as the lecture was being broadcast to them, the students could follow along with the lecture by examining the notes they had already downloaded. However, it was not a perfect process for either the lecturer or the students. The effort that was involved on the lecturer's end required them to have everything prepared, and scanned, a day prior to the day on which the lecture was being given. Then, when the lecturer was delivering the lecture in class, they would have to be careful about what they said in connection with that day's notes. For instance, it was found that if the lecturer was referring to an equation on the board in the classroom with a flippant "Now, let's look at this equation up here", the "virtual" students wouldn't know exactly which equation the lecturer might be referring to. This meant that the lecturer would have to be careful in their wording during the lecture, which required, however slight, a change that had to be made to their pedagogical teaching technique. So, although this type of integrated audio/class notes broadcasting system satisfied some of the more stringent constraints that we had placed on our developments, it didn't entirely satisfy all of them. It adequately created a broadcasting system that would teach students over 28.8 kbits per second modem communication lines, but it was not a transparent system to the lecturer.
It also did not allow for the transmission of video to the "virtual" students, which was something that we eventually wanted to include in the final version of our broadcasting system. Although it was not absolutely necessary to see the lecturer's face in order to learn from them, the additional video broadcasting system would allow the students to feel a stronger sense of community with the other students in class and with the course's lecturer. As we continued to investigate different broadcasting technologies, we soon discovered that we could mix the different broadcasting technologies together into a sort of "Broadcasting Cocktail" that would allow us to build not only a more diverse broadcasting system, but also a more robust and flexible one.
In order to incorporate video broadcasting technologies into our broadcasting system, we decided to investigate two different, but similar, video broadcasting software packages. The first of these was the VideoPhone broadcasting system. This system was interesting for us to investigate because it used the Connectix QuickCam imaging system to encode video and transmit it over the Internet. It was also very interesting because it allowed full-duplex audio to be sent and received, in addition to sending either black-and-white or color video at very high frame rates. All this could be performed while still using less than the available 28.8 kbits per second bandwidth that is available over standard modem communication lines. When we investigated the VideoPhone broadcasting system, we were very impressed by the quality of the video, whether black-and-white or color, and the quality of the audio transmission. Also, the refresh rate of the video was exceptionally good--on the order of 2-3 frames per second. In fact, we were so impressed with the quality of the VideoPhone software that we originally thought this would be the best video broadcasting system we could link to our current audio and class notes broadcasting systems. However, the VideoPhone software had one major limitation. As the name implied, the VideoPhone was designed to be used as a replacement for a telephone, and not as a teleconferencing system. Although the VideoPhone does have some limited teleconferencing capabilities if it is installed on a Macintosh platform, it has no teleconferencing functions for a Windows 95 platform. So, unfortunately, since we were constrained to design our broadcasting system to broadcast from one computer to many, the VideoPhone reluctantly became a moot point in our quest for a useful video broadcasting system.
With the advice of one of the VideoPhone software development managers, we were pointed in the direction of CU-seeme teleconferencing software, which is distributed by WhitePine Software distributors. Although the video quality of CU-seeme is by no means comparable to the quality of the VideoPhone software, the new enhanced CU-seeme teleconferencing software allowed true broadcasting capabilities. In addition, the new version of the CU-seeme teleconferencing software also came with a 16-bit interactive whiteboard program that allowed any user at the "virtual" conference to draw diagrams or write notes to anyone else that was present at the "virtual" conference. However, the audio capabilities of the CU-seeme teleconferencing software left much to be desired. In fact, we found that out of the four different types of audio compression techniques available, only one of those audio compression types could actually be used effectively to broadcast audio to other individuals present at the "virtual" teleconference. When visiting CU-seeme reflector sites, it became immediately obvious that no one was using the audio broadcasting capabilities to communicate with anyone else. Instead, all communication was being done by typing text in a "Chat Window", which seemed to us to be a very archaic way to communicate. This is due to the fact that the only audio transmission type that works at all requires almost 40 kbits per second of the available bandwidth to operate, which is more than standard modems allow.
That is not to say that the CU-seeme teleconferencing software would not be useful if it were to be used to broadcast courses to individuals that would have ISDN, T1 or T3 connections. These type of high bandwidth communication lines would allow CU-seeme teleconferencing software to work quite effectively as a broadcasting system, and there are many applications for this type of educational broadcasting. In fact, one of the "virtual" courses that we plan to offer this spring at the University of Missouri-Rolla will use exactly that type of system to connect graduate students at Stanford to a graduate course in Mechanical Engineering here at the University. However, we were not trying to design a broadcasting system that could be useful to broadcast courses from one University campus to another. That has already been done in at least two instances.
Instead, what we were trying to design was a broadcasting system which would allow us to broadcast courses over the Internet that could be viewed by ordinary individuals sitting in their homes or offices, and who would only have access to standard modem communication lines. In order to accomplish our objective, we decided to marry the three independent broadcasting technologies together to create an integrated broadcasting system that would serve our purposes. In order to do this effectively, we needed to use two computers, one to broadcast the RealAudio encoded audio transmission, and another to broadcast the WhitePineboard interactive whiteboard broadcasting system and the CU-seeme video broadcasting system. It would have been nice to have all three broadcasting systems on one computer working in a true multi-tasking mode under a 32-bit platform, but that wasn't possible due to the fact that both the RealAudio Encoder and the WhitePineboard broadcasting software were only 16-bit programs. They never worked together in a true multi-tasking mode, always interfering with each other's operation, and therefore had to separated onto two different computers. However, the enhanced CU-seeme video broadcasting system is a true 32-bit program, and can take advantage of the fact that Windows 95 is a 32-bit, true multi-tasking operating system. By using this "cocktail" combination of the three different broadcasting technologies, we have been able to provide video, audio and an interactive whiteboard to our students as an integrated broadcasting system that will work in real-time over the Internet.
With our current system, the audio encoding and transmission only requires 8 kbits per second, and transmits telephone-quality audio over the Internet. The WhitePineboard interactive whiteboard uses somewhere between 1-5 kbits per second in order to clearly transmit the notes that the lecturer is writing on the digitizing device, whether that device be a digitizing tablet or a digitizing whiteboard. Also, since the "virtual" students launch both the RealAudio player and the CU-seeme viewer separately on their computers, both of these two broadcasting systems set up their priority structures independently. Since the audio transmission from the RealAudio Server is a continuous function, the audio remains the highest priority in the student's monitoring system. And, since the lecturer is using a separate computer to encode the RealAudio broadcast, the student receives the two streams in parallel, but not in competition. With both the CU-seeme video broadcasting system and the WhitePineboard interactive whiteboard broadcasting system on the other computer, these two broadcasting technologies are prioritized with the WhitePineboard interactive whiteboard receiving the highest priority, and the CU-seeme video broadcasting system using all available bandwidth that remains. This means that the overall priority structure for transmission to the student puts the audio broadcast at the highest priority, as it should be, and the WhitePineboard interactive whiteboard broadcast at the second highest priority. The total bandwidth requirements for these two top priority broadcasting systems may vary from only 8 kbits per second, if only audio is being transmitted, to as much as 14 kbits per second if both the audio and the WhitePineboard are being used simultaneously. However, this still leaves approximately half of the available bandwidth for use by the video broadcasting system. If all of the remaining 14 kbits per second were utilized by the video broadcasting system, than the student could still expect to see the video of the lecturer refreshed on the order of 1 frame per second. Although this is a far cry from standard TV video refresh rates, which are on the order of 30 frames per second, these refresh rates are still quite good considering that the major value the video really has to the student is to provide them with a sense of community within the "virtual" class.
On the other hand, if the material were broadcast over ISDN lines, which cost as little as $50 per month in many urban areas throughout the country, then the audio and interactive whiteboard transmission would only require a small percentage of the available bandwidth, and the remaining bandwidth could be dedicated to video transmission. This would allow video refresh rates of approximately 9-10 frames per second, and would make the video seem much more realistic, and similar to black-and-white TV transmission. If this were taken one step further, and the broadcasting system were being transmitted over T1 communication lines which allow 1.544 Megabits per second to be transmitted, then true stereo-quality audio, black-and-white TV-quality video, and real-time interactive whiteboard transmission would be easily possible.
The results of our efforts are an educational broadcasting system that will allow the broadcasting of courses over the Internet to individual students located throughout the region, across the country, and around the world. We are not alone in this development. A collaboration between NASA and the University of North Dakota is also developing a course in Telerobotics that should be offered in this coming spring semester. Additionally, students at the University of North Carolina's Ashville campus are currently taking classes from North Carolina State University in Raleigh through an on-line system. The difference between our broadcasting system and the one currently in use in North Carolina is that ours allows students to be at home, or in their office, when they are taking the course, rather than requiring them to go to another venue. The broadcasting system that is being used to offer the Telerobotics course at the University of North Dakota seems very similar to our own. It appears that they have also independently come to the conclusion that in order to broadcast their course around the world, they need to marry several different broadcasting technologies together to effectively remain within standard modem bandwidth limitations.
We are certain that our integrated broadcasting system will be very useful as educational institutions rise to meet the growing need for life-long education. It will not only finally allow educational institutions to offer courses to individuals across state boundaries, but will also allow them to reach into people's homes and liberate them from the tyranny of distance that denies many of them the ability to attend an educational institution. It will also allow students, as consumers, to decide which educational institution they would like to attend. No longer will students be constrained to take courses from a single educational institution. In the near future, they will be able to pick and choose which educational institution can give them the best value.
For many years, it has been possible for students to transfer certain courses from one educational institution to another, and use those courses toward their degree requirements. Now, that process may grow into an even larger business. For instance, if a student attending Sinclair Community College or Kellogg Community College needed to take a class in heat transfer, but it was not offered that semester, then they could take the class in "virtual" mode from another educational provider, such as Purdue University or the University of Missouri-Rolla, or even MIT. Or, if a student was studying at a Community College, but they did not offer, or did not have the expertise to offer, a fundamental course in engineering that the student needed, then that student would have the option of "virtually" attending the course from a another educational provider, such as the University of Missouri-Rolla or Stanford, or a host of other educational institutions.
In the distant future, perhaps ten years from now, this integration of various broadcasting technologies may well change the profile of higher education not only in America, but throughout the world. At that time, not only will educational institutions be providing courses to students across the United States in their homes, but may very well be providing educational services to students around the entire world. Many students from overseas now come to the United states in order to study at both the undergraduate and graduate levels. But, many more overseas students could be accommodated if they were able to attend classes in a "virtual" mode, while still residing in their home countries. In this way, the recognized superiority of American educational institutions could expand into a world-wide market. The University of North Dakota, with its offering of the Telerobotics course over the Internet, may well be just the tip of the iceberg of what may come in our future.
We don't perceive that there will be any problem with performance assessment of the students in these classes because assessment methods and standards have been established and in place for years for students that have taken courses in other distance-learning modes. Many universities currently offer courses at both the undergraduate and graduate level over microwave-based or satellite-based transmissions systems. Purdue University and the National Technological University are two good examples of this. They have, through years of conscious effort and development, helped to standardize the methods in which students involved in distance-learning are assessed. Those procedures would still be used with the new broadcasting technologies that are being developed. Only the methods of broadcasting material to the students and how we interact with those students will change, not the manner in which we assess those students involved in distance-learning programs.
We, here at the Instructional Software Development Center at the University of Missouri-Rolla are also beginning to offer courses over the Internet, using our new broadcasting system. We plan to offer courses in Mechanical Engineering and Engineering Management in this coming semester, predominantly to serve the needs of students across Missouri that do not have convenient access to institutions of higher education. But, all of us, as academics, also have the responsibility to make courses as accessible as possible to students of all ages, and at any location. Employing audio, video and interactive whiteboard broadcasting technologies to attain this goal should only be perceived as a normal evolution of this, our most sacred, responsibility.
WhitePine Software Company
Enhanced Cu-seeme $69
Cu-seeme reflector site software package (including 10 copies of CU-seeme) $778
WhitePine WhitePineboard software included with CU-seeme
RealAudio 3.0 live audio encoder free with server
RealAudio TimeLine editor free with server
RealAudio 3.0 player free with server
RealAudio 3.1 server (10-stream) $1000
Connectix software company
Connectix QuickCam quick movie recording software free with
Connectix color QuickCam $150
Poynton, Charles A., A Technical Introduction to Digital Video, John Wiley & Sons, 1996
Steinmetz, Ralf and Nahrstedt, Klara, Multimedia: Computing, Communications and Applications, Prentice Hall, July 1996
Cavanagh, James P. (Ed.), Multimedia Networking Handbook, Auerbach Publication, 1995.
Furht, Borko, Multimedia Systems and Techniques
Holland, Robert W. Jr,. Multimedia Tools and Applications: An International Journal
Nwosu, Kingsley C. (Ed.), Multimedia Database Systems: Design and Implementation Strategies
Agnew, Palmer W. and Kellerman, Anne S., Distributed Multimedia
Beale, Jeremy. "The information explosion." The OECD Observer, no. 196. October/November 1995. pp. 6-9.
[Campbell, Candace.] Proposal to Establish the Minnesota Institute for Telecommunications Technology Applications and Education (MITTAE), report to the Minnesota State Legislature. December 15, 1994.
Cronin, Francis, and Paul Hebert, and Elisabeth Colleran. "Linking telecommunications and economic competitiveness." Telephony. September 27, 1992. pp. 38-42.
The Economist. "Telecommunications Survey: The death of distance." The Economist. September 30, 1995. pullout section.
The Economist. "The Internet: The accidental superhighway." The Economist. July 1, 1995. pullout section.
Entman, Robert, and Charles Firestone. The Communications Devolution, report of the Tenth Annual Aspen Institute Conference on Telecommunications Policy. Washington: Aspen Institute, 1996.
Garwood, Amy. Telecommunications and Rural Economic Development. Minneapolis: Humphrey Institute, October 1992.
New York Telecommunications Exchange. Connecting to the Future: Greater Access, Services, and Competition in Telecommunications. [Albany: Department of Economic Development,] December 1993.
Noam, Eli. "A theory for the instability of public telecommunications systems," in C. Antonelli, ed., The Economics of Information Networks. New York:Elsevier Science Publishers, 1992. pp. 107-127.
Ross, Doug, and Robert Friedman. "The emerging third wave: New economic development strategies in the '90s." Entrepreneurial Economy Review.Washington: Council for Economic Development, Autumn 1990.
Stevens, Candice. "The knowledge-driven economy." The OECD Observer, no. 200. June/July 1996. pp. 6-10.