Your browser is out-of-date!

Update your browser to view this website correctly. Update my browser now

×

Bridging THE Gap

Purdue University founded in 1869 in West Lafayette, Indiana is built upon its historic strengths in engineering and agriculture. Purdue not only has

Bridging THE Gap

May 1, 2003 12:00 PM,
By Charles Conte

Purdue University — founded in 1869 in West Lafayette, Indiana — is built upon its historic strengths in engineering and agriculture. Purdue not only has a history of being in the forefront of technology as an institution but also in the accomplishments of its alumni such as Neil Armstrong and Gene Cernan, the first and last men to walk on the moon. Agricultural research at Purdue has given the world better golf courses, as well as Prescription Athletic Astroturf, used at Purdue’s P.A.T. field and more than 30 institutions and professional stadiums around the country.

In 1985 “purdue.edu” became one of the first domain names registered on the Internet. Following from that rather recent tradition is Purdue’s participation in the Access Grid — a collaboration of facilities linking Purdue to more than 100 domestic and international research sites.

THE ACCESS GRID

The Access Grid is a National Computational Science Alliance project led by Argonne National Laboratory. The deployment of the Purdue node of the Grid was a joint project of many campus units, including Information Technology at Purdue, the Center for Instructional Services, the Center for Collaborative Manufacturing, the Center for Education and Research in Information Assurance and Security, Krannert Executive Education Programs, and Continuing Engineering Education.

Universities and research institutions across the country have been working together for some time to develop computational grids for the purpose of binding every participating entity’s supercomputer to everyone else’s for the goal of achieving a single, large collective computational powerhouse. Most of this binding occurs by using existing high-speed research networks such as Internet2 (www.internet2.edu).

The Access Grid runs on the same premise, except that it does not bind supercomputers together but actually binds physical spaces together. This in essence creates the potential for a large collaborative space where researchers can see and speak with researchers at other institutions and share information without going farther than across a campus. This may sound like a simple two-way videoconferencing facility, and in some respects it is. But there is more to it than two-way video capabilities. Researchers can also use these facilities to share documents, presentation graphics, and any kind of research data that can be transmitted from one computer to another.

“Access Grid is more a concept than a system or a facility,” says Michael Gay, manager of broadcast networks and services (BNS) within Purdue’s IT Telecommunications department. “Unlike traditional videoconferencing systems and facilities, the Access Grid is an abstraction. Members of the Access Grid community use multicast technology to transmit multiple simultaneous video feeds to and from all around the world. There is no physical network dedicated to connecting rooms together. The multicast video is simply transmitted via existing high-speed research networks such as Internet2. There is no single piece of equipment one goes out to purchase to facilitate connectivity. Although Access Grid is an abstraction, the idea behind it is simple — connect physical spaces in order to create a means for groups to meet with each other remotely.”

Purdue’s research mission creates a need for collaboration between both individuals and groups at other institutions around the country and the world. Although face-to-face collaboration in a group environment works best, it is not always practical or economically feasible, especially in a time when travel budgets are hard to come by. These are the circumstances that drove Purdue’s involvement in the Access Grid.

NODES

The space used at each institution for an Access Grid facility is typically referred to as a node. About 150 domestic and international research sites have Access Grid facilities to which Purdue can connect. The facilities at each of these sites is a node on the Access Grid. Each node in the Access Grid must meet a minimum set of standards in order to communicate with the other sites.

The minimum standard equipment specification for Access Grid provides for a large two-dimensional display space, four live video camera transmissions, and two-way audio between all sites. A node will typically broadcast four separate video images that can be received by any number of remote sites around the world. The real power of this lies in the fact that a node can see as many as four separate video images from every site it is broadcasting to. In essence, every user in every room has the capability of seeing every user in every other room. That can get a little cumbersome with 150 nodes active at this time.

Virtual venues have been created to ease this burden and allow several collaborations to occur simultaneously. A virtual venue is a virtual meeting space. For example, if Purdue wants to participate in a conference with five other rooms around the country, all institutions would direct their respective room’s transmit and receive equipment to a particular virtual venue. Now only rooms within that virtual venue would be able to see and hear each other.

Think of the virtual venue as a common meeting space where all the interested nodes agree to come together to further a common agenda. A virtual venue can even be a virtual lobby where rooms not in use can park their equipment. Some virtual rooms are secure encrypted rooms for private meetings. Technically speaking, a virtual venue is merely a collection of known multicast addresses and IP ports accompanied by private online chat rooms for back-channel communication among node operators.

Multicast technology allows multipoint conferences to be held without the need for a dedicated hardware multipoint control unit (MCU) similar to those used in ISDN or H.323 videoconferencing. That means the only limit to the number of simultaneous users in a conference is the available bandwidth on the research network. Although most MCUs limit the number of simultaneous images viewed on the display screen to four or nine, an Access Grid node can display as many simultaneous video images as the node has display space for. This can be a limitless number of live video images, considering the typical node will have two to three high-resolution display projectors.

THE STANDARD

In October 2001, a committee was formed to study the Access Grid specification and come up with a general plan (see the sidebar “Building a Node”). Gay cochaired the committee and had the task of leading the design team — Roger Mikels, Phil Knobloch, Earle Nay, and John Dietrich — whose goal was to implement the functional desires of the committee. This group of engineers and technicians were already working for Purdue within BNS and would later become the implementation team, as well. Access Grid software engineers, Leslie Arvin and Jeff Schwab, on staff within the Purdue IT department, were also part of the implementation team. They were responsible for integrating the computer hardware and Access Grid software applications. The fact that Purdue had such qualified people already on staff, with years of experience working in its 200-plus-seat classrooms, was the key to the smooth integration of this facility.

Once the committee reviewed the Access Grid standard set of specifications, it decided that the Purdue facility should set a new standard for an Access Grid facility. This decision included installing Draper DMVC 101 rear-projection systems, multiple available sources, the ability to merge a two-way videoconference into an Access Grid session, a simple user interface to the audio and video controls, and an overall clean and finished look within the facility.

While designing the Access Grid node for Purdue, Gay’s design team had to take into account many capabilities requested by the design criteria committee. “We tried to make complex events easy for the operational staff to manage as a simple remote collaborative gathering,” Gay says.

Because usable space is always an issue at institutions, an existing two-way videoconference facility was chosen as the Grid location. The two-way videoconference functionality needed to remain in the completed facility, but connections to the Video Network Operations Center needed to be added. Most Access Grid nodes connect their display devices (in this case, three Sharp XG-P20XU XGA LCD video projectors) directly to the VGA outputs of the display machine. The added requirements of the Purdue design required dynamic control of the projector inputs with a variety of inputs consisting of VGA and S-video inputs. The Purdue design would also require the control of as many as 16 microphones, including boundary and wireless types. Ultimately, Purdue went with a dozen Shure MX392/C low-profile boundary microphones.

Gay says that in the future, the Access Grid might decide to employ a three-dimensional display image or holographic imaging. “The technologies currently used were chosen because they are both accessible and affordable now,” Gay says. “This is evident in the use of the multimonitor functionality built in to Windows 2000 and Windows XP. The Access Grid community could have easily chosen a more sophisticated method using proprietary display devices but instead made a conscious effort to spec off-the-shelf components supported by a variety of manufacturers. Some users have taken cues from the variety of desktop videoconferencing units available on the market and have created desktop Access Grid environments known as Personal Interface to the Grid, or PIG for short. This is indicative of the natural continuing development process of Access Grid technology.”

As the design took shape, Gay worried about some poor operator trying to manage three projector remote controls, a multitude of router buttons, playback devices, and Linux-based video capture software. The facility had the potential to become a real nightmare — a wonderful facility that no one would be able to operate without a degree in rocket science.

IN CONTROL

Purdue already had several advanced technology classrooms outfitted with Crestron touch-screen control systems, and Gay had several skilled Crestron programmers on his staff. As a result, Purdue chose Crestron to meet its system needs.

Programming the Crestron CNMSX-PRO integrated control system was a joint effort by Gay and Mikels. Gay designed the user interface and custom graphics for the control screens, and Mikels programmed the control system end, making the screen designs function as desired. Mikels used a combination of off-the-shelf programming macros from Crestron as well as a few custom macros he designed. Using bidirectional connection to most of the equipment allowed the screens to reflect the state of the system at any given time. The control system automates many of the configuration tasks and gives the operator complete control of equipment in the room. The intuitiveness of the system reduces training time for operators to less than a day. The control system pulls all the pieces together seamlessly.

Although many Access Grid activities do consist simply of groups of people sitting down to collaborate remotely with each other, complex Access Grid events, such as those incorporating formal presentations from multiple locations, may take weeks of planning and organization. The system was put to the test when Gay was a participant in a complex Access Grid event hosted by Purdue, where he was asked to prepare a formal presentation and was also involved in the planning process.

“The planning for this type of event takes into account the types of media the presenters wish to use, as well as how the media will be displayed at remote sites,” Gay says. “In addition to coordinating speakers and audiences, I was also tasked with coordinating the facilities at the many participating sites along with the technology and the presentations. When we designed our Access Grid facility at Purdue, we tried to make complex events as easy for the operational staff to manage as a simple remote collaborative gathering. I use the word simple somewhat tongue in cheek. To the participant, speaking face-to-face with groups around the world is simple. To our operational staff, this is somewhat more complicated but still a straightforward process.”

Although most standard videoconferencing facilities are designed for operation by the end-user, an Access Grid node uses multiple simultaneous camera angles, distinct presentation tools, and open-source software requiring a skilled operator familiar with the facility. This operator will monitor the camera positions, video routing, and network connection quality and will ready PowerPoint presentations for presenters. Some events in the facility have used two operators, with a production operator focusing on the audiovisual activities (such as camera control and microphone levels) and a network operator focusing on the computer-related activities (such as network monitoring, PowerPoint setup, and audio/video capture software).

This particular event required two operators, who spent a lot of time and effort ensuring everything would go as planned. When the day of the event came, they were well prepared. The room was opened early to make some final preparations and to initiate the connection to the virtual venue in which the event was to occur. The first thing the production operator did was use the Crestron TPS-4500 Isys touch-panel screen to select the mode in which the room was to be operated. The choices are Videoconferencing or Access Grid. Touching the button on the screen for Access Grid initiates the room by powering up the projection system, making some default video routes, and then displaying the control scene for Access Grid operation. The Access Grid operation screen brings up a graphical representation of the room, including camera positions and microphone positions.

The operator can touch the speaker icon to bring up a touch-panel-based audio mixer. The operator uses this screen to make minute adjustments to the microphone levels and outputs to the Access Grid audio codec (open-source software running in a Linux environment). The interesting aspect of this particular part of the process is that there is no traditional mixer in the room. All audio mixing, routing, and control happen inside the digital signal-processing brains of the three Polycom Vortex 2280 matrix mixer/echo cancelers. The audio mixer screen also adjusts playback levels to the room’s Tannoy i8AW loudspeakers, used as point source playback of remote audio and local multimedia audio. The loudspeakers are powered by a Crown D75 power amplifier, and a Symetrix 628 handles voice processing.

“I like the Tannoys for their accurate representation of sound, which gives the facility a more immersive feel when speaking to remote sites,” Gay says. “We definitely did not want the remote audio to sound like it was from a drive-through window. For this facility, the i8AWs fit the bill for room, as well as a pair of Reveal Actives in the control area.”

Switching from the audio mixer and returning to the Access Grid operation screen, the operator then touches any one of the five camera positions — all five cameras are Sony EVI-D30s — to bring up a camera control subscreen. The screen gives the operator full control of pan, tilt, and zoom. The operator previews his camera shot on the Wohler/Panorama MON4-3 LCD. “While we could have designed the system to display the video on the Crestron TPS-4500 touch panel, as we do in some of our classrooms, we consciously elected not to do so here in order to maintain a wealth of information on the touch panel as well as give the operator simultaneous preview of all cameras,” Gay says.

The goal of the design was to keep it simple. The equipment behind the touch screen is a complex array of video routing, projection, and audio systems, yet the operator needs to know only what end result he or she wishes to accomplish. “This is also true in our production switcher screen,” Gay says. “Some applications of the room such as video recording or videoconferencing might require the fast-paced switching of cameras and other video sources to a common output. The production switcher screen provides a one-touch switch action of a selection of video sources to an array of outputs.”

The screen was designed to look like a professional video production switcher used in television production studios. The action of this “glass console” mimics the operation of the real thing. A preview bus is also provided with the preview output appearing on an 18-inch flat-panel display screen.

The conference went smoothly. “Our operators had full control of every function,” Gay says. “The largest failure in the room was a remote RF mouse used by the presenter to advance the slides. I guess if that is the worst of it, it was a good day. We will check the batteries next time.” The conference had several onsite speakers from the Purdue facility as well as two speakers from remote locations around the country. “The transition from speaker to speaker went as smoothly or smoother than one might expect at an in-person national conference,” he says.

Although the room is primarily used for Access Grid events, it is also used for traditional videoconferencing and for meetings requiring multimedia presentation capabilities. The Crestron touch-screen control system allows these functions to coexist in one room and puts the power of these systems literally at the operator’s fingertips.

When all was finished, Purdue not only had an Access Grid node but also a complete multimedia presentation facility capable of remote collaboration across the Access Grid as well as traditional videoconferencing. In addition, links with existing Purdue facilities made satellite uplinks a real possibility. With so many possibilities built into this facility and the upgradeability inherent in facilities with control systems, the Purdue Access Grid node will be a valuable asset to the university for years to come.

Charles Conte heads Big Media Circus, a marketing communications company. He has published extensively in the commercial audio trade press. He can be reached at [email protected].

Building a Node

When it came time to build an Access Grid node at Purdue, a committee was formed in late October 2001 consisting of Information Technology department members, as well as representatives from several academic units on campus. The committee studied the Access Grid specification and came up with a general plan. The process, from the decision to build the facility to completion, went like this:

  • The committee analyzes the basic requirements for and determines what other tools or functions should be added to make the Purdue Access Grid node stand out from the pack. The committee evaluates potential rooms on campus and chooses a facility already in use for basic two-way videoconferencing.
  • The committee’s recommendations are forwarded to the design team.
  • The design team specifies the equipment to fill the requirements. Key elements selected: the Crestron control system, Draper rear-projection systems, and Polycom Vortex matrix mixers.
  • The design team’s design is reviewed and approved by the committee.
  • A deadline for completion of mid-February is set so the facility can be unveiled at the 2002 Teaching and Learning Technology Showcase held each year at Purdue.
  • The design team changes modes to become the integration team.
  • Equipment is ordered from MCSi.
  • The Purdue Physical Facilities crew adds more conduits to the room.
  • Equipment arrives and is installed.
  • Finally, the room is ready for its presentation at the Teaching and Learning Technology Showcase.
  • After a week’s worth of 16-hour days, the integration team rests.

For More Information

Crestron Electronics
www.crestron.com
Ž 241

Crown Audio
www.crownaudio.com
Ž 242

Draper
www.draperinc.com
Ž 243

Extron Electronics
www.extron.com
Ž 244

Polycom
www.polycom.com
Ž 245

Sharp
www.sharp-usa.com
Ž 246

Shure
www.shure.com
Ž 247

Sony
www.sony.com
Ž 248

Symetrix
www.symetrixaudio.com
Ž 249

Tannoy
www.tannoy.com
Ž 250

Wohler/Panorama
www.panoramadtv.com
Ž 251

Ž Circle this number on Reader Service Card or visit freeproductinfo.net/svc

Featured Articles

Close