Live Access Server Description

      The U.S.JGOFS Data Management Office (DMO) has been working with the Live Access Server (LAS) ( group at the University of Washington and the National Oceanic and Atmospheric Administration (NOAA) Pacific Marine Environmental Laboratory in Seattle to provide an access and visualization interface for it's data. The LAS was not originally designed for work with non-gridded data sets. In an exciting collaboration, programmers in Seattle and Woods Hole are putting together the additional modules necessary to allow LAS to work with U.S.JGOFS process study profile data. This development comes at a time when the relatively large merged data products are becoming available. The customized LAS interface will allow scientists to navigate and visualize the merged data products in a much more convenient and rapid manner.

      The Live Access Server (LAS) has been providing a means of visualizing, reformatting and selecting subsets of multidimensional scientific data for World Wide Web users since 1994. The initial priorities in designing the LAS were to reduce barriers erected by the size, location and format of files for investigators who wanted to get access to gridded data sets. It uses the Ferret program for data analysis and visualization, but other visualization tools can be used.

      LAS is an easily installed, configurable Web application especially suited to the analysis and visualization of large, gridded environmental data sets. It uses the Ferret program for data analysis and visualization, but other visualization tools can be used. The Live Access Server enables the Web user to:

  • visualize data with dynamically generated graphics
  • request custom subsets of variables in a choice of file formats
  • access background reference material about the data (metadata)
  • compare (difference) variables from distributed locations (with DODS networking)

      LAS is really two separate servers -- a user interface server and a data server. Although both servers usually reside on the same computer, they can be distributed to different machines if desired. The user interface server handles the presentation logic for LAS. It consists of a set of Perl objects at the server, as well as static and dynamically generated HTML, JavaScript, and Java code that are downloaded to the user's Web browser. The browser code allows a user to select a dataset and variable to visualize, as well as geographical regions, view planes, visualization styles, and a variety of other parameters for given variable.

      Once a user decides how he wants a given variable in a given dataset to be presented (as data or visualized), the browser sends a request (composed in XML) to the data server. The data server consists of a set of Perl objects that parse the XML request and call a configured driver for data analysis and visualization. LAS is currently configured to use Ferret as the data analysis and visualization program, but other programs can be substituted. The Ferret driver dynamically creates custom Ferret scripts which are then used by Ferret to visualize or analyze the requested data. The datasets can be on the local machine, or can be accessed remotely through the DODS protocol. The visualized or analyzed data is returned to the user in a variety of formats (as a GIF image, netCDF, ArcView GIS, comma separated values, or plain ASCII).

      Both the user interface and data servers are configured with one XML configuration file. XML (the Extensible Markup Language) is a markup language that allows a user to store structured data in a text file. It is similar to HTML, but has the advantage of being extensible (through user defined tags) and easier for computer programs to use. A LAS administrator can easily change the configuration of the server by editing this configuration file. After the XML file is edited, the administrator runs a program that generates JavaScript for the user interface, and converts the XML file to a set of relational database tables that are utilized by both the data and user interface servers.

Why use LAS?

      Visualization makes it possible to explore any given data set in a variety of ways with a web browser. Should further analysis be desirable, the subsetting capacity allows the investigator to download small units of data that move efficiently over the Internet. Reformatting makes it simple to incorporate data into the desktop environment. Recent work on the LAS has focused on support for groups of collaborating researchers at disparate locations. LAS supports collaborations by providing common access to reference data sets, shared access to distributed data sets, and the ability to compare distributed data holdings.

      The LAS creates a shared virtual data base as a natural extension of its modular design. LAS sites installed at separate locations exchange copies of the metadata that describe the data sets they serve, making each aware of the data sets held by the others. Each is able to direct users' requests to the LAS that serves a particular data set.

      The goal of collaborative research is to create a whole that is greater than the sum of its parts. Data systems can support this goal by bringing information from distributed data sets together for merged calculations. Binary-level access to remote data sets is provided by the distributed ocean data system or DODS. The most extensive collection of distributed ocean data available through LAS is available via the Virtual Ocean Data Hub (VODHub) at


      The LAS was originally used by U.S. JGOFS to provide access to gridded data sets, where data points are fairly regularly distributed in time and space. As part of a collaboration among staff members at the University of Washington, PMEL and the U.S. JGOFS Data Management Office in Woods Hole, LAS capacities have been extended to provide the same sort of access to non-gridded (in situ) data, from process studies and the U.S. JGOFS Synthesis and Modeling Project. This enhanced version of LAS provides access to the DMO merged data products as well as the synthesis and modelling phase (SMP) results.

Note: Much of the above text is taken from U.S. JGOFS Newsletter, volume 11 number 3 (November, 2001) by Steve Hankin (NOAA Pacific Marine Environmental Laboratory in Seattle), and Jonathan Callahan and Joe Sirott (Joint Institute for the Study of the Atmosphere and Ocean, University of Washington).

back to top