Appendix 4: On ECS User Scenarios

A4.1 Introduction and Summary

User scenarios have been gathered by the ECS User Model Team for two purposes: to elicit functional requirements for the "pull" database system and to provide a basis for estimating this system's performance requirements. The workbook ECS User Characterization, May 1994, describes the progress of this effort as of the publication date.

In this appendix we develop and apply an alternative approach to scenario analysis. Our method of analysis is to express the scenarios as sequences of queries in an SQL-like language. Think of this language as a straw man design for Version 0 of SQL-* described in the body of the report.

First, we define the types, tables, and functional embellishments of SQL-*. Then, we describe a number of pertinent query examples, providing a superficial tutorial. Finally, we consider 12 specific scenarios (see Table A4-1) of the workbook from the SQL-* point of view. Together, this information illuminates various aspects of SQL-* in action.

This alternative representation is a compact yet expressive way to understand scenario steps and complements the more procedural data flow method used in Appendix 3. Casting scenarios as queries reveals their differences and similarities, for it focuses attention on types of data rather than specific data sets. The query approach is logically rigorous, and the act of phrasing data activities in SQL-* highlights ambiguities and inconsistencies in the scenarios. A final merit is that SQL queries are the proper starting point for studying the computational and input/output loads that scenarios exert on the database management system. Appendix 5 has an example of this.

The scenarios cover a wealth of applications, but we have focused on the data -- intensive examples of research scientists. We have not considered scenarios involving human assistance from the EOSDIS "help desk" or those limited to simple catalog searches. We also have not considered the needs of non&endash;specialists but note that lawyers and insurers are likely to be aggressive users of EOSDIS as they now are amongst the heaviest users of NOAA's environmental data services.

A number of the most interesting scenarios end prematurely, just as the relevant data are identified, ready for copying to tape and mailing to the user. Not to consider what happens after delivery is to take too narrow a view of EOSDIS. Our N+2 DAAC architecture includes the userís environment and DBMS as integral parts of the data system. Scenarios should not end with data on tape but with data intimately linked through a DBMS to the application software of scientists.

Table A4-1: Scenarios Studied in This Report

Scenario

Section

Remarks

1 Literature review

Omitted

2 Lightning data

Omitted

3 Grassland vegetation

Omitted

4 Forest model

Omitted

5 Electronic journal

omitted

6 Watershed study

Omitted

Similar to 10a

7 Multi-sensor surface model

4.4.1

8 Global biomass burning

4.4.2

No query

9 Student thesis using NDVI

Omitted

10a Hydrologic models

4.4.3

10b Cloud models

4.4.4

No query

11a Sea ice

4.4.5

11b Water content of snow pack

Omitted

11c Radiative flux over ice

Omitted

12 Use of assimilated GCM data

4.4.6

13 Radiation budget review

Omitted

14 Snow mapping

Omitted

Similar to Project Sequoia 2000, A3.2

15 Lightning

4.4.7

16 Air-ocean interaction

4.4.8

18 Chesapeake watershed

Omitted

Similar to 10a

19 Bio-geochemical fluxes

4.4.9

20 Global water cycle

Omitted

Similar to Project Sequoia 2000, A3.3

22a Real-time volcano watch

Omitted

Similar to 8

22b Andes tectonics

Omitted

23a Stratosphere chemistry

4.4.10

23b Precipitation algorithm

4.4.11

24 Ocean color

4.4.12

Another example of incompleteness is the neglect, at least in the workbook available to us, of the computing load on the scientist's machines. There are scenarios that, while running, appear to be as demanding as a typical "push" scenario of the product generation system. These should be considered in the "flop" budget, even if the principal effort is accomplished outside the core system.

The "push-like" scenarios involve numerous analysis steps and fitting such procedural analysis to the query model gives too much emphasis to data relationships and not enough to algorithms and processes. A data flow model would help show how the processes are joined to each other, to data stores and to the DBMS. We have not done this for any of the ECS scenarios, but four examples of this approach are given in Appendix 3. Data flow models are also an excellent basis for estimating performance requirements of non-DBMS operations.

A4.2 Background

ECS designers have collected user requirements from the pull community and the push community. The former are predominantly instrument teams involved with the creation of "standard" products. The latter are a more diverse clientele wishing ad hoc access to the data. The requirements of both communities are combined to estimate performance requirements of the system.

The apportionment of ECS capabilities to the needs of the two communities has fluctuated widely. In May 1994 the Independent Architecture groups were told that the pull load would be 1/9 the push load. Subsequently, we were told informally that now (August 1994) the pull load is estimated to be 5 times the push load. Finally, the Science Requirement Summary merely states that pull capacity will be 6 times the push capacity.

These changes have come from several factors, but perhaps the biggest problem is the inherent difficulty of quantifying the disparate interests and demographics of the pull community. Because the push processing is algorithmic and repetitive it is more susceptible to metric analysis than the DBMS-intensive needs of the sporadic pull users.

Nevertheless, almost from the inception of the project, user scenarios have been a fundamental tool in EOSDIS design. Most recently, the ECS team has devoted a large effort to eliciting from pull users the data they will want to access and uses they intent to make of them. As detailed in the workbook, the effort has proceeded through distinct stages. In the beginning, approximately 50 scientists were interviewed. Half the interviews were reduced to a step-wise description of the data items and the processing to be applied.

The final stage was to describe the impact of each step on a model database. For this purpose, an 18-tiered data pyramid was defined and a methodology developed to relate the data needs of the scenarios to the tiers. Presumably, this mapping is a key element by which ECS developers are estimating workloads for the scenarios, but we have not seen those results.

The ECS approach to data modeling is strongly hierarchical and retains vestiges of the waning data/metadata attitude that figured in the concerns expressed about the ECS architecture. At the bottom of the pyramid are the low-level (level 0, level 1A, etc.) raw data. In the middle are the higher-level "processed" data. At the top are the data catalogues. In essence, the "library" model of IMS V.0 has been grafted on top of the "product levels" of the product generation system. This topology inevitably perpetuates the first-find-then-order approach towards data access.

A4.3 Project Sequoia 2000 Approach

Our approach to understanding the scenarios was to invent an EOSDIS database, then express the scenarios as database queries. The database is much more detailed than that modeled by ECS developers and bears little resemblance to their data pyramid. This leads to a concise and elegant way of casting the scenarios into a formal language (SQL-*) that is amenable to quantitative analysis. It also serves to hide irrelevant detail and reveal overarching features that figure in more than one scenario. Our data model is very rudimentary. It includes a few abstract data types, a collection of functions that act on these types, and some simple tables (the schema) that bundle the abstract types with ancillary data.

Compared to the ECS modeling, this approach leads us to de-emphasize some scenarios and to highlight particularly arduous ones. First, chauffeur-driven scenarios predicated on experts at a DAAC assisting the scientist have been passed-over; these involve different considerations. Secondly the catalog-searching prologues have not been expressed as distinct operations for they become a natural part of the where clauses of our queries. Finally, the data-order and data-transmission parts of the scenarios have also been eliminated as separate sets of operations. These, too, have been embedded in queries on the assumption that data will move electronically over a local-area and wide-area network with suitable bandwidth.

Many scenarios have steps requiring the display of "browse" images, and this activity is prominent in our queries. We don't know exactly what a browse image is but propose some of possibilities. The scenarios also assume that algorithm descriptions and "QA statistics" will be online, but we have omitted their use in our queries. There is a massive effort underway to describe the algorithms of the push system. We think it will be difficult to make the information under development intelligible to a wide range of users through online searches. The QA statistics envisioned by ECS designers have not been described to us. If they are simple numeric quantities, they will be easy to incorporate in queries. The harder part will be to explain to users what they are and how to use them intelligently.

The notion of feature extraction is rare in the scenarios (one user even expresses distrust of any cloudiness estimates that might exist the database). A concerted effort in this area could well revolutionize the accessibility of EOSDIS data. Some features, such as geographic and cartographic items (e.g., mountains of the world), are static and, once assembled, would never change. Others might change from day-to-day and season-to-season and would need to be captured during push processing, or by accessing results of subsequent analysis. An automatic feature-recognition system is part of the Sequoia scenario described in Section A3.3.

A4.3.1 Data Types

The SQL92 standard supports numbers, character strings, time, and time intervals, and most SQL systems have been extended to support arrays and class libraries for image data and text data. ECS needs to go further, as we show with the abstract types introduced below. The simpler ones (e.g., point and raster) are already supported by some existing databases. More elaborate type libraries for spatial data are being implemented by two enterprises, Open Geographical Information Systems (OGIS) and Petrochemical Open Systems Corporation (POSC).

Multispectral images and fields defined over space-time grids exemplify data of high dimension. Methods of describing data on regular and irregular grids are well known (e.g., netCDF), and schemas for these complex data are a subject of Project Sequoia 2000 research. None of the work we are aware of adequately addresses the subject of data precision and accuracy. These attributes are essential parts of the DBMS.

Several abstract data types are formally defined below and are used throughout the examples. These are conceptual types only and have no significance beyond this appendix. The later part of the section discusses other kinds of data but without formally defining them.

Types for Objects with Low Spatial Dimension

type point

The coordinates of a place in two- or three-dimensional space.

type scalar

A value assigned to a point.

type vector

A directed line segment in two- or three-dimensional space. It is also an ordered list of variables (e.g., the wavelengths of a multispectral imaging system or the scalar variables of a GCM).

type raster

Values of a two-dimensional array of pixels recorded by an imaging system. The x, y values are equally spaced in sensor coordinates. After being mapped onto geographic coordinates, they will be unequally spaced in latitude and longitude.

type Ngrid

A space-grid in two or three dimensions. It may be regular (e.g., a raster image) or irregular (GCM latitude, longitude, altitude grid).

type rectangle

A surface area defined by two corners. In a curved geometry, four corners and other information may be needed.

type polygon

A sequence of points that, when connected, define a cyclic graph defining a subsurface.

Types for Objects with Higher Dimension

EOSDIS queries will frequently involve values defined over two (and three) space dimensions and time. There is a substantial literature on appropriate data structures.

Example 1: GCM data can be viewed as an n-element field vector defined over three space dimensions and time. As many as 100 field variables may exist at each grid point, and the entire combination is a five-dimension object.

Example 2: Multispectral imagery can be viewed as an n-element vector of electromagnetic wave-lengths, lambda, defined over two space dimensions and time. In image coordinates, the spatial variable is equally spaced pixel units. In map coordinates, the spatial variable is unequally spaced geographic locations. There can be several score distinct lambdas. An important part of the product generation system is to combine data of different wavelengths to derive a physically meaningful variable, such as sea surface temperature (SST). Derived products of this sort can be viewed simply as extensions of the lambda vector.

Logical models of high-dimensional data can follow an agglutination principle, lumping all "coordinates" together into a single object, or a dispersion principle, spreading the information across objects of lower dimension. The only difference between these models is one of naming. Elements of agglutinated objects are referenced by subscripts; elements of dispersed objects are referenced by name. In the case of multispectral data, one can write {MSD[lambda][x][y][t]}, {MSD[x][y][t], lambda} and all the obvious permutations. The following definitions view measurement vectors as a type of coordinate but treat time as a separate quantity.

type 3matrix

This type is used for sets of variables defined over a two-dimensional surface. An example is the radiance at a collection of wavelengths recorded by a multispectral imaging system. This type would also be appropriate for many data products derived from image data. When the space grid is irregular, it can also hold the mapping from pixel coordinates to another system, e.g., latitude, longitude.

type 4matrix

This type is used for sets of variables defined throughout a volume of three-dimensional space. An example is the field variables of an assimilation model at a fixed instant of time.

Complex Types

The following conceptual types simplify the queries

type lineage

This type describes the antecedents of a derived data product, including the precursor items and the algorithms joining them.

type map

This type describes a map projection. It has the information needed to recover the exact latitude and longitude of data stored in a matrix.

type orbit

This type describes the position of a satellite over time and the pointing directions of all its sensors. Orbit has at least enough information to provide the center (latitude and longitude) of a sensor's field-of-view, given the time, as well as the accuracy of the estimate.

Sounding Systems

Sounding systems, e.g., TOPEX/Poseidon, measure properties along a path rather than a swath. This type of data can be modeled in several ways. Each property can be saved as a time-series and the satellite orbit used to relate time to latitude and longitude. Alternatively, the ground track can be saved as an open polygon, with measurements defined at the vertices.

Time

The familiar problem of representing time appropriately is compounded with satellite data. The specifications of simultaneity and duration are affected by the motion of the satellite and, for scanners, the sweeping of the antenna over the earth. These two effects, described below, are ignored in the queries. The EOSDIS system will need to implement the appropriate SQL extension to handle them.

Scanning imagers gather data one pixel at a time (at each electromagnetic frequency) as the sensor sweeps perpendicular to the track and as the satellite's motion carries the scanner along track. Scanned data accumulate as an endless ribbon, but the ribbon is cut into convenient pages, usually with about the same number of pixels in both coordinates. With scanned data, no 2 pixels are simultaneous. For example, if the image size (on the Earth) is 2 degrees, and the orbital period is 90 minutes, it takes 30 seconds to accumulate each image.

Sensors that collect data in swaths eventually tile most of the entire Earth. For a satellite in polar orbit this takes at least 12 hours. At lesser inclinations the tiling can take a week.

Cartographic Data

Cartographic data are important to many of the queries. Vector cartography, e.g., continent outlines and political boundaries, are used for data selection and as base-maps for data display. Land elevation is important for the science of several scenarios, and the users assume the availability of a digital elevation model (DEM), a two-dimensional matrix of land elevations (or ocean depths) projected in some coordinate system.

ETOPO5 is a familiar DEM, available on CD-ROM from the U.S. Geological Survey. The resolution of ETOPO5 is 5 km. DEMs with resolution in the 10s of meters have been developed but are not so easily obtainable. New DEMs will be developed from EOS's ASTER data.

A4.3.2 Functions and Operators

The following functions and operators are introduced.

Display(raster [, other parameters]) returns null

This function shows a raster image on a graphics screen. It also accepts point and vector data, casting them to a raster for display.

Covers(polygon, polygon) returns float

This function returns the fraction of the first polygon, which is covered by the second polygon. With a suitable casting function, the second variable can also be type raster.

Slice(Nmatrix [, other parameters]) returns raster

This function returns a two-dimensional slice along one of the coordinates of the specified data matrix. For image data, this is the way to obtain the raster for a particular lambda. For geophysical fields in three-dimensional space, this is the way to get a field variable along a surface, e.g., sea surface temperature.

Cut(Nmatrix [, other parameters]) returns *matrix

This function extracts a sub-matrix of data from a larger matrix. The topologic dimension of the data is not changed. The "other parameters" define the limits along the coordinate axes.

The Cut() and Slice() functions are simple forms of netCDF's Hyperslab operator, which extracts an arbitrary subspace from an Nmatrix of data.

Browse(3matrix [, other parameters]) returns raster

This forms a browse image from a multispectral image. The following algorithms are possible, and they can be chained:

  1. Smooth over lambda.
  2. For one lambda, reduce pixel resolution, e.g., from gray scale to black and white.
  3. Combine several lambda values to create a false-color image.
  4. For one lambda, smooth pixels. There are two possibilities: Select one pixel out of n (e.g., 1 for 9), and apply a Gaussian filter then decimate.

The ECS data pyramid indicates that "browse" data are to be computed as part of the product generation (push) processing. If a product is typically not used before the computation algorithm is updated, or if it is cheaper to re-compute the product rather than store it, then one should build the product on demand rather than pre-compute it. This eager/lazy approach to push processing is one aspect of optimizing the use of ECS resources. It moves push processing into the pull phase. It is especially attractive if computation is inexpensive compared to storage, or if algorithms change frequently.

Write() returns null

This function copies data from ECS to an external computing system.

In

This operator joins two attributes of type "vector" and is used to express set membership. The variable to the right attribute is usually the set of attributes defining the "physics" axis of a data cube. The variable to the left is the subset of these attributes targeted by the query.

Example

     where [albedo] in M.variables
     and [streams, rainfall] in P.variables

Between and Overlaps

SQL operates on Universal Time (UTC) and displays either GMT or civil time. SQL supports time precise to years, months, days, hours, minutes, seconds, or fractions of seconds (down to picoseconds). Intervals are built by defining two time endpoints and again have a precision. Time can be in an interval, and one can ask if two intervals overlap.

Example

     and M.time between t1 and t2 

If interval = (t1, t2) = t2 - t1, then we can have

Example

     and M.time in interval

Or if M.time is also an interval, then we can have

Example

     and M.time overlaps interval

A4.3.3 Schema

Our scenario queries are based on the following schema. There are obvious similarities between the tables of the schema and the "levels" of the ECS product pyramid.

Point Data

table pointData (
/* This table illustrates point data. In general numerous measurements 
at a point are saved in a single data vector, so long as they are 
simultaneous within some time precision and adjacent within some space 
precision */
values      vector,      /* the measurements                          */
variables   vector,	/* the independent physics variables         */
place       point,	 /* latitude, longitude of the measurement     */
time        time,	  /*time of measurement                         */
duration    time	   /* the measure of simultaneity                */
)

Image Data

table multiBandImage  (
/* This table holds multiband data in image coordinates. It might also 
be suitable to save multiband data after pixels have been mapped to a 
geographic coordinate system. */
values     3matrix,	/*depending on lambda, x-pixel, y-pixel  */
variables  vector,       /*the independent physics variables      */
grid       2grid,        /*the  pixel variables                   */ 
platform   text[],       /*e.g., AM1, Landsat                     */ 
sensor     text[],       /*e.g., AVHRR, SSM/I, MISR               */  
time       interval,     /*start and stop time of measurement     */ 
nadir      point,        /*latitude, longitude of image center    */ 
boundary   rectangle,    /*corners of a bounding box              */ 
area       float,        /*square km of bounding box              */ 
orbit      orbit         /*satellite and sensor navigation        */
/* a function will geo-locate images and pixels from "orbit" data */)
 

SAR Data

table sarImage	( 
/* This table holds images formed by Synthetic Aperture Radar.   This table appears in the scenario
discussed in section 4.5.*/
/*attributes are not defined*/ 
)

Sounding Data

table sounderProfile	( 
/* This table holds sounding
data, such as altimeter measurements,  taken along the track.
 This table appears in the scenario  discussed in section 4.8
*/
/*attributes are not defined*/ 
)
 

Calculated Data Products

Several hundred products will be calculated from the remote sensing instruments. The following table holds those defined over the Earth's surface.

table mappedSurfaceProduct	( 
/* This table holds derived products defined on a standard space- time grid. */
values      3matrix,	/*depending on field-name, latitude, longitude*/
variables   vector,       /*physics variables, e.g. NDVI, SST, Albedo   */
grid        2grid,        /*the independent space variables             */
sensor      text[],       /*e.g., AVHRR, SSM/I, MISR                    */
projection  map,          /*e.g., Mercator, 1:20M, ...                  */
time        interval,     /*start and stop time of tile measurement     */
paternity   lineage      /*all the up-stream products and algorithms   */
)

This table holds GCM fields

table gcmFields	( 
/* This table holds scalar geophysical fields defined over three  space dimensions.  They will be GCM
model fields or assimilated  fields */ 
values      4matrix,	/*depending on field-name, lat., lon. alt.    */
variables   vector,       /*physics vars, e.g., U-wind, humidity        */
grid        3grid,        /*the independent space variables             */
authority   char[],       /*provider of the (assimilated) data          */
time        interval      /*start and stop time of tile measurement     */
)

A4.3.4 Representative Queries

Queries manipulate the database by invoking the appropriate data type functions to extract results. Following are some examples that illustrate this concept. The queries are written in a loose, pseudo-SQL to show the general style. Software developers would compose code like this, embedding it in a call interface of the application language. For routine research, Earth scientists would be presented with a graphical interface, which would accept the parameters of the queries but hide the details of grammar and syntax. However, many Earth scientists would find it convenient to learn the rudiments of the query language so they could execute simple ad hoc queries interactively.

As a brief tutorial, SQL has a data definition language for creating schemas, domains, tables, views, data constraints, and stored data procedures and triggers. It also has a data manipulation language for selecting, inserting, updating and deleting data. We are primarily concerned with data selection here. The basic SQL select statement has the syntax:

select <data>
    from <list of tables>
    where <predicates>

Each table name in the table list is typically accompanied by a following character, which is used as its abbreviation. So, for example, in the query below, mentioning multiBandImage M allows us to talk about M.nadir in the rest of the query rather than having to write multiBandImage.nadir.

Example 1a

select  Display(Browse(Cut(M.values, ..), ..), ..)
    from multiBandImage M
    where M.sensor = Landsat_TM
        and Contains(fixed_5_degree_square, M.nadir)
        and M.time overlaps (t1, t2)
 

This query displays a sequence of browse images on the user's screen for a particular sensor and a specified time interval, provided the center of the image is inside a particular geographic region. The parameters of the Browse function will determine how the multiband data are collapsed onto a single raster.

Example 1b

select  Display(Browse(Cut(M.values, ..), ..), ..)
    from multiBandImage M
    where M.sensor = Landsat_TM
        and Covers(fixed_5_degree_square, M.boundary) > 0.75
        and M.time overlaps (t1, t2)

This query has a more precise where clause. Only images that cover 75% or more of the specified five-degree square are selected.

Example 2

select Write(Cut(M.*, ..), path_to_my_computer, ..)
    from multiBandImage M
    where M.sensor = Landsat_TM
        and rest-of-where-clause

This query writes image data to a file on the user's computer network. A direct database-to-database copy would be employed if there is a DBMS on one of the network hosts. It is assumed in this example that all pertinent attributes are contained in Table M. Otherwise, additional Tables will need to be "joined."

Example 3

select Display(Cut(M.values, ..), ..)
       Display(Cut(P.values, ..), ..)
    from multiBandImage M, pointData P
    where M.sensor = Landsat_TM
        and Contains(M.boundary, P.location)
        and rest-of-where-clause

This query shows image data and point data that span the same part of the Earth.

Example 4

select Write(Cut(M.*, ..), path_to_my_computer, ..)
       Write(Cut(MS.*, ..), path_to_my_computer, ..) 
    from multiBandImage M, mappedSurfaceProducts MS
    where M.sensor = *
        and [M.OID] in MS.paternity 
        and [NDVI] in MS.variables 
        and MS.time overlaps (t1, t2) 
        and rest-of-where-clause

The intent of this query is to obtain all the antecedent data products of every NDVI product calculated over a certain space-time interval.

A4.4 Scenarios Cast into Queries

The types, functions, and schema defined above are used to express the intent (as we understand it) of the hard parts of the ECS User Scenarios. Scenario descriptions devote many steps to look-up operations in catalogues. These are not expected to be computationally demanding and so are generally ignored. However, many of these steps are included implicitly in the where clauses of the following examples.

A4.4.1 Scenario 7

This scenario describes a research program to process, at the user's institution, data spanning a 5-degree square of the U.S. recorded by several high-resolution sensors over a multi-week interval. Data with varying spectral, temporal, and spatial resolution will be selected by visual examination of their "browse" images. Full level 1 images will be copied to the user's facility and analyzed to obtain an optimum surface model. All geo-location will be done by the user. The easier steps of the scenario are omitted.

Example 1

select Display(Browse(Cut(M.values, ..), ..), ..) 
    from multiBandImage M
    where M.sensor = Landsat_TM
        and Contains(fixed_5_degree_square, M.nadir)
        and M.time overlaps (t1, t2)

The query displays a set of low-resolution pieces of Landsat imagery, which are screened for suitability. Topography is important in this scenario, so the browse images might need to be draped over a DEM. The browse function defines which bands or combination of bands to display. The browse image may have been pre-computed, as noted previously.

Example 2

select Write(Cut(M.*, ..), path_to_my_computer, ..) 
    from multiBandImage M
    where M.sensor = Landsat_TM
        and rest-of-where-clause

After visual examination, the cut region from a full-resolution Landsat image is copied to the user's computer system.

Example 3

select Display(Browse(Cut(M.values, ..), ..), ..)
    from multiBandImage M, multiBandImage M1
    where M.sensor = AVHRR_LAC
        and M1.sensor = Landsat_TM
        and M.time = (M1.time.start -30 days, M1.time.start +30 days)
        and M1.time is chosen from Query 2

After examining the browse images returned from Query 1, a "best" image is selected. (As an intermediate step, the user might wish to show the actual data in all bands for this day, but that is an easy query). Query 3 shows data that are coincident in space and time recorded on another sensor system, the AVHRR Local Area Coverage channels. Browse images of the AVHRR data are reviewed for a month either side of the day the best Landsat image was recorded.

Example 4

select Write(Cut(M.*, ..), path_to_my_computer, ..) 
    from multiBandImage M
    where M.sensor = AVHRR_LAC
        and M.time is chosen from Query 3

After examination of Example 3, full resolution AVHRR_LAC data, spanning the cut area and for a number of consecutive days within the time span about the instant of the "best" Landsat image, are copied to the user's institution.

Data from other sensing systems will be selected in a similar manner. The issues of tiling data from different images at different times needs further study. If the review of the browse images shows puzzling features, the user presumably will look first at the full-resolution data, then at the processing lineage. The ECS scenario assumes the user will obtain data in sensor coordinates. Part of the subsequent analysis will involve the "orbit" type to make a provisional geo-location. Detailed analysis may involve precise geo-location using targets on the Earth. Much of the analysis at the scientist's institution could presumably benefit from a DBMS approach, but this aspect of the work is not discussed.

A4.4.2 Scenario 8

This scenario involves the development of a monitoring system to watch continuously the 50 million 1-km squares of the Earth that might burn. Fires are tracked with respect to burn models (15-km or 50-km grid size). Model parameters include initial fuel loading and meteorology. This is a scale-up of research now underway for Africa.

The scenario as described emphasizes the work the investigator would do to become familiar with EOS products, product quality, and product generation algorithms. It is assumed this information will be available for online searching. The culmination of the scenario description is the submission of a standing order for level 1B and level 2 products. The hardest parts of the scenario -- development of the real-time system and then its routine operation -- take place on the user's computers, but this work is not described. The user will obtain some data in sensor coordinates and, thus, will utilize the orbit data type, at least, to geo-register them. Much of the analysis and operation at the user's institution could presumably benefit from a DBMS approach, but this aspect of the work is not discussed.

These are important elements of the project:

This is a complex scenario, and we have not tried to capture its essence with representative queries. It has a distinct push-processing flavor but may not be well enough developed to be amenable to simulation analysis.

A4.4.3 Scenario 10a

This scenario involves hydrologic modeling of watersheds for study areas as small as 20 km on a side to as large as 20 degrees a side. The scenario illustrates the need to mix remote-sensing data with ground-based measurements and to merge EOSDIS data with the user's own parochial data.

Example 1

select Display(Browse(Cut(M.values, ..), ..), ..)
       Display(Cut(P.values, ..),  )
    from mappedSurfaceProduct M, pointData P
    where M.sensor = MODIS 
        and [albedo] in M.variables  
        and [streams, rainfall] in P.variables  
        and Contains(M.boundary, P.location)
        and M.time overlaps (t1, t2)  
        and P.time in (t1, t2)  
        and rest-of-where-clause
 

This query is interesting because the user wants to discover the availability of raster, vector, and point data pertaining to variables that are important in hydrologic modeling. All data would be plotted over a base map of the study area. After review of the browse displays and other pertinent information, such as quality assessments, a specific list of products will be copied to the user's facility for analysis. Data at the highest spatial resolution and for a decade in time are needed. The EOSDIS data will be combined with the user's own data set to develop the hydrologic model. No details are given about this analysis. Much of the analysis and operation at the user's institution could presumably benefit from a DBMS approach, but this aspect of the work is not discussed.

A4.4.4 Scenario 10b

This scenario is a study of cloud properties using EOS sensor data, assimilated atmospheric models, and in situ ARM measurements. The study area is of order 10 degrees on a side. There are two novel aspects to the scenario:

The user is familiar with the available data (CERES products) and knows the specific products to be copied to his home computer network. Data at the highest resolution are required. Several GB will be copied each data-day from EOSDIS, for a total volume of approximately 1 TB/year. The data will be jointly analyzed with the in situ readings, the assimilated models, and the researcher's own cloud models. The hard parts of the scenario take place at the user's computing facility, not the EOS data system.

A4.4.5 Scenario 11a

This scenario, which has affinities to Scenario 10a, involves joint analysis of in situ data and Synthetic Aperture Radar (SAR) data for a 500 x 500 km region in the Arctic. The goal is to study the response of sea ice to changes in atmospheric pressure. It is notable that the query does not involve ìassimilatedî atmospheric pressure. The query features the need to display small parts of sequential images as a movie loop. Approximately 100 GB of data will be retrieved for the study.

Example 1

select Display(Cut(P.values, ..),..)
    from pointData P
    where [Buoys, IceCores, MetStations, etc.] in P.variables
        and Contains(fixed_500km_region, P.location)
        and P.time overlaps (t1, t2) 
        and rest-of-where-clause

This is a simple request to display point and vector data on a base map. Other display modes, such as contour plots, histograms, etc., would probably be found useful and could be invoked in a similar manner.

Example 2

select  Display(Browse(Cut(M.values, ..), ..), ..)
    from SarImage S
    where S.sensor = SAR
        and Covers(box_around_my_point, S.boundary) = 1.0
        and S.time overlaps (t1, t2)

Having found interesting spots from perusal of the results of Example 1, the user now examines 40 low-resolution images taken with a synthetic aperture radar (10 scenes, each with 4 bands) over a number of specific locations. A new table SarImage is introduced with a structure similar to multiBandImage.

Example 3

select	Write(Cut(S.*, ..), path_to_my_computer, ..)  
            Write(Cut(P.*, ..), path_to_my_computer, ..)  
    from SarImage S, pointData P 
    where S.sensor = SAR  
        and [Buoys, IceCores, etc.] in P.variables  
        and S.time between t1 and t1 + 1 year  
        and P.time between t1 and t1 + 1 year  
        and rest-of-where-clause
select 	Write(Cut(M1.*, ..), path_to_my_computer, ..)    
             Write(Cut(M2.*, ..), path_to_my_computer, ..)  
    from mappedSurfaceProduct M1, modelOutput M2 
    where [Ice_concentration] in M1.variables  
        and [Ice_model_output] in M2.variables  
        and M1.time between t1 and t1 + 1 year  
        and M2.time between t1 and t1 + 1 year  
        and rest-of-where-clause

Point data and SAR data covering the study area and lasting for 1 year are copied to the user's computer facility. A similar query is submitted to get SSM/I ice concentration maps and ice growth model outputs. The ice concentration is assumed to be in the mappedSurfaceProduct table. A new table modelOutput is introduced to hold the ice model output.

A4.4.6 Scenario 12

This scenario is a study of precipitation by a GCM researcher. The investigator wants to acquire assimilated (level 4) wind and temperature fields, and copy them to his own computing system for analysis.

Example 1

select Write(G.*, path_to_my_computer, ..)  
    from gcmFields G
    where [u-wind, v-wind, temperature] in G.variables
        and [1_degree_mercator] in G.map
        and G.grid = 20_levels
        and max(G.time) - min(G.time) > 30 days
        and G.timeStep between 6 hrs and 10 hrs
        and G.quality is TBD
        and G.statistics is TBD

It is not clear in the scenario whether the user only will accept data that satisfies the specified space grid or wishes data to be interpolated onto the grid. Assimilated data sets will be rare enough that users desiring this type of data will know enough about them that the qualifiers will probably be superfluous.

The scenario assumes that data features (quality and statistics) are available to help winnow the selection. It is implied that these are stored quantities derived from prior analysis. Should they be computed on-the-fly? What intuition does the investigator bring to understanding what the features mean? Would it not be better for the investigator to design his own quality/statistics algorithms and include them as stored procedures?

Example 2

select	Write(Cut(M.*, ..), path_to_my_computer, ..) 
    from multiBandImage M, gcmField G  
    where M.OID in G.paternity
        and M.time overlaps (G.time.start -15 days, G.time.end +15 days)

The investigator observes a feature from perusal of the coarse-scale assimilated data and wishes to tunnel into the database to recover all the antecedents of the level 4 data pertaining to a region covering the feature. As posed, the query is ambiguous: It even appears on the surface to be virtually impossible. Data at level 4 are the result of sophisticated analyses of multiple data sets, often involving in situ measurements on Earth. These data have probably been provided by several different investigators (NMC, ECMWF, and DAO) who will have different standards for maintaining an audit trail. Even if the underlying readings are available in the EOSDIS database, it is a daunting thought to track what may have happened to them throughout the task of forming the level 4 product.

Another important point is raised by this query. Level 1B data will have been calibrated for sensor response but not necessarily mapped onto a regular geographic coordinate system. To register these data, the user will require much of the cartographic software of the push system.

A4.4.7 Scenario 15

This scenario is a study of lightning over regions of the U.S. It features searching, plotting, and contouring point data. Sequences of plots will be run as a slide-show, and a way to navigate within the sequence of pictures is desired.

Example 1

select Display(P.values, ..)
    from  pointData P  
    where P.variables = Lightning
        and Contains(Arizona, P.location)
        and P.time overlaps (t1, t2)
        and rest-of-where-clause
 

A number of systems record the time and location of lightning strikes. The user wishes to make slide shows mapping the strike's locations with full zoom and pan capability. This scenario can be implemented today with a DBMS and IDL. The scenario is pertinent to other kinds of point data.

A4.4.8 Scenario 16

This scenario is a study of the role of atmospheric forcing in the circulation of the southern ocean. It is an extrapolation of work underway using TOPEX/Poseidon altimeter data. The scenario has the following novel features

select Display(Cut(S.values, ..), ..)  
    from soundingDataS
    where S.sensor = TOPEX/Poseidon
        and Contains(SouthernOcean, S.nadir)
        and S.time overlaps (t1, t2)
        and rest-of-where-clause

This query displays the tracks of a satellite sounding system, as well as the fields measured by the system, in this case ocean elevation, wave height, etc.

Example 2

select Display(EOF(DeTide(S.values), ..), ..)  
    from soundingData S
    where S.sensor = TOPEX/Poseidon
        and Contains(SouthernOcean, S.nadir)
        and S.time overlaps (t1, t2)
        and rest-of-where-clause

Data that passes the previous query are analyzed at the user's computer system. The DeTide function removes the effect of ocean tides from the sea-surface elevation. The EOF function expands the corrected elevations into orthogonal functions. It may be inappropriate to use a SQL-like language for what appears to be a job-control problem.

A4.4.9 Scenario 19

This scenario uses EOS and other data to develop models of carbon dioxide exchange between the air and ocean, and the transport of carbon dioxide within the ocean. A novel feature of the scenario is that the model will be installed at a DAAC and run remotely. This scenario can be better described with a data flow model.

Example 1

select  Display(Browse(Cut(M.values, ..), ..), ..)
    from multiBandImage M
    where M.sensor = MODIS
        and [CO2, photoChemical, etc.] in M.variables
        and Contains(area-of-Atlantic-ocean, M.nadir)
        and M.time overlaps (t1, t1 + 1 month)

Several kinds of raster data at several browse resolutions will be scanned in this manner to identify a few 5-degree squares that are candidates for continuous monitoring.

Example 2

select  Display(UserAlgorithm(M.values, ..), ..)
    from multiBandImage M
    where M.sensor = MODIS
        and [CO2, photoChemical, etc.] in M.variables
        and Contains(some-part-of-Atlantic, M.nadir)
        and M.time overlaps (t1, t2)

Carbon dioxide concentration is modeled using the full-resolution spectral data as well as wind and temperature fields. The scenario does not specify the source of these data.

A4.4.10 Scenario 23a

This scenario involves atmospheric chemistry. The researcher wants access to assimilated wind fields to model atmospheric transport of chemical species and compare model results to EOS measurements of species and ground-based meteorological measurements.

One notable feature of the scenario is the desire to pose queries from applications and receive results into the application. Another feature is the need to access data stored outside EOSDIS. Only the first part of the query is discussed.

Example 1 (Executed from an Application)

select G.uwind, G.vwind into a-data-structure
    from gcmFields G
    where G.source = NMC 
        and G.grid = SomeGrid
        and G.time overlaps (t1, t2)
        and rest-of-where-clause

The crux of the query is easy uploading of selected data into the calling application (a-data-structure). If more than one data element is returned, a cursor (iterator) is defined, and the application program makes an open-fetch-fetch-...-fetch-close sequence of calls on the DBMS to get each data item. This approach is modeled on the SAIC Generic Database Interface. It would be convenient if GUI query tools let the scientist fine-tune the query parameters interactively.

Wind is not the only assimilated field of interest, and NMC is not the only possible data producer (DAO and ECMWF are also of interest). To obtain NMC data may involve the re-direction of the query from EOSDIS to a NOAA data system.

A4.4.11 Scenario 23b

This scenario involves downloading EOSDIS data to develop a new algorithm to estimate precipitation.

Example 1

select	Display(Browse(Cut(M1.values, ..), ..), ..)  
            Display(Browse(Cut(M2.values, ..), ..), ..) 
    from multiBandImage M1, multiBandImage M2 
    where M1.sensor = AVHRR  
        and M2.sensor = SSM/S  
        and abs(M1.time.start - M2.time.start) < 1 day  
        and Covers( Circle(M1.nadir, 1 degree),
                    Circle(M2.nadir, 1 degree) ) = 1.0

The user wishes to scan low-resolution images from several instrument systems that have viewed the same patch of the Earth at nearly the same time. The scenario asks for three coincident images, but the query has been simplified to find two.

Example 2

select  Display(Cut(P.values, ..), ..) 
    from pointData P  
    where Contains(Query-1-polygon, P.location)
        and [Precipitation] in P.variables
        and Precipitation > 1"/day
        and P.time overlaps (t1, t2)
        and rest-of-where-clause

Ground-truth data, which are required for testing algorithms, will be obtained for the time-space interval chosen from scanning the images returned by the first query. It is assumed that many of the pertinent precipitation measurements will be save on external (non-EOSDIS) systems.

Example 3

select      Write(Cut(M.*, ..), path_to_my_computer, ..) 
            Write(Cut(P.*, ..), path_to_my_computer, ..)
    from multiBandImage M, pointData P
    where M.sensor = Query-1-sensors
        and P.location = Query-2 locations 
        and M.time overlaps Query-1-time
        and P.time overlaps Query-2-time
        and rest-of-where-clause

This query copies pertinent image data and precipitation data to the user's computing facility.

A4.4.12 Scenario 24

The objective of this scenario is to develop a 6-year time-series of chlorophyll in the global ocean, obtained by blending all available data. The procedural steps required to blend images are not described. Only the first part of the scenario is discussed.

Example 1

select count(M.OID)
    from multiBandImage M
    where M.time between t1 and t1 + 6 years
        and M.grid has better than 1 km resolution
        and [radiance] in M.variables
        and [known list of sensors] in M.sensors

The purpose of this query is to see what data are available. It is not clear from the rest of the scenario how these level 1 data are utilized.

Example 2

select Display(Browse(M.values, ..), ..) 
    from multiBandImage M
    where Contains(Pacific-Ocean, M.boundary)
        and [MODIS, MERIS] in M.sensors
        and M.time between t1 and t1 + 6 years
        and rest-of-where-clause

Browse images of pertinent data are viewed. The separate images would need to be tiled to show their relationship to the Pacific Basin. There might be 50 tiles each day.

After viewing the browse images, a "time-series analysis" is performed and the results "sequentially examined." Then, equivalent data from the mappedSurfaceProducts table for each sensors is "analyzed" and then "blended."

It is assumed that the first result from analysis and blending of the mapped data has errors. The source of these is sought, the algorithms changed, and the processing repeated.

At the end of the development cycle, a reliable algorithm is running at the user's institution. Each month a new set of chlorophyll data are downloaded and analyzed.