NSF APROPOS Workshop: Asilomar, CA, 15-17 December 1997 Instrumentation for Physical Oceanography: the last two decades and beyond Charles C. Eriksen School of Oceanography, University of Washington Box 357940, Seattle WA 98195-7940 email: [email protected] Abstract Instrumentation for physical oceanography has evolved over the last two decades to take advantage of major advances outside oceanography in computing, communication, and navigation technology fueled by economic pressure. While ship-based techniques remain vital to development, distributed networks of autonomous platforms will take an ever larger role in ocean observations. Satellites will continue to both measure the ocean surface and provide communication links to instruments probing the ocean interior. Progress in meeting the daunting challenge of sampling ocean variability will be made through development of small cheap autonomous instruments that can be used in large numbers. 1 1. Introduction Measuring physical variables in the ocean remains difficult because ocean basins are large compared to the scales over which important processes within them occur and because the ocean is a technically challenging environment for instruments. Historically and at present, most observations of the ocean have relied on ships. Satellite remote sensing techniques remain somewhat “remote” in the sense that they are limited to observing the sea surface. Acoustic techniques, while able to probe the velocity and temperature structure in significant ways, remain limited by somewhat particular sampling strategies. The resolution of physical observations of the ocean remains commonly very coarse. Access to the ocean remains an enormous problem, yet instrument developments in the past couple of decades have made significant progress. While research vessels will continue to be vital to physical oceanography, the future outlook is that a larger share of oceanic observations will be made with diminishing dependence on them. The point of reference for this contribution is the review prepared by Baker(1981) as part of the celebration of Henry Stommel’s sixtieth birthday. Stommel himself authored a paper entitled, “Why do our ideas about the ocean circulation have such a peculiarly dream-like quality” (Stommel, 1954), which has the even more apt alternate title for the subject at hand, “Examples of types of observations that are badly needed to test oceanographic theories.” At the time of Stommel’s essay, most of what was known about oceanographic phenomena had been gathered painstakingly by long trips to sea aboard research vessels. These vessels typically made transects consisting of a sequence of wire casts made at roughly fixed geographic positions. Sections had been collected along various tracks around the world, mostly in regions proximate to the nations sponsoring the research. The nature of physical oceanographic observation remains largely the same, for we remain unable to measure everywhere simultaneously eternally. Chances of radical change to this state of affairs are unlikely without divine intervention. Historically, the underlying premise of most studies has been the simplest assumption that could be made in the face of ignorance: that the ocean is in a steady state. When sufficient resources made possible repeat observation, of course the ocean was found to vary. Recently, a meteorological colleague complained to this author that “Mr. WOCE” had changed his tune completely from wanting to determine the mean circulation of the ocean to investigating its variability. The surprise should be that the meteorologist expected otherwise. We do what we can afford, proceeding (hopefully) from greater to lesser ignorance while assumptions about the state of the ocean become more sophisticated. Oceanography remains a field where phenomenology inspires theory. The development of observing techniques almost inevitably leads to new insights in how the ocean operates. In a discussion of the relationship between meteorology and oceanography, Stommel (1955) remarked that determination of climatological conditions in the atmosphere by means analogous to those used by oceanographers would be to use “half a dozen automobiles and kites to which air sounding instruments were attached and by doing all of their work on dark moonless nights when they couldn’t see what was happening in their medium.” Oceanographic instrumentation has progressed a long way since, but we are still limited to relatively few platforms from which to probe the ocean interior. We’ve learned to use much more than the analog of kites (wire casts). We launch profilers, surface drifters, subsurface drifting floats, probe the ocean with sound, look 2 down on it from space, and even launch autonomous vehicles. But we still rely heavily on the analog of half a dozen automobiles (research ships). They are rarely comfortable and sometimes Figure 1. View of fantail, R/V Melville, during passage of tropical cyclone Reva near Tahiti, March 1983. Oceanographic observations were interrupted, among other activities. ineffective platforms (Figure 1). Meteorology has long benefitted from a global network of observing stations for which there is no oceanographic counterpart. The meteorological network exists largely because of the needs of civil aviation: soundings are most routinely done at the world’s airports. Oceanographers largely do not have the benefit of commerce driving the need for observations. However, the desire to understand climate has fueled the establishment of the beginnings of regular monitoring of the ocean interior (examples are the NOAA Tropical Ocean Array (TAO) and associated Volunteer Observing Ship (VOS) profile measurements). The use of drifting or deliberately steered instruments, particularly profilers, has made the prospect for regular monitoring of the ocean more realistic. As a NOAA program manager recently remarked, “Whatever the measure- 3 ment is, we’ll just assimilate it.” The danger that measurements will be lost into model-dependent assimilated fields is real but inevitable. This article continues by summarizing the state of instrumentation for physical oceanography at about 1980. It then examines the technological changes outside oceanography that have influenced our field in the intervening nearly two decades. A selection of instrument developments that have had significant impacts on oceanography is given next. The last section comments on progress in instrumentation and offers opinions on future developments. 2. A Summary of Instrumentation circa 1980 Baker (1981) noted the contention that oceanographic instruments should be simple and reliable in order to be of most use. This philosophy is not always followed, although it is fair to say that simple and reliable means are also the most affordable and most likely to get wide use. He also pointed out the wisdom of the Norwegian model of ship use, that locally operating medium-sized vessels to great efficiency is more productive than operating one or two large vessels on global scale expeditions. Much has been learned using both regional and global exploration models, although the advent of large oceanographic programs (e.g. the World Ocean Circulation Experiment (WOCE) and the Joint Global Ocean Flux Studies (JGOFS)) has left the U.S. oceanographic fleet with several large vessels and relatively few medium-sized vessels (Class 1 vs. Class 2 vessels in University National Oceanographic Laboratory Schools (UNOLS) nomenclature). The big developments in oceanographic instrumentation previous to 1980 came both due to the involvement of engineers dedicated to instrument development and to commercial developments beyond the narrow confines of the oceanographic world. The principal advances were in electronics, where COSMOS (complementary-symmetry metal-oxide semiconductor) circuitry made possible low power logic to oceanographic measurements. Physical oceanographers luckily can make most relevant measurements electrically, in contrast to the plight of our colleagues in the other branches of oceanography who must gather water and bring it into a laboratory for analysis. While this state of affairs is changing, physical oceanographers have benefitted enormously from this advantage. The advent of low power electronics and high energy density batteries has made possible unattended operation of oceanographic instruments for up to several years. Lithium batteries were quite new in 1980 and subject to rapid degassing and explosion, but allowed instruments to be deployed for much longer than previously possible. Data storage was routinely made on magnetic tape in contrast to earlier instruments that relied on paper tape, photo-sensitive tape or even etching on smoked glass. Low-power electronics allowed modest data manipulation before storage (e.g. vector-averaging of current components by a current meter). To put these developments in context, oceanographers were just starting to use pocket calculators in 1972 and by 1980 some were “playing” with the new “toy” personal computers. While the same COSMOS technology is the backbone of current oceanographic instrumentation, data storage and on-board computation have changed dramatically. Baker’s (1981) first examples of modern instrumentation are deep-sea moorings and current meters. He included pictures of the Vector Averaging Current Meter, the Vector Measuring 4 Current Meter, and the Aanderaa RCM-4 current meter. All three are still in use to date, some without significant changes. Metal sphere, glass ball and syntactic foam floats all were in frequency use two decades ago as were acoustic releases. These have changed relatively little with the exception of releases which have become electronically more sophisticated and more reliable. The basic practice of preparing, shipping, deploying from a ship, recovering, shipping again, and debriefing in situ instruments was largely worked out by 1980 and continues with modest modifications. Neutrally buoyant floats, ballasted to ride on particular pressure surfaces, were a relatively new development in 1980. The major advance was to use the SOFAR (SOund Fixing And Ranging) channel to track these floats once launched from ships. Moored autonomous listening stations were used in modest numbers to obviate the use of shore-connected hydrophones. The “spaghetti diagram” (float tracks through mesoscale eddy fields) had become a familiar fixture of open-ocean studies. Expendable probes for the measure of temperature profiles were made common well before 1980 largely due to military use. The most common of these probes, the ship-launched XBT, is produced in a variety of models suited to different cruising speeds and depth extents. The ease of use and reliability of XBTs made them a favorite for making underway surveys of temperature from military, volunteer observing, and research vessels alike. By about 1980, equipment for automatic digitization of the profiles was available. Baker (1981) reported that the scientific community alone used 65,000 XBTs annually. The aircraft deployable version, the AXBT, dropped from a self-scuttling surface buoy and is still most commonly used by military aircraft monitoring submarine operations. The practice of making traditional wire cast hydrographic stations using Nansen bottles and reversing thermometers had largely disappeared by 1980 in favor of using CTD (conductivity-temperature-depth) probes together with rosette bottle samplers. These systems, where data is transmitted via conducting cable to a recording unit in the ship laboratory, have been the stock-intrade of modern hydrographers since their acceptance. Together with laboratory salinometers, they were shown to be more accurate than reversing thermometers and chemical titration of seawater samples in determining temperature, salinity and density. Casts require a ship to stop for several hours to complete a full ocean depth cast, since lowering rates are limited by wire and sensor package hydrodynamics. By 1980, the use of free-fall instruments to make profile measurements of temperature, salinity, horizontal current, and turbulent dissipation had become relatively common. These probes are meant to act as drogues to horizontal current so, that by tracking them acoustically or by measuring voltages induced by motion through the geomagnetic field, a profile of velocity can be obtained. Acoustic tracking involves setting a network on the sea floor and its survey. The electromagnetic method alone gives shear rather than absolute velocity profiles. Free-fall instruments proved particularly useful in making fine and microstructure measurements because of isolation from motions of the sea surface. Airfoil-type shear probes were just coming in to use by 1980. A single deep-ocean profile with a free-fall instrument typically takes a few hours, since typical fall rates are less than 1 m/s and a ship must maneuver to pick it up after it returns to the sea surface. 5 Baker (1981) noted a number of instruments then under development. The first mentioned are deep sea pressure gauges, meant to be able to monitor long term absolute pressure from the sea floor. Their biggest problem was long term drift at levels above the few part per million stability needed to measure oceanic variability on subinertial time scales. Himself a proponent of these instruments, Baker was hopeful that a solution was forthcoming. He also mentioned bottommounted electric field recorders as a promising method for measuring roughly the depth-averaged horizontal flow. Baker quite correctly pointed to satellite-based techniques as promising for oceanography. Satellite-tracked surface drifters had seen some development, but there remained questions about how well they tagged water parcels, even when their drogues were verified to still be attached. Sea surface temperature images derived from satellite radiometers were still rare in 1980 but provocative for the spatial structure they revealed. SEASAT was launched in 1978 but unfortunately suffered a massive electrical short circuit less than 4 months into its mission. It carried a radar altimeter, a scatterometer, and a synthetic aperture radar in addition to radiometers. One rumor prevalent in the oceanographic community at the time of SEASAT’s demise was that military interests deliberately ordered the satellite disabled because its synthetic aperture radar allowed global tracking of ships at sea. Despite its short mission, SEASAT succeeded in demonstrating the potential contribution of satellites to oceanography, although proponents of these techniques would have to wait a decade or more to see some of them utilized again. Two particular acoustic techniques were being developed by 1980 for application to physical oceanography: Doppler sonar and tomography. Sonar relied on scattering off biota to estimate currents, assuming organisms either did not move relative to water or at least moved little, but randomly with zero mean, so that time and space averages could be used effectively. Tomographic inference of the ocean’s temperature structure was promoted as a means of attaining a desired spatial resolution based geometrically rather than linearly on the number of sensors. 3. Developments outside oceanography Oceanography is a very small concern in societal terms. As such it is reliant on developments in the commercial and military worlds at large for new technology. Most of what oceanographers and ocean engineers do to develop instrumentation is to adopt and adapt the techniques and products developed in other sectors of the economy. The development of low power electronics and high density batteries provided an enormous boost to oceanographic instrumentation in the 1970’s. The advances that have most changed oceanography since 1980 are those in computing, communication, and navigation technology. Oceanographers cannot take credit for but only advantage of them. Computation previous to 1980 was largely done on main-frame machines operated by centralized computing facilities. The era of the punch card was just ending and being able to log on to a computer via a telephone line was a comparatively new luxury. Most oceanographers now have on their desk computers that are orders of magnitude more powerful in all respects than what 6 was available as a shared resource for an entire university two decades ago. The change in computational capabilities has been incorporated into oceanographic instrumentation with some time lag. For example, the microprocessors now being used in what oceanographers think of as fairly sophisticated instruments are primitive versions of what presently is installed in personal computers. Magnetic tape has given way to solid-state memory as a reliable, fast, reliable, low-power method of storing data in instruments. The personal computer industry continues to drive development of ever smaller, faster, lower power components. The camera industry is responsible for miniature memory cards that are enormously useful in oceanographic instruments. A common drawback of laptop computers is that as yet they cannot be operated for the duration of a transcontinental airplane flight. Commercial pressure to solve such problems will certainly bring advances of benefit to oceanographic instrument designers. Before 1980, the entire communication an oceanographic ship had daily with the shore was the transmission of a noon position, the fuel tank level, some meteorological conditions, and a few words about current operations. Communication satellites have rapidly eroded the solitude of going to sea. Already by 1980, the Advanced Technology Satellites, long past their projected lifetimes, were providing voice transmissions and soon afterward data. Anyone with a receiver could listen as ship crew and scientists engaged in conversations of deeply personal nature (the news of everything from births to divorces was communicated). This somewhat primitive system was abandoned in the late 1980’s in favor of INMARSAT telephones. Most oceanographic ships now have the equivalent of a telephone booth installed on them, just as in a shopping mall. The bandwidth is large enough to transfer files relatively quickly, but at significant cost. Ship operators routinely exchange data files for a few minutes daily. The cost is significant, to a degree that nonessential messages are discouraged. (The respondent to this paper asked that a draft not be transmitted manuscript to him at sea because of cost considerations, at least so was the claim.) Communications can realistically be expected to become faster, less expensive, and more global. Large corporations are racing to capture the global satellite cellular telephone market (the first promises operation by September, 1998). High bandwidth mobile Internet/World Wide Web connections are planned by consortia of computer software, telecommunications, and aerospace concerns for the early part of the next century. Oceanographers are left only to think of ways to take advantage of the enormous communication infrastructure that is now being developed. The days of blissful isolation at sea are long over. By 1980, navigation of oceanographic vessels relied, at least in part, on fixes provided by transit satellites. These fixes typically took 20 minutes or so to acquire, at the end of which, a position about that old was reported. While vastly more convenient and accurate than celestial navigation, the effective errors in position were still of order a km or more. The establishment of the Global Positioning System (GPS) by the U.S. Air Force has been of enormous benefit to oceanographers, not only operationally (equipment can be launched at locations known to a few m accuracy), but scientifically (a few meter position accuracy means that ship speed can be estimated to a within a few cm/s over a few minutes). Developed as a means of correcting the course of intercontinental ballistic missiles, GPS has become a navigation standard. A small industry has developed around correcting fixes ex post facto for deliberately introduced noise Receivers have been manufactured in such quantity so as to have made them affordable family gifts to soldiers in the 1991 Gulf War. GPS receivers are sufficiently accurate that they can detect ship motion at sea 7 and be used in pairs to determine the absolute heading of a vessel more accurately that can a gyrocompass. They are so small, light, and inexpensive that a handlebar-mounted model is being advertised as a holiday gift for bicycle enthusiasts. 4. Changes in instrumentation since 1980 Physical oceanographic instrumentation falls into five broad categories: measurement systems tied to ships, those anchored to the sea floor, those propelled by ocean currents, those based in space or on land, and those that are autonomously propelled. Lagrangian instruments (drifters and floats) generally depend on a ship (in some cases an aircraft) for transit to a study region. Moored instruments likewise require ships for transit. The following sections summarize developments in each of these categories a. Anchored measurements Much of the temporal structure of the ocean has been studied with instruments anchored to the sea floor recording physical variables at regular intervals over durations (most profitably) long compared to time scales of variability. The virtue of such measurements is that they describe purely temporal variability. Their drawback is that they are poor at resolving spatial variations unless they are sufficiently numerous. MOORINGS Deep-Sea moorings have changed little since 1980. The anchors and floats are the same, with either jacketed wire rope or braided nylon connecting them. Acoustic releases have become smaller and acoustic codes more secure with the advent of digital electronics, improvements offered by commercial manufacturers. Surface moorings have been developed that can endure heavy seas for long periods while making meteorological measurements (Rudnick et al, 1997). The desire to communicate in situ measurements to the surface has driven the development of inductively coupling signals from instruments into standard wire rope using seawater for the return electrical path (D. Frye, personal communication, 1996). Subsurface moorings have routinely been set for periods as long as two years, while wear on mooring components due to surface wave action have limited the reliable life of surface moorings to 6 months. Regions of the ocean with moderate currents extending in the same sense over substantial fractions of the water column remain challenging environments for moored measurements. Counteracting hydrodynamic drag with buoyancy remains the principal problem. Subsurface moorings have successfully been placed in such regions (under the Gulf Stream, for example), but drag tends to knock the moorings over, moving instruments fixed to the mooring to deeper depth. The problem of interpreting time series gathered from arbitrarily changing depths has been the subject of a number of papers (e.g. Hogg, 1991) Surface moorings open the possibility of communicating data to shore either directly or through a satellite communication link. Near shore, HF-band radio communication is often used. 8 In the open ocean, the workhorse of this communication to date has been ARGOS, a reliable but somewhat narrow-band one way system based on usually 3 polar-orbiting NOAA satellites. Typical ARGOS data messages are a few hundred characters per satellite pass, with several passes per day over a given location. The most prominent example of such a use is the TAO array of about 70 moorings in the tropical Pacific. Moorings typically carry recording instruments at a number of depths. Mechanical current meter models used two decades ago are still in use, although not all are currently in production. They are normally inserted between sections of wire rope or nylon and carry the tension of the mooring. Moored temperature measurements have been made easier by the introduction of small self-contained temperature recorders that clip on to mooring wire. Although they do not carry mooring tension, they carry the drawback that their position on the mooring wire is not fixed as rigidly and have been known to slip. One of the major changes in moored instrumentation has been the measurement of electrical conductivity, hence salinity. Both electrode and inductive sensors have been used successfully to measure salinity over long periods on moorings. In addition, a number of optical measurements are now routinely made from surface moorings. Fouling continues to be a problem for mechanical and optical measurements in the upper ocean, since sensors are not calibrated for the slow growth of algae or barnacles (Figure 2). Figure 2. VMCM current sensors fouled with barnacles after 6 months at 5 m depth in the central Arabian Sea, despite generous use of antifoulant agents prior to deployment (photo courtesy R. Weller). 9 The biggest change in moored instrumentation since 1980 has been the introduction of acoustic Doppler current profilers (ADCPs). These gate the return of scattered acoustic pulses from three or more beams to form a profile of horizontal current (with the assumption that currents are identical at common depths, independent of range along the beams, which are separated typically by 30° from one another vertically). Because of the need for scatterers, these operate best in the upper ocean, although near bottom applications have also been successful. Range and resolution trade against one another with acoustic frequency. Typical ranges are a few hundred meters. Several models are available commercially. ADCPs have been used in a variety of configurations, although the measurement can be compromised indirectly by the presence of mooring elements in the depth range of the profile: fish tend to be attracted to a mooring in the upper ocean and their schooling behavior causes swimming speed to contaminate current records (Plimpton et al, 1997) For this reason, ADCP measurements are made on separate moorings in the TAO array. The fishing industry refers to moorings as FADs (fish aggregation devices) (Figure 3). Figure 3. (left) ADCP mounted looking downward from a surface buoy after 9.5 months in the northeast Pacific, (right) Macroscopic acoustic scatterers schooling around mooring line being hauled during recovery of a surface mooring after 6 months in the Arabian Sea. 10 Profiles of temperature, salinity, and current can be obtained from moorings by measuring from a moving instrument package. Baker(1981) mentioned the Cyclesonde and the Draper Laboratory Profiling Current Meter, both of which were designed to operate in the upper ocean, employing variable buoyancy to move vertically. A deep ocean instrument that crawls along mooring wire is under development to extend the profiling depth range from near surface to near bottom (D. Frye, personal communication). The principal drawback of moorings is their cost. By their very nature, they are massive, so that it is not uncommon for several tons of materiel to be shipped, loaded, launched, recovered, and reshipped again for an investigation using moorings. A modest crew is needed in most cases to prepare moorings and carry out their deployment and recoveries. Labor costs make the use of mooring equipment as or more expensive than its capital cost. Depending on mooring complexity and remoteness of the site, costs range from one to several hundred thousand U.S. dollars per deep sea mooring, exclusive of ship time. Expendable subsurface moorings have been developed recently in an attempt to reduce costs, although their components are tailored to long-term volume flux investigations where daily averages are all that are required. Since moored measurements are costly, observational programs rarely manage to have many of them at one time. While moorings do well at detecting temporal variability, they do very poorly at detecting horizontal variations only because of limited numbers. Coherent arrays in the open ocean with more than a few moorings are rare for academic studies. Because of this, moored arrays often are unable to resolve horizontal gradients well enough to estimate horizontal advection. In the now nearly completed field work of WOCE, moorings were arrayed mostly one dimensionally across boundary currents in an attempt to measure volume transport. Some moored sensors are sufficiently fragile that redundant sets are employed on the same mooring in order to assure data return (Figure 4). This approach, while cheaper than fielding additional moorings, serves to further concentrate resources at the cost of obtaining more spatial information. Just as moored measurements appear to becoming less prevalent in physical oceanographic studies, our oceanographic colleagues are learning to make unattended sampling devices suitable for moorings for chemical, biological, and geological studies. Oxygen and nitrate sensor packages are under development as are devices to take certain trace metal samples. Acoustic and optical measurement packages have been developed to measure such things as plankton abundance and chlorophyll concentration. Sediment traps have become a fixture for carbon flux studies. BOTTOM MOUNTED INSTRUMENTS Bottom-mounted instruments have the advantage that they are often much simpler and more compact than full moorings, hence less costly. A number of instruments of this kind have been developed since 1980, a selection of which are mentioned here. They are suited to measurement of processes confined near the sea floor, to integral measures of the water above, and to acoustic profiling. Examples of these are the bottom stress tripod BASS (Williams et al, 1987), the inverted echo sounder (IES) (Watts and Rossby, 1977), bottom pressure gauge, the electric field sensor (Filloux, 1987) and the ADCP. The BASS unit is in effect a compact rigid moored 11 Figure 4. A surface buoy carrying redundant meteorological sensors and recording systems in a western tropical Pacific squall (the buoy is not manned continuously). array suited to measuring boundary layer properties. Because of its complexity, multiple units are not in use. Multiple copies of the other examples given here are in use, making possible horizontal resolution of various oceanic processes. The IES measures acoustic travel time from the bottom to the ocean surface. Its appeal, that it measures an integral quantity simply and reliably, is also its drawback for interpreting physical processes. IES users generally use CTD data to draw an empirical calibration of travel time against some hydrographic parameter, such as dynamic height, the depth of a particular isotherm, or a baroclinic mode amplitude (Hallock, 1987, Pickart and Watts, 1990, Trivers and Wimbush, 1994). Due to its simplicity, it can be used in substantial numbers so that a degree of horizontal 12 resolution can be obtained; arrays of 10 or more are not uncommon. The drawback is that different physical processes can give the same travel time signal. IES users usually argue that a single mode of variability accounts for most variance in a given set of observations. The problem of sensor drift for deep pressure gauges has yet to be solved. The best attempts have come from simply accepting the drift and trying to predict it for removal in data processing (Watts and Kontoyiannis, 1990) The appeal of bottom pressure is its relationship to depth-integrated geostrophic flow. A prominent use of bottom pressure measurements is in concert with IESs or electric field sensors. The combination with acoustic travel time allows the distinction between steric and dynamic changes to some degree. Horizontal electric field measurements made at the sea floor are proportional to the conductivity weighted vertical average of horizontal flow (Luther et al, 1991). Filloux (1987) developed bottom mounted electric field recorders that have become useful in monitoring the otherwise elusive depth-averaged oceanic flow. By contrast, the use of a mooring laden with current meters from surface to bottom appears a brute force approach to measuring this quantity. The combination of electric field recorders with IES and bottom pressure gauges has been a fruitful means of measuring the variation of flow fields with relatively simple vertical structure (or at least assumed so). Cables along the sea bed have been another source of oceanographic measures. In contrast to the above examples, they are much more expensive to install and maintain. Abandoned telephone cables have been useful for inferring transport of currents between land stations (Larsen and Sanford, 1985). Bottom hydrophones on cables are useful for acoustic tomography studies (Worcester et al, 1991). The main advantages to active cables are their ability to carry power and data (Chave et al, 1990). Physical oceanographers have been shy to embrace their use largely because of cost. b. Lagrangian platforms The last two decades have seen substantial progress in making current measurements by quasi-Lagrangian means. The single most important enabling technology has been development of long range communication. The merits of acoustic tracking via the SOFAR channel were well established before 1980, but the technique relied on large neutrally buoyant floats to transmit sufficiently powerful pulses to be heard at remote listening stations. There was no practical means to track surface drifters at remote locations without the use of ships or aircraft. The acoustic method was simply reversed so that floats received rather than emitted acoustic signals. The NOAA satellite-based ARGOS system allowed platforms on the sea surface to be tracked and small amounts of information to be sent from such platforms. The advent of the ARGOS system sparked a resurgence in the use of drifters to track surface currents. ARGOS was used initially to track drifters of a variety of designs, although by 1980 it was clear that these did not track water parcels very faithfully. A number of efforts to decrease displacement due to wind and waves and to improve the reliability of surface drifters resulted in low-cost drifters that could be deployed in large numbers and operate for a year or more in the open sea (Bitterman, 1986, Dahlen, 1986, and Niiler et al, 1987). Active satellite- 13 tracked drifters now number in the thousands (Figure 5). The density of drifter trajectories is sufficient to revise and refine surface velocity global climatology. Ironically, though the measurements are Lagrangian, they are most often analyzed in an Eulerian fashion (averages over float speeds within specified areas are the most common statistics). Floats have been modified to report temperature at a fixed depth and sometimes even salinity. Drift of unattended conductivity sensors is difficult to detect, particularly when drifters are not routinely recovered for post-sea calibration. The distribution of drifters is difficult to control: natural convergences and divergences tend to attract and repel drifters as with any other flotsam. Figure 5. Lagrangian drifter showing holey sock drogue. In addition to having their positions fixed via ARGOS, drifters commonly report sea surface temperature, barometric pressure, whether or not the drogue is attached, and sea surface salinity (photo from the website of the Drifting Buoy Data Assembly Center, NOAA Atlantic Oceanographic and Marine Laboratory) 14 On smaller scales, GPS navigation has made practical accurate tracking of surface drifters (George and Largier, 1996). Because of the high data rate needed to report large numbers of positions often, radio links to aircraft or ships have been employed. The recent burgeoning use of cellular telephones has allowed drifters to be tracked close to populated coastal areas (Hitchcock et al, 1996). Rossby et al (1986) introduced the RAFOS system (SOFAR spelled backwards) for tracking neutrally buoyant floats by making them the receivers and transmitting to them from a number of moored sound sources (Figure 6). At a preprogrammed time, floats ascend to the sea surface and transmit their recorded acoustic ranges via ARGOS. The economy of the system over its predecessor is the smaller number of expensive transmitters that are necessary. The first neutrally buoyant floats were isobaric. Newer versions have been made to follow isopycnal surfaces (Rossby et al, 1985) and even to successively profile between isopycnal surfaces. Figure 6. RAFOS float. An array of sound sources is moored in the general area in which the floats are set for self-tracking (photo courtesy S. Riser) 15 The low-frequency motions of neutrally buoyant floats can be revealed from successive float positions some days or weeks apart. Davis et al (1992) developed a float that freed itself from the infrastructure needed for acoustic tracking by periodically rising to the sea surface to be tracked by ARGOS. This system, dubbed ALACE (Autonomous LAgrangian Circulation Explorer), has been employed through WOCE to track mid-depth currents on a global scale. Recently, these instruments have been modified to profile temperature and salinity on their trips to the sea surface (Figure 7). Typically these floats are designed for 50 to 150 profiles before failure, giving them lifetimes of years. Figure 7. PALACE (Profiling Autonomous LAgrangian Circulation Explorer) float carrying conductivity and temperature sensors ready for launch (photo courtesy D. Swift). Water motions can be tracked vertically as well as horizontally if a float is able to remain neutrally buoyant over even large depth excursions. D’Asaro et al (1996) developed an acoustically tracked float that follows water motions in the surface mixed layer (Figure 8). It demonstrates the dominance of motions at the scale of the mixed layer depth in surface turbulence. Attendant temperature measurements give estimates of vertical heat flux within the mixed layer. 16 A more recent version extends the depth range to follow deep convection and report measurements via ARGOS. Figure 8. Two Lagrangian Floats ready for simultaneous launch. These maintain near neutral buoyancy during vertical excursions through a hull that has nearly the same compressibility as seawater. A disk is included to increase hydrodynamic drag. (photo courtesy E. D’Asaro) c. Ship-based measurement systems The conventional CTD cast has changed relatively little since 1980 and it continues to be the most common way of probing the ocean interior. CTD units are supplied by one or two private companies, to whom improvements have been left. Redundant and modular sensors have improved overall accuracy. Typical absolute accuracies are a few m°C and thousandths psu (with resolution an order of magnitude better). Even the standards themselves have been transferred to private companies (the president of one remarks, “Do you really mean for me to have the keys to the castle?”). Rosette samplers have been enlarged to carry up to 32 bottles. Data is still multiplexed up a conducting wire. Although a system has been devised to decouple the CTD cage from ship motion and transmit data acoustically (Toole et al, 1997), it has yet to be widely 17 adopted. Additional sensors for oxygen, chlorophyll, and acoustic backscatter have routinely been added to the CTD fish, although the calibration of such sensors remains a matter of debate. Recent WOCE sections saw the addition of an ADCP profiler hung below the CTD package in an attempt to make current profiles concurrently with hydrography. The accuracy of these profiles appears to be a few cm/s at present, somewhat higher than desirable for general circulation studies. Expendable temperature probes are being used at roughly the rate of two decades ago. Most are being used from ships of opportunity rather than from dedicated research vessels. The accuracy of XBT probes remains about 0.05°C and most used are rated to 760m depth (Roemmich, personal communication, 1997). The manufacturer’s estimate fall rate appears to be in error by about 3.5% (Hanawa et al, 1995). Expendable CTD (XCTD) probes are available (for about 20 times the cost of an XBT) which are more accurate in temperature ( 0.02°C) and have salinity accuracy of 0.03psu. XCTDs are rated to 1000m. Rumors purport XBTs rated to 2000m are being sold to the French Navy. Expendable current (XCP) probes have also been developed. They use electric field measurements to estimate horizontal velocity relative to an unknown offset. They are about 40 times the cost of an XBT. Shipboard ADCPs are now standard equipment on oceanographic research vessels. By themselves, they give current relative to the ship. The advent of GPS has made them much more useful to oceanography, since absolute current can be estimated by a suitable determination of ship speed and direction. Oceanographers hopeful of finding a method to determine absolute references for geostrophic velocities were disappointed to realize that instantaneous velocity measured by an ADCP was heavily contaminated by tidal and internal wave velocities. (The same problem confronts lowered ADCP users plus the additional one of sensor motion relative to the ship.) ADCP surveys perforce are contaminated by superinertial and tidal motions due to the time taken for a ship to execute a survey on interesting horizontal scales. Towed vehicles have experienced a resurgence in use in the last decade or so due to their commercial availability and their ability to make high horizontal resolution surveys of the upper few hundred meters of the water column (Figure 9). These vehicles are typically towed at about 8 kt and cycle from a few meters from the sea surface to 300-450 m depth, depending on vehicle load. Breaching the sea surface is usually avoided because of sensor exposure to flotsam and bubbles. Typical profile slopes are about 1:5, giving horizontal spacing of about 3 km between profile cycles for a profile depth range of 300 m. These vehicles typically carry a CTD and commonly a number of other sensors including chemical and optical sensors. Power and data are sent via the faired tow cable. Together with GPS and a vessel mounted ADCP, towed vehicles make possible fine-scale quasi-surveys of modest dimensions. Tidal, inertial, and internal wave motions are the main sources of unresolved variance in such surveys. Nevertheless, towed systems have proved valuable tools to describing frontal structures in a variety of coastal and deep ocean environments. Measuring meteorological variables was once a job left to the ship’s crew. Bridge observations of wind, air temperature, barometric pressure, and sea state were keyed by Morse Code by radio operators aboard oceanographic ships as recently as 1980. With the advent of shipboard data networks came the prospect of automating and making more complete the records and their reporting. Also, a number of studies (e.g. Large and Pond, 1981, Fairall et al, 1996) refined algorithms for estimation of air-sea fluxes. Oceanographic ships now routinely carry meteorological 18 Figure 9. SeaSoar towed vehicle fitted with CTD, fluorometer, transmissometer, and optical instrumentation. Wings provide hydrodynamic lift for depth control. (photo courtesy C. Lee) packages that automatically record variables from which realistic air-sea fluxes of momentum, heat, and fresh water. At least one such package, IMET (Hosom et al, 1995), was designed also for use on surface buoys as well as ships (Figure 4). Precipitation remains one of the most difficult (and probably least credible) meteorological variable to be measured at sea. Nevertheless, instrument systems and accompanying processing algorithms are approaching absolute overall errors in heat flux as low as 10 W/m2. A number of free-fall profilers have been developed since 1980. These have generally been designed to measure velocity profiles through large parts of the water column and/or microscale turbulence. Many such profilers also carry a CTD. Perhaps the simplest of the velocity profilers is the POGO, an acoustically-tracked dropsonde intended to measure the depth-averaged horizontal velocity from the sea surface to some preselected depth (Rossby et al, 1991). This technique has been used to estimate absolute velocity profiles from geostrophic shear referenced to POGO depth-averaged velocities. Direct full-depth velocity profiles have been measured using the AVP, a dropsonde that references an electromagnetically-inferred velocity profile to one mea- 19 sured near the sea floor found by acoustically tracking the sea floor (Sanford et al, 1985). By 1980, at least two systems were in use to acoustically track a dropsonde over the entire water column. Setting and surveying an acoustic network on the bottom was overhead necessary to such a system. With the advent of GPS, it is possible to track a dropsonde from beacons drifting on the sea surface (Leaman et al, 1995), in an attempt to reduce the overall time necessary to make a profile. Fine-scale and microscale profilers have become instruments capable of gathering robust statistics on turbulent mixing over the last two decades. The Cartesian Diver (Duda et al, 1988) drifts freely while making multiple profiles in the upper 1 km, inferring horizontal velocity electromagnetically. A profiler dubbed HRP allows both fine and microscale velocity and temperature to be measured throughout the water column, making possible valuable deep measurements of mixing rates (Schmitt et al, 1988) (Figure 10). The dropsonde MSP (Winkel et al, 1996) Figure 10. HRP launch (photo courtesy K. Polzin) resolves horizontal velocity over scales ranging from ocean depth to microscale by combining electromagnetic, acoustic, and hydrodynamic lift on small airfoils. HRP and MSP make a single profile before being recovered and relaunched. The ability to make repeated profiles rapidly and continuously made possible the collection of robust statistics on mixing. This was achieved by tethering otherwise free-fall instruments on thin fiber-optic cables (RSVP and AMP are two such 20 profilers, Caldwell et al, 1985, and Nodland and Gregg, 1987, respectively). The fiber-optic cable allows the profiler to be small and easily handled, since it need not carry batteries or recording electronics, and also allows it to be pulled back aboard a ship without the time consuming recovery operations that are necessary for larger free-fall vehicles (Figure 11). Figure 11. AMP launch in the Bosphorus (photo courtesy of M. Gregg) d. Satellite, aircraft, and land-based measurement Systems Remote sensing of the sea surface has made great strides over the last two decades. Their great contribution has been to provide regional and global synoptic views of the spatial structure of the ocean. Satellites are by far the most prevalent platforms for remote sensing the ocean. Satellites differ from other oceanographic instrumentation in that oceanographers neither direct their use nor development of sensor packages they carry. Instead, teams of engineers managed by space agencies design and implement use of hardware and algorithms for their use. Only at the stage of data interpretation are oceanographers typically involved. A number of data products are routinely processed from algorithms that have undergone various degrees of verification with traditional oceanographic measurements. 21 By 1980, satellite radiometry was in the course of transition between providing provocative images of sea surface temperature and being a quantitative scientifically useful tool. The potential use of satellite radar altimetry had been demonstrated by 1980, but development to the stage of providing routine, calibrated sea surface temperature (SST) data did not occur until somewhat later. Today it is common to be able to load 1 km resolution SST images from Internet sites from specific regions of the ocean where interested parties have acquired and are processing such high resolution data. The most common source of these data are the Advanced Very High Resolution Radiometer (AVHRR) instruments aboard a few NOAA satellites. Typically a dedicated satellite link is required to obtain the large volume of data generated but generally not stored aboard a satellite. Ground stations have been installed in various remote locations for the purpose of supporting regional studies. Satellite receiving stations are occasionally installed on research ships to provide investigators aboard context within which to consider their in situ measurements or to guide the ship toward a particular SST feature, such as a front. Clouds obscure the sea surface in the frequency bands used for radiometry, so images are often composited over a number of days, tending to compromise their synopticity. Over longer time scales, this is less problematic and AVHRR data have become a principal source for monitoring low frequency SST variability. Satellite radar altimetry has become increasingly valuable as a means of measuring sea level variation in the ocean. Because of uncertainty in the geoid, it has not been useful in determining mean ocean circulation. GEOSAT, ERS1, and TOPEX/POSEIDON (Fu et al, 1994) have together made possible a global temporal description of sea level, to varying degrees of accuracy. Altimetric satellites are generally configured to repeat a prescribed orbital track periodically. Because several days or longer is required to complete a track with oceanographically interesting spatial resolution, interest in the detection and prediction of ocean tides has seen a revival. The tide gauge network that provided the spatially fragmentary description of sea level variability prior to the practice of satellite altimetry remains a valuable calibration standard as well as a primary source for long term sea level variability. Given the relatively short operational lifetimes of satellite altimetry missions of a few years or so, altimetric records are well suited to detecting variability on time scales of weeks to years. Particularly dramatic examples of the oceanic variability detected by altimetry are baroclinic Rossby and equatorial Kelvin waves (Chelton and Schlax, 1996). The surface wind field remains poorly known on temporal and spatial scales crucial to oceanic processes. Reasonable model interpolations of the wind field have been provided routinely from numerical weather prediction centers in recent years, but these are based on sparse observations and cannot give accurate estimates of spatial and temporal derivatives of wind stress over the ocean. The potential for scatterometry to provide comprehensive high-spatial resolution winds was demonstrated by SEASAT, but only in 1996 with the launch of NSCAT were nearly global daily ocean winds measured directly (Figure 12). Unfortunately, the satellite carrying NSCAT failed only 9 months into its mission, leaving oceanographers once again teased by seeing what could be within their grasp. While satellite remote sensing has succeeded in providing new descriptions of the ocean surface in unprecedented detail, its history has been punctuated by devastating failures and long waits between sparkling successes. These may be simply greatly magnified versions of what ocean experimentalists have also endured: the sudden failure of hardware when it is out of reach 22 Figure 12. Image of NSCAT winds overlaid on clouds of typhoon Violet (NASA image) for repair. It has been over two decades since SEASAT demonstrated the great promise of satellite remote sensing, yet this promise still is only partly fulfilled. The reliance on very few very complex instruments makes for a very vulnerable observing system, no matter how powerful. e. Autonomous Underwater Vehicles Ocean engineers have long toyed with remotely operated vehicles (ROVs), but their usefulness to physical oceanography has been limited. By their very nature, ROVs require the presence of a large platform from which they can be operated, usually a ship. Unlike tethered profilers or towed bodies, ROVs classically power themselves. Their range is limited by tether length, in practice limiting them to a vicinity not much greater than the ocean depth to their support platform. They have received significant use in geological studies, but not in physical oceanography, perhaps because the physical measurements they could make are generally more readily made by other means. Most conventional autonomous underwater vehicles (AUVs) have been limited to mission ranges and durations that are short (tens of km and a few hours) compared to time and space 23 scales of most oceanographic phenomena. The range of AUVs is limited by stored energy, usually in the form of batteries. They expend this energy against buoyancy and drag forces for propulsion and going fast necessarily means not going terribly far. In contrast, profiling ALACE drifters might be regarded as very simply controlled AUVs: they drift at will at their neutrally buoyant depth but are controlled to profile to the sea surface periodically. They do not propel themselves horizontally. Stommel (1989) published the fantasy of a future fleet of slowly profiling glider vehicles he called Slocums that would draw power from thermal stratification of the ocean. His science-fiction tale set in 2021 had such vehicles first launched in 1994 and operating from a command center on an exclusive Cape Cod area island. Glider vehicles are currently under development. While ultimately a thermal engine version is attractive, current vehicle prototypes use batteries. By operating at low power (a fraction of a watt), they have a design range of several thousand km horizontally and over a thousand km vertically. They maintain a fixed course underwater while diving to a predetermined depth at which they run a pump to increase their volume, pitch slightly upward, and glide back to the ocean surface where they obtain a GPS fix. The vehicles calculate a new course and dive again after a few minutes on the surface. On occasional visits to the surface the vehicles call a shore station to send data and receive updated command instructions. A prototype vehicle has operated in Puget Sound using the local cellular telephone network for communication (Figure 13). The Figure 13. Prototype glider viewed from below in a municipal swimming pool. This vehicle is 1.8 m long (exclusive of trailing antenna), has 52kg mass, and glides at a few tenths m/s along slopes as gentle as 1:5 (photo courtesy J. Osse). 24 principal glider measurement package is a CTD, while in the future additional packages may be added. Fortunately, their unit cost is projected to be about half what Stommel hoped for, hence a distributed network of vehicles seems possible. Gliders are small enough that they may be launched and recovered without the need for heavy equipment. Their design range is great enough that they may reach most regions of the ocean by launch from a small vessel within sight of land. 5. Perspective The single biggest problem for oceanographic observations is that of adequate sampling. Baker (1981) noted the 19th century approach to oceanography was to gather interdisciplinary resources into a very few long global voyages (e.g. the Challenger expedition), whereas by the mid 20th century, physical oceanographers were trying to design observational programs to describe phenomena for which they made assumptions about the space and time scales involved (three examples from Stommel (1963) are discussed). He notes the experimental design work that took place for the MODE and ISOS experiments, where statistical analysis methods were used to guide sampling plans. The MODE experimental plan called for 29 current meter moorings and a grid of 77 CTD stations to be surveyed repeatedly over a four month period. Even with such resources, the duration of the experiment was woefully short of being long enough to construct robust statistics on the mesoscale eddy field. Yet, the inflation-adjusted cost of this experiment, looms large compared to recent large program costs. There is little argument among physical oceanographers that the basic variables of most interest are temperature, salinity, and velocity. Over the years, means to measure these variables has improved in accuracy and ease, yet oceanographers are still rarely able to garner the resources necessary to resolve oceanic variability adequately. Baker (1981) wrote: It is clear now that maintaining enough moorings to monitor large-scale ocean circulation is too expensive, for both equipment and logistics. One must look to satellite-based systems, together with a modest number of moorings and hydrographic observations. The surface drifters will certainly play a major role in any new global observation system, providing surface measurements for calibration of satellite data, giving a direct measurement of surface currents for use with hydrographic data, and providing an interpolation between moorings. The underlying premise of this statement is that some technologies are more cost-effective than others for making measurements. To Baker, satellites seemed a viable means of addressing the sampling problem facing investigators interested in general circulation problems. At the time, though, it was still felt than altimetry could establish the absolute pressure field of the surface ocean (the distinction between sea level signals mean currents and the geoid has yet to be made). Accurate surface drifters were just being developed and now large numbers of them have proved useful in describing surface circulation. Hundreds of subsurface drifters, also dependent on satellites to calculate and report position, have been deployed and are beginning to add to the description of ocean circulation. It turns out that satellites are at least as valuable, if not more so, in providing communication and navigation to instruments at sea as in making remote measurements 25 of surface conditions (temperature, sea level and wind stress). Finding affordable means of making measurements sufficiently densely, widely, and long enough to understand oceanic processes remains a challenge to oceanography (Figure 14). Figure 14. Oceanographic dream or nightmare, 1981, depending on the point of view. (image courtesy W. “keep ‘em laughin’” Patzert) Some techniques turn out to be too complex or expensive to find widespread use. Baker (1981) identified tomography as “an acoustic technique of great promise” and laser Raman backscatter to remotely sense temperature as an important idea. The laser technique is an example of technology that is too expensive and specialized to be useful to oceanography. Acoustic tomography has yet to find widespread use in oceanography, probably both because it requires a large number of complex installations to be effective (moorings or cables) and because multiple rays in the vertical plane between a pair of array elements are insufficient to resolve more than the average conditions between them. While true that tomography offers geometric increase in resolution with the number of array elements, even a modest array requires significant resources. 26 Meanwhile, the important problems of determining the absolute pressure and vertical velocity fields of the ocean, whether globally or regionally, remain outstanding. Fitting techniques utilizing constraints have emerged as the likely solutions to these problems (e.g. Wunsch, 1996), although more than the traditional tools of hydrographic surveys and direct horizontal velocity measurements may provide help. Examples are data from deliberate tracer release experiments and repeated simultaneous measurement of transport and density profiles by autonomous vehicles. Baker(1981) called for new techniques that could measure currents where they are strong, in the surface mixed layer under storms, and synoptically over large areas. He called for means of monitoring long-term profiles of currents and for a deep neutrally buoyant float that would periodically report to a satellite. Much progress has been made in these directions, summarized above. A mix of instrumentation will always be important to physical oceanographic measurements, but the geographic extent of the ocean suggests that distributed measurements will lead to greater understanding than a few very sophisticated complex measurement systems. As an environmental science, there is no apparent equivalent to the physicist’s particle accelerator for finding general truths of universal application. Perhaps an analogy to space science is more apt: ships and autonomous vehicles may be analogous to manned and unmanned space missions. Ships remain essential to most ocean measurements, if only to transport instruments to launch sites. They remain the only means by which many techniques for probing the ocean interior can be developed. Conversely, they cannot provide the extensive spatial and temporal coverage needed to resolve oceanic flow. To address the sampling problem, a distributed network of inexpensive unmanned platforms is appropriate. The expense of ships has led to the development of various “expendable” instruments. They are often expended only because to recover them from a ship is prohibitively expensive. While maritime legal arrangements have prohibited dumping used mooring wire from a ship, it is not uncommon to expend floats costing $15,000 each or more. The advent of long range self-propelled vehicles may ultimately curb this activity by making it cheaper to reuse sophisticated equipment than to scuttle it on the sea floor, an environmentally more friendly practice. A possible alternative to the “alternative to satellite oceanography” shown above is a network of autonomous vehicles with the occasional ship-serviced platform (Figure 15). Instrumentation development for physical oceanography has been supported to differing degrees by different government agencies in the U.S. The Office of Naval Research (ONR) has sponsored most new instrument development in the past few decades, but shows signs of slowing down. Whereas ONR once specifically reserved a significant portion of its budget for physical oceanography instrumentation, budget constraints following the end of the Cold War have led to diminished support. The National Aeronautics and Space Administration (NASA) naturally supported most development of remote sensing techniques. These efforts continue, but satellites are sufficiently expensive to build, launch, and maintain that they offer attractive targets to politicians intent on finding items to cut from the federal budget. The National Science Foundation (NSF) has supported development of new instruments mainly through large programs. Instrumentation development through the Oceanographic Technology and Interdisciplinary Coordination program has been modest due to limited resources, often limited to modifications of existing hardware. 27 Figure 15. An observing system for the ocean interior relying on autonomous vehicles and only a few ship-serviced elements. Data transmission and control is by satellite. It is not clear whether large or small programs do a better job sponsoring the development of instrumentation for physical oceanography. What does matter, though, is that successful instrument development requires the guidance of scientists, not simply engineers. Stommel’s (1966) statement that, “A few good and determined engineers could revolutionize this backward field” still rings true. The great improvements in oceanographic instrumentation in the last two decades have come mainly by taking advantage of technical developments external to oceanography. The role of engineers in this technology transfer has been vital. The support of technical support infrastructure is essential to continued instrumental improvements. Yet, as far as we have come with technical improvements, many of our instruments are too expensive to build or use in the 28 great numbers that adequate sampling demands. Instruments developed by engineers detached from the needs and desires of the scientific community often are never used after a brief engineering demonstration. Others are so specialized that they are used only a few times and spend most of their time on a lab or warehouse shelf simply because they are too expensive to use. The best instruments for oceanography continue to be the ones that are simple and inexpensive enough to be used by many, often, and for many years. The future of instrumentation, like the past, will be shaped largely by technical developments outside oceanography. The three major developments that have most influenced instrument development over the last two decades have been advances in computing, communication, and navigation. Further advances in these areas are inevitable, hence we can look for oceanographic instruments to become smaller, smarter, and cheaper per unit performance. In the Cold War era, much of technical development was accomplished in the military arena. Now, much of it is done in response to commercial pressure. Miniaturization and increased computing power will make possible more sophisticated and extensive use of nontraditional platforms such as VOS, drifters, and autonomous vehicles. It will also make possible more near real-time reporting of measurements for analysis, remote control of sampling, and use in modelling efforts. References Bitterman, D. S., and D. V. Hansen, 1986: The design of a low cost tropical drifter. Proc. of the 1986 Marine Data Systems Symp., New Orleans, Marine Data Systems, 575-581. Baker, D. J., 1981: Ocean instruments and experiment design. Evolution of Physical Oceanography, B. A. Warren and C. Wunsch, editors. MIT Press, 396-433. Caldwell, D. R., T. M. Dillon, and J. N. Moum, 1985: The Rapid-Sampling Vertical Profiler: An examination. J. Atmos. Oceanic Technol., 2, 615-625. Chave, A. D., R. Butler, and T. E. Pyle, 1990: Workshop on scientific uses of undersea cables. Joint Oceanographic Institutions, Inc., Washington DC. Chelton, D. B., and M. G. Schlax, 1996: Global observations of oceanic Rossby waves. Science, 272, 234-238. D’Asaro, E. A., D. M. Farmer, J. T. Osse, and G. T. Dairiki, 1996: A Lagrangian float. J. Atmos. Oceanic Technol., 13, 1230-1246. Dahlen, J., 1986: The Draper LCD - A calibrated, low cost Lagrangian drifter. Proc. of the 1986 Marine Data Systems Symp., New Orleans, Marine Data Systems, 582-595. Davis, R. E., D. C. Webb, L. A. Regier, and J. Dufour, 1992: The Autonomous Lagrangian Circulation Explorer (ALACE). J. Atmos. Oceanic Technol., 9, 264-285. Duda, T. F., C. S. Cox, and T. K. Deaton, 1988:The Cartesian Diver: A self-profiling Lagrangian velocity recorder. J. Atmos. Oceanic Technol., 5, 16-33. 29 Fairall, C. W., E. F. Bradley, D. P. Rogers, J. B. Edson, and G. S. Young, 1996: Bulk parameterization of air-sea fluxes for Tropical Ocean-Global Atmosphere Coupled Ocean Atmosphere Response Experiment. J. Geophys. Res., 101, 3747-3764. Filloux, J. H., 1987: Instrumentation and experimental methods for oceanic studies. Geomagnetism, v. 1, edited by J. A. Jacobs, Academic, San Diego, Calif., pp. 143-248. Toole, J. M., K. W. Doherty, D. E. Frye, and R. C. Millard, 1997: A wire-guided, free-fall system to facilitate shipborne hydrographic profiling. J. Atmos. Oceanic Technol., 14, 667-675. Fu, L.-L., E. J. Christensen, C. A. Yarmarone Jr., M. Lefebvre, Y. Menard, M. Dorrer, and P. Escudier, 1994: TOPEX/POISEIDON overview, J. Geophys. Res., 99, 24369-24381. George, R., and J. L. Largier, 1996: Description and performance of finescale drifters for coastal and estuarine studies. J. Atmos. Oceanic Technol., 13, 1322-1326. Nodland, W.E. and M.C. Gregg, 1987: Advanced Microstructure Profiler(AMP) Engineering Description. Technical Report. Applied Physics Laboratory, Univ. Washington, Seattle, 127 pp. Hallock, Z. R.,1987: Regional characteristics for interpreting inverted echo sounder (IES) observations. J. Atmos. Oceanic Technol., 4, 298-304. Hanawa, K., P. Rual, R. Bailey, A. Sy, and Szabados, M., 1995: A new depth-time equation for Sippican or TSK T-7, T-6 and T-4 expendable bathythermographs (XBT). Deep-Sea Res., 42, 1423-1451. Hogg, N. G., 1991: Mooring motion revisited. J. Atmos. Oceanic Technol., 8, 289-295. Hosom, D. S., R. A. Weller, R. E. Payne, and K. E. Prada, The IMET (Improved meteorology) ship and buoy systems. J. Atmos. Oceanic Technol., 12, 527-540. Large, W. G., and S. Pond, 1981: Open ocean momentum flux measurements in moderate-tostrong winds. J. Phys. Oceanogr., 11, 324-336. Larsen, J. C., and T. B. Sanford, 1985: Florida Current volume transports from voltage measurements. Science, 227, 302-304, 1985. Leaman, K. D., P. S. Vertes, and C. Rocken, 1995: Polaris: A GPS-navigated ocean acoustic profiler. J. Atmos. Oceanic Technol., 12, 541-549. Luther, D. S., J. H. Filloux, and A. D. Chave, 1991: Low-frequency, motionally induced electromagnetic fields in the ocean 2. Electric field and Eulerian current comparison. J. Geophys. Res., 96, 12797-12814. Niiler, P., R. E. Davis, and H. White, 1987: Water following characteristics of a mixed layer drifter. Deep-Sea Res., 34, 18678-1881. 30 Pickart, R. S., and D. R. Watts, 1990: Using the inverted echo sounder to measure vertical profiles of Gulf Stream temperature and geostrophic velocity. J. Atmos. Oceanic Technol., 7, 146156. Plimpton, P. E., H. P. Freitag, and M. J. McPhaden, 1997: ADCP velocity errors from pelagic fish schooling around equatorial moorings. J. Atmos. Oceanic Technol., 14, 1212-1223. Rossby, T. R., E. R. Levine, and D. N. Connors, 1985: The isopycnal Swallow float - a simple device for tracking water parcels in the ocean. In Essays in Oceanography: A tribute to John Swallow. Progress in Oceanography, 14, 511-525. Rossby, T., D. Dorson, and J. Fontaine, 1986: The RAFOS system. J. Atmos. Oceanic Technol., 3, 672-679. Rossby, T., J. Fontaine, and J. Hummon, 1991: Measuring mean velocities with POGO. J. Atmos. Oceanic Technol., 8, 713-717. Rudnick, D. L., R. A. Weller, C. C. Eriksen, T. D. Dickey, J. Marra, and C. Langdon, 1997: Moored instruments weather Arabian Sea monsoons, yield data. EOS, Trans. Am. Geophys. Union, 78, 117 & 120-121. Sanford, T. B., R. G. Drever, and J. H. Dunlap, 1985: An acoustic Doppler and electromagnetic velocity profiler. J. Atmos. Oceanic Technol., 2, 110-124. Schmitt, R. W., J. M. Toole, R. L. Koehler, E. C. Mellinger, and K. W. Doherty, 1988: The development of a fine- and microstructure profiler. J. Atmos. Oceanic Technol., 5, 484-500 Stommel, H. M., 1954. Why do our ideas about ocean circulation have such a peculiarly dreamlike quality? or Examples of types of observations that are badly needed to test oceanographic theories. Reprinted in Collected Works of Henry M. Stommel, v. 1, N. G. Hogg and R. X. Huang, editors, Am. Meteorological Soc., Boston, 1995. Stommel, H. M., 1955: Discussions on the relationships between meteorology and oceanography. J. Mar. Res., 14, 504-510. Stommel, H. M., 1963: Varieties of oceanographic experience. Science, 139, 572-576. Stommel, H. M., 1966: The large-scale oceanic circulation. In: Advances in Earth Science, P. M. Hurley, editor, MIT Press, 175-184, 1966. Stommel, H. M., 1989: The Slocum mission. Oceanography, 2, 22-25. Trivers, G., and M. Wimbush, 1994: Using acoustic travel time to determining dynamic height variations in the North Atlantic Ocean. J. Atmos. Oceanic Technol., 11, 1309-1316. Watts, D. R., and H. Kontoyiannis, 1990: Deep-ocean bottom pressure measurement: Drift removal and performance. J. Atmos. Oceanic Technol., 7, 296-306. 31 Watts, D. R., and T. Rossby, 1977: Measuring dynamic heights with inverted echo sounders: Results from MODE. J. Phys. Oceanogr., 7, 345-358. Winkel, D. P., M. C. Gregg, and T. B. Sanford, 1996: Resolving oceanic shear and velocity with the Multi-Scale Profiler. J. Atmos. Oceanic Technol., 13, 1046-1072. Williams, A. J. 3rd, J. S. Tochko, R. L. Koehler, W. D. Grant, T. F. Gross, and C. V. R. Dunn, 1987: Measurement of turbulence in the oceanic bottom boundary layer with an acoustic current meter array. J. Atmos. Oceanic Technol., 4, 312-327. Worcester, P. F., B. D. Cornuelle, and R. C. Spindel, 1991. A review of ocean acoustic tomography: 1987-1990. Rev. Geophys., 29 (Supplement Part 2), 557-570., 1991 Wunsch, C., 1996: The ocean circulation inverse problem. Cambridge Univ. Press, Cambridge, 442 pp. 32
© Copyright 2026 Paperzz