Expand this Topic clickable element to expand a topic
Skip to content
Optica Publishing Group

Modeling hyperspectral normalized water-leaving radiance in a dynamic coastal ecosystem

Open Access Open Access

Abstract

Next-generation satellite sensors such as the Ocean Color Instrument (OCI) aboard the NASA Plankton, Aerosols, Cloud and ocean Ecosystem (PACE) satellite and the proposed Surface Biology and Geology (SBG) sensor will provide hyperspectral measurements of water-leaving radiances. However, acquiring sufficiently accurate in situ validation data in coastal ecosystems remains challenging. Here we modeled hyperspectral normalized water-leaving radiance ([LW(λ)]N) in a dynamic coastal ecosystem using in situ inherent optical properties (IOPs) as inputs to the Hydrolight radiative transfer model. By reducing uncertainty of modeled hyperspectral [LW(λ)]N (%RMSE ≤ 21%) relative to [LW(λ)]N derived from in situ radiometric measurements (%RMSE ≤ 33%), we introduce modeling as an alternative or complementary method to in-water radiometric profilers for validating satellite-derived hyperspectral data from coastal ecosystems.

© 2021 Optical Society of America under the terms of the OSA Open Access Publishing Agreement

1. Introduction

Ocean color satellites and airborne sensors characterize aquatic ecosystems spatially. These sensors measure light fields, or apparent optical properties (AOPs) of aquatic ecosystems from above (e.g. [1]). A variety of remote-sensing based bio-optical algorithms, in turn, estimate biogeochemical ocean properties from these above-water AOPs. These algorithms assume radiative transfer theory, namely that optical properties of water masses are dictated by their contents. However, the above-water AOPs that these algorithms utilize are subject to uncertainties caused by the atmosphere [2], distortion of sensors over time (e.g., [3]), and ecosystem-specific perturbations [4,5], which introduce uncertainty into their derived data products. Satellite and airborne sensors must therefore be validated with radiometric surface data from the ecosystems that they characterize [4,5].

Validation of satellite and airborne sensors has traditionally relied on above-water AOPs extrapolated from in-water radiometric profilers (profilers), negatively buoyant radiometers that measure AOPs through the water column in situ (see [6,7]). However, coastal ecosystems frequently exhibit elevated concentrations of bio-optically active constituents (e.g., phytoplankton, non-algal particles, and dissolved matter) that can be vertically-stratified in layers (e.g., [8]). Furthermore, these constituents often cause light to attenuate sharply in the upper water column, amplifying the importance of accurately measuring the light field within the very depth layer that is most susceptible to wave focusing [7]. For these reasons, profilers require higher vertical resolution to reliably characterize coastal ecosystems than they do for the open ocean. They may also require high signal-to-noise ratio (SNR) to measure sharply attenuated light signals. Additionally, information derived from the spectral shape of AOPs has the potential to provide more information than traditional band-ratio algorithms, such as bio-optical constituent composition or dominant phytoplankton taxa [9,10]. Next-generation satellite sensors (e.g., OCI) and algorithms will therefore utilize hyperspectral AOPs [11]. Thus, validating and developing algorithms for next-generation satellite sensors in coastal ecosystems will benefit from hyperspectral ocean surface optical data.

Design trade-offs in SNR, vertical resolution, and spectral resolution make profilers problematic for sampling coastal ecosystems. For example, the SeaBird Scientific HyperPro II Optical Profiler’s (HP2) 137 channels characterize spectral shape, but its vertical resolution, as well as the likelihood of exceeding typical (> 5$^\circ $) tilt thresholds in the upper water column, may be problematic in coastal ecosystems [9,12]. Consequently, a single HP2 cast can exhibit depth resolution as low as ∼50 cm, and uncertainty as high as 25% in its derived products [13]. Although above-water AOPs derived from multiple HP2 casts can characterize coastal ecosystems realistically, mean environmental and uncertainty signals cannot be decoupled [13]. The adoption of multicast, a protocol that artificially increases vertical profiler resolution by combining measurements from multiple casts taken in succession, decreases profiler uncertainty [7,12]. However, Zibordi et al. [7] report a minimum vertical resolution of 2 cm (e.g., 50 “typical” HP2 casts) in order to adequately correct (e.g., $\le $ 2% uncertainty) for instrumental and environmental perturbations on derived AOP products such as diffuse attenuation coefficient (${k_d}(\lambda )$) in coastal waters. Conversely, the Biospherical Instruments Compact Optical Profiling System (C-OPS) possesses high SNR and cm-scale vertical resolution to characterize coastal ecosystems, thus allowing for high-fidelity measurments with typical deployments of 3 vertical profiles per station. The C-OPS only has 19-20 channels, however, and cannot fully characterize AOP spectral shape [13].

Utilizing radiative transfer theory to model AOPs offers several advantages over measuring them directly. In theory, modeled AOPs can be calculated at any wavelength ($\lambda $) and they are not subject to profiler-associated uncertainties. However, modeled AOPs are only as reliable as the inherent optical properties (IOPs) on which they are based, underscoring the latter’s importance. In situ IOP measurements are also susceptible to biases caused by instrument design [14]. Such biases can be associated with an instrument’s platform (e.g., size, shape, and descent rate), its sensors (e.g., SNR and optical pathlength), or its analog to digital converter [13,15,16]. Extents of these impacts vary by ecosystem [17]. Moreover, modeling AOPs in the ultraviolet (UV) and near-infrared (NIR) is problematic due to the dearth of IOPs measured in these regions. Conversely, C-OPS can measure AOPs reliably in the UV and NIR [13].

WET Labs’ hyperspectral absorption attenuation meter (ac-s) measures spectral absorption ($a(\lambda )$) and attenuation ($c(\lambda $)) at 87-89 channels. HOBI Labs’ HydroScat-6 spectral backscattering sensor (HS6) measures spectral backscattering at six channels, which can be partitioned into pure-water and particulate backscattering (${b_{bp}}(\lambda )$). Both instruments are ubiquitous in the ocean optics community. The ac-s can be purchased with an optical pathlength of 25 or 10 cm, intended for clear water and coastal ecosystems respectively. Designed for the open ocean, HS6 has a 15 cm optical pathlength and uses Oishi’s [18] assumptions of a $\lambda $-independent conversion factor and a single scattering angle (140$^\circ $) to compute ${b_{bp}}(\lambda )$, both of which could be problematic in phytoplankton-dominated coastal ecosystems [19]. WET Labs has more recently introduced the ECO-VSF3, which incorporates a very short optical pathlength to measure ${b_{bp}}(\lambda )$ from three scattering angles at three $\lambda $s; its measurements require no sigma correction for $c(\lambda $) $\le $ 5 m-1 [20]. While the 10 cm ac-s and ECO-VSF3 are designed for coastal ecosystems, their high costs compel many investigators to continue using the 25 cm ac-s and HS6 in waters where shorter optical pathlengths would be preferable. Reliability of long pathlength instruments in sampling specific coastal ecosystems can be evaluated by their ability to achieve optical closure with the observational data, or parity between AOPs modeled from in situ IOPs via radiative transfer theory, and contemporaneous AOPs measured in situ. On this basis, Tuchow et al. [21] demonstrated that the HS6, in conjunction with Doxaran et al.’s [22] proposed sigma correction, which corrects for absorption within the HS6 optical pathlength, improved optical closure in a Monterey Bay red tide relative to a co-deployed ECO-VSF3. Therefore, Tuchow et al. [21] concluded that with an adequate sigma-correction, the HS6’s increased spectral resolution (six $\lambda $s), compared to ECO-VSF3 (three $\lambda $s), more than compensates for its reliance on the Oishi approximation [18].

Doxaran et al. [22]-corrected ${b_{bp}}(\lambda )$ improved optical closure overall, but the modeled AOPs underestimated observed AOPs from the HP2 in the mid-visible (green) range of the spectrum [21]. Similar findings were reported by Bausell and Kudela [13], comparing modeled normalized water-leaving radiance (${[{{L_W}(\lambda )} ]_N}$) with ${[{{L_W}(\lambda )} ]_N}$ derived from both the HP2 and C-OPS, across five optically diverse Monterey Bay sampling stations. Following Tuchow et al. [21], Bausell and Kudela [13] utilized a long pathlength ac-s and HS6, and sigma-corrected ${b_{bp}}(\lambda )$ with Doxaran et al. [22]. Due to the minimization of HS6 pathlength errors [22], coupled with the C-OPS’s reliability in coastal ecosystems [13], we investigated whether underestimation of modeled above-water AOPs in the green may have stemmed from failures to properly correct for unaccounted scatter inside the ac-s’ absorption (${b_{ua}}$) and attenuation (${b_{uc}})$ flow tubes. The ac-s measures $c(\lambda )$ by focusing a beam of light into a 1 mm aperture; it calculates $c(\lambda )$ using the difference in beam strength between the lamp and this aperture, located on opposite ends of the flow tube. The ac-s measures $a(\lambda )$ by focusing a collimated light beam that is 45$^\circ $ out of phase with the $c(\lambda )$ beam through a highly reflective flow tube. These highly reflective walls serve to reflect as many scattered photons as possible into a detector at the far end of the tube, opposite the lamp. $a(\lambda )$ is therefore calculated as a function of undetected photons, which are assumed to have been absorbed by the medium [23,24]. However, 25 cm flow tubes allow for significant ${b_{ua}}$ and ${b_{uc}}$, which can result in overestimated $a(\lambda )$ and underestimated $c(\lambda )$. As such, a scattering correction approach (SCA) is used to account for these errors. Several SCAs, subject to varying assumptions, have been proposed for the ac-s (e.g., [16,23,25,26]). Using six of these proposed SCAs, we re-evaluate optical closure at five bio-optically distinct stations that were originally described by Bausell and Kudela [13]. Our study seeks to ascertain whether applying the correct SCA can improve the reliability of modeled hyperspectral ${[{{L_W}(\lambda )} ]_N}$ in characterizing coastal ecosystems.

2. Methods

2.1 Scattering correction approaches

NASA Ocean Optics Protocols list several SCAs that have been proposed for correcting ${b_{ua}}$ [4]. Referred to henceforth as ‘self-contained’ because they require no co-measured IOPs, these SCAs can be classified as flat (“FL”) or proportional (“PR”). “FL” SCAs assume that a single $\lambda $-independent ${b_{ua}}$ value can be subtracted from non ${b_{ua}}$-corrected ac-s $a(\lambda )$ (${a_m}(\lambda )$). The simplest of these SCAs assumes $a(\lambda )$ to be negligible in the NIR (e.g., 715 nm) and thus assumes ${a_m}({715} )$, which is subtracted from ${a_m}(\lambda )$, to be directly representative of ${b_{ua}}$ [26].

$${a_n}(\lambda )= \; {a_m}(\lambda )- \; {a_m}({715} )$$
FL” SCAs can also be utilized without assuming negligible $a(\lambda )$ in the NIR. These SCAs are approximated as:
$${a_n}(\lambda )= \; {a_m}(\lambda )- \; ({a_m}({715} )- a({715} ))$$
where ${a_n}(\lambda )$ is ac-s measured $a(\lambda )$ that has been corrected for ${b_{ua}}$ and $a({715} )$ is actual NIR $a(\lambda )$. “PR” SCAs assume spectrally variable ${b_{ua}}$ that is proportional to the $\lambda $-dependence of spectral scatter ($b(\lambda )$) [26]. Like “FL” SCAs, the simplest “PR” SCA assumes negligible NIR $a(\lambda )$ and is approximated as:
$${a_n}(\lambda )= \; {a_m}(\lambda )- \; {a_m}({715} )\frac{{{c_m}(\lambda )- {a_m}(\lambda )}}{{{c_m}({715} )- {a_m}({715} )}}$$
where ${c_m}(\lambda )$ is ac-s measured $c(\lambda )$ uncorrected for ${b_{uc}}$. As with “FL” SCAs, “PR” SCAs can be expanded to account for non-negligible NIR $a(\lambda )$ using the same expression as Eq. (2):
$${a_n}(\lambda )= \; {a_m}(\lambda )- \; ({a_m}({715} )- a({715} ))\frac{{{c_m}(\lambda )- {a_m}(\lambda )}}{{{c_m}({715} )- {a_m}({715} )}}$$
The ac-s underestimates $c(\lambda )$, which “PR” SCAs utilize to estimate ${a_n}(\lambda )$ (Eqs. (1)-(4)). For this reason, “PR” SCAs cannot utilize ac-s measurements to correct ${c_m}(\lambda )$ without resorting to circular reasoning. Therefore, Rottgers et al. [16] introduce a$\; \lambda $-independent coefficient (${\epsilon _c}$) to account for ${b_{uc}}$, which they set to 0.56.
$${a_n}(\lambda )= \; {a_m}(\lambda )- \; ({a_m}({715} )- a({715} ))\frac{{({1/{\epsilon_c}} )\; {c_m}(\lambda )- {a_m}(\lambda )}}{{(1/{\epsilon _c})\; {c_m}({715} )- {a_m}({715} )}}$$
We evaluated the ability of these self-contained SCAs to achieve optical closure at five optically distinct Monterey Bay sampling stations (Table 1). We evaluated two “FL” SCAs which assumed negligible (Eq. (1)) ($F{L^0}$) and non-negligible (Eq. (2)) $a({715} )$ ($FL$). Similarly, we evaluated two “PR” SCAs that assumed $a({715} )$ to be negligible (Eq. (3)) ($P{R^0}$) and non-negligible ($PR$) (Eq. (4)). We also evaluated a third “PR” SCA ($P{R_C}$) that assumed non-negligible $a({715} )$, but also relied on ${\epsilon _c}\; $to correct ${c_m}(\lambda )$. For $FL$, $PR$, and $P{R_C}$, we estimated $a({715} )$ using Rottgers et al.’s [16] empirical relationship.
$$a({715} )= \; 0.212\; {a_m}{({715} )^{1.135}}$$

Tables Icon

Table 1. Scattering correction approaches

Our sixth SCA ($M{c_{13}}$) was an iterative SCA that was developed by McKee et al. [23] and improved upon by McKee et al. [25]. $M{c_{13}}$ relies on contemporaneous direct measurements of ${b_{bp}}(\lambda )$ as inputs. It corrects for ${b_{ua}}$ and ${b_{uc}}$, using an iterative method that was created with Monte Carlo simulations of photons traveling through ac-s flow tubes. $M{c_{13}}$ also models the relationship between ${b_{bp}}(\lambda )$ and ${b_{ua}}$ at ten flow tube reflectance efficiencies (${r_W}$) (100% efficiency: ${r_W} = 1$). Because we cannot determine ${r_W}$ retroactively, we computed ${a_n}(\lambda )$ at $0.95 \le {r_W} \le 1$ and used ${[{{L_W}(\lambda )} ]_N}$ derived from C-OPS (Section 2.2) to constrain ${r_W}$.

2.2 Data acquisition and processing

The five Monterey Bay stations were sampled as part of COAST (CST), OCEANIA (OCE), and C-HARRIER (CHR) NASA airborne campaigns. In accordance with best practices, the C-OPS and HP2 were gently lowered into the water and kited away from the research vessel via their telemetry cables; each descent constituted one cast. Three to four casts were completed for each profiler consecutively [27]. We derived ${[{{L_W}(\lambda )} ]_N}$ from profiler casts using the established ocean optics protocols (see [4,28]) and averaged each set of casts by station. In order to enhance vertical resolution, HP2 was deployed in multicast at CHR stations, which involved taking five casts continuously, combining their measurements, and processing the data as if one cast.

To measure surface chlorophyll concentration (Chl) (Table 2), we collected seawater in a pre-rinsed bucket. Following procedures outlined in Kudela et al. [29], three 250 mL seawater aliquots were filtered onto Whatman Glass Fiber Filters (GF/F; nominally 0.7 $\mu $m) under low vacuum, which were stored on dry ice. Chl was extracted and measured following Welschmeyer’s [30] non-acidification technique. We calibrated our Turner Designs 10-AU fluorometer with a Turner Designs chlorophyll standard, which was verified using a spectrophotometric method [31]. We also calculated the first optical depth at 490 nm ($O{D_{490}}$) for each station using the average ${k_d}({490} )$ derived from C-OPS casts (Table 2).

Tables Icon

Table 2. Bio-optical descriptors for Monterey Bay stations

Bausell and Kudela [13] describe data acquisition and processing methods in detail. The ac-s, HS6, and an integrated SeaBird sonde were affixed to a metal cage that was raised and lowered through the water column multiple times in succession. Prior to applying SCAs, ${a_m}(\lambda )$ and ${c_m}(\lambda )$ were corrected for spectral ‘jumps’ caused by the ac-s’ two asynchronous holographic gratings. They were subsequently corrected for random drift by applying pure-water calibrations, and for directional temperature and salinity biases [32]. We applied SCAs following these preliminary corrections. Self-contained SCAs were applied to ${a_m}(\lambda )$ prior to depth binning (0.5 m), while $M{c_{13}}$ was applied to binned ${a_m}(\lambda )$ and ${c_m}(\lambda )$ because of computational limitations. To process ${b_{bp}}(\lambda ),$ we first subtracted pure-water backscattering coefficients as approximated by Morel et al. [33] from the measured total backscatter. We then applied Doxaran et al.’s [22] sigma correction five times at each station, using each set of binned ${a_n}(\lambda )$, as computed with self-contained SCAs. Prior to applying $M{c_{13}}$, we sigma-corrected ${b_{bp}}(\lambda )$ using binned ${a_m}(\lambda )$ and linearly interpolated ${b_{bp}}(\lambda )$ values to correspond with the ac-s’ channels.

We modeled ${[{{L_W}(\lambda )} ]_N}$ using Hydrolight 5.3 (HE53), computer software that uses radiative transfer theory to simulate underwater light fields. HE53 computes AOPs (e.g., ${[{{L_W}(\lambda )} ]_N}$) from binned ${a_n}(\lambda )$, ${c_n}(\lambda )$, and ${b_{bp}}(\lambda )$, as well as Chl, which was used to model the red fluorescence peak. In all cases, we modeled ${[{{L_W}(\lambda )} ]_N}$ using multi-spectral, as opposed to linearly interpolated ${b_{bp}}(\lambda )$. We modeled ${[{{L_W}(\lambda )} ]_N}$ for each SCA (Table 1) using the HE53 settings outlined below (Table 3). All unlisted settings remained default. CST, OCE, and CHR campaigns were sampled years apart (Table 2); thus, ${r_W}$ could have degraded over time [25]. Because we could not measure ${r_W}$ during sampling, we modeled ${[{{L_W}(\lambda )} ]_N}$ with each set of $M{c_{13}}$ ${a_n}(\lambda )$, corrected at $0.95 \le {r_W} \le 1$. We then constrained ${r_W}$, and by extension $M{c_{13}}$, using C-OPS ${[{{L_W}(\lambda )} ]_N}$ for the purpose of ascertaining 1) $M{c_{13}}$’s potential to reliably characterize ${[{{L_W}(\lambda )} ]_N}$ and 2) whether ${b_{ua}}$ can explain the underestimations of above-water AOPs for mid-visible spectrum, or “green maxima” $\lambda $s (e.g., 555 and 560 nm) as reported by previous Monterey Bay optical closure studies (e.g., [13,21]).

2.3 Data analysis

At each station, we evaluated agreement between C-OPS ${[{{L_W}(\lambda )} ]_N}$ and all seven hyperspectral ${[{{L_W}(\lambda )} ]_N}$s (six SCAs plus HP2) (Table 4). We chose C-OPS as the standard by which to evaluate SCAs and HP2 because it has been demonstrated to be a reliable and accurate profiler in coastal ecosystems (e.g., [9,13]). As such, we limited our analyses to the 12-13 C-OPS-centered $\lambda $s (station-dependent) that coincide with the ac-s and HS6 spectral range ($400\; \le \; \lambda < 715\; $nm). We first calculated percent root-mean-square error (%RMSE) between C-OPS and hyperspectral ${[{{L_W}(\lambda )} ]_N}$s (Table 4), which we used as a proxy for the latter’s overall uncertainty. We then evaluated qualitative and quantitative agreement between C-OPS and hyperspectral ${[{{L_W}(\lambda )} ]_N}$s for each $\lambda $. Qualitative agreement was achieved for a given $\lambda $ if hyperspectral ${[{{L_W}(\lambda )} ]_N}$ fell within one standard deviation of C-OPS. Quantitative agreement was met for a given $\lambda $ if the absolute percent difference ($\%\delta $) was $\le $ 10%, classified by Mueller et al. [5] as (satellite) “validation” quality. We used the term “agreement” as a catch-all for spectral optical closure (e.g., SCAs) and profiler (e.g., HP2) comparisons involving C-OPS. Our methods allowed hyperspectral ${[{{L_W}(\lambda )} ]_N}$s to achieve agreement at some $\lambda $s but not others. Thus, our evaluation put emphasis on the green maxima $\lambda $s.

Tables Icon

Table 4. Hyperspectral vs. C-OPS ${[{{{\boldsymbol L}_{\boldsymbol W}}({\boldsymbol \lambda } )} ]_{\boldsymbol N}}$ by station (%RMSE). ${\boldsymbol M}{{\boldsymbol c}_{13}}$ and the best-performing SCA shaded

In order to establish whether SCA selection can improve optical closure in Monterey Bay (e.g., [13,21]) and to determine whether modeling is a reasonable alternative for satellite validation more broadly, we centered our comparisons on $PR$, HP2, $M{c_{13}}$, and the best-performing self-contained SCA (lowest %RMSE) at each station; we also used %RMSE to constrain $M{c_{13}}$. Additionally, we compared the in-water properties of these hyperspectral ${[{{L_W}(\lambda )} ]_N}$s to gain insight into their differences. We compared near-surface (bin: 0.5-1 m) ${a_n}(\lambda )$ and ${b_{bp}}(\lambda )$ between SCAs and performed a sensitivity analysis of ${[{{L_W}(\lambda )} ]_N}$ modeled using $M{c_{13}}$-corrected ${a_n}(\lambda )$ ($0.95 \le {r_W} \le 1$). We selected the 0.5-1 m depth bin for these IOP comparisons because ac-s and HS6 deployment configurations (Section 2.2) precluded consistent measurements above 0.5 m. Lastly, we compared subsurface remote sensing reflectance profiles (${r_{rs}}({\lambda ,z} )$) between C-OPS (binned at 0.1 m), HP2, and SCA-modeled light fields, as a function of $\lambda $ and depth ($z$) (0-5 m).

$${r_{rs}}({\lambda ,z} )= \frac{{{L_u}({\lambda ,z} )}}{{{E_d}({\lambda ,z} )}}$$
${E_d}({\lambda ,z} )$ is downwelling irradiance and ${L_u}({\lambda ,z} )$ is upwelling radiance. We evaluated ${r_{rs}}({\lambda ,z} )$ comparisons and our sensitivity analysis for three $\lambda $s: 443, 490, and 555 nm.

3. Results

Although constrained with different ${r_W}$ values (Fig. 1), $M{c_{13}}$ exhibited the lowest overall %RMSE across stations (14%), followed by $F{L^0}$ (16%) and $P{R^0}$ (18%). Of the self-contained SCAs, $F{L^0}$ and $P{R^0}$ performed best at CST and CHR stations respectively; they performed equally well at OCE18 (Table 4). Radiative transfer modeling outperformed HP2 at all stations except CHR2 (Table 4); however, HP2 and modeled ${[{{L_W}(\lambda )} ]_N}$s were similarly shaped (Fig. 1).

 figure: Fig. 1.

Fig. 1. Hyperspectral ${[{{L_W}(\lambda )} ]_N}$ derived from HP2 and modeled using $PR$, $M{c_{13}}$, $F{L^0}$, and $P{R^0}$ overlaid with mean C-OPS ${[{{L_W}(\lambda )} ]_N}$ (${\pm} $ standard deviation) and $M{c_{13}}$, with ${r_W}$ constrained by C-OPS.

Download Full Size | PDF

$M{c_{13}}$ and best-performing SCAs improved agreement across stations relative to $PR$. $F{L^0}$ improved quantitative agreement of modeled hyperspectral ${[{{L_W}(\lambda )} ]_N}$ from eight to ten $\lambda $s at CST17 [Fig. 2(a)] and from one to four $\lambda $s at CST18 [Fig. 2(c)]. $F{L^0}$ improved qualitative agreement at these stations from two to nine $\lambda $s [Fig. 2(a)] and from four to seven $\lambda $s [Fig. 2(c)]. $M{c_{13}}$ improved quantitative and qualitative agreement to ten and eight $\lambda $s respectively at CST17 [Fig. 2(a)] and to seven and 11 $\lambda $s at CST18 [Fig. 2(c)]. $F{L^0}$ improved qualitative agreement from six to nine $\lambda $s but did not improve quantitative agreement at OCE18 [Fig. 2(b)].

 figure: Fig. 2.

Fig. 2. Quantitative (heatmap) and qualitative (circles) agreement between ${[{{L_W}(\lambda )} ]_N}$ measured with C-OPS versus ${[{{L_W}(\lambda )} ]_N}$measured with HP2 and modeled with different SCAs. Black and white circles denote whether qualitative agreement was ($\surd $) or was not ($\textrm{X}$) achieved.

Download Full Size | PDF

$P{R^0}$ improved quantitative agreement from five to eight $\lambda $s at OCE18 [Fig. 2(b)] and three to six $\lambda $s at CHR2 [Fig. 2(e)]; $P{R^0}$ improved qualitative agreement by one $\lambda $ at these stations [Fig. 2(b) and 2(e)]. $P{R^0}$ improved agreement by one $\lambda $ at CHR1 [Fig. 2(d)]. Unlike $P{R^0}$, $M{c_{13}}$ did not improve quantitative agreement at OCE18 or CHR1 [Figs. 2(b) and 2(d)], but improved quantitative agreement to seven $\lambda $s at CHR2 [Fig. 2(e)]. $M{c_{13}}$ increased qualitative agreement to nine and five $\lambda $s at OCE18 and CHR2 [Figs. 2(b) and 2(e)] but decreased qualitative agreement by one $\lambda $ at CHR1 [Fig. 2(d)]. Modeled ${[{{L_W}(\lambda )} ]_N}$ outperformed HP2 at all stations but CHR2. Broadly speaking, improved agreement relative to $PR$ and HP2 occurred throughout the spectrum, except for the red fluorescence peak (683 nm) (Fig. 2).

Agreement differences between hyperspectral ${[{{L_W}(\lambda )} ]_N}$s are explained by our sensitivity analysis, which demonstrates increased sensitivity of ${[{{L_W}(\lambda )} ]_N}$ to ${a_n}(\lambda )$ for 555 nm (Fig. 3). Likewise, discrepancies between SCAs were most pronounced for green maxima (Fig. 1) despite relatively consistent offsets between near-surface IOPs across the visible spectrum (Fig. 4). Because %RMSE gives greater weight to larger errors, intrastation rankings of hyperspectral ${[{{L_W}(\lambda )} ]_N}$s strongly reflect their similarity to C-OPS at green maxima $\lambda $s. For example, $M{c_{13}}$ achieved agreement, quantitative and qualitative, for at least one green maxima $\lambda $ at all stations but CHR1 (Fig. 2). Similarly, $F{L^0}$ demonstrated agreement at CST17 and OCE18 [Figs. 2(a)–2(b)], as well as near quantitative agreement ($\%\delta $ $\le $ 11%) at both CST stations [Figs. 2(a) and 2(c)]; $P{R^0}$ exhibited quantitative agreement at OCE18 and CHR1 [Figs. 2(b) and 2(d)]. For green maxima $\lambda $s, at least one SCA out-performed HP2 at all stations except for CHR2.

 figure: Fig. 3.

Fig. 3. Sensitivity analysis of near-surface ${a_n}(\lambda )$ on ${[{{L_W}(\lambda )} ]_N}$ at three$\; \lambda $s: 443, 490 and 555 nm. ${[{{L_W}(\lambda )} ]_N}$ was modeled from ${a_n}(\lambda )$ and ${b_{bp}}(\lambda )$ corrected by $M{c_{13}}$ (0.95 $\le {r_W} \le $ 1). $\mathrm{\Delta }{a_n}(\lambda )$ and $\mathrm{\Delta }{[{{L_W}(\lambda )} ]_N}$ equal zero at ${r_W} = 1$.

Download Full Size | PDF

 figure: Fig. 4.

Fig. 4. Near-surface ${a_n}(\lambda )$ (a-e) and ${b_{bp}}(\lambda )$ (f-j), as corrected with $M{c_{13}}$, $PR$, $F{L^0}$ (CST), and $P{R^0}$ (OCE and CHR).

Download Full Size | PDF

Consistent with our sensitivity analysis (Fig. 4) and SCA comparisons (Fig. 2), ${r_{rs}}({\lambda ,z} )$ offsets between SCAs were greatest for 555 nm [Fig. 5]. In comparing ${r_{rs}}({\lambda ,z} )$, we observed shape similarities between C-OPS and SCAs at the three $\lambda $s that we examined (Fig. 5). However, within stations we found no relationship between agreement across SCAs and their ${r_{rs}}({\lambda ,z} )$ offsets with C-OPS (Figs. 2 and 5). Despite ${r_{rs}}({\lambda ,z} )$ overlapping with C-OPS at all stations, HP2 did not improve agreement compared to best-performing SCAs (Figs. 2 and 5).

 figure: Fig. 5.

Fig. 5. ${r_{rs}}({\lambda ,z} )$s of C-OPS, HP2, and SCAs (see Fig. 1). The shaded region in each plot denotes the depth below $O{D_{490}}$ (see Table 2). Profiler cast (1-3) and multicast (m) are in legend subscript. Arranged by $\lambda $ (1-3) and station: A) CST17, B) OCE18, C) CST18, D) CHR1, and E) CHR2.

Download Full Size | PDF

4. Discussion and conclusions

Our findings establish modeling as an alternative or complementary method to reliance on profilers exclusively for validating hyperspectral satellite data from coastal ecosystems. We reach this conclusion based on 1) spectral shape similarities between modeled and HP2-derived ${[{{L_W}(\lambda )} ]_N}$ (Fig. 1), and 2) reductions in the uncertainty of modeled ${[{{L_W}(\lambda )} ]_N}$ compared with HP2 (Table 4; Fig. 2) for the majority of stations and casts. These findings are consistent with previous analyses which estimate that the ac-s and volume scattering functions propagate 8% error, on average, into modeled above-water AOPs [13]. In contrast, a single HP2 cast introduces 25% error into its derived products [14]. Our method has several intrinsic advantages over profilers. It is not susceptible to wave focusing, shading, or tilt [7,12], for which C-OPS compensates with cm-scale depth resolution. In coastal ecosystems, however, profiler depth artifacts as low as 8 cm, which can be caused by an offset pressure transducer or asynchronous sensor integration times, can introduce significant error into derived data products in the visible [9], especially when combined with coarser depth resolution [12]. Reduction in the uncertainty of modeled ${[{{L_W}(\lambda )} ]_N}$, compared with HP2, despite similar vertical resolutions, suggests that modeling may be less susceptible to depth artifacts or diminished resolutions than profilers.

The improvement in optical closure achieved with best-performing SCAs as opposed to $PR$, suggest that previous failures to achieve optical closure in Monterey Bay (e.g., [13,21]) are attributable to ${b_{ua}}$. This observation underscores the importance of correcting ${a_m}(\lambda )$ with an SCA whose assumptions (e.g., Section 2.1) are applicable to the target ecosystem. Tonizzo et al. [14] demonstrate this importance by reporting wide disparities between $F{L^0}$, $P{R^0},$ and $M{c_{13}}$ in overall %RMSE within different ecosystems. Our overall results (Table 4) are similar in scope to Tonizzo et al. [14], who report, for example, $M{c_{13}}$ and $F{L^0}$ to be optimal in the New York Bight ($\le $ 24%), and San Diego Coast (19%) respectively. However, our findings demonstrate that disparities between these three SCAs also exist between water masses inside the same ecosystem (Table 4; Fig. 2), which included low (CST17 and CHR1), medium (OCE18 and CHR2), and high (CST18) phytoplankton biomasses (Table 2).

Water masses with high phytoplankton biomass, such as CST18, are characterized by spatial ‘patchiness’ of phytoplankton and rapid light attenuation in the upper meter of the water column. Bausell and Kudela [13] attribute CST18’s diminished optical closure, relative to the other stations, to these water mass characteristics. Not only was our ac-s/HS6 configuration unable to measure the upper 0.5 meter of the water column, it was also deployed asynchronously with C-OPS (Section 2.2). Consequently, HE53 modeled light fields without input of IOPs from the upper half meter, which would in this case be expected to drive reflectance signature. However, our ability to improve agreement at CST18 with $M{c_{13}}$ suggests ${b_{ua}}$ was also responsible for much of the previously-reported discrepancy between C-OPS and modeled ${[{{L_W}(\lambda )} ]_N}$, as reported by Bausell and Kudela [13]. Spectral shape similarities between C-OPS and SCA ${r_{rs}}({\lambda ,z} )$s for 443, 490, and 555 nm suggest that modeled AOPs are representative of the vertical structure of underwater light fields (Fig. 5).

In explaining the lack of a relationship between ${r_{rs}}({\lambda ,z} )$ offsets (Fig. 5) and ${[{{L_W}(\lambda )} ]_N}$ agreement (Fig. 2) between SCAs and C-OPS, we note that these two AOPs are derived differently. ${[{{L_W}(\lambda )} ]_N}$ is calculated by extrapolating ${L_u}(\lambda )$ and ${E_d}(\lambda )$ across the air-water interface prior to its computation [13]. Because ${r_{rs}}({\lambda ,z} )$ is computed from in-water ratios (Eq. (7)), small-scale changes in ${L_u}(\lambda )$ and ${E_d}(\lambda )$ may impact it differently than they would ${[{{L_W}(\lambda )} ]_N}$. Similarly, we suggest that HP2’s low vertical resolution, which can increase the extrapolation uncertainty of above-water AOPs, explains how HP2 can exhibit better ${r_{rs}}({\lambda ,z} )$ overlap but diminished ${[{{L_W}(\lambda )} ]_N}$ agreement relative to SCAs as compared to C-OPS (Fig. 2).

The increased variability between SCAs for 555 nm, both in terms of ${r_{rs}}({\lambda ,z} )$ and agreement with C-OPS, can be explained by the nonlinear relationship between $a(\lambda )$ and AOPs, as reported by Gordon et al. [34] for phytoplankton-dominated ecosystems. The sensitivity of this relationship decreases as $a(\lambda )$ increases. Because phytoplankton $a(\lambda )$ signal is weakest in the green [35], ${a_n}(\lambda )$ impacts on ${[{{L_W}(\lambda )} ]_N}$ and ${r_{rs}}({\lambda ,z} )$ would be more sensitive than in regions where pigments absorb strongly (e.g., 443 nm) (Fig. 4). Our inability to model the red fluorescence peak (e.g., (Figs. 1 and 2)), is attributable to HE53’s utilization of Chl, coupled with a quantum efficiency value to model phytoplankton fluorescence; the default quantum efficiency value (0.02) may be inaccurate for Monterey Bay [36].

There are several tradeoffs to consider when modeling ${[{{L_W}(\lambda )} ]_N}$ with long pathlength instruments. Collectively, self-contained SCAs encompass a diverse array of assumptions that govern the optical properties of water masses (e.g., [16,26]). Therefore, these SCAs potentially afford investigators the latitude to model water masses within a diverse array of aquatic ecosystems. However, to select the most optimal self-contained SCA without co-measured AOPs, investigators must have a working understanding of the optical properties governing the water masses within their target ecosystems. For example, our observed efficacy of $F{L^0}$ and $P{R^0}$ in modeling ${[{{L_W}(\lambda )} ]_N}$ (e.g., Figs. 1 and 2) suggest that while Monterey Bay phytoplankton assemblages absorb negligibly in the NIR, their phase functions can be both dependent and independent of $\lambda $ [34]. Reliably modeling AOPs in Monterey Bay (e.g., selecting the correct SCA) without profiler data would thus require an improved understanding of the optical properties associated with its water masses. Contrary to self-corrected SCAs, $M{c_{13}}$ utilizes co-measured ${b_{bp}}(\lambda )$ to correct ${a_m}(\lambda )$; hence, investigators need no working understanding of the optical properties governing their target ecosystems [25]. Moreover, $M{c_{13}}$ corrects ${c_m}(\lambda )$ for ${b_{uc}}$, although it does so at the expense of ${b_{bp}}(\lambda )$, which cannot be sigma-corrected with ${a_n}(\lambda )$ (e.g., [22]) sans circular reasoning.

$M{c_{13}}$ relies on instrument-specific ${r_w}$ which McKee et al. [25] infer by optimizing fits between ac-s measured ${a_n}(\lambda )$ and $a(\lambda )$ co-measured with a Helmholtz-Zentrum Greesthacht point-source integrating cavity absorption meter (PSICAM), which is unperturbed by ${b_{ua}}$. By measuring $a(\lambda )$ from water samples collected at discrete depths, PSICAM $a(\lambda )$ can be used to constrain ${r_w}$. However, IOP datasets containing co-measured ac-s ${a_n}(\lambda )$ and PSICAM $a(\lambda )$ are uncommon. Moreover, while co-measured PSICAM $a(\lambda )$ can infer ${r_w}$ for an ac-s over a short time period (e.g., a field campaign), this ${r_w}$ value may not be applicable to past ac-s measurements because its flow tube walls may have degraded over time [25], especially (as in our case) when stations were sampled years apart (Table 2). Therefore, constraining $M{c_{13}}$ with co-measured AOPs (e.g., C-OPS) may be more feasible than doing so with PSICAM. Using multiple ${r_w}$s, $M{c_{13}}$ outperformed HP2 at four stations (Fig. 6), suggesting that ${b_{ua}}$ can explain previous failures to adequately characterize Monterey Bay water masses with modeled data (e.g., [13,21]). While C-OPS constrained ${r_w}$ likely reflects the accumulation of errors through the modeling process rather than actual flow tube reflectance efficiency, our results speak to the potential of this method to characterize coastal ecosystems. For example, averaging C-OPS constrained ${r_w}$ across stations (∼0.97) improves modeling, relative to HP2, at three stations (Fig. 6); however, it is important to remember that true ${r_w}$ at these stations may have differed due flow tube degradation. We believe that C-OPS constrained ${r_w}$s across stations sampled within the same period (e.g., a cruise) are less likely to differ significantly. Notwithstanding, variation in C-OPS constrained ${r_w}$ across stations decreased when stations were partitioned by water mass properties. $M{c_{13}}$ outperformed HP2 at four stations when we averaged ${r_w}$ separately for low-biomass stations (∼0.985) and medium-high biomass stations (∼0.96) (Table 2; Fig. 6).

 figure: Fig. 6.

Fig. 6. Heatmap indicating reduced (blue) or increased (red) uncertainty (%RMSE) of $M{c_{13}}$ relative to HP2 for all ten ${r_w}$ values (columns) at each station (rows). Light blue indicates $M{c_{13}}$ exhibited lower uncertainty than HP2 when the 683 nm band was excluded from analysis and black circles indicate ${r_w}$ as constrained by C-OPS.

Download Full Size | PDF

We conclude that long pathlength instruments, such as the 25 cm ac-s and the HS6 may provide an alternative or complementary method to in-water radiometric profilers for validating satellite-derived hyperspectral data from coastal ecosystems; inclusion of additional sensors such as the PSICAM could further constrain retrieved data. Although limited in scope, our findings show that modeled data exhibited lower overall uncertainty (%RMSE) than HP2 at four of the five stations (Table 4). Modeling ${[{{L_W}(\lambda )} ]_N}$ with $M{c_{13}}$ also produced lower uncertainty than HP2 at CHR2 once the 683 nm band (chlorophyll fluorescence) was excluded from analyses (Fig. 6). Modeling is inherently advantageous over radiometry, as above-water AOPs can be modeled at any $\lambda $, meaning that satellite validation would no longer be limited to$\; \lambda $s on which profiler channels are centered. However, because Monterey Bay water masses typically exhibit low concentrations of colored dissolved organic matter and terrigenous sediments, we believe that applying our method to coastal ecosystems with significant riverine inputs would provide a more robust evaluation of these methods. Such an extension would be an important step in evaluating the utility of $P{R_C}$, $PR$, and $FL$, which assume non-negligible $a(\lambda )$ in the NIR (Eqs. (3)-(5)).

An initial evaluation would suggest that modeling ${[{{L_W}(\lambda )} ]_N}$ with PSICAM-inferred ${r_w}$ may be unreliable, as average C-OPS constrained ${r_w}$ (∼0.97) only outperformed HP2 at three stations (Fig. 6); however, field campaigns occurred years apart and ${r_w}$ may therefore differ (Table 2). However, in the absence of PSICAM, co-measured AOPs would allow investigators to compensate for a limited understanding of the optical properties of water masses within the ecosystems they target. For example, investigators could co-deploy C-OPS with an ac-s and HS6 at a handful of stations during a field campaign and use in-situ AOPs to 1) determine the most optimal self-contained SCA at this station, or 2) determine the most likely value(s) at which ${r_w}$ will be constrained. This information could be used to model light fields at additional stations without co-measured AOPs. This ability has important applications for validating hyperspectral next-generation ocean color satellite and aircraft sensors, as our method could potentially characterize water mass light fields in less-than-ideal conditions, such as uneven cloud cover, high solar zenith angle, or at night, for which profilers do not produce high-fidelity data. We believe that due to the ubiquity of ac-s and HS6 data in the ocean optics community, our method creates the potential for an increased volume of above-water AOPs derived from in situ IOPs for validating satellite and airborne imagery in coastal ecosystems. While an ideal scenario would be to co-deploy multiple IOP and AOP instruments to fully constrain the model, using a subset of data still provides significant improvements for calibration, validation, and research when hyperspectral data are required.

Funding

National Aeronautics and Space Administration (NNH11ZDA001N, NNX14AI30A, NNX17AK89G).

Acknowledgments

We thank Dr. Liane Guild (NASA Ames Research Center) for leading the three field campaigns (one of which was also associated with the HyspIRI Preparatory Airborne Activities as part of NNH11ZDA001N), Dr. Stanford Hooker (NASA/Goddard Space Flight Center) and Mr. James Brown (University of Miami) for technical assistance in processing profiler AOP data, and Kendra N. Hayashi (University of California, Santa Cruz) for her important role in field sampling.

Disclosures

The authors declare no conflicts of interest.

Data availability

Data underlying the results presented in this paper are not publicly available at this time but may be obtained from the authors upon reasonable request.

References

1. J. Ryan, M. McManus, R. Kudela, M. L. Artigas, J. Bellingham, F. Chavez, G. Doucette, D. Foley, M. Godin, and J. Harvey, “Boundary influences on HAB phytoplankton ecology in a stratification-enhanced upwelling shadow,” Deep Sea Res., Part II 101, 63–79 (2014). [CrossRef]  

2. R. Amin, D. Lewis, R. W. Gould, W. Hou, A. Lawson, M. Ondrusek, and R. Arnone, “Assessing the application of cloud–shadow atmospheric correction algorithm on HICO,” IEEE Trans. Geosci. Remote Sensing 52(5), 2646–2653 (2014). [CrossRef]  

3. R. A. Barnes, D. K. Clark, W. E. Esaias, G. S. Fargion, G. C. Feldman, and C. R. McClain, “Development of a consistent multi-sensor global ocean colour time series,” International Journal of Remote Sensing 24(20), 4047–4064 (2003). [CrossRef]  

4. J. Mueller, J. MuelIer, C. Pietras, S. Hooker, D. Clark, A. M. R. Frouin, and G. Fargion, “Ocean optics protocols for satellite ocean color sensor validation, revision 3, volumes 1 and 2,” in NASA tech. memo (2002).

5. J. L. Mueller, S. Brown, D. Clark, B. Johnson, H. Yoon, K. Lykke, S. Flora, M. Feinholz, N. Souaidia, C. Pietras, T. Stone, M. Yarbrough, Y. Kim, and R. Barnes, “Ocean Optics Protocols For Satellite Ocean Color Sensor Validation, Revision 5. Volume VI: Special Topics in Ocean Optics Protocols, Part 2,” (2004).

6. M. E. Ondrusek, E. Stengel, M. A. Rella, W. Goode, S. Ladner, and M. Feinholz, “Validation of ocean color sensors using a profiling hyperspectral radiometer,” in SPIE Sensing Technology + Applications (SPIE, 2014), p. 12.

7. G. Zibordi, D. D’Alimonte, and J.-F. Berthon, “An evaluation of depth resolution requirements for optical profiling in coastal waters,” Journal of Atmospheric and Oceanic Technology 21(7), 1059–1073 (2004). [CrossRef]  

8. J. P. Ryan, M. A. McManus, and J. M. Sullivan, “Interacting physical, chemical and biological forcing of phytoplankton thin-layer variability in Monterey Bay, California,” Cont. Shelf Res. 30(1), 7–16 (2010). [CrossRef]  

9. S. Hooker, J. Morrow, and A. Matsuoka, “The 1% and 1 cm perspective in deriving and validating AOP data products,” Biogeosci. Discussions 9(7), 9487–9531 (2012). [CrossRef]  

10. S. L. Palacios, R. M. Kudela, L. S. Guild, K. H. Negrey, J. Torres-Perez, and J. Broughton, “Remote sensing of phytoplankton functional types in the coastal ocean from the HyspIRI Preparatory Flight Campaign,” Remote Sensing of Environment 167, 269–280 (2015). [CrossRef]  

11. P. J. Werdell, M. J. Behrenfeld, P. S. Bontempi, E. Boss, B. Cairns, G. T. Davis, B. A. Franz, U. B. Gliese, E. T. Gorman, and O. Hasekamp, “The Plankton, Aerosol, Cloud, ocean Ecosystem mission: status, science, advances,” Bull. Am. Meteorol. Soc. 100(9), 1775–1794 (2019). [CrossRef]  

12. D. D’Alimonte, G. Zibordi, and T. Kajiyama, “Effects of integration time on in-water radiometric profiles,” Opt. Express 26(5), 5908–5939 (2018). [CrossRef]  

13. J. Bausell and R. Kudela, “Comparison of two in-water optical profilers in a dynamic coastal marine ecosystem,” Appl. Opt. 58(27), 7319–7330 (2019). [CrossRef]  

14. A. Tonizzo, M. Twardowski, S. McLean, K. Voss, M. Lewis, and C. Trees, “Closure and uncertainty assessment for ocean color reflectance using measured volume scattering functions and reflective tube absorption coefficients with novel correction for scattering,” Appl. Opt. 56(1), 130–146 (2017). [CrossRef]  

15. P. Jørgensen, G. Tilstone, J. Høkedal, and W. Schönfeld, “Intercomparison of Spectral Backscattering Coefficients Measured In Situ Using Several Hydroscat Instruments Results from PlymCal-2 and REVAMP Cruises,” in ESA Special Publication (2003).

16. R. Röttgers, D. McKee, and S. B. Woźniak, “Evaluation of scatter corrections for ac-9 absorption measurements in coastal waters,” Methods in Oceanography 7, 21–39 (2013). [CrossRef]  

17. J. T. Kirk, Light and photosynthesis in aquatic ecosystems (Cambridge university press, 1994).

18. T. Oishi, “Significant relationship between the backward scattering coefficient of sea water and the scatterance at 120,” Appl. Opt. 29(31), 4658–4665 (1990). [CrossRef]  

19. M. Chami, E. B. Shybanov, G. A. Khomenko, M. E.-G. Lee, O. V. Martynov, and G. K. Korotaev, “Spectral variation of the volume scattering function measured over the full range of scattering angles in a coastal environment,” Appl. Opt. 45(15), 3605–3619 (2006). [CrossRef]  

20. WET Labs Inc., “ECO-VSF 3: Three-angle, Three-wavelength Volume Scattering Function Meter,” (2007).

21. N. Tuchow, J. Broughton, and R. Kudela, “Sensitivity analysis of volume scattering phase functions,” Opt. Express 24(16), 18559–18570 (2016). [CrossRef]  

22. D. Doxaran, E. Leymarie, B. Nechad, A. Dogliotti, K. Ruddick, P. Gernez, and E. Knaeps, “Improved correction methods for field measurements of particulate light backscattering in turbid waters,” Opt. Express 24(4), 3615–3637 (2016). [CrossRef]  

23. D. McKee, J. Piskozub, and I. Brown, “Scattering error corrections for in situ absorption and attenuation measurements,” Opt. Express 16(24), 19480–19492 (2008). [CrossRef]  

24. WET Labs Inc., “Spectral Absorption and Attenuation Meter User's Guide,” (2008).

25. D. McKee, J. Piskozub, R. Röttgers, and R. A. Reynolds, “Evaluation and improvement of an iterative scattering correction scheme for in situ absorption and attenuation measurements,” J. Atmos. Oceanic Technol. 30(7), 1527–1541 (2013). [CrossRef]  

26. J. R. V. Zaneveld, J. C. Kitchen, and C. C. Moore, “Scattering error correction of reflection-tube absorption meters,” in Ocean Optics XII (International Society for Optics and Photonics, 1994), pp. 44–55.

27. Biospherical Instruments Inc., “Compact Optical Profiling System: Hardware User's Manual (Doc 006406UA),” (2010).

28. J. L. Mueller and R. W. Austin, Ocean optics protocols for SeaWiFS validation (National Aeronautics and Space Administration, Goddard Space Flight Center, 1992).

29. R. M. Kudela, W. P. Cochlan, T. D. Peterson, and C. G. Trick, “Impacts on phytoplankton biomass and productivity in the Pacific Northwest during the warm ocean conditions of 2005,” Geophys. Res. Lett. 33(22), L22S06 (2006). [CrossRef]  

30. N. A. Welschmeyer, “Fluorometric analysis of chlorophyll a in the presence of chlorophyll b and pheopigments,” Limnol. Oceanogr. 39(8), 1985–1992 (1994). [CrossRef]  

31. E. Arar, “Method 446.0 In Vitro Determination of Chlorophylls a, b, c1+ c2 and//Visible Spectrophotometry. Revision 1.2 National Exposure Research Laboratory,” (US Environmental Protection Agency Cincinatti, OH, 1997).

32. J. M. Sullivan, M. S. Twardowski, J. R. V. Zaneveld, C. M. Moore, A. H. Barnard, P. L. Donaghay, and B. Rhoades, “Hyperspectral temperature and salt dependencies of absorption by water and heavy water in the 400-750 nm spectral range,” Appl. Opt. 45(21), 5294–5309 (2006). [CrossRef]  

33. Morel, “Optical Aspects of Oceanography,” in Optical Aspects of Oceanography, M. G. Jerlov and E. S. Nielsen, eds. (Academic Press Inc, 1977).

34. H. R. Gordon, O. B. Brown, R. H. Evans, J. W. Brown, R. C. Smith, K. S. Baker, and D. K. Clark, “A semianalytic radiance model of ocean color,” J. Geophys. Res.: Atmos. 93(D9), 10909–10924 (1988). [CrossRef]  

35. L. Prieur and S. Sathyendranath, “An optical classification of coastal and oceanic waters based on the specific spectral absorption curves of phytoplankton pigments, dissolved organic matter, and other particulate materials,” Limnol. Oceanogr. 26(4), 671–689 (1981). [CrossRef]  

36. C. D. Mobley and L. K. Sundman, “Hydrolight 5.2 Ecolight 5.2 Technical Documentation,” I. Sequoia Scientific, ed. (2013).

Data availability

Data underlying the results presented in this paper are not publicly available at this time but may be obtained from the authors upon reasonable request.

Cited By

Optica participates in Crossref's Cited-By Linking service. Citing articles from Optica Publishing Group journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (6)

Fig. 1.
Fig. 1. Hyperspectral ${[{{L_W}(\lambda )} ]_N}$ derived from HP2 and modeled using $PR$ , $M{c_{13}}$ , $F{L^0}$ , and $P{R^0}$ overlaid with mean C-OPS ${[{{L_W}(\lambda )} ]_N}$ ( ${\pm} $ standard deviation) and $M{c_{13}}$ , with ${r_W}$ constrained by C-OPS.
Fig. 2.
Fig. 2. Quantitative (heatmap) and qualitative (circles) agreement between ${[{{L_W}(\lambda )} ]_N}$ measured with C-OPS versus ${[{{L_W}(\lambda )} ]_N}$ measured with HP2 and modeled with different SCAs. Black and white circles denote whether qualitative agreement was ( $\surd $ ) or was not ( $\textrm{X}$ ) achieved.
Fig. 3.
Fig. 3. Sensitivity analysis of near-surface ${a_n}(\lambda )$ on ${[{{L_W}(\lambda )} ]_N}$ at three $\; \lambda $ s: 443, 490 and 555 nm. ${[{{L_W}(\lambda )} ]_N}$ was modeled from ${a_n}(\lambda )$ and ${b_{bp}}(\lambda )$ corrected by $M{c_{13}}$ (0.95 $\le {r_W} \le $ 1). $\mathrm{\Delta }{a_n}(\lambda )$ and $\mathrm{\Delta }{[{{L_W}(\lambda )} ]_N}$ equal zero at ${r_W} = 1$ .
Fig. 4.
Fig. 4. Near-surface ${a_n}(\lambda )$ (a-e) and ${b_{bp}}(\lambda )$ (f-j), as corrected with $M{c_{13}}$ , $PR$ , $F{L^0}$ (CST), and $P{R^0}$ (OCE and CHR).
Fig. 5.
Fig. 5. ${r_{rs}}({\lambda ,z} )$ s of C-OPS, HP2, and SCAs (see Fig. 1). The shaded region in each plot denotes the depth below $O{D_{490}}$ (see Table 2). Profiler cast (1-3) and multicast (m) are in legend subscript. Arranged by $\lambda $ (1-3) and station: A) CST17, B) OCE18, C) CST18, D) CHR1, and E) CHR2.
Fig. 6.
Fig. 6. Heatmap indicating reduced (blue) or increased (red) uncertainty (%RMSE) of $M{c_{13}}$ relative to HP2 for all ten ${r_w}$ values (columns) at each station (rows). Light blue indicates $M{c_{13}}$ exhibited lower uncertainty than HP2 when the 683 nm band was excluded from analysis and black circles indicate ${r_w}$ as constrained by C-OPS.

Tables (4)

Tables Icon

Table 1. Scattering correction approaches

Tables Icon

Table 2. Bio-optical descriptors for Monterey Bay stations

Tables Icon

Table 3. HE53 Settings

Tables Icon

Table 4. Hyperspectral vs. C-OPS [ L W ( λ ) ] N by station (%RMSE). M c 13 and the best-performing SCA shaded

Equations (7)

Equations on this page are rendered with MathJax. Learn more.

a n ( λ ) = a m ( λ ) a m ( 715 )
a n ( λ ) = a m ( λ ) ( a m ( 715 ) a ( 715 ) )
a n ( λ ) = a m ( λ ) a m ( 715 ) c m ( λ ) a m ( λ ) c m ( 715 ) a m ( 715 )
a n ( λ ) = a m ( λ ) ( a m ( 715 ) a ( 715 ) ) c m ( λ ) a m ( λ ) c m ( 715 ) a m ( 715 )
a n ( λ ) = a m ( λ ) ( a m ( 715 ) a ( 715 ) ) ( 1 / ϵ c ) c m ( λ ) a m ( λ ) ( 1 / ϵ c ) c m ( 715 ) a m ( 715 )
a ( 715 ) = 0.212 a m ( 715 ) 1.135
r r s ( λ , z ) = L u ( λ , z ) E d ( λ , z )
Select as filters


Select Topics Cancel
© Copyright 2024 | Optica Publishing Group. All rights reserved, including rights for text and data mining and training of artificial technologies or similar technologies.