Accessing SnowEx Data at the NSIDC DAAC

2021 SnowEx Hackweek

Author: Amy Steiker, NSIDC DAAC, CIRES, University of Colorado

This tutorial provides a brief overview of the resources provided by the NASA National Snow and Ice Data Center Distributed Active Archive Center, or NSIDC DAAC, and demonstrates how to discovery and access SnowEx data programmatically.

Learning Objectives:

  1. Identify resources available at the NSIDC DAAC to help you work with SnowEx data.

  2. Search and discover file size and coverage of SnowEx data over a time and geojson region of interest.

  3. Set up Earthdata Login authentication needed to access NASA Earthdata.

  4. Download SnowEx data programmatically using the NSIDC DAAC Application Programming Interface (API).


Explore snow products and resources

NSIDC introduction

The National Snow and Ice Data Center, located in Boulder, Colorado, provides over 1100 data sets covering the Earth’s cryosphere and more, all of which are available to the public free of charge. Beyond providing these data, NSIDC creates tools for data access, supports data users, performs scientific research, and educates the public about the cryosphere.

../../_images/nsidc-daac-header.png

The NASA National Snow and Ice Data Center Distributed Active Archive Center (NSIDC DAAC) provides access to over 700 data products (over 2 PB of data and growing!) in support of cryospheric research, global change detection, and water resource management. NSIDC is one of 12 DAACs as part of NASA’s EOSDIS, or Earth Observing System Data and Information System, and is also the largest program within the National Snow and Ice Data Center, as part of the University of Colorado’s Cooperative Institute for Research in Environmental Sciences.

Select Data Resources

Snow Today

Snow Today, a collaboration with the University of Colorado’s Institute of Alpine and Arctic Research (INSTAAR), provides near-real-time snow analysis for the western United States and regular reports on conditions during the winter season. Snow Today is funded by NASA Hydrological Sciences Program and utilizes data from the Moderate Resolution Imaging Spectroradiometer (MODIS) instrument and snow station data from the Snow Telemetry (SNOTEL) network by the Natural Resources Conservation Service (NRCS), United States Department of Agriculture (USDA) and the California Department of Water Resources: www.wcc.nrcs.usda.gov/snow.

NSIDC’s SnowEx pages

Other relevant snow products:


Import Packages

Get started by importing packages needed to run the following code blocks, including the tutorial_helper_functions module provided within this repository.

import os
import geopandas as gpd
from shapely.geometry import Polygon, mapping
from shapely.geometry.polygon import orient
import pandas as pd 
import requests
import json
import pprint
import getpass
import netrc
from platform import system
from getpass import getpass
from urllib import request
from http.cookiejar import CookieJar
from os.path import join, expanduser
import requests
from xml.etree import ElementTree as ET
import time
import zipfile
import io
import shutil

Data Discovery

Start by identifying your study area and exploring coincident data over the same time and area.

Identify area and time of interest

Since our focus is on the Grand Mesa study site of the NASA SnowEx campaign, we’ll use that area to search for coincident data across other data products. From the SnowEx17 Ground Penetrating Radar Version 2 landing page, you can find the rectangular spatial coverage under the Overview tab, or you can draw a polygon over your area of interest in the map under the Download Data tab and export the shape as a geojson file using the Export Polygon icon shown below. An example polygon geojson file is provided in the /Data folder of this repository.

../../_images/Data-download-polygon-export.png

Create polygon coordinate string

Read in the geojson file as a GeoDataFrame object and simplify and reorder using the shapely package. This will be converted back to a dictionary to be applied as our polygon search parameter.

polygon_filepath = str(os.getcwd() + '/Data/nsidc-polygon.json') # Note: A shapefile or other vector-based spatial data format could be substituted here.

gdf = gpd.read_file(polygon_filepath) #Return a GeoDataFrame object

# Simplify polygon for complex shapes in order to pass a reasonable request length to CMR. The larger the tolerance value, the more simplified the polygon.
# Orient counter-clockwise: CMR polygon points need to be provided in counter-clockwise order. The last point should match the first point to close the polygon.
poly = orient(gdf.simplify(0.05, preserve_topology=False).loc[0],sign=1.0)

#Format dictionary to polygon coordinate pairs for CMR polygon filtering
polygon = ','.join([str(c) for xy in zip(*poly.exterior.coords.xy) for c in xy])
print('Polygon coordinates to be used in search:', polygon)
poly
Polygon coordinates to be used in search: -108.2352445938561,38.98556907427165,-107.85284607930835,38.978765032966244,-107.85494925720668,39.10596902171742,-108.22772795408136,39.11294532581687,-108.2352445938561,38.98556907427165
../../_images/nsidc-data-access_7_1.svg

Alternatively: Specify bounding box region of interest

Instead of using a vector shape file to specify a region of interest, you can simply use a bounding box. The following cell is commented out below, which can be used instead of the polygon search parameter.

# bounds = poly.bounds # Get polygon bounds to be used as bounding box input
# bounding_box = ','.join(map(str, list(bounds))) # Bounding Box spatial parameter in decimal degree 'W,S,E,N' format. 
# print(bounding_box)

Set time range

This is an optional parameter; set this to specify a time range of interest. In this case we’ll just select all of 2017 to ensure that we receive all files within this data set campaign.

temporal = '2017-01-01T00:00:00Z,2017-12-31T23:59:59Z' # Set temporal range

Create data parameter dictionary

Create a dictionary with the data set shortname and version, as well as the temporal range and polygonal area of interest. Data set shortnames, or IDs, as well as version numbers, are located at the top of every NSIDC landing page.

param_dict = {
    'short_name': 'SNEX17_GPR',
    'version': '2',
    'polygon': polygon,
#     'bounding_box': bounding_box, #optional alternative to polygon search parameter; if using, remove or comment out polygon search parameter
    'temporal':temporal,
}

Determine how many files exist over this time and area of interest, as well as the average size and total volume of those files

We will use the granule_info function to query metadata about each data set and associated files using the Common Metadata Repository (CMR), which is a high-performance, high-quality, continuously evolving metadata system that catalogs Earth Science data and associated service metadata records. Note that not all NSIDC data can be searched at the file level using CMR, particularly those outside of the NASA DAAC program.

def search_granules(search_parameters, geojson=None, output_format="json"):
    """
    Performs a granule search with token authentication for restricted results
    
    :search_parameters: dictionary of CMR search parameters
    :token: CMR token needed for restricted search
    :geojson: filepath to GeoJSON file for spatial search
    :output_format: select format for results https://cmr.earthdata.nasa.gov/search/site/docs/search/api.html#supported-result-formats
    
    :returns: if hits is greater than 0, search results are returned in chosen output_format, otherwise returns None.
    """
    search_url = "https://cmr.earthdata.nasa.gov/search/granules"

    
    if geojson:
        files = {"shapefile": (geojson, open(geojson, "r"), "application/geo+json")}
    else:
        files = None
    
    
    parameters = {
        "scroll": "true",
        "page_size": 100,
    }
    
    try:
        response = requests.post(f"{search_url}.{output_format}", params=parameters, data=search_parameters, files=files)
        response.raise_for_status()
    except HTTPError as http_err:
        print(f"HTTP Error: {http_error}")
    except Exception as err:
        print(f"Error: {err}")
    
    hits = int(response.headers['CMR-Hits'])
    if hits > 0:
        print(f"Found {hits} files")
        results = json.loads(response.content)
        granules = []
        granules.extend(results['feed']['entry'])
        granule_sizes = [float(granule['granule_size']) for granule in granules]
        print(f"The total size of all files is {sum(granule_sizes):.2f} MB")
        return response.json()
    else:
        print("Found no hits")
        return

search_granules(param_dict)
Found 3 files
The total size of all files is 209.20 MB
{'feed': {'updated': '2021-10-21T00:29:03.263Z',
  'id': 'https://cmr.earthdata.nasa.gov:443/search/granules.json?short_name=SNEX17_GPR&version=2&polygon=-108.2352445938561%2C38.98556907427165%2C-107.85284607930835%2C38.978765032966244%2C-107.85494925720668%2C39.10596902171742%2C-108.22772795408136%2C39.11294532581687%2C-108.2352445938561%2C38.98556907427165&temporal=2017-01-01T00%3A00%3A00Z%2C2017-12-31T23%3A59%3A59Z',
  'title': 'ECHO granule metadata',
  'entry': [{'producer_granule_id': 'SnowEx17_GPR_Version2_Week1.csv',
    'time_start': '2017-02-08T00:00:00.000Z',
    'updated': '2019-11-20T14:19:39.156Z',
    'dataset_id': 'SnowEx17 Ground Penetrating Radar V002',
    'data_center': 'NSIDC_ECS',
    'title': 'SC:SNEX17_GPR.002:167128516',
    'coordinate_system': 'GEODETIC',
    'day_night_flag': 'UNSPECIFIED',
    'time_end': '2017-02-10T23:59:59.000Z',
    'id': 'G1657541380-NSIDC_ECS',
    'original_format': 'ECHO10',
    'granule_size': '57.3195',
    'browse_flag': False,
    'polygons': [['39.05189 -108.06789 39.04958 -108.07092 39.02644 -108.13422 39.04032 -108.18504 39.0357 -108.2211 39.01719 -108.21534 38.99637 -108.18261 39.00562 -108.11049 39.02413 -108.06225 39.03338 -108.06213 39.02876 -108.08619 39.04264 -108.05301 39.05189 -108.05289 39.0542 -108.06786 39.05189 -108.06789']],
    'collection_concept_id': 'C1655875737-NSIDC_ECS',
    'online_access_flag': True,
    'links': [{'rel': 'http://esipfed.org/ns/fedsearch/1.1/data#',
      'type': 'text/plain',
      'hreflang': 'en-US',
      'href': 'https://n5eil01u.ecs.nsidc.org/DP1/SNOWEX/SNEX17_GPR.002/2017.02.08/SnowEx17_GPR_Version2_Week1.csv'},
     {'rel': 'http://esipfed.org/ns/fedsearch/1.1/metadata#',
      'type': 'text/xml',
      'title': '(METADATA)',
      'hreflang': 'en-US',
      'href': 'https://n5eil01u.ecs.nsidc.org/DP1/SNOWEX/SNEX17_GPR.002/2017.02.08/SnowEx17_GPR_Version2_Week1.csv.xml'},
     {'inherited': True,
      'length': '0.0KB',
      'rel': 'http://esipfed.org/ns/fedsearch/1.1/data#',
      'hreflang': 'en-US',
      'href': 'https://n5eil01u.ecs.nsidc.org/SNOWEX/SNEX17_GPR.002/'},
     {'inherited': True,
      'length': '0.0KB',
      'rel': 'http://esipfed.org/ns/fedsearch/1.1/data#',
      'hreflang': 'en-US',
      'href': 'https://search.earthdata.nasa.gov/search/granules?p=C1655875737-NSIDC_ECS&q=SNEX17_GPR&m=29.76382409255042!-108.823974609375!4!1!0!0%2C2&tl=1558474061!4!!'},
     {'inherited': True,
      'length': '0.0KB',
      'rel': 'http://esipfed.org/ns/fedsearch/1.1/data#',
      'hreflang': 'en-US',
      'href': 'https://nsidc.org/data/data-access-tool/SNEX17_GPR/versions/2/'},
     {'inherited': True,
      'rel': 'http://esipfed.org/ns/fedsearch/1.1/metadata#',
      'hreflang': 'en-US',
      'href': 'https://doi.org/10.5067/G21LGCNLFSC5'},
     {'inherited': True,
      'rel': 'http://esipfed.org/ns/fedsearch/1.1/documentation#',
      'hreflang': 'en-US',
      'href': 'https://doi.org/10.5067/G21LGCNLFSC5'}]},
   {'producer_granule_id': 'SnowEx17_GPR_Version2_Week2.csv',
    'time_start': '2017-02-14T00:00:00.000Z',
    'updated': '2019-11-20T14:19:39.156Z',
    'dataset_id': 'SnowEx17 Ground Penetrating Radar V002',
    'data_center': 'NSIDC_ECS',
    'title': 'SC:SNEX17_GPR.002:167128520',
    'coordinate_system': 'GEODETIC',
    'day_night_flag': 'UNSPECIFIED',
    'time_end': '2017-02-17T23:59:59.000Z',
    'id': 'G1657541238-NSIDC_ECS',
    'original_format': 'ECHO10',
    'granule_size': '85.516',
    'browse_flag': False,
    'polygons': [['39.10738 -107.88943 39.10738 -107.89539 39.0912 -107.95508 39.07271 -108.02372 39.0542 -108.09234 39.04264 -108.16078 39.0357 -108.2113 39.03338 -108.2113 39.0195 -108.20533 39.00099 -108.18454 39.00099 -108.12811 39.00099 -108.08653 39.02644 -108.02094 39.0357 -107.94938 39.02413 -107.93155 39.04726 -107.89867 39.08195 -107.85677 39.10507 -107.86257 39.10969 -107.88644 39.10738 -107.88943']],
    'collection_concept_id': 'C1655875737-NSIDC_ECS',
    'online_access_flag': True,
    'links': [{'rel': 'http://esipfed.org/ns/fedsearch/1.1/data#',
      'type': 'text/plain',
      'hreflang': 'en-US',
      'href': 'https://n5eil01u.ecs.nsidc.org/DP1/SNOWEX/SNEX17_GPR.002/2017.02.14/SnowEx17_GPR_Version2_Week2.csv'},
     {'rel': 'http://esipfed.org/ns/fedsearch/1.1/metadata#',
      'type': 'text/xml',
      'title': '(METADATA)',
      'hreflang': 'en-US',
      'href': 'https://n5eil01u.ecs.nsidc.org/DP1/SNOWEX/SNEX17_GPR.002/2017.02.14/SnowEx17_GPR_Version2_Week2.csv.xml'},
     {'inherited': True,
      'length': '0.0KB',
      'rel': 'http://esipfed.org/ns/fedsearch/1.1/data#',
      'hreflang': 'en-US',
      'href': 'https://n5eil01u.ecs.nsidc.org/SNOWEX/SNEX17_GPR.002/'},
     {'inherited': True,
      'length': '0.0KB',
      'rel': 'http://esipfed.org/ns/fedsearch/1.1/data#',
      'hreflang': 'en-US',
      'href': 'https://search.earthdata.nasa.gov/search/granules?p=C1655875737-NSIDC_ECS&q=SNEX17_GPR&m=29.76382409255042!-108.823974609375!4!1!0!0%2C2&tl=1558474061!4!!'},
     {'inherited': True,
      'length': '0.0KB',
      'rel': 'http://esipfed.org/ns/fedsearch/1.1/data#',
      'hreflang': 'en-US',
      'href': 'https://nsidc.org/data/data-access-tool/SNEX17_GPR/versions/2/'},
     {'inherited': True,
      'rel': 'http://esipfed.org/ns/fedsearch/1.1/metadata#',
      'hreflang': 'en-US',
      'href': 'https://doi.org/10.5067/G21LGCNLFSC5'},
     {'inherited': True,
      'rel': 'http://esipfed.org/ns/fedsearch/1.1/documentation#',
      'hreflang': 'en-US',
      'href': 'https://doi.org/10.5067/G21LGCNLFSC5'}]},
   {'producer_granule_id': 'SnowEx17_GPR_Version2_Week3.csv',
    'time_start': '2017-02-21T00:00:00.000Z',
    'updated': '2020-02-18T12:25:16.785Z',
    'dataset_id': 'SnowEx17 Ground Penetrating Radar V002',
    'data_center': 'NSIDC_ECS',
    'title': 'SC:SNEX17_GPR.002:173252482',
    'coordinate_system': 'GEODETIC',
    'day_night_flag': 'UNSPECIFIED',
    'time_end': '2017-02-25T23:59:59.000Z',
    'id': 'G1694922459-NSIDC_ECS',
    'original_format': 'ECHO10',
    'granule_size': '66.3598',
    'browse_flag': False,
    'polygons': [['39.05189 -108.06789 39.04958 -108.06792 39.03107 -108.08616 39.0195 -108.15531 39.00331 -108.14352 39.00562 -108.11049 39.00562 -108.05349 39.01719 -108.05334 39.02876 -108.02919 39.0542 -108.05586 39.0542 -108.06786 39.05189 -108.06789']],
    'collection_concept_id': 'C1655875737-NSIDC_ECS',
    'online_access_flag': True,
    'links': [{'rel': 'http://esipfed.org/ns/fedsearch/1.1/data#',
      'type': 'text/plain',
      'hreflang': 'en-US',
      'href': 'https://n5eil01u.ecs.nsidc.org/DP1/SNOWEX/SNEX17_GPR.002/2017.02.21/SnowEx17_GPR_Version2_Week3.csv'},
     {'rel': 'http://esipfed.org/ns/fedsearch/1.1/metadata#',
      'type': 'text/xml',
      'title': '(METADATA)',
      'hreflang': 'en-US',
      'href': 'https://n5eil01u.ecs.nsidc.org/DP1/SNOWEX/SNEX17_GPR.002/2017.02.21/SnowEx17_GPR_Version2_Week3.csv.xml'},
     {'inherited': True,
      'length': '0.0KB',
      'rel': 'http://esipfed.org/ns/fedsearch/1.1/data#',
      'hreflang': 'en-US',
      'href': 'https://n5eil01u.ecs.nsidc.org/SNOWEX/SNEX17_GPR.002/'},
     {'inherited': True,
      'length': '0.0KB',
      'rel': 'http://esipfed.org/ns/fedsearch/1.1/data#',
      'hreflang': 'en-US',
      'href': 'https://search.earthdata.nasa.gov/search/granules?p=C1655875737-NSIDC_ECS&q=SNEX17_GPR&m=29.76382409255042!-108.823974609375!4!1!0!0%2C2&tl=1558474061!4!!'},
     {'inherited': True,
      'length': '0.0KB',
      'rel': 'http://esipfed.org/ns/fedsearch/1.1/data#',
      'hreflang': 'en-US',
      'href': 'https://nsidc.org/data/data-access-tool/SNEX17_GPR/versions/2/'},
     {'inherited': True,
      'rel': 'http://esipfed.org/ns/fedsearch/1.1/metadata#',
      'hreflang': 'en-US',
      'href': 'https://doi.org/10.5067/G21LGCNLFSC5'},
     {'inherited': True,
      'rel': 'http://esipfed.org/ns/fedsearch/1.1/documentation#',
      'hreflang': 'en-US',
      'href': 'https://doi.org/10.5067/G21LGCNLFSC5'}]}]}}

Data Access

Option 1: Python script from the data set landing page

The Depths and Snow Pit Data Package Contents tutorial demonstrates how to access data using the script provided under the “Download Data” tab of each NSIDC DAAC data set landing page.

Earthdata Login authentication setup notes

  1. All data with the NASA Earthdata system are freely available to the public, but requires an Earthdata Login profile to access. If you have not already done so, visit http://urs.earthdata.nasa.gov to register (it just takes a minute to set up).

  2. Create a .netrc file within the JupyterHub environment. This is a hidden file typically stored in your home directory that contains your Earthdata login username and password. This is a secure and easy way to ensure that any data download requests are authenticated against your profile. You can set up your .netrc within the JupyterHub environment as copied from the preliminary work article here:

Run the following commands on the JupyterHub in a terminal replacing your Earthdata login username and password:

echo "machine urs.earthdata.nasa.gov login EARTHDATA_LOGIN password EARTHDATA_PASSWORD" > ~/.netrc
chmod 0600 .netrc

Note that the script below should prompt you with your Earthdata Login username and password if a .netrc file does not exist.

os.chmod('/home/jovyan/.netrc', 0o600) #only necessary on snowex hackweek jupyterhub
%run './scripts/nsidc-download_SNEX20_SD.001.py' 
print('Grand Mesa 2020 Snow Depth data download complete') 
Querying for data:
	https://cmr.earthdata.nasa.gov/search/granules.json?provider=NSIDC_ECS&sort_key[]=start_date&sort_key[]=producer_granule_id&scroll=true&page_size=2000&short_name=SNEX20_SD&version=001&version=01&version=1&temporal[]=2020-01-28T00:00:00Z,2020-02-12T23:59:59Z
Found 1 matches.
Downloading 2 files...
1/2: SnowEx2020_SnowDepths_COGM_alldepths_v01.csv
  [===============                                             ]  25%  5.4MB/s   
  [==============================                              ]  50%  6.4MB/s   
  [=============================================               ]  75%  7.4MB/s   
  [============================================================] 100%  9.7MB/s   
2/2: SnowEx2020_SnowDepths_COGM_alldepths_v01.csv.xml
  [============================================================] 100%  2.6MB/s   
Grand Mesa 2020 Snow Depth data download complete

Option 2: Additional data access services: API Access

Data can be accessed directly from our HTTPS file system as described in the aforementioned tutorial, or through the NSIDC DAAC’s Application Programming Interface (API).

What is an API? You can think of an API as a middle man between an application or end-use (in this case, us) and a data provider. Here, the data provider is both the Common Metadata Repository (CMR) housing data information, and NSIDC as the data distributor. These APIs are generally structured as a URL with a base plus individual key-value-pairs separated by ‘&’. This option is beneficial for those of you who want to incorporate data access directly into your visualization and analysis coding workflow, without the need to utilize the NSIDC website. This method is also reproducible and documented to ensure data provenance.

This API offers you the ability to order data using specific temporal and spatial filters. These options can be requested in a single access command without the need to script against our data directory structure. See the programmatic access guide for more information on API options.

Create the data request API endpoint

Programmatic API requests are formatted as HTTPS URLs that contain key-value-pairs specifying the service operations that we specified above.

The cell below sets up the API request URL using our data search parameters as well as a few other configuration parameters. We will first create a string of key-value-pairs from our data dictionary and we’ll feed those into our API endpoint. This API endpoint can be executed via command line, a web browser, or in Python below.

#Set NSIDC data access base URL
base_url = 'https://n5eil02u.ecs.nsidc.org/egi/request'

#Set the request mode to asynchronous, "no" processing agent (no subsetting or reformatting services available), and optionally removing metadata delivery

param_dict['request_mode'] = 'async'
param_dict['agent'] = 'NO'
param_dict['INCLUDE_META'] ='N' #optional if you do not wish to receive the associated metadata files with each science file. 

param_string = '&'.join("{!s}={!r}".format(k,v) for (k,v) in param_dict.items()) # Convert param_dict to string
param_string = param_string.replace("'","") # Remove quotes

api_list = [f'{base_url}?{param_string}']
api_request = api_list[0]
print(api_request) # Print API base URL + request parameters
https://n5eil02u.ecs.nsidc.org/egi/request?short_name=SNEX17_GPR&version=2&polygon=-108.2352445938561,38.98556907427165,-107.85284607930835,38.978765032966244,-107.85494925720668,39.10596902171742,-108.22772795408136,39.11294532581687,-108.2352445938561,38.98556907427165&temporal=2017-01-01T00:00:00Z,2017-12-31T23:59:59Z&request_mode=async&agent=NO&INCLUDE_META=N

Input Earthdata Login credentials

For our API access option, An Earthdata Login account is required to access data from the NSIDC DAAC. If you do not already have an Earthdata Login account, visit http://urs.earthdata.nasa.gov to register.

# Start authenticated session with Earthdata Login to allow for data downloads:
def setup_earthdata_login_auth(endpoint: str='urs.earthdata.nasa.gov'):
    netrc_name = "_netrc" if system()=="Windows" else ".netrc"
    try:
        username, _, password = netrc.netrc(file=join(expanduser('~'), netrc_name)).authenticators(endpoint)
    except (FileNotFoundError, TypeError):
        print('Please provide your Earthdata Login credentials for access.')
        print('Your info will only be passed to %s and will not be exposed in Jupyter.' % (endpoint))
        username = input('Username: ')
        password = getpass('Password: ')
    manager = request.HTTPPasswordMgrWithDefaultRealm()
    manager.add_password(None, endpoint, username, password)
    auth = request.HTTPBasicAuthHandler(manager)
    jar = CookieJar()
    processor = request.HTTPCookieProcessor(jar)
    opener = request.build_opener(auth, processor)
    request.install_opener(opener)

setup_earthdata_login_auth(endpoint="urs.earthdata.nasa.gov")

Download data

Download data using the requests library. The data will be downloaded directly to this directory in a new Outputs folder. The progress of each order will be reported.

def request_nsidc_data(API_request):
    """
    Performs a data customization and access request from NSIDC's API/
    Creates an output folder in the working directory if one does not already exist.
    
    :API_request: NSIDC API endpoint; see https://nsidc.org/support/how/how-do-i-programmatically-request-data-services for more info
    on how to configure the API request.
    
    """

    path = str(os.getcwd() + '/Outputs') # Create an output folder if the folder does not already exist.
    if not os.path.exists(path):
        os.mkdir(path)
        
    base_url = 'https://n5eil02u.ecs.nsidc.org/egi/request'

    
    r = request.urlopen(API_request)
    esir_root = ET.fromstring(r.read())
    orderlist = []   # Look up order ID
    for order in esir_root.findall("./order/"):
        orderlist.append(order.text)
    orderID = orderlist[0]
    statusURL = base_url + '/' + orderID # Create status URL
    print('Order status URL: ', statusURL)
    request_response = request.urlopen(statusURL) # Find order status  
    request_root = ET.fromstring(request_response.read())
    statuslist = []
    for status in request_root.findall("./requestStatus/"):
        statuslist.append(status.text)
    status = statuslist[0]
    while status == 'pending' or status == 'processing': #Continue loop while request is still processing
        print('Job status is ', status,'. Trying again.')
        time.sleep(10)
        loop_response = request.urlopen(statusURL)
        loop_root = ET.fromstring(loop_response.read())
        statuslist = [] #find status
        for status in loop_root.findall("./requestStatus/"):
            statuslist.append(status.text)
        status = statuslist[0]
        if status == 'pending' or status == 'processing':
            continue
    if status == 'complete_with_errors' or status == 'failed': # Provide complete_with_errors error message:
        messagelist = []
        for message in loop_root.findall("./processInfo/"):
            messagelist.append(message.text)
        print('Job status is ', status)
        print('error messages:')
        pprint(messagelist)
    if status == 'complete' or status == 'complete_with_errors':# Download zipped order if status is complete or complete_with_errors
        downloadURL = 'https://n5eil02u.ecs.nsidc.org/esir/' + orderID + '.zip'
        print('Job status is ', status)
        print('Zip download URL: ', downloadURL)
        print('Beginning download of zipped output...')
        zip_response = request.urlopen(downloadURL)
        with zipfile.ZipFile(io.BytesIO(zip_response.read())) as z:
            z.extractall(path)
        print('Download is complete.')
    else: print('Request failed.')
    
    # Clean up Outputs folder by removing individual granule folders 
    for root, dirs, files in os.walk(path, topdown=False):
        for file in files:
            try:
                shutil.move(os.path.join(root, file), path)
            except OSError:
                pass
        for name in dirs:
            os.rmdir(os.path.join(root, name))
    return  


# NOTE: downloads ~ 200MB of CSV files
request_nsidc_data(api_request)
Order status URL:  https://n5eil02u.ecs.nsidc.org/egi/request/5000001617631
Job status is  processing . Trying again.
Job status is  complete
Zip download URL:  https://n5eil02u.ecs.nsidc.org/esir/5000001617631.zip
Beginning download of zipped output...
Download is complete.

Read in SnowEx data

This SnowEx data set is provided in CSV. A Pandas DataFrame is used to easily read in data.

snowex_path = './Outputs/SnowEx17_GPR_Version2_Week1.csv' # Define local filepath
df = pd.read_csv(snowex_path, sep='\t') 
df.head()
collection trace long lat elev twtt Thickness SWE x y UTM_Zone
0 GPR_0042_020817 2581 -108.066856 39.043146 3240.2 5.89 0.692 225 753854.880092 4.325659e+06 12 S
1 GPR_0042_020817 2582 -108.066856 39.043146 3240.2 5.89 0.692 225 753854.899385 4.325660e+06 12 S
2 GPR_0042_020817 2583 -108.066856 39.043146 3240.2 5.87 0.690 224 753854.918686 4.325660e+06 12 S
3 GPR_0042_020817 2584 -108.066855 39.043146 3240.2 5.86 0.689 224 753854.937987 4.325660e+06 12 S
4 GPR_0042_020817 2585 -108.066855 39.043147 3240.2 5.84 0.686 223 753854.957280 4.325660e+06 12 S

Extract date values

The collection date needs to be extracted from the collection parameter:

df['date'] = df.collection.str.rsplit('_').str[-1].astype(str)
df.date = pd.to_datetime(df.date, format="%m%d%y")
df = df.sort_values(['date'])
df.head()
collection trace long lat elev twtt Thickness SWE x y UTM_Zone date
0 GPR_0042_020817 2581 -108.066856 39.043146 3240.20 5.89 0.692 225 753854.880092 4.325659e+06 12 S 2017-02-08
109172 GPR_0043_020817 6360 -108.063209 39.049202 3248.49 11.49 1.350 439 754148.853700 4.326342e+06 12 S 2017-02-08
109173 GPR_0043_020817 6361 -108.063209 39.049202 3248.50 11.56 1.358 441 754148.882549 4.326342e+06 12 S 2017-02-08
109174 GPR_0043_020817 6362 -108.063208 39.049202 3248.50 11.62 1.365 444 754148.911407 4.326342e+06 12 S 2017-02-08
109175 GPR_0043_020817 6363 -108.063208 39.049202 3248.50 11.64 1.368 445 754148.947466 4.326342e+06 12 S 2017-02-08

Convert to Geopandas dataframe to provide point geometry

According to the SnowEx documentation, the data are available in UTM Zone 12N so we’ll set to this projection to allow for geospatial analysis:

gdf_utm= gpd.GeoDataFrame(df, geometry=gpd.points_from_xy(df.x, df.y), crs='EPSG:32612')
gdf_utm.head()
collection trace long lat elev twtt Thickness SWE x y UTM_Zone date geometry
0 GPR_0042_020817 2581 -108.066856 39.043146 3240.20 5.89 0.692 225 753854.880092 4.325659e+06 12 S 2017-02-08 POINT (753854.880 4325659.484)
109172 GPR_0043_020817 6360 -108.063209 39.049202 3248.49 11.49 1.350 439 754148.853700 4.326342e+06 12 S 2017-02-08 POINT (754148.854 4326341.915)
109173 GPR_0043_020817 6361 -108.063209 39.049202 3248.50 11.56 1.358 441 754148.882549 4.326342e+06 12 S 2017-02-08 POINT (754148.883 4326341.916)
109174 GPR_0043_020817 6362 -108.063208 39.049202 3248.50 11.62 1.365 444 754148.911407 4.326342e+06 12 S 2017-02-08 POINT (754148.911 4326341.917)
109175 GPR_0043_020817 6363 -108.063208 39.049202 3248.50 11.64 1.368 445 754148.947466 4.326342e+06 12 S 2017-02-08 POINT (754148.947 4326341.918)

Additional data imagery services

NASA Worldview and the Global Browse Imagery Service

NASA’s EOSDIS Worldview mapping application provides the capability to interactively browse over 900 global, full-resolution satellite imagery layers and then download the underlying data. Many of the available imagery layers are updated within three hours of observation, essentially showing the entire Earth as it looks “right now.”

Several MODIS snow data products from NSIDC include high-resolution browse imagery available through NASA Worldview, including “MODIS/Terra Snow Cover Daily L3 Global 500m SIN Grid, Version 61”. This layer can be downloaded as various image files including GeoTIFF using the snapshot feature at the top right of the page. This link presents the MOD10A1 NDSI layer over our time and area of interest: https://go.nasa.gov/35CgYMd.

Additionally, the NASA Global Browse Imagery Service provides up to date, full resolution imagery for select NSIDC DAAC data sets as web services including WMTS, WMS, KML, and more. These layers can be accessed in GIS applications following guidance on the GIBS documentation pages.