DaymetR, an official Daymet R package

I have described my DaymetR scripts in previous posts (here and here). However, some communication with Michele Thornton a DAYMET admin at Oak Ridge National Laboratories brought to light some issues in the way I handled specifying a region of interest. Mainly, the reprojection of the netcdf ancillary files at the edges. This is a typical issue when reprojecting raster data. As a solution I requested a shape file of the grid tiles instead of a raster and Michele kindly provided.

After cleaning up my code I also decided to wrap both the single pixel location as well as the gridded data extraction tools in a proper R package!! The package also includes the map by default so no additional data needs to be downloaded. However, I do provide the shapefile of the DAYMET grid tiles separately.

You can download my first R package on my bitbucket page linked to on my software page or by following this link. I validated the code after the recent changes in server setup at DAYMET, and all seems to be fine. Please report any bugs if you find any.

Basic GIS in R

Recently I had to help out a colleague with processing some spatial data in R. The overall goal was to find landscape parameters (topography, soil characteristics, etc.) within the neighbourhood of a flux tower.

The below  code gives a short introduction on how to extract data from both raster and shapefiles (vector data), very common operations .  I will add more tips and tricks as I come across them.

# Load libraries.
# For raster data
require(raster)

# For vector data
require(sp)

# Set flux tower coordinates.
lat= 45.2041
lon=-68.7402

# Define lat / lon projection.
lat_lon <- CRS("+init=epsg:4326")

# Define UTM projection.
utm <- CRS( "+proj=utm +zone=19 +north +ellps=WGS84 +datum=WGS84 +no_defs")

# Read in the coordinates and assign them a projection, # otherwise they remain just 'numbers'
location <- SpatialPoints(cbind(lon,lat), lat_lon)

# Convert to utm using the above projection string
# using spTransform.
location_utm <- spTransform(location, utm)

# Load raster Digital Terrain Map (DTM) data.
# If the raster data has no projection,
# you will have to assign a projection.
# I assume the file has a projection.
dtm <- raster('DTM.tif')

# Relative height of the location of the tower.
dtm_data <- extract(dtm,location_utm)

# Read in soil shapefile.
soils <-  readShapeSpatial("soils.shp")

# Assign projection to the soils.shp file
# if not provided.
proj4string(soils) <- utm

# Extract soil polygon attributes.
soil_attributes <- over(location_utm,soils)

 

Ubuntu 14.04 upgrade

Yesterday I updated my laptop from Ubuntu 12.04 LTS to the latest release 14.04 LTS (Trusty Thar). Overall the experience is very pleasant. I noticed that the interface is snappier, bootup is as fast (It’s difficult to notice any differences on a fast SSD drive). What I did not like was the removal of some functionality in the file manager, mainly splitting the file manager window into two columns. This has been a sore point for many users but there are work arounds of it really messes with your workflow. I’ll give it a try and go without for a while. Also some of my favourite themes don’t work anymore. I now use the Numix theme as a subsitute.

I also did some counting and I figured out that I’ve spend most of my research career (approaching 10 years) using Ubuntu, starting with their first release 4.10 (warty warthog). More so, aside from some graphics work for which I used a Mac I’ve been only using Linux for research and day to day needs, and never regretted making this move. With Ubuntu moving into tablet territory I foresee some exciting times in the linux (open-source) community.

DaymetR – gridded data extension

Short recap: Daymet is a collection of algorithms and computer software designed to interpolate and extrapolate from daily meteorological observations to produce gridded estimates of daily weather parameters - as concisely described on the Daymet website.

I’m extensively using daily meteorological data to drive a grassland growth model and quick and easy access to this data is therefore key. Although I previously coded a script to deal with single pixel data extraction I now updated my DaymetR package to include gridded (tiled) data downloads through the specification of a region of interest (top left / bottom right) or single point location.

The script has the following syntax:

download.Daymet.tiles(lat1=36.0133,
lon1=-84.2625,
lat2=NA,
lon2=NA,
start_yr=1980,
end_yr=2012,
param="ALL")

If only the first set of coordinates is provided (lat1 / lon1) the tile in which these reside is downloaded. If your location / region of interest [defined by a top left (lat1 / lon1) and bottom right (lat2 / lon2) coordinate set] falls outside the scope of the DAYMET data coverage a warning is issued. All tiles covering the region of interest are downloaded for their respective parameters, e.g. minimum and maximum temperature (see bitbucket documentation for details).

This final addition is a spatial extension of the previous code, serving primarily the spatial scaling needs of my modelling exercises. All code can be found on through my software page or by following this direct link to my bitbucket page. I hope this might serve other people in the R community.

 

DAYMET ancillary data conversion

I recently posted tools to extract DAYMET data for point locations. However, the product as produced by the DAYMET team is gridded data. If you want to scale models driven by DAYMET data spatially you need access to this gridded (spatial) data.

Sadly, access to the data is rather convoluted. If you want to download this gridded data, or tiles, you first have to figure out which tiles you need. This means, going to the DAYMET website and clicking several times within the tiles you want to download to get their individual numbers. After establishing all these tile numbers you can download them from the THREDDS server.

Obviously this routine is less than ideal, if for example you want to do regional to state based research and have to select a large number of tiles. I hope to automate this process. First step in this process is reprojecting the tiles grid to a latitude and longitude. Oddly enough the format of the netCDF file was not strandard and rather confusing. It took me a while to figure out how to accomplish this. Here is a bash script to do so. These instructions work on all ancillary netCDF grids as provided by DAYMET. From the generate tile file using the below code you can either determine the tile based upon any given point location or alternatively for a region of interest. The latter is my goal and in the works, as the former is covered by previous code.

#!/bin/bash

# get filename with no extension
no_extension=`basename $1 | cut -d'.' -f1`

# convert the netCDF file to an ascii file 
gdal_translate -of AAIGrid $1 original.asc

# extract the data with no header
tail -n +7 original.asc > ascii_data.asc

# paste everything together again with a correct header
echo "ncols        8011" 	>  final_ascii_data.asc
echo "nrows        8220"	>> final_ascii_data.asc
echo "xllcorner    -4659000.0" 	>> final_ascii_data.asc
echo "yllcorner    -3135000.0" 	>> final_ascii_data.asc
echo "cellsize     1000" 	>> final_ascii_data.asc
echo "NODATA_value 0"    	>> final_ascii_data.asc 

# append flipped data
tac ascii_data.asc >> final_ascii_data.asc

# translate the data into Lambert Conformal Conic GTiff
gdal_translate -of GTiff -a_srs "+proj=lcc +datum=WGS84 +lat_1=25 n +lat_2=60n +lat_0=42.5n +lon_0=100w" final_ascii_data.asc tmp.tif

# convert to latitude / longitude
gdalwarp -of GTiff -overwrite -t_srs "EPSG:4326" tmp.tif tmp_lat_lon.tif

# crop to reduce file size, only cover DAYMET data areas
gdal_translate -a_nodata -9999 -projwin -131.487784581 52.5568285568 -51.8801911189 13.9151864748 tmp_lat_lon.tif $no_extension.tif

# clean up
rm original.asc
rm ascii_data.asc
rm final_ascii_data.asc
rm tmp.tif
rm tmp_lat_lon.tif

 

Pagination


© 2018. All rights reserved.

Powered by Hydejack v7.5.1