I’m a geographer whose work focuses on GIS and spatial analysis . I work primarily in R and Python using open source tools. This blog serves primarily to share new tools, tips and techniques I’ve found or find interesting using open source spatial analysis approaches in R and Python.

Pull in data via API for Survey data This script uses the NASS API to query the NASS Quickstats service for historic acres harvested of primary crop groups in Willamette basin counties in Oregon from 1969-2013. An example of using the NASS Quickstats API to pull in and examine crop data in Willamette basin. library(dplyr) ## Warning: package 'dplyr' was built under R version 3.5.2 library(tibble) library(tidyr) library(rnassqs) library(lubridate) library(readr) years <- seq(as. [Read More]
Recently I wanted to be able to add a custom basemap of NHDPlus features to mapview leaflet maps. It’s not straitforward out of the box in mapview, but I found the helpful tips to do it here and here. Below I add the WMS service from EPA Waters along with USGS StreamGage stations in Benton County, OR using the dataRetrieval package. library(dataRetrieval) library(mapview) library(sf) library(leaflet) Stations <- readNWISdata(stateCd="Oregon", countyCd="Benton") # DataRetrival returns objects as 'attributes' - things like the url used, site metadata, site info, etc - just use attributes(Durham_Stations) to examine siteInfo <- attr(Stations , "siteInfo") stations_sf = st_as_sf(siteInfo, coords = c("dec_lon_va", "dec_lat_va"), crs = 4269,agr = "constant") m <- mapview(stations_sf) m@map = m@map %>% addWMSTiles(group = 'NHDPlus', "https://watersgeo. [Read More]

Just finished putting together and running a half-day R Spatial Workshop covering some cool new R spatial things using packages such as sf, DataRetrieval, mapview, and combining dplyr chained operations with sf among other things. And I finally, after a year, got Disqus to start working on my blog! All I did was update blogdown and rebuild - if only everything were so simple…

This is code to get PctAg and PctUrb using NLCD 2006 for US Counties. First we’ll import some libraries. import pandas as pd import numpy as np import geopandas as gp Bring in US Counties using US Census counties shapefile counties = gp.GeoDataFrame.from_file('L:/Priv/CORFiles/Geospatial_Library/Data/RESOURCE/POLITICAL/BOUNDARIES/NATIONAL/Counties_Census_2010.shp') list(counties) counties.STATE_NAME.unique() counties = counties[(counties.STATE_NAME != 'Hawaii') & (counties.STATE_NAME != 'Alaska')] counties = counties[['FIPS','NAME','geometry']] counties.head() .dataframe thead tr:only-child th { text-align: right; } . [Read More]
This is an implementation of the excellent PostGIS / geopandas tutorial here using NHDPlus WBD polygons for PNW. All the ideas and methods are from this tutorial, simply implementing with a different dataset and in Oregon. %matplotlib inline import os import json import psycopg2 import matplotlib.pyplot as plt # The two statemens below are used mainly to set up a plotting # default style that's better than the default from matplotlib import seaborn as sns plt. [Read More]
I’ve been toying with building a wordcloud of all my publications in both R and python on and off for some time, and while there’s a really nice R library for doing this wordcloud2, this example uses wordcloud, a python word cloud generator. I pasted all my publications into a single text file which is read into python and used by wordcloud as shown in code and results below. from os import path from wordcloud import WordCloud # Read the whole text. [Read More]
I set up my personal website using GitHub Pages a little while back using Jekyll and a Hyde template that I thought was pretty nice. However, after listening to Yihui’s RStudio webinar on using blogdown, I decided maybe it was time to give blogdown a try and switch from Jekyll to Hugo. I spent a good couple weeks off and on reading up on Hugo, exploring different Hugo themes, and building test websites and locally before finally feeling comfortable enough to take the plunge and make the switch. [Read More]

Tried out the georasters python package for changing data types of rasters, recoding, merging, and writing out - very nice, handy python package!

Load and look at basics of simple features package library(devtools) # install_github("edzer/sfr") library(sf) ## Linking to GEOS 3.5.0, GDAL 2.1.1, proj.4 4.9.3 nc <- st_read(system.file("shape/nc.shp", package="sf")) ## Reading layer `nc' from data source `C:\Users\mweber\R\library\sf\shape\nc.shp' using driver `ESRI Shapefile' ## converted into: MULTIPOLYGON ## Simple feature collection with 100 features and 14 fields ## geometry type: MULTIPOLYGON ## dimension: XY ## bbox: xmin: -84.32385 ymin: 33.88199 xmax: -75. [Read More]

Here’s a snippet showing how to rasterize a shapefile in python using rasterio and geopandas: