Surf Check - Automating Weather Forecast Emails in R

5 minute read


A non-R related pursuit of mine is surf photography, which is highly weather dependent. Often, existing weather forecasts just don’t have the required detail or are too hard to search out.

A fun project to combine some interesting R tools would be an automated and customised surf weather report. Ideally it should do the following:

  • Get relevant data on surf and coastal weather for my city
  • Get a good quality synoptic weather image
  • Create a simple to read forecast with just this info
  • Automatically email it to myself at 6am every day

Breaking down the key steps required and the tools used:

  1. Obtain Forecast data - bomrang
  2. Make it human readable - glue
  3. Send it via email - gmailr
  4. Automate it - AWS cron job

Obtain Forecast data

Earlier this year I made some contributions to the bomrang R package.

bomrang provides functions to interface with Australian Government Bureau of Meteorology (BOM) data, fetching data and returning a tidy data frame of précis forecasts, current weather data from stations, ag information bulletins, historical weather data and downloading and importing radar or satellite imagery. bomrang is part of the rOpenSci project and has a fantastic team of authors and contributors.

Given my focus is on coastal and surf forecasts I used the get_coastal_forecast() function (which I developed).

forecast <- bomrang::get_coastal_forecast(state = "NSW") %>% filter(dist_name == "Illawarra")
forecast[,c(4, 5, 9, 14, 15)] %>% knitr::kable()
NSWIllawarra2018-11-13 10:00:001.5 to 2.5 metres.Partly cloudy.
NSWIllawarra2018-11-14 00:00:001.5 to 2.5 metres, decreasing to 1 to 1.5 metres around midday.Cloudy. The chance of a thunderstorm inshore in the afternoon and early evening.
NSWIllawarra2018-11-15 00:00:00Around 1 metre, increasing to 1 to 2 metres during the morning.Cloudy. 60% chance of showers.

Make it human readable

The outputs from the bomrang coastal forecast are in the form of a data frame. To make it easier to digest I wanted to pull out key components and turn it into a short paragraph. To do this I used the glue package. One really useful feature of glue is its ability to vectorise the gluing of strings. Given my forecast data typically has multi-day forecasts, the one glue() function is able to handle this nicely.


{lubridate::wday(forecast$start_time_local, label = TRUE, abbr = FALSE)} {as.Date(forecast$start_time_local, tz = 'Australia/Sydney')} will be {forecast$forecast_weather}
The swell is {forecast$forecast_swell1} 


## Tuesday 2018-11-13 will be Partly cloudy.
## The swell is Northeasterly below 1 metre. 
## Wednesday 2018-11-14 will be Cloudy. The chance of a thunderstorm inshore in the afternoon and early evening.
## The swell is Northeasterly around 1 metre, increasing to 1 to 1.5 metres during the morning. 
## Thursday 2018-11-15 will be Cloudy. 60% chance of showers.
## The swell is Northeasterly 1.5 metres, tending easterly 1 to 1.5 metres during the morning.

Send via email

Once I retrieve and format my custom forecast, emailing it to myself was essential so I could read it on my phone each morning. The gmailr package was an easy way to do this. I followed the steps to setup a Google Project which was covered really well in the project README. From there it was straightforward.

One custom feature I wanted was the inclusion of a synoptic weather map so I could see cold fronts rolling their way up the coast from Antarctica. Downloading this as a tempfile and attaching it was a snap with the attach_file() function.

Given my aim was to automate this process I was able to use the gmailr::use_secret_file() function to retrieve my client and secret key from a local .json file and perform authentication to the gmail API.

# Download Weather Map ----------------------------------------------------

tf <- tempfile(fileext = ".gif")
download.file(url = synoptic_img, destfile = tf, mode = "wb")

# Send Mail ---------------------------------------------------------------
msg <- gmailr::mime() %>%
  gmailr::subject(subject) %>% 
  gmailr::to(to) %>%
  gmailr::from(from) %>% 
  gmailr::text_body(body) %>% 
  gmailr::attach_part(body) %>% 


Automate it

This combination of tools works great locally, however for convenience I wanted it to automatically email me at 6am every morning so I could see if it was worth getting out of bed.

One option was to set up an AWS EC2 virtual server.

I cover the basic setup of this in a previous post. The final step is to automate this R script using a cronjob (given we are using linux).

To do this I open the crontab in the terminal

crontab -e

and add a line to the bottom of the file with the particulars of the job scheduling:

   # Use the hash sign to prefix a comment
    # +---------------- minute (0 - 59)
    # |  +------------- hour (0 - 23)
    # |  |  +---------- day of month (1 - 31)
    # |  |  |  +------- month (1 - 12)
    # |  |  |  |  +---- day of week (0 - 7) (Sunday=0 or 7)
    # |  |  |  |  |
    # *  *  *  *  *  command to be executed

Below we can see a cronjob to run at minute 0, hour 6, every day, every month.

The command changes directory to where the rscript is saved and then uses Rscript to run the code in batch mode.

0 6 * * * cd folder_containing_script && Rscript r_file.R

and we’re done.

Installing RStudio Server on AWS