diff --git a/DESCRIPTION b/DESCRIPTION index d170c26..c28f68a 100644 --- a/DESCRIPTION +++ b/DESCRIPTION @@ -1,18 +1,14 @@ Package: ggOceanMaps Type: Package Title: Plot Data on Oceanographic Maps using 'ggplot2' -Version: 1.3.7 -Date: 2022-11-24 +Version: 2.0.0 +Date: 2023-07-04 Authors@R: c(person("Mikko", "Vihtakari", email = "mikko.vihtakari@hi.no", role = c("aut", "cre"), comment = c(affiliation = "Institute of Marine Research", ORCID = "0000-0003-0371-4319")), - person("Yves", "Reecht", role = "ctb", - comment = c(affiliation = "Institute of Marine Research", - ORCID = "0000-0003-3583-1843")), - person("Hadley", "Wickham", role = "ctb"), - person("Simon", "O'Hanlon", role = "ctb"), - person("Roger", "Bivand", role = "ctb") + person("Roger", "Bivand", role = "ctb"), + person("Hadley", "Wickham", role = "ctb") ) URL: https://mikkovihtakari.github.io/ggOceanMaps/ BugReports: https://github.com/MikkoVihtakari/ggOceanMaps/issues @@ -21,16 +17,17 @@ Description: Allows plotting data on bathymetric maps using 'ggplot2'. Plotting for custom modifications. Data that contain geographic information from anywhere around the globe can be plotted on maps generated by the basemap() or qmap() functions using 'ggplot2' layers separated by the '+' operator. The - package uses spatial shapefiles stored in the 'ggOceanMapsData' package, - geospatial packages for R to manipulate, and the 'ggspatial' package to help - to plot these shapefiles. High-resolution shapefiles for detailed maps are - stored on GitHub and downloaded automatically when needed. -Depends: R (>= 3.5.0), ggplot2, ggspatial -Imports: sp, raster, sf, rgeos, methods, utils, stars, smoothr, units, dplyr, parallel -Suggests: ggOceanMapsData, cowplot, knitr, rmarkdown, scales, rgdal, ggnewscale -Additional_repositories: https://mikkovihtakari.github.io/drat + package uses spatial shape- ('sf') and raster ('stars') files, geospatial + packages for R to manipulate, and the 'ggplot2' package to plot these + files. The package ships with low-resolution spatial data files and + higher resolution files for detailed maps are stored in the + 'ggOceanMapsLargeData' repository on GitHub and downloaded automatically + when needed. +Depends: R (>= 3.5.0), ggplot2 +Imports: sf, stars, methods, utils, smoothr, units +Suggests: ggspatial, cowplot, knitr, rmarkdown, scales, ggnewscale License: GPL-3 Encoding: UTF-8 -RoxygenNote: 7.2.1 +RoxygenNote: 7.2.3 LazyData: true - +LazyDataCompression: xz diff --git a/NAMESPACE b/NAMESPACE index ae332a5..c44f41d 100644 --- a/NAMESPACE +++ b/NAMESPACE @@ -6,13 +6,13 @@ export(auto_limits) export(basemap) export(basemap_data) export(clip_shapefile) -export(convert_crs) +export(dd_clip_boundary) export(dd_to_deg) +export(define_bathy_style) export(define_shapefiles) export(deg_to_dd) export(dist2land) export(geonorge_bathymetry) -export(getCores) export(guess_coordinate_columns) export(is_decimal_limit) export(load_map_data) @@ -21,6 +21,7 @@ export(qmap) export(quiet) export(raster_bathymetry) export(reorder_layers) +export(rotate_crs) export(round_any) export(select_element) export(shapefile_list) @@ -28,25 +29,10 @@ export(theme_map) export(transform_coord) export(vector_bathymetry) import(ggplot2) -import(ggspatial) -import(raster) -import(rgeos) -import(sf) -import(sp) importFrom(grDevices,chull) importFrom(methods,"slot<-") importFrom(methods,slot) -importFrom(parallel,detectCores) -importFrom(parallel,makeCluster) -importFrom(parallel,mclapply) -importFrom(parallel,parLapply) -importFrom(parallel,stopCluster) -importFrom(rgeos,createSPComment) -importFrom(rgeos,gDistance) -importFrom(rgeos,gIntersection) -importFrom(rgeos,gIntersects) -importFrom(rgeos,gSimplify) +importFrom(sf,st_sf) +importFrom(stars,read_stars) importFrom(utils,download.file) importFrom(utils,menu) -importFrom(utils,setTxtProgressBar) -importFrom(utils,txtProgressBar) diff --git a/NEWS.md b/NEWS.md index 0903939..34ec8f7 100644 --- a/NEWS.md +++ b/NEWS.md @@ -1,11 +1,17 @@ -# ggOceanMaps 1.4 (development version) - +# ggOceanMaps 2.0 + +* Full [sf](https://r-spatial.github.io/sf/) integration. Old GIS packages for R and ggspatial dependencies removed. Since this change required rewriting of most functions, new bugs have almost certainly been introduced. +* Bathymetry system redesigned (see [this](https://mikkovihtakari.github.io/ggOceanMaps/articles/new-features.html)) +* Decimal degree maps can now be plotted across the antimeridian. +* Added spatial data to ggOceanMaps making the ggOceanMapsData package not needed any longer. +* `dist2land()` now uses great circle distances on a spherical Earth ([s2](https://r-spatial.github.io/s2/)) by default and should be able to calculate distances to land anywhere around the globe. +* qmap points turned red. Addressed a long-standing issue with `shapefiles` and `shape` getting mixed. * Fixed a bug with shapefiles argument shortcut. * Fixed a bug in ices_data -* Add sf support for clip_shapefile -* Add sf support for shapefiles (converts to sp, needs to be refined) -* Fix a bug with expanded limits in decimal degree projection -* Fix a bug where shapefile_list("all") would retunr multiple rows per shapefile name. +* Added sf support for clip_shapefile +* Added sf support for shapefiles (converts to sp, needs to be refined) +* Fixed a bug with expanded limits in decimal degree projection +* Fixed a bug where shapefile_list("all") would return multiple rows per shapefile name. # ggOceanMaps 1.3.4 diff --git a/R/auto_limits.R b/R/auto_limits.R index 63277ee..413538a 100644 --- a/R/auto_limits.R +++ b/R/auto_limits.R @@ -1,53 +1,33 @@ #' @title Automatic limits for basemap #' @description Find limits for a \code{\link{basemap}} from a data frame. -#' @param data Data frame containing data for which the limits should be calculated. +#' @param data Data frame or a spatial object containing data for which the limits should be calculated. #' @param proj.in Original \code{\link[sf:st_crs]{CRS}} projection. Must be defined as character argument. #' @param proj.out Resulting map projection. See \code{\link{transform_coord}}. #' @param lon,lat Names of longitude and latitude columns in \code{data} as character or integer index. If \code{NULL}, the column names are \link[=guess_coordinate_columns]{guessed}. #' @param expand.factor Expansion factor for map limits. Set to \code{NULL} to ignore. #' @param rotate Logical indicating whether the limits should be rotated to point towards the pole relative to mid-longitude limit. #' @param verbose Logical indicating whether information about the projection and guessed column names should be returned as message. Set to \code{FALSE} to make the function silent. -#' @param output.sf Logical indicating whether an \code{\link[sf:st_polygon]{sf}} (\code{TRUE}) or \code{sp} (\code{FALSE}) polygon should be returned. #' @details This is an internal function, which is automatically run by the \code{\link{basemap}} function. #' @return A list of limits and projections in \code{proj.in} and \code{proj.out} formats. #' @keywords internal #' @author Mikko Vihtakari #' @family customize shapefiles #' @examples -#' if(requireNamespace("ggOceanMapsData")) { #' auto_limits(data = expand.grid(lon = c(-120, 180, 120), #' lat = c(60, 60, 80))) -#' } #' @export -# data = expand.grid(lon = c(-120, 180, 120), lat = c(60, 60, 80)); lon = NULL; lat = NULL; proj.in = 4326; proj.out = NULL; expand.factor = NULL; verbose = FALSE; output.sf = FALSE -auto_limits <- function(data, lon = NULL, lat = NULL, proj.in = 4326, proj.out = NULL, expand.factor = NULL, verbose = FALSE, output.sf = FALSE) { +# lon = NULL; lat = NULL; proj.in = 4326; proj.out = NULL; expand.factor = NULL; verbose = FALSE +auto_limits <- function(data, lon = NULL, lat = NULL, proj.in = 4326, proj.out = NULL, expand.factor = NULL, verbose = FALSE) { - # Get limits from spatial polygons #### + # Get coordinates from spatial objects - if(any(inherits(data, c("SpatialPolygonsDataFrame", "SpatialPolygons")))) { - proj.in <- raster::crs(data) - - # if(!sf::st_is_longlat(proj.in)) { - # data <- sp::spTransform(data, sp::CRS(convert_crs(4326))) - # proj.in <- convert_crs(4326) - # message("The data argument is a spatial polygons object, which is not given as decimal degrees. Converted to decimal degrees.") - # } - - data <- suppressMessages(ggplot2::fortify(data)[c("long", "lat")]) - names(data) <- c("lon", "lat") - } - - # Get limits from sf objects - - if(any(inherits(data, "sf"))) { - proj.in <- sf::st_crs(data) - - tmp <- sf::st_bbox(data) + if(any(inherits(data, c("sf", "sfc", "SpatialPolygonsDataFrame", "SpatialPolygons")))) { + tmp <- sf::st_bbox(sf::st_transform(sf::st_as_sfc(sf::st_bbox(data)), proj.in)) data <- expand.grid(data.frame( - lon = tmp[c(1,3)], - lat = tmp[c(2,4)]) + lon = tmp[c("xmin", "xmax")], + lat = tmp[c("ymin", "ymax")]) ) } @@ -125,18 +105,15 @@ auto_limits <- function(data, lon = NULL, lat = NULL, proj.in = 4326, proj.out = } - # Projected boundaries + projLims <- stats::setNames(projLims, c("xmin", "xmax", "ymin", "ymax")) + # Projected boundaries - ## Old sp way, to be removed when the sf way is tested - # projBound <- sp::Polygon(matrix(c(projLims[1], projLims[3], projLims[1], projLims[4], projLims[2], projLims[4], projLims[2], projLims[3], projLims[1], projLims[3]), ncol = 2, byrow = TRUE)) - # projBound <- sp::SpatialPolygons(list(sp::Polygons(list(projBound), ID = "clip_boundary")), - # proj4string = if(class(proj.crs) == "CRS") {proj.crs} else {sp::CRS(proj.crs)}) - # tmp <- as.data.frame(t(sp::bbox(projBound))) - # projBoundNodes <- sp::SpatialPoints(expand.grid(lon = tmp$x, lat = tmp$y), - # proj4string = if(class(proj.crs) == "CRS") {proj.crs} else {sp::CRS(convert_crs(proj.crs))}) + projBound <- sf::st_polygon( + list(matrix(c(projLims[1], projLims[3], projLims[1], projLims[4], projLims[2], + projLims[4], projLims[2], projLims[3], projLims[1], projLims[3]), + ncol = 2, byrow = TRUE))) - projBound <- sf::st_polygon(list(matrix(c(projLims[1], projLims[3], projLims[1], projLims[4], projLims[2], projLims[4], projLims[2], projLims[3], projLims[1], projLims[3]), ncol = 2, byrow = TRUE))) projBound <- sf::st_sfc(projBound, crs = sf::st_crs(proj.crs)) tmp <- sf::st_bbox(projBound) @@ -148,20 +125,18 @@ auto_limits <- function(data, lon = NULL, lat = NULL, proj.in = 4326, proj.out = decBoundNodes <- sf::st_transform(projBoundNodes, 4326) - # decBoundNodes <- sp::spTransform(projBoundNodes, sp::CRS(convert_crs(4326))) # proj.in - if(!identical(sign(projLims[3]), sign(projLims[4]))) { # Spans across the pole - decLims <- unname(sf::st_bbox(decBoundNodes)[c(1,3,2,4)]) # old: c(raster::extent(decBoundNodes)[1:3], 90) + decLims <- c(unname(sf::st_bbox(decBoundNodes)[c(1,3,2)]), 90) # old: c(raster::extent(decBoundNodes)[1:3], 90) decLims <- c(decLims[1:3], sign(decLims[4]) * 90) } else if(sign(decLims[1]) != sign(decLims[2]) & decLims[1] < decLims[2]) { # Antimeridian correction decLims <- unname(sf::st_bbox(decBoundNodes)[c(1,3,2,4)]) # old: c(raster::extent(decBoundNodes)[1:4]) } + names(decLims) <- c("xmin", "xmax", "ymin", "ymax") + # Return - if(output.sf) { - list(ddLimits = decLims, projLimits = projLims, projBound = projBound, proj.in = attributes(x)$proj.in, proj.out = proj.out) - } else { - list(ddLimits = decLims, projLimits = projLims, projBound = sf::as_Spatial(projBound), proj.in = attributes(x)$proj.in, proj.out = proj.out) - } + list(ddLimits = decLims, projLimits = projLims, projBound = projBound, + proj.in = attributes(x)$proj.in, proj.out = proj.out) + } diff --git a/R/basemap.R b/R/basemap.R index 5399b99..5fa4b63 100644 --- a/R/basemap.R +++ b/R/basemap.R @@ -1,37 +1,37 @@ #' @title Create a ggplot2 basemap for plotting variables -#' @description Creates a ggplot2 basemap for further plotting of variables. +#' @description Creates a ggplot2 basemap for further plotting of data. #' @param x The limit type (\code{limits}, \code{data}, or \code{shapefiles}) is automatically recognized from the class of this argument. #' @param limits Map limits. One of the following: #' \itemize{ -#' \item \strong{numeric vector} of length 4: The first element defines the start longitude, the second element the end longitude (counter-clockwise), the third element the minimum latitude and the fourth element the maximum latitude of the bounding box. The coordinates can be given as decimal degrees or coordinate units for shapefiles used by a projected map. Produces a rectangular map. Latitude limits not given in min-max order are automatically ordered to respect this requirement. +#' \item \strong{numeric vector} of length 4: The first element defines the start longitude, the second element the end longitude (counter-clockwise), the third element the minimum latitude, and the fourth element the maximum latitude of the bounding box. Also accepts \code{\link[sf:st_bbox]{sf::st_bbox}} type named vectors with limits in any order. The coordinates can be given as decimal degrees or coordinate units for shapefiles used by a projected map. Produces a rectangular map. Latitude limits not given in min-max order are automatically ordered to respect this requirement. #' \item \strong{single integer} between 30 and 88 or -88 and -30 produces a polar map for the Arctic or Antarctic, respectively. #' } #' Can be omitted if \code{data} or \code{shapefiles} are defined. -#' @param data A data frame, \link[sp]{SpatialPolygons}, or \link[sf]{sf} shape containing longitude and latitude coordinates. If a data frame, the coordinates have to be given in decimal degrees. The limits are extracted from these coordinates and produces a rectangular map. Suited for situations where a certain dataset is plotted on a map. The function attempts to \link[=guess_coordinate_columns]{guess the correct columns} and it is advised to use intuitive column names for longitude (such as "lon", "long", or "longitude") and latitude ("lat", "latitude") columns. Can be omitted if \code{limits} or \code{shapefiles} are defined. -#' @param shapefiles Either a \link[=shapefile_list]{list containing shapefile information} or a character argument referring to a name of pre-made shapefiles in \code{\link{shapefile_list}}. This name is partially matched. Can be omitted if \code{limits} or \code{data} are defined as decimal degrees. -#' @param bathymetry Logical indicating whether bathymetry should be added to the map. -#' @param glaciers Logical indicating whether glaciers and ice-sheets should be added to the map. -#' @param rotate Logical indicating whether the projected maps should be rotated to point towards the pole relative to mid-longitude limit. Experimental. -#' @param bathy.style Character defining the style for bathymetry contours. Alternatives: -#' \itemize{ -#' \item \code{"poly_blues"} plots polygons filled with different shades of blue. -#' \item \code{"poly_greys"} plots polygons filled with different shades of gray. -#' \item \code{"contour_blues"} contour lines with different shades of blue. -#' \item \code{"contour_grey"} plots gray contour lines. -#' } +#' @param data A data frame, sp, or \link[sf]{sf} shape containing longitude and latitude coordinates. If a data frame, the coordinates have to be given in decimal degrees. The limits are extracted from these coordinates and produce a rectangular map. Suited for situations where a certain dataset is plotted on a map. The function attempts to \link[=guess_coordinate_columns]{guess the correct columns} and it is advised to use intuitive column names for longitude (such as "lon", "long", or "longitude") and latitude ("lat", "latitude") columns. Can be omitted if \code{limits} or \code{shapefiles} are defined. +#' @param shapefiles Either a \link[=shapefile_list]{list containing shapefile information} or a character argument referring to a name of pre-made shapefiles in \code{\link{shapefile_list}}. This name is partially matched. Can be omitted if \code{limits} or \code{data} is defined as decimal degrees. +#' @param crs \link[sf:st_crs]{Coordinate reference system} (CRS) for the map. If \code{NULL} (default), the CRS is selected automatically based on \code{limits}, \code{data}, or \code{shapefiles}. Passed to \code{\link[sf]{st_crs}}. Typically integers giving the EPGS code are the easiest. Cannot be used simultaneously with \code{rotate}. +#' @param bathymetry Logical indicating whether bathymetry should be added to the map. Functions together with \code{bathy.style}. See Details. +#' @param glaciers Logical indicating whether glaciers and ice sheets should be added to the map. +#' @param rotate Logical indicating whether the projected maps should be rotated to point towards the pole relative to the mid-longitude limit. +#' @param bathy.style Character (plots bathymetry; list of alternatives in Details) or \code{NULL} ("raster_binned_blues" if \code{bathymetry = TRUE}) defining the bathymetry style. Partially matched, can be abbreviated, and used to control bathymetry plotting together with \code{bathymetry}. See Details. +#' @param downsample Integer defining the downsampling rate for raster bathymetries. A value of 0 (default) does not downsample, 1 skips every second row, 2 every second and third. See \code{\link[stars]{geom_stars}} #' @param legends Logical indicating whether the legend for bathymetry should be shown. #' @param legend.position The position for ggplot2 legend. See the argument with the same name in \link[ggplot2]{theme}. #' @param lon.interval,lat.interval Numeric value specifying the interval of longitude and latitude grids. \code{NULL} finds reasonable defaults depending on \code{limits}. -#' @param land.col,gla.col,grid.col Character code specifying the color of land, glaciers and grid lines, respectively. Use \code{NA} to remove the grid lines. +#' @param land.col,gla.col,grid.col Character code specifying the color of land, glaciers, and grid lines, respectively. Use \code{NA} to remove the grid lines. #' @param land.border.col,gla.border.col,bathy.border.col Character code specifying the color of the border line for land, glacier, and bathymetry shapes. #' @param land.size,gla.size,bathy.size,grid.size Numeric value specifying the width of the border line land, glacier and bathymetry shapes as well as the grid lines, respectively. Use the \code{\link{LS}} function for a specific width in pt. See Details. -#' @param bathy.alpha Transparency parameter for bathymetry fill color. See \link[ggplot2]{scale_alpha}. +#' @param bathy.alpha Transparency parameter for the bathymetry fill color. See \link[ggplot2]{scale_alpha}. #' @param base_size Base size parameter for ggplot. See \link[ggplot2]{ggtheme}. #' @param projection.grid Logical indicating whether the coordinate grid should show projected coordinates instead of decimal degree values. Useful to define limits for large maps in polar regions. #' @param expand.factor Expansion factor for map limits with the \code{data} argument. Can be used to zoom in and out automatically limited maps. Defaults to 1.1. Set to \code{NULL} to ignore. -#' @param verbose Logical indicating whether information about the projection and guessed column names should be returned as message. Set to \code{FALSE} to make the function silent. +#' @param verbose Logical indicating whether information about the projection and guessed column names should be returned as messages. Set to \code{FALSE} to make the function silent. #' @return Returns a \link[ggplot2]{ggplot} map, which can be assigned to an object and modified as any ggplot object. -#' @details The function uses \link[ggplot2:ggplot2-package]{ggplot2}, \link[ggspatial:ggspatial-package]{ggspatial}, GIS packages of R, and shapefiles to plot maps of the world's oceans. +#' @details The function uses \link[ggplot2:ggplot2-package]{ggplot2}, \link[sf:sf]{sf}, \link[stars:st_as_stars]{stars} and spatial files to plot maps of the world's oceans. +#' +#' \strong{Limits} +#' +#' If the limits are in decimal degrees, the longitude limits (\code{[1:2]}) specify the start and end segments of corresponding angular lines that should reside inside the map area. The longitude limits are defined \strong{counter-clockwise}. The latitude limits \code{[3:4]} define the parallels that should reside inside the limited region given the longitude segments. Note that the actual limited region becomes wider than the polygon defined by the coordinates (shown in Examples). Using \code{data} to limit the map expands the map all around the data points to make them fit into the map. If the limits are given as projected coordinates or as decimal degrees for maps with -60 < latitude < 60, limit elements represent lines encompassing the map area in cartesian space. #' #' \strong{Projections} #' @@ -42,38 +42,53 @@ #' \item \strong{4326} WGS 84 / World Geodetic System 1984, used in GPS. Called "DecimalDegree". For min latitude (\code{limits[3]}) < 30 or > -30, max latitude (\code{limits[4]}) < 60 or > -60, and single integer latitudes < 30 and > -30. #' } #' -#' \strong{Limits} -#' -#' If the limits are in decimal degrees, the longitude limits (\code{[1:2]}) specify the start and end segments of corresponding angular lines that should reside inside the map area. The longitude limits are defined \strong{counter-clockwise}. The latitude limits \code{[3:4]} define the parallels that should reside inside the limited region given the longitude segments. Note that the actual limited region becomes wider than the polygon defined by the coordinates (shown in Examples). Using \code{data} to limit the map expands the map all around the data points to make them fit into the map. If the limits are given as projected coordinates or as decimal degrees for maps with -60 < latitude < 60, limits elements represent lines encompassing the map area in cartesian space. -#' +#' The \code{rotate} argument changes the pre-defined projection such that mid-longitude point in the map points northward. +#' +#' The \code{crs} argument can be used to define the projection, which can be useful when plotting, for instance, model data that are difficult to transform into another projection. +#' +#' \strong{Bathymetry} +#' +#' Bathymetry can be plotted by simply specifying \code{bathymetry = TRUE} or \code{bathy.style} (you won't need to specify both any longer). The former uses a low-resolution raster file shipped with ggOceanMaps. The package contains an option to plot higher resolution bathymetries than the default binned blue alternative (\code{bathy.style = "raster_binned_blues"}). These bathymetries can be accessed by specifying the \code{bathy.style} argument and require a download from \href{https://github.com/MikkoVihtakari/ggOceanMapsLargeData}{ggOceanMapsLargeData} or other online repositories. The \code{bathy.style} character argument consists of three parts separated by a \code{_}. The first part gives the type: raster, poly(gon), or contour. The two latter ones use vector data. The second part gives the resolution: binned, continuous or user. The continuous and user options cannot be used for vector data. The user option accepts any raster file that can be opened using \link[stars]{read_stars}. The path to the file has to be stored in \code{ggOceanMaps.userpath} \link[base:options]{option} (e.g. \code{options(ggOceanMaps.userpath = "PATH_TO_THE_FILE")}) (you can set this in .Rprofile to avoid having to type it every time). The last part defines the color: blues or grays. These options can be abbreviated by specifying the first letter of each part. Gray contour lines are an exception to the rule above and can be plotted using \code{bathy.style = "contour_gray"}. Future versions may contain a combination of raster and gray contours, but these have not been implemented yet. Currently implemented \code{bathy.style} alternatives are: +#' \itemize{ +#' \item \code{NULL} (\strong{default}). Bathymetry style is searched from \code{getOption("ggOceanMaps.bathy.style")}. If not found, \code{"raster_binned_blues"} is used. +#' \item \code{"raster_binned_blues"} or \code{"rbb"} plots binned raster bathymetry filled with different shades of blue. Does not require a download. +#' \item \code{"raster_binned_grays"} or \code{"rbg"} the same than above but uses different shades of gray. +#' \item \code{"raster_continuous_blues"} or \code{"rcb"} plots continuous raster bathymetry filled with different shades of blue. More detailed and visually more appealing than the binned bathymetry. Recommended. Requires a download. +#' \item \code{"raster_continuous_grays"} or \code{"rcg"} the same than above but uses different shades of gray. +#' \item \code{"raster_user_blues"} or \code{"rub"} plots continuous raster bathymetry filled with different shades of blue from \code{getOption("ggOceanMaps.user.path")}. Any file supported by \link[stars]{read_stars} should work. The file has to be placed into the location specified by the path. Experimental feature. Has been tested using \href{https://www.ncei.noaa.gov/products/etopo-global-relief-model}{ETOPO 60 arc-second} and \href{https://www.gebco.net/data_and_products/gridded_bathymetry_data/}{GEBCO 15 arc-second} grids. Please report any bugs you find. +#' \item \code{"raster_user_grays"} or \code{"rug"} the same than above but uses different shades of gray. +#' \item \code{"poly_binned_blues"}, \code{"poly_blues"}, \code{"pbb"} or \code{"pb"} plots polygon bathymetry filled with different shades of blue. Default in the versions older than 2.0 of ggOceanMaps. Requires a download. +#' \item \code{"poly_binned_grays"}, \code{"poly_grays"}, \code{"pbg"} or \code{"pg"} same than above but uses different shades of gray. +#' \item \code{"contour_binned_blues"}, \code{"contour_blues"}, \code{"cbb"} or \code{"cb"} contour lines with different shades of blue. Requires a download. +#' \item \code{"contour_gray"}, \code{"contour_gray"} or \code{"cg"} plots gray contour lines. Requires a download. +#' } +#' +#' The default can be changed by setting the \code{ggOceanMaps.bathy.style} option. \code{options(ggOceanMaps.bathy.style = "poly_blues")} would make the style similar to older pre-2.0 versions of ggOceanMaps. +#' #' \strong{Pre-made shapefiles} #' -#' If the limits are not defined as decimal degrees (any longitude outside range [-180, 180] or latitude [-90, 90]), the function will ask to specify \code{shapefiles}. The \code{shapefiles} can be defined by partially matching the names of the pre-made shapefiles in \code{\link{shapefile_list}} (e.g. "Ar" would be enough for "ArcticStereographic") or by specifying custom shapefiles. +#' If the limits are not defined as decimal degrees (any longitude outside the range [-180, 180] or latitude [-90, 90]), the function will ask to specify \code{shapefiles}. The \code{shapefiles} can be defined by partially matching the names of the pre-made shapefiles in \code{\link{shapefile_list}} (e.g. "Ar" would be enough for "ArcticStereographic") or by specifying custom shapefiles. #' #' \strong{Custom shapefiles} #' -#' Custom shapefiles have to be a named list containing at least following elements: +#' Custom shapefiles have to be a named list containing at least the following elements: #' \itemize{ -#' \item \strong{land} Object name of the \code{\link[sp:SpatialPolygons]{SpatialPolygonsDataFrame}} containing land. Required. -#' \item \strong{glacier} Object name of the \code{\link[sp:SpatialPolygons]{SpatialPolygonsDataFrame}} containing glaciers. Not required if glaciers are not needed. -#' \item \strong{bathy} Object name of the \code{\link[sp:SpatialPolygons]{SpatialPolygonsDataFrame}} containing bathymetry contours. Not required if bathymetry is not needed. +#' \item \strong{land} Object name of the \link[sf:st_sf]{spatial polygon} containing land. Required. +#' \item \strong{glacier} Object name of the \link[sf:st_sf]{spatial polygon} containing glaciers. Not required if glaciers are not needed. +#' \item \strong{bathy} Object name of the \link[sf:st_sf]{spatial polygon} or \link[stars:st_as_stars]{raster} containing bathymetry data. Not required if bathymetry is not needed. #' } #' #' See Examples. #' #' \strong{Line width and font size} #' -#' The line size aesthetics in \link[ggplot2:ggplot2-package]{ggplot2} generates approximately 2.13 wider lines measured in pt than the given values. If you want a specific line width in pt, use the internal function \code{\link{LS}} to convert the desired line width to ggplot2 equivalent. A similar function is also available for font sizes (\code{\link{FS}}). -#' -#' \strong{CRS warnings} +#' The line size aesthetics in \link[ggplot2:ggplot2-package]{ggplot2} generates approximately 2.13 wider lines measured in pt than the given values. If you want a specific line width in pt, use the internal function \code{\link{LS}} to convert the desired line width to the ggplot2 equivalent. A similar function is also available for font sizes (\code{\link{FS}}). #' -#' Open-source GIS systems are rolling over to a new \href{https://rgdal.r-forge.r-project.org/articles/CRS_projections_transformations.html}{to a new projection definition system}. The changes to underlying systems appear to sometimes trigger warnings the user can ignore as long as the resulting map looks OK. Bug reports regarding these warnings are appreciated. -#' -#' @references Note that if you use this function to generate maps for a publication, it is advised to cite the underlying data. The spatial data used by this function have been acquired from following sources: +#' @references Note that if you use this function to generate maps for a publication, it is advised to cite the underlying data. The spatial data used by this function have been acquired from the following sources: #' \itemize{ -#' \item \strong{Land polygons.} \href{http://www.naturalearthdata.com/downloads/10m-physical-vectors/}{Natural Earth Data} 1:10m Physical Vectors with the Land and Minor Island datasets combined. Distributed under the \href{https://creativecommons.org/publicdomain/}{CC Public Domain license} (\href{http://www.naturalearthdata.com/about/terms-of-use/}{terms of use}). -#' \item \strong{Glacier polygons.} \href{http://www.naturalearthdata.com/downloads/10m-physical-vectors/}{Natural Earth Data} 1:10m Physical Vectors with the Glaciated Areas and Antarctic Ice Shelves datasets combined. Distributed under the \href{https://creativecommons.org/publicdomain/}{CC Public Domain license} (\href{http://www.naturalearthdata.com/about/terms-of-use/}{terms of use}) -#' \item \strong{Bathymetry.} \href{https://www.ngdc.noaa.gov/mgg/global/relief/ETOPO1/docs/ETOPO1.pdf}{Amante, C. and B.W. Eakins, 2009. ETOPO1 1 Arc-Minute Global Relief Model: Procedures, Data Sources and Analysis. NOAA Technical Memorandum NESDIS NGDC-24. National Geophysical Data Center, NOAA}. Distributed under the \href{https://www.usa.gov/government-works}{U.S. Government Work license}. +#' \item \strong{Land polygons.} \href{https://www.naturalearthdata.com/downloads/10m-physical-vectors/}{Natural Earth Data} 1:10m Physical Vectors with the Land and Minor Island datasets combined. Distributed under the \href{https://creativecommons.org/publicdomain/}{CC Public Domain license} (\href{https://www.naturalearthdata.com/about/terms-of-use/}{terms of use}). +#' \item \strong{Glacier polygons.} \href{https://www.naturalearthdata.com/downloads/10m-physical-vectors/}{Natural Earth Data} 1:10m Physical Vectors with the Glaciated Areas and Antarctic Ice Shelves datasets combined. Distributed under the \href{https://creativecommons.org/publicdomain/}{CC Public Domain license} (\href{https://www.naturalearthdata.com/about/terms-of-use/}{terms of use}) +#' \item \strong{Bathymetry.} \href{https://www.ncei.noaa.gov/products/etopo-global-relief-model}{NOAA National Centers for Environmental Information. 2022: ETOPO 2022 15 Arc-Second Global Relief Model. NOAA National Centers for Environmental Information.} \doi{10.25921/fd45-gt74}. Distributed under the \href{https://www.usa.gov/government-works}{U.S. Government Work license}. #' } #' #' @family basemap functions @@ -84,71 +99,77 @@ #' # The easiest way to produce a map is to use the limits #' # argument and decimal degrees: #' +#' basemap(limits = 60) # synonym to basemap(60) #' \donttest{ -#' if(requireNamespace("ggOceanMapsData", quietly = TRUE)) { -#' basemap(limits = 60) -#' -#' # Bathymetry and glaciers can be added using the respective arguments: +#' # Bathymetry can be added using the respective argument: +#' basemap(limits = -60, bathymetry = TRUE) #' -#' basemap(limits = -60, bathymetry = TRUE, glaciers = TRUE) +#' \dontrun{ +#' # Glaciers require a download in the new version: +#' basemap(limits = -60, glaciers = TRUE, shapefiles = "Arctic") +#' } #' #' # The easiest way to add data on the maps is to use the ggspatial functions: -#' #' dt <- data.frame(lon = c(-150, 150), lat = c(60, 90)) -#' +#' if(requireNamespace("ggspatial", quietly = TRUE)) { #' basemap(data = dt, bathymetry = TRUE) + -#' geom_spatial_point(data = dt, aes(x = lon, y = lat), color = "red") -#' +#' ggspatial::geom_spatial_point(data = dt, aes(x = lon, y = lat), +#' color = "red") +#' } #' \dontrun{ #' # Note that writing out data = dt is required because there are multiple #' # underlying ggplot layers plotted already: #' basemap(data = dt) + -#' geom_spatial_point(dt, aes(x = lon, y = lat), color = "red") +#' ggspatial::geom_spatial_point(dt, aes(x = lon, y = lat), color = "red") #' #> Error: `mapping` must be created by `aes()` #' } #' #' # If you want to use native ggplot commands, you need to transform your data #' # to the projection used by the map: -#' #' dt <- transform_coord(dt, bind = TRUE) #' -#' basemap(data = dt) + geom_point(data = dt, aes(x = lon.proj, y = lat.proj)) +#' basemap(data = dt) + +#' geom_point(data = dt, aes(x = lon.proj, y = lat.proj), color = "red") #' #' # The limits argument of length 4 plots a map anywhere in the world: -#' #' basemap(limits = c(100, 160, -20, 30), bathymetry = TRUE) #' -#' # The argument leads to expanded maps towards poles: +#' # The limits are further expanded when using the data argument: #' #' dt <- data.frame(lon = c(-160, 160, 160, -160), lat = c(80, 80, 60, 60)) #' -#' basemap(limits = c(160, -160, 60, 80)) + -#' geom_spatial_polygon(data = dt, aes(x = lon, y = lat), -#' fill = NA, color = "red") -#' -#' # The limits are further expanded when using the data argument: -#' +#' if(requireNamespace("ggspatial", quietly = TRUE)) { #' basemap(data = dt) + -#' geom_spatial_polygon(data = dt, aes(x = lon, y = lat), -#' fill = NA, color = "red") +#' ggspatial::geom_spatial_polygon(data = dt, aes(x = lon, y = lat), +#' fill = NA, color = "red") #' #' # Rotate: +#' basemap(data = dt, rotate = TRUE) + +#' ggspatial::geom_spatial_polygon(data = dt, aes(x = lon, y = lat), +#' fill = NA, color = "red") +#' } #' +#' # Alternative: #' basemap(data = dt, rotate = TRUE) + -#' geom_spatial_polygon(data = dt, aes(x = lon, y = lat), -#' fill = NA, color = "red") +#' geom_polygon(data = transform_coord(dt, rotate = TRUE), +#' aes(x = lon, y = lat), fill = NA, color = "red") #' #' ## To find UTM coordinates to limit a polar map: #' basemap(limits = 60, projection.grid = TRUE) +#' +#' \dontrun{ +#' # (Arctic shapes require a download in 2.0) #' basemap(limits = c(2.5e4, -2.5e6, 2e6, -2.5e5), shapefiles = "Arctic") #' -#' # Using custom shapefiles +#' # Using custom shapefiles (requires download): #' data(bs_shapes, package = "ggOceanMapsData") -#' basemap(shapefiles = list(land = bs_land), -#' bathymetry = TRUE) +#' basemap(shapefiles = list(land = bs_land))#' #' -#' # grid.col = NA removes grid lines, rotate = TRUE rotates northwards +#' # Premade shapefiles from ggOceanMapsLargeData (requires download): +#' basemap("BarentsSea", bathymetry = TRUE) +#' } #' +#' # grid.col = NA removes grid lines, rotate = TRUE rotates northwards: #' basemap(limits = c(-180, -140, 50, 70), grid.col = NA, rotate = TRUE) #' #' # Rename axis labels @@ -166,41 +187,74 @@ #' axis.ticks.y = element_blank() #' ) #' } -#' } -#' @import ggplot2 ggspatial sp sf +#' @import ggplot2 #' @export ## Test parameters -# x = NULL; limits = NULL; data = NULL; shapefiles = NULL; bathymetry = FALSE; glaciers = FALSE; rotate = FALSE; legends = TRUE; legend.position = "right"; lon.interval = NULL; lat.interval = NULL; bathy.style = "poly_blues"; bathy.border.col = NA; bathy.size = 0.1; land.col = "grey60"; land.border.col = "black"; land.size = 0.1; gla.col = "grey95"; gla.border.col = "black"; gla.size = 0.1; grid.col = "grey70"; grid.size = 0.1; base_size = 11; projection.grid = FALSE; verbose = TRUE -basemap <- function(x = NULL, limits = NULL, data = NULL, shapefiles = NULL, bathymetry = FALSE, glaciers = FALSE, rotate = FALSE, legends = TRUE, legend.position = "right", lon.interval = NULL, lat.interval = NULL, bathy.style = "poly_blues", bathy.border.col = NA, bathy.size = 0.1, bathy.alpha = 1, land.col = "grey60", land.border.col = "black", land.size = 0.1, gla.col = "grey95", gla.border.col = "black", gla.size = 0.1, grid.col = "grey70", grid.size = 0.1, base_size = 11, projection.grid = FALSE, expand.factor = 1.1, verbose = FALSE) { - - # Install ggOceanMapsData if not installed - # if (!requireNamespace("ggOceanMapsData", quietly = TRUE)) { - # stop('The ggOceanMapsData package needs to be installed for ggOceanMaps to function.\nInstall the data package by running\ninstall.packages("ggOceanMapsData", repos = c("https://mikkovihtakari.github.io/drat", "https://cloud.r-project.org"))\nOR\ndevtools::install_github("MikkoVihtakari/ggOceanMapsData")') - # } +# limits = c(160, -160, 60, 80); bathymetry = TRUE +# x = NULL; limits = NULL; data = NULL; shapefiles = NULL; crs = NULL; bathymetry = FALSE; glaciers = FALSE; rotate = FALSE; legends = TRUE; legend.position = "right"; lon.interval = NULL; lat.interval = NULL; bathy.style = NULL; bathy.border.col = NA; bathy.size = 0.1; bathy.alpha = 1; land.col = "grey60"; land.border.col = "black"; land.size = 0.1; gla.col = "grey95"; gla.border.col = "black"; gla.size = 0.1; grid.col = "grey70"; grid.size = 0.1; base_size = 11; projection.grid = FALSE; verbose = TRUE + +basemap <- function(x = NULL, limits = NULL, data = NULL, shapefiles = NULL, crs = NULL, bathymetry = FALSE, glaciers = FALSE, rotate = FALSE, legends = TRUE, legend.position = "right", lon.interval = NULL, lat.interval = NULL, bathy.style = NULL, downsample = 0, bathy.border.col = NA, bathy.size = 0.1, bathy.alpha = 1, land.col = "grey60", land.border.col = "black", land.size = 0.1, gla.col = "grey95", gla.border.col = "black", gla.size = 0.1, grid.col = "grey70", grid.size = 0.1, base_size = 11, projection.grid = FALSE, expand.factor = 1.1, verbose = FALSE) { # The x argument to limits or data if(!is.null(x)) { - if(any(class(x) %in% c("integer", "numeric", "bbox")) & is.null(limits)) { + if(inherits(x, c("integer", "numeric", "bbox")) & is.null(limits)) { limits <- x - } else if(any(class(x) %in% c("data.frame", "data.table", "sf", "sfc", "SpatialPolygonsDataFrame", "SpatialPolygons")) & is.null(limits) & is.null(data)) { + } else if(inherits(x, c("data.frame", "data.table", "sf", "sfc", "SpatialPolygonsDataFrame", "SpatialPolygons", "SpatialPoints", "SpatialPointsDataFrame")) & is.null(limits) & is.null(data)) { data <- x - } else if(inherits(x, "character")) { + } else if(inherits(x, c("character", "list"))) { shapefiles <- x } } + # Bathymetry style + + if(is.null(bathy.style)) { + bathy.style <- ifelse(!is.null(getOption("ggOceanMaps.bathy.style")), getOption("ggOceanMaps.bathy.style"), "raster_binned_blues") + } else { + bathymetry <- TRUE + } + + bathy_cmd <- define_bathy_style(bathy.style) + bathy.type <- gsub("_blues$|_grays$", "", names(bathy_cmd)) + bathy.type <- ifelse(grepl("raster", bathy.type), bathy.type, "vector") + bathy_color <- utils::tail(unlist(strsplit(names(bathy_cmd), "_")), n = 1) + + if(bathymetry & !is.null(shapefiles) & inherits(shapefiles, "list")) { + if(!is.null(shapefiles$bathy)) { + if(grepl("raster", bathy.type) & !inherits(shapefiles$bathy$raster, c("stars", "stars_proxy"))) { + if(inherits(shapefiles$bathy$raster, c("sf", "sfc", "SpatialPolygonsDataFrame", "SpatialPolygons"))) { + msg <- paste0("Detecting vector bathymetry. Code written for <2.0 perhaps? Changing bathy.style to 'poly_blues'.") + + message(paste(strwrap(msg), collapse= "\n")) + bathy.style <- "poly_blues" + + bathy_cmd <- define_bathy_style(bathy.style) + bathy.type <- gsub("_blues$|_grays$", "", names(bathy_cmd)) + bathy.type <- ifelse(grepl("raster_binned|raster_continuous", bathy.type), bathy.type, "vector") + } + } + } else { + stop("shapefiles = list(..., bathy) is required when using custom shapefiles with bathymetry = TRUE") + } + } + + # Checks #### if(is.null(data) & is.null(limits) & is.null(shapefiles)) stop("One or several of the arguments limits, data and shapefiles is required.") if(!is.logical(legends) | !length(legends) %in% 1:2) stop("'legends' argument has to be a logical vector of length 1 or 2. Read the explantion for the argument in ?basemap") + if(!is.null(crs) & rotate) { + rotate <- FALSE + message("The rotate argument cannot be used with custom crs. Turning rotate to FALSE.") + } ########### # Data #### - X <- basemap_data(limits = limits, data = data, shapefiles = shapefiles, bathymetry = bathymetry, glaciers = glaciers, lon.interval = lon.interval, lat.interval = lat.interval, rotate = rotate, expand.factor = expand.factor, verbose = verbose) + X <- basemap_data(limits = limits, data = data, shapefiles = shapefiles, crs = crs, bathymetry = bathymetry, bathy.type = bathy.type, downsample = downsample, glaciers = glaciers, lon.interval = lon.interval, lat.interval = lat.interval, rotate = rotate, expand.factor = expand.factor, verbose = verbose) ########### # Plot #### @@ -209,19 +263,23 @@ basemap <- function(x = NULL, limits = NULL, data = NULL, shapefiles = NULL, bat if(bathymetry & !is.null(X$shapefiles$bathy)) { - bathy_cmd <- switch(bathy.style, - poly_blues = "bathy_pb", - poly_greys = "bathy_pg", - contour_blues = "bathy_cb", - contour_grey = "bathy_cg", - stop(paste("bathy.style not found")) - ) - bathy.legend <- ifelse(length(legends) == 1, legends, legends[1]) if(bathy_cmd == "bathy_cg" & is.na(bathy.border.col)) bathy.border.col <- "grey" - layers <- paste(map_cmd("base"), map_cmd(bathy_cmd), sep = " + ") + if(bathy.type == "raster_binned") { + layers <- paste( + map_cmd("base"), map_cmd(bathy_cmd), + ifelse(bathy_color == "blues", map_cmd("bathy_rbb_scale"), map_cmd("bathy_rbg_scale")), + sep = " + ") + } else if(bathy.type %in% c("raster_user", "raster_continuous")) { + layers <- paste( + map_cmd("base"), map_cmd(bathy_cmd), + ifelse(bathy_color == "blues", map_cmd("bathy_rcb_scale"), map_cmd("bathy_rcg_scale")), + sep = " + ") + } else { + layers <- paste(map_cmd("base"), map_cmd(bathy_cmd), sep = " + ") + } } else { layers <- map_cmd("base") @@ -257,6 +315,10 @@ basemap <- function(x = NULL, limits = NULL, data = NULL, shapefiles = NULL, bat } } + ## An expand bug, try with basemap(limits = c(-120, -0, -60, -90), rotate = TRUE) + + if(diff(X$map.limits[3:4]) > 1e6 && X$map.limits[3] == 0) X$map.limits[3] <- 1 + ## Final plotting out <- eval(parse(text=layers)) @@ -268,8 +330,7 @@ basemap <- function(x = NULL, limits = NULL, data = NULL, shapefiles = NULL, bat attributes(out)$limits <- X$map.limits attributes(out)$polarmap <- X$polar.map attributes(out)$map.grid <- X$map.grid - attributes(out)$crs <- X$shapefiles$crs - attributes(out)$proj <- X$proj + attributes(out)$crs <- sf::st_crs(X$proj) out diff --git a/R/basemap_data.R b/R/basemap_data.R index cf13b7b..d9de490 100644 --- a/R/basemap_data.R +++ b/R/basemap_data.R @@ -5,20 +5,138 @@ #' @return A list of class \code{basemapData} containing information required for plotting a \code{\link{basemap}}. #' @keywords internal #' @export -#' @import sp sf raster rgeos #' @author Mikko Vihtakari #' @seealso \code{\link{basemap}} # Test paramters # limits = NULL; data = NULL; shapefiles = NULL; bathymetry = FALSE; glaciers = FALSE; lon.interval = NULL; lat.interval = NULL; expand.factor = 1.1; rotate = FALSE; verbose = TRUE -basemap_data <- function(limits = NULL, data = NULL, shapefiles = NULL, bathymetry = FALSE, glaciers = FALSE, lon.interval = NULL, lat.interval = NULL, expand.factor = 1.1, rotate = FALSE, verbose = FALSE) { +basemap_data <- function(limits = NULL, data = NULL, shapefiles = NULL, crs = NULL, bathymetry = FALSE, bathy.type = NULL, downsample = 0, glaciers = FALSE, lon.interval = NULL, lat.interval = NULL, expand.factor = 1.1, rotate = FALSE, verbose = FALSE) { + + ## For code-readability and debugging, the function has been cut to compartments. + ## The internal functions can be found at the end of this script. + + ## Turn sf limits ggOceanMaps compatible and sort ggOceanMaps latitude limits + + if(length(limits) == 4 & all(c("xmin", "xmax", "ymin", "ymax") %in% names(limits))) { + limits <- limits[c("xmin", "xmax", "ymin", "ymax")] + } + + if(length(limits) == 4 & is.null(names(limits))) { + limits <- c(limits[1:2], sort(limits[3:4])) + names(limits) <- c("xmin", "xmax", "ymin", "ymax") + } + + # 1. Define the shapefiles #### + + x <- basemap_data_define_shapefiles( + limits = limits, data = data, shapefiles = shapefiles, crs = crs, + bathymetry = bathymetry, bathy.type = bathy.type, downsample = downsample, glaciers = glaciers, + rotate = rotate, expand.factor = expand.factor, verbose = FALSE + ) + + # 2. Crop and rotate shapefiles if needed #### + + x <- basemap_data_crop(x = x, bathymetry = bathymetry, glaciers = glaciers, crs = crs) + + # 3. Define the grid lines #### + + x <- basemap_define_grid_lines(x = x, lon.interval = lon.interval, lat.interval = lat.interval) + + # Return #### + + out <- list(shapefiles = x$shapefiles, map.limits = x$map_limits, + polar.map = x$polarMap, map.grid = x$mapGrid, proj = x$crs) + + class(out) <- "basemapData" + + out +} + + + +# Internal functions to make the code easier to read and debug #### + +## Detect case from provided arguments #### + +basemap_data_detect_case <- function(limits = NULL, data = NULL, shapefiles = NULL) { + + if(is.null(limits) & is.null(data) & is.null(shapefiles)) { + "none" + } else if(!is.null(limits)) { + + if(!is.numeric(limits)) stop("Limits have to be given as numeric vectors of length 1 or 4. See Details.") + if(!length(limits) %in% c(1, 4)) stop("Limits has to be either length of 1 or 4. See Details.") + + decLimits <- is_decimal_limit(limits) + + if(length(limits) == 1) { + if(!decLimits) stop("Limits of length 1 have to be given as decimal degrees.") + + if(abs(limits) <= 89 & abs(limits) >= 10) { + "limits_polar" + } else { + stop("The limits argument has to be between 10 and 89 (or negative) for polar stereographic maps.") + } + } else if( + (all(abs(limits[1:2]) == 180) & diff(limits[3:4]) <= 60) | + all(abs(limits[1:2]) == 0) + ) { ## Fix the problem with limits ending up on a single line in polar maps + "limits_polar_adjust" + } else if(decLimits) { + if(identical(limits[1], limits[2])) stop("Longitude limits[1:2] equal. Cannot plot a map") + if(identical(limits[3], limits[4])) stop("Latitude limits[3:4] equal. Cannot plot a map") + "limits_dec" + } else { + if(identical(limits[1], limits[2])) stop("Longitude limits[1:2] equal. Cannot plot a map") + if(identical(limits[3], limits[4])) stop("Latitude limits[3:4] equal. Cannot plot a map") + "limits_proj" + } + + + } else if(!is.null(data)) { + if(inherits(data, c("sf", "sfc"))) { + "data_sf" + } else if(inherits(data, c("SpatialPolygons", "SpatialPolygonsDataFrame","SpatialPoints", "SpatialPointsDataFrame"))) { + "data_sp" + } else if(inherits(data, c("data.frame", "tibble", "data.table"))) { + clipLimits <- try(auto_limits(data, verbose = FALSE), silent = TRUE) + + if(inherits(clipLimits, "try-error")) { + decLimits <- FALSE + } else { + decLimits <- is_decimal_limit(clipLimits$ddLimits) + } + + if(decLimits) { + "data_dec" + } else { + "data_proj" + } + } else { + stop("The data argument has to be either a data frame, tibble, data.table, sf or sp object.") + } + + } else { + "shapefiles" + } +} + + +## Define shapefiles #### +basemap_data_define_shapefiles <- function(limits = NULL, data = NULL, shapefiles = NULL, crs = NULL, bathymetry = FALSE, bathy.type = NULL, downsample = 0, glaciers = FALSE, rotate = FALSE, expand.factor = 1.1, verbose = FALSE) { # Switches and checks #### shapefilesDefined <- FALSE polarMap <- FALSE - # 1. shapefiles argument dictates the used shapefile. If NULL, shapefiles are obtained from limits ### + # 1. Detect case #### + + case <- basemap_data_detect_case(limits = limits, data = data, shapefiles = shapefiles) + + # 2. Define shapefiles when specified #### + + if(case == "none") shapefiles <- "DecimalDegree" if(!is.null(shapefiles)) { @@ -37,47 +155,64 @@ basemap_data <- function(limits = NULL, data = NULL, shapefiles = NULL, bathymet } if(!glaciers) shapefiles$glacier <- NULL - if(!bathymetry) shapefiles$bathy <- NULL + if(!bathymetry) { + shapefiles$bathy <- NULL + } else { + shapefiles$bathy <- shapefiles$bathy[bathy.type] + } ## Load the shapefiles if download required - if(!is.na(shapefiles$path) & - any(sapply( - shapefiles[names(shapefiles) %in% c("land", "glacier", "bathy")], - function(k) !grepl("::", k) & !is.null(k))) - ) { - - ## Predifined shapefile case - - tmp <- load_map_data(x = shapefiles) - - shapefiles <- stats::setNames(lapply(seq_along(shapefiles), function(i) { - test <- which(names(tmp) %in% shapefiles[[i]]) - - if(length(test) != 1) { - shapefiles[[i]] + shapefiles <- load_map_data(shapefiles) + + # if(!is.na(shapefiles$path) & + # any(sapply( + # shapefiles[names(shapefiles) %in% c("land", "glacier", "bathy")], + # function(k) !grepl("::", k) & !is.null(k))) + # ) { + # + # ## Predifined shapefile case + # + # tmp <- load_map_data(x = shapefiles) + # + # shapefiles <- stats::setNames(lapply(seq_along(shapefiles), function(i) { + # test <- which(names(tmp) %in% shapefiles[[i]]) + # + # if(length(test) != 1) { + # shapefiles[[i]] + # } else { + # if(names(shapefiles)[i] == "glacier" & !glaciers) { + # NULL + # } else if(names(shapefiles)[i] == "bathy" & !bathymetry) { + # NULL + # } else { + # tmp[[test]] + # } + # } + # }), names(shapefiles)) + # } + + # if(is.character(shapefiles$land)) { + # shapefiles$land <- eval(parse(text = shapefiles$land)) + # } + # + # if(is.character(shapefiles$glacier)) { + # shapefiles$glacier <- eval(parse(text = shapefiles$glacier)) + # } + # + # if(is.character(shapefiles$bathy)) { + # shapefiles$bathy <- eval(parse(text = shapefiles$bathy)) + # } + + if(any(sapply(shapefiles, function(k) + inherits(k, c("SpatialPolygons", "SpatialPolygonsDataFrame"))))) { + shapefiles <- lapply(shapefiles, function(k) { + if(inherits(k, c("SpatialPolygons", "SpatialPolygonsDataFrame"))) { + sf::st_as_sf(k) } else { - if(names(shapefiles)[i] == "glacier" & !glaciers) { - NULL - } else if(names(shapefiles)[i] == "bathy" & !bathymetry) { - NULL - } else { - tmp[[test]] - } + k } - }), names(shapefiles)) - } - - if(is.character(shapefiles$land)) { - shapefiles$land <- eval(parse(text = shapefiles$land)) - } - - if(is.character(shapefiles$glacier)) { - shapefiles$glacier <- eval(parse(text = shapefiles$glacier)) - } - - if(is.character(shapefiles$bathy)) { - shapefiles$bathy <- eval(parse(text = shapefiles$bathy)) + }) } shapefilesDefined <- TRUE @@ -86,436 +221,559 @@ basemap_data <- function(limits = NULL, data = NULL, shapefiles = NULL, bathymet ##Custom shapefile case - # if(any(!c("land", "glacier", "bathy") %in% names(shapefiles))) stop("Shapefiles object must be a list and contain named elements: 'land', 'glacier', 'bathy'. See Details.") # Delete this once you confirm that the changes do not cause bugs - shapefiles <- shapefiles[c("land", "glacier", "bathy")] shapefiles <- shapefiles[!is.na(names(shapefiles))] - customShapefiles <- sapply(shapefiles, function(k) class(k)) - - if(any(sapply(customShapefiles, - function(k) any(k %in% c("sf", "sfc"))))) { - shapefiles <- lapply(seq_along(shapefiles), function(i) { - if(any(customShapefiles[[i]] %in% c("sf", "sfc"))) { - sf::as_Spatial(shapefiles[[i]]) + if(any(sapply(shapefiles, function(k) + inherits(k, c("SpatialPolygons", "SpatialPolygonsDataFrame"))))) { + shapefiles <- lapply(shapefiles, function(k) { + if(inherits(k, c("SpatialPolygons", "SpatialPolygonsDataFrame"))) { + sf::st_as_sf(k) } else { - shapefiles[[i]] + k } }) - - names(shapefiles) <- names(customShapefiles) - customShapefiles <- sapply(shapefiles, function(k) class(k)) } - if(any(sapply(customShapefiles, function(k) any(!k %in% c("NULL", "SpatialPolygonsDataFrame", "SpatialPolygons"))))) stop("Shapefiles elements 'land', 'glacier', and 'bathy' must either be a SpatialPolygonsDataFrame, SpatialPolygons, or NULL. See Details.") - if(all(sapply(customShapefiles, function(k) k == "NULL"))) stop("One of following shapefiles elements 'land', 'glacier', and 'bathy' must be a SpatialPolygonsDataFrame. See Details.") + customShapefiles <- sapply(shapefiles, function(k) class(k)[1]) - shapefilesDefined <- TRUE + if(all(sapply(customShapefiles, function(k) is.null(k)))) stop("One of following shapefiles elements 'land', 'glacier', and 'bathy' must be a an sf or stars object. See Details.") } + # if(any(sapply(shapefiles, function(k) !inherits(k, c("NULL", "sf", "SpatialPolygonsDataFrame", "SpatialPolygons"))))) stop("Shapefiles elements 'land', 'glacier', and 'bathy' must either be a SpatialPolygonsDataFrame, SpatialPolygons, or NULL. See Details.") + + shapefilesDefined <- TRUE } - # 2. limits argument dictates the limits and defines shapefile if not specified ### + # 3. Define clip limits for shapefiles #### - if(!is.null(limits)) { - - # Checks - - if(!is.numeric(limits)) stop("Limits have to be given as numeric vectors of length 1 or 4. See Details.") - if(!length(limits) %in% c(1, 4)) stop("Limits has to be either length of 1 or 4. See Details.") - - decLimits <- is_decimal_limit(limits) + ## shapefiles #### + + if(case %in% c("shapefiles", "none")) { - # Polar maps + clip_shape <- sf::st_as_sfc(sf::st_bbox(shapefiles$land)) - if(length(limits) == 1) { - if(!decLimits) stop("Limits of length 1 have to be given as decimal degrees.") + if(!is.null(shapefiles$name) && shapefiles$name %in% shapefile_list("all")$name) { + clip_shape <- shapefile_list(shapefiles$name)$limits + crs <- sf::st_crs(shapefile_list(shapefiles$name)$crs) - if(abs(limits) <= 89 & abs(limits) >= 10) { - polarMap <- TRUE + + if(length(clip_shape) == 4) { + limits <- stats::setNames(clip_shape, c("xmin", "xmax", "ymin", "ymax")) + + if(!sf::st_is_longlat(crs)){ + clip_shape <- sf::st_as_sfc( + sf::st_bbox(limits, crs = crs) + ) + + limits <- sf::st_bbox( + sf::st_transform( + clip_shape, crs = 4326))[c("xmin", "xmax", "ymin", "ymax")] + } + } else { - stop("The limits argument has to be between 10 and 89 (or negative) for polar stereographic maps.") + limits <- clip_shape + polarMap <- TRUE + rotate <- FALSE } - } else if( - (all(abs(limits[1:2]) == 180) & diff(limits[3:4]) <= 60) | - all(abs(limits[1:2]) == 0) - ) { ## Fix the problem with limits ending up on a single line in polar maps - - limits <- c(sort(limits[1:2]), sort(limits[3:4])) - + } else if(sf::st_is_longlat(clip_shape)) { + limits <- sf::st_bbox(clip_shape)[c("xmin", "xmax", "ymin", "ymax")] + crs <- sf::st_crs(shapefiles$land) + } else { + limits <- sf::st_bbox(sf::st_transform(clip_shape, crs = 4326))[c("xmin", "xmax", "ymin", "ymax")] + crs <- sf::st_crs(shapefiles$land) + } + + if(rotate & is.numeric(limits) & length(limits) == 4) { + crs <- rotate_crs(crs, limits[1:2]) + clip_shape <- dd_clip_boundary(limits, crs, expand.factor = 1.1) + } + + ## polar #### + } else if(case %in% c("limits_polar", "limits_polar_adjust")) { + + if(case %in% c("limits_polar_adjust")) { tmp <- define_shapefiles(limits) if(grepl("antarcticstereographic", tmp$shapefile.name, ignore.case = TRUE) & tmp$decimal.degree.limits) { limits <- max(limits[3:4]) - polarMap <- TRUE message("All decimal degree limits along a single line. You wanted a polar map with latitude limit furthest from the South Pole, right?") } else if (grepl("arcticstereographic", tmp$shapefile.name, ignore.case = TRUE) & tmp$decimal.degree.limits) { limits <- min(limits[3:4]) - polarMap <- TRUE message("All decimal degree limits along a single line. You wanted a polar map with latitude limit furthest from the North Pole, right?") + } else { + stop("limits_polar_adjust definition did not work as expected") } + } + + polarMap <- TRUE + clip_shape <- limits + + if(is.null(shapefiles)) { + shapefile.def <- define_shapefiles(limits, force_dd = TRUE) + if(!is.null(crs)) message("Polar maps have a defined crs. Cannot provide a custom one.") + crs <- sf::st_crs(shapefile.def$crs) + shapefile.name <- shapefile.def$shapefile.name + shapefiles <- shapefile_list(shapefile.name) - } else { # Rectangular maps, length(limits) == 4 - - if(identical(limits[1], limits[2])) stop("Longitude limits[1:2] equal. Cannot plot a map") - - if(!decLimits){ # Limits given as UTM coordinates - - if(rotate) { - message("The map rotation does not work for projected coordinates yet. Changed rotate to FALSE") - rotate <- FALSE - } - - if(identical(limits[3], limits[4])) stop("Latitude limits[3:4] equal. Cannot plot a map") - - limits <- c(sort(limits[1:2]), sort(limits[3:4])) - - if(is.null(shapefiles) & !is.null(data)) { # Define shapefiles with help of data - tmp <- data[unname(guess_coordinate_columns(data))] - ddLimits <- c(range(data[[1]], na.rm = TRUE), range(data[[2]], na.rm = TRUE)) - shapefile.def <- define_shapefiles(ddLimits) - shapefile.name <- shapefile.def$shapefile.name - shapefiles <- shapefile_list(shapefile.name) - - } else if(is.null(shapefiles)) { - stop("Cannot detect the required shapefiles automatically from projected limits coordinates. Change the limits to decimal degrees, provide data with decimal degree information or specify the shapefiles argument.") - } - - clipLimits <- - auto_limits(data = - expand.grid(data.frame( - lon = limits[1:2], - lat = limits[3:4]) - ), - lon = "lon", lat = "lat", - proj.in = - if(inherits(shapefiles$land, "SpatialPolygonsDataFrame")) { - raster::crs(shapefiles$land) - } else { - convert_crs(shapefiles$crs) - }, - proj.out = 4326, - verbose = verbose) - - } else { # Limits given as decimal degrees - - if(all(c("xmin", "ymin", "xmax", "ymax") %in% names(limits))) { # sf::st_bbox limits - limits <- unname(limits[c("xmin", "xmax", "ymin", "ymax")]) - } else { - limits <- c(limits[1:2], sort(limits[3:4])) # ggOceanMaps/raster/sp limits - } - - tmp <- dd_to_deg(limits[1:2]) - - if(tmp[1] > tmp[2]) { - lonDiff <- 360 - tmp[1] + tmp[2] - } else { - lonDiff <- tmp[2] - tmp[1] - } - - midLon <- tmp[1] + lonDiff/2 - midLon <- deg_to_dd(midLon) - - if(is.null(shapefiles)) { - clipLimits <- - auto_limits(data = expand.grid( - lon = sort(c(limits[1:2], midLon)), - lat = limits[3:4]), - lon = "lon", lat = "lat", - verbose = verbose) - } else { - clipLimits <- - auto_limits(data = expand.grid( - lon = sort(c(limits[1:2], midLon)), - lat = limits[3:4]), - lon = "lon", lat = "lat", - proj.out = - ifelse(grepl("SpatialPolygons", class(shapefiles$land)), - suppressWarnings(sp::proj4string(shapefiles$land)), - suppressWarnings(sp::proj4string(eval(parse(text = shapefiles$land)))) - ), - verbose = verbose) + } else { + if(inherits(shapefiles$land, c("sf", "SpatialPolygonsDataFrame", "SpatialPolygons"))) { + if(inherits(shapefiles$land, c("SpatialPolygonsDataFrame", "SpatialPolygons"))) { + shapefiles$land <- sf::st_as_sf(shapefiles$land) } - + crs <- suppressWarnings(sf::st_crs(shapefiles$land)) + } else { + crs <- suppressWarnings(sf::st_crs(eval(parse(text = shapefiles$land)))) } } + } else if(case %in% c("limits_dec")) { + ### limits_dec #### + # Shapefile definitions if(is.null(shapefiles)) { - shapefile.def <- define_shapefiles(limits) + shapefile.def <- define_shapefiles(limits, force_dd = TRUE) + if(is.null(crs)) {crs <- sf::st_crs(shapefile.def$crs)} shapefile.name <- shapefile.def$shapefile.name + shapefiles <- shapefile_list(shapefile.name) + + } else { + if(inherits(shapefiles$land, c("sf", "SpatialPolygonsDataFrame", "SpatialPolygons"))) { + if(inherits(shapefiles$land, c("SpatialPolygonsDataFrame", "SpatialPolygons"))) { + shapefiles$land <- sf::st_as_sf(shapefiles$land) + } + crs <- suppressWarnings(sf::st_crs(shapefiles$land)) + } else { + crs <- suppressWarnings(sf::st_crs(eval(parse(text = shapefiles$land)))) + } } - } - - # 3. data argument defines the limits and shapefiles if not specified. Also helps the limits argument to find shapefile if it was given as projected coordinates ### - - if(!is.null(data) & is.null(limits)) { + if(sf::st_is_longlat(crs) && sign(limits[1]) != sign(limits[2]) & !rotate) { + msg <- paste0("Detecting antimeridian crossing on decimal degree map. Plotting only + works with rotate = TRUE. Turning rotate on. Adjust limits if this + is not desired.") + + message(paste(strwrap(msg), collapse= "\n")) + rotate <- TRUE + } + + if(rotate) { + crs <- rotate_crs(crs, limits[1:2]) + } - if(inherits(data, c("sf", "sfc"))) { # sf data + clip_shape <- dd_clip_boundary(limits, crs) + + + # } else if(case %in% c("data_dec")) { ### data frames ### + # + # if(is.null(shapefiles)) { + # + # + # + # + # + # if(rotate){ + # tmp <- auto_limits(data, verbose = verbose) + # + # tmp2 <- guess_coordinate_columns(data) + # tmp$ddLimits <- stats::setNames( + # c(deg_to_dd(range(dd_to_deg(data[[tmp2[names(tmp2) == "lon"]]]))), + # range(data[[tmp2[names(tmp2) == "lat"]]]) + # ), + # c("xmin", "xmax", "ymin", "ymax") + # ) + # } else { + # tmp <- auto_limits(data, expand.factor = 1.1, verbose = verbose) + # } + # + # shapefile.def <- define_shapefiles(tmp$ddLimits, force_dd = TRUE) + # if(is.null(crs)) {crs <- sf::st_crs(shapefile.def$crs)} + # shapefile.name <- shapefile.def$shapefile.name + # shapefiles <- shapefile_list(shapefile.name) + # # clip_shape <- tmp$projBound + # # + # } else { + # if(inherits(shapefiles$land, c("sf", "SpatialPolygonsDataFrame", "SpatialPolygons"))) { + # if(inherits(shapefiles$land, c("SpatialPolygonsDataFrame", "SpatialPolygons"))) { + # shapefiles$land <- sf::st_as_sf(shapefiles$land) + # } + # crs <- suppressWarnings(sf::st_crs(shapefiles$land)) + # } else { + # crs <- suppressWarnings(sf::st_crs(eval(parse(text = shapefiles$land)))) + # } + # + # tmp <- auto_limits(data, proj.out = crs, expand.factor = 1.1, verbose = verbose) + # clip_shape <- sf::st_transform(tmp$projBound, crs) + # } + # + # + # # if(rotate) { + # # limits <- tmp$ddLimits + # # } else { + # limits <- sf::st_bbox(sf::st_transform(tmp$projBound, crs = 4326))[c("xmin", "xmax", "ymin", "ymax")] + # + # if(rotate) { + # crs <- rotate_crs(crs, limits[1:2]) + # } + # + # clip_shape <- dd_clip_boundary(limits, crs) + # # } + # + # #limits <- tmp$ddLimits + # # limits <- sf::st_bbox(sf::st_transform(sf::st_as_sfc(sf::st_bbox(clip_shape)), crs = 4326))[c("xmin", "xmax", "ymin", "ymax")] + # + # if(sf::st_is_longlat(crs) && sign(limits[1]) != sign(limits[2]) && diff(limits[1:2]) < 180 & !rotate) { + # msg <- paste0("Detecting antimeridian crossing on decimal degree map. Plotting only + # works with rotate = TRUE. Turning rotate on. Adjust limits if this + # is not desired.") + # + # message(paste(strwrap(msg), collapse= "\n")) + # rotate <- TRUE + # } + # + # if(rotate) { + # crs <- rotate_crs(crs, limits[1:2]) + # clip_shape <- dd_clip_boundary(limits, crs, expand.factor = 1.1) + # } else { + # clip_shape <- dd_clip_boundary(limits, crs) + # } + # + } else if(case %in% c("data_sf", "data_sp", "data_dec")) { ### data #### + + if(case == "data_sp") data <- sf::st_as_sf(data) + + if(case == "data_dec") { + tmp <- guess_coordinate_columns(data) + data <- sf::st_as_sf(data, coords = tmp, crs = 4326) + } + + if(is.null(shapefiles)) { - if(sf::st_is_longlat(data)) { - limits <- sf::st_bbox(data) + if(rotate) { + tmp <- sf::st_coordinates(sf::st_transform(data, 4326)) + + limits <- stats::setNames( + c(deg_to_dd(range(dd_to_deg(tmp[,1]))), range(tmp[,2])), + c("xmin", "xmax", "ymin", "ymax")) + + } else if(!sf::st_is_longlat(data)) { + limits <- sf::st_bbox(sf::st_transform(data, 4326))[c("xmin", "xmax", "ymin", "ymax")] } else { - limits <- sf::st_bbox(sf::st_transform(data, 4326)) + limits <- sf::st_bbox(data)[c("xmin", "xmax", "ymin", "ymax")] } - limits <- unname(limits[c("xmin", "xmax", "ymin", "ymax")]) - decLimits <- is_decimal_limit(limits) - - tmp <- dd_to_deg(limits[1:2]) + shapefile.def <- define_shapefiles(limits, force_dd = TRUE) + if(is.null(crs)) crs <- sf::st_crs(shapefile.def$crs) + shapefile.name <- shapefile.def$shapefile.name + shapefiles <- shapefile_list(shapefile.name) - if(tmp[1] > tmp[2]) { - lonDiff <- 360 - tmp[1] + tmp[2] + } else { + if(inherits(shapefiles$land, c("sf", "SpatialPolygonsDataFrame", "SpatialPolygons"))) { + if(inherits(shapefiles$land, c("SpatialPolygonsDataFrame", "SpatialPolygons"))) { + shapefiles$land <- sf::st_as_sf(shapefiles$land) + } + crs <- suppressWarnings(sf::st_crs(shapefiles$land)) } else { - lonDiff <- tmp[2] - tmp[1] + crs <- suppressWarnings(sf::st_crs(eval(parse(text = shapefiles$land)))) } - midLon <- tmp[1] + lonDiff/2 - midLon <- deg_to_dd(midLon) - + # clip_shape <- sf::st_as_sfc(sf::st_bbox(sf::st_transform(data, crs))) + limits <- sf::st_bbox(sf::st_transform(data, 4326))[c("xmin", "xmax", "ymin", "ymax")] + } + + if(rotate) { + crs <- rotate_crs(crs, limits[1:2]) + clip_shape <- dd_clip_boundary(limits, crs, expand.factor = 1.1) + } else { + if(sf::st_crs(data) == crs) { + clip_shape <- sf::st_as_sfc(sf::st_bbox(data)) + clip_shape <- sf::st_as_sfc(sf::st_bbox(sf::st_buffer(clip_shape, dist = 0.01*sqrt(sf::st_area(clip_shape))))) + } else { + clip_shape <- sf::st_as_sfc(sf::st_bbox(sf::st_transform(data, crs))) + clip_shape <- sf::st_as_sfc(sf::st_bbox(sf::st_buffer(clip_shape, dist = 0.01*sqrt(sf::st_area(clip_shape))))) + } + } + + } else if(case %in% c("limits_proj", "data_proj")) { ## Projected limits/data #### + + if(case %in% c("data_proj")) { ### Data frames in decimal degrees ### if(is.null(shapefiles)) { - clipLimits <- - auto_limits(data = expand.grid( - lon = sort(c(limits[1:2], midLon)), - lat = limits[3:4]), - lon = "lon", lat = "lat", - verbose = verbose) + stop("Cannot detect the required shapefiles automatically from projected coordinates. Change the limits to decimal degrees or specify the shapefiles argument.") + } + if(inherits(shapefiles$land, c("sf", "SpatialPolygonsDataFrame", "SpatialPolygons"))) { + if(inherits(shapefiles$land, c("SpatialPolygonsDataFrame", "SpatialPolygons"))) { + shapefiles$land <- sf::st_as_sf(shapefiles$land) + } + crs <- suppressWarnings(sf::st_crs(shapefiles$land)) } else { - clipLimits <- - auto_limits(data = expand.grid( - lon = sort(c(limits[1:2], midLon)), - lat = limits[3:4]), - lon = "lon", lat = "lat", - proj.out = - ifelse(grepl("SpatialPolygons", class(shapefiles$land)), - suppressWarnings(sp::proj4string(shapefiles$land)), - suppressWarnings(sp::proj4string(eval(parse(text = shapefiles$land)))) - ), - verbose = verbose) + crs <- suppressWarnings(sf::st_crs(eval(parse(text = shapefiles$land)))) } - } else { # data.frame etc. + limits <- auto_limits(data, proj.in = crs, proj.out = 4326, verbose = verbose)$ddLimits + names(limits) <- c("xmin", "xmax", "ymin", "ymax") + } + + + if(is.null(shapefiles) & !is.null(data)) { # Define shapefiles with help of data + tmp <- data[unname(guess_coordinate_columns(data))] + ddLimits <- c(range(data[[1]], na.rm = TRUE), range(data[[2]], na.rm = TRUE)) - if(rotate) { - - clipLimits <- auto_limits(data, verbose = verbose) - limits <- clipLimits$ddLimits - decLimits <- is_decimal_limit(limits) - - tmp <- dd_to_deg(limits[1:2]) - - if(tmp[1] > tmp[2]) { - lonDiff <- 360 - tmp[1] + tmp[2] - } else { - lonDiff <- tmp[2] - tmp[1] + shapefile.def <- define_shapefiles(ddLimits, force_dd = TRUE) + if(is.null(crs)) {crs <- sf::st_crs(shapefile.def$crs)} + shapefile.name <- shapefile.def$shapefile.name + shapefiles <- shapefile_list(shapefile.name) + + } else if(!is.null(shapefiles)) { + + if(inherits(shapefiles$land, c("sf", "sfc", "SpatialPolygonsDataFrame", "SpatialPolygons"))) { + if(inherits(shapefiles$land, c("SpatialPolygonsDataFrame", "SpatialPolygons"))) { + shapefiles$land <- sf::st_as_sf(shapefiles$land) } - - midLon <- tmp[1] + lonDiff/2 - midLon <- deg_to_dd(midLon) - + crs <- suppressWarnings(sf::st_crs(shapefiles$land)) } else { - - if(!is.null(shapefiles)) { - clipLimits <- - auto_limits(data, verbose = verbose, - proj.out = raster::crs(shapefiles$land), - expand.factor = expand.factor - ) - } else { - clipLimits <- auto_limits(data, expand.factor = expand.factor, verbose = verbose) - } - - limits <- clipLimits$ddLimits - decLimits <- is_decimal_limit(limits) - + crs <- suppressWarnings(sf::st_crs(eval(parse(text = shapefiles$land)))) } + + } else { + stop("Cannot detect the required shapefiles automatically from projected limits coordinates. Change the limits to decimal degrees, provide data with decimal degree information or specify the shapefiles argument.") } - if(is.null(shapefiles)) { - if(!exists("shapefile.name")) { - if(!decLimits) stop("Cannot detect the required shapefiles automatically from projected coordinates. Change the limits to decimal degrees or specify the shapefiles argument.") - - shapefile.def <- define_shapefiles(limits) - shapefile.name <- shapefile.def$shapefile.name - } + clipLimits <- + auto_limits(data = + expand.grid(data.frame( + lon = limits[1:2], + lat = limits[3:4]) + ), + lon = "lon", lat = "lat", + proj.in = crs, + proj.out = 4326, + verbose = verbose) + + limits <- clipLimits$ddLimits + + if(rotate) { + crs <- rotate_crs(crs, limits[1:2]) + clip_shape <- dd_clip_boundary(limits, crs, expand.factor = 1.1) + } else { + clip_shape <- clipLimits$projBound } + } else { + stop("Unrecognized case") } - # 4. Define the shapefiles to be used in the map #### + # 4. Define shapefiles #### if(!shapefilesDefined) { if(exists("shapefile.name") & is.null(shapefiles)) shapefiles <- shapefile_list(shapefile.name) - shapefiles$land <- eval(parse(text = shapefiles$land)) - - if(glaciers) { - shapefiles$glacier <- eval(parse(text = shapefiles$glacier)) - } else { - shapefiles$glacier <- NULL - } - - if(bathymetry) { - shapefiles$bathy <- eval(parse(text = shapefiles$bathy)) - } else { + if(!glaciers) shapefiles$glacier <- NULL + if(!bathymetry) { shapefiles$bathy <- NULL + } else { + if(bathy.type == "raster_user" && is.na(shapefiles$bathy[bathy.type])) { + msg <- "Define path to the user defined bathymetry raster using options(ggOceanMaps.userpath = 'PATH_TO_THE_FILE'))" + stop(paste(strwrap(msg), collapse= "\n")) + } + + shapefiles$bathy <- shapefiles$bathy[bathy.type] } - } else { - - if(!glaciers) shapefiles$glacier <- NULL - if(!bathymetry) shapefiles$bathy <- NULL + shapefiles <- load_map_data(shapefiles, downsample = downsample) } - ## Define the CRS for the underlying data ### + # 5. Return #### + + list(shapefiles = shapefiles, limits = limits, polarMap = polarMap, clip_limits = clip_shape, crs = crs, rotate = rotate, case = case) + +} + + + + + +##################### # +## Crop shapefiles #### + +basemap_data_crop <- function(x, bathymetry = FALSE, glaciers = FALSE, crs = NULL) { + + # 1. Clip shapefiles #### + + if(x$polarMap) { + landBoundary <- clip_shapefile( + sf::st_transform(x$shapefiles$land, crs = x$crs), + limits = x$clip_limits, + return.boundary = TRUE + ) + } else { + if(!is.null(crs)) { # this hack is required for custom crs. Couldn't come up with a better solution + landBoundary <- clip_shapefile( + x$shapefiles$land, + limits = smoothr::densify(x$clip_limits), + return.boundary = TRUE + ) + + landBoundary <- lapply(landBoundary, function(k) { + sf::st_transform(k, crs) + }) + + } else { + landBoundary <- clip_shapefile( + sf::st_transform(x$shapefiles$land, crs = x$crs), + limits = sf::st_transform(x$clip_limits, crs = x$crs), + return.boundary = TRUE + ) + } + } - LandCRS <- sf::st_crs(shapefiles$land) # old: raster::crs(shapefiles$land) + x$shapefiles$land <- landBoundary$shapefile + x$clip_limits <- landBoundary$boundary - # 5. Crop and rotate shapefiles if needed ### + # if(x$rotate) { + # landBoundary <- clip_shapefile( + # sf::st_transform(x$shapefiles$land, crs = x$crs), + # limits = sf::st_transform(x$clip_limits, crs = x$crs), + # return.boundary = TRUE + # ) + # + # x$shapefiles$land <- landBoundary$shapefile + # x$clip_limits <- landBoundary$boundary + # + # } else { + # + # landBoundary <- clip_shapefile( + # x$shapefiles$land, + # limits = x$clip_limits, + # return.boundary = TRUE + # ) + # + # x$shapefiles$land <- landBoundary$shapefile + # + # } + # + if(glaciers) { + if(!is.null(crs)) { # this hack is required for custom crs. Couldn't come up with a better solution + x$shapefiles$glacier <- + sf::st_transform( + clip_shapefile( + x$shapefiles$glacier, + limits = sf::st_transform(x$clip_limits, crs = 4326) + ), + crs = crs + ) + } else { + x$shapefiles$glacier <- clip_shapefile( + sf::st_transform(x$shapefiles$glacier, crs = x$crs), + limits = x$clip_limits + ) + } + } - if(is.null(limits)) { # For cases when limits and data are not defined - - if(rotate) message("Rotating maps defined using only the shapefiles argument have not been implemented. Changed rotate to FALSE") - - # Define map limits + if(bathymetry) { - tmp <- sapply(shapefiles[c("land", "glacier", "bathy")], function(k) { - if(class(k) %in% c("SpatialPolygonsDataFrame", "SpatialPolygons")) { - length(k) > 0 - } else { - FALSE + if(inherits(x$shapefiles$bathy, "bathyRaster")) { + # raster bathymetries + newgrid <- stars::st_as_stars(sf::st_bbox(x$clip_limits)) + x$shapefiles$bathy$raster <- stars::st_warp(x$shapefiles$bathy$raster, newgrid) + + if(x$polarMap) { + x$shapefiles$bathy$raster <- x$shapefiles$bathy$raster[x$clip_limits] } - }) - - if(all(!tmp)) stop("No shapefile layers to plot.") - - - tmp <- lapply(shapefiles[names(tmp[tmp])], function(k) { - # out <- raster::extent(k) - matrix(sf::st_bbox(k), nrow = 1) - }) - - tmp <- as.data.frame(do.call(rbind, tmp)) - map.limits <- c(min(tmp[1]), max(tmp[3]), min(tmp[2]), max(tmp[4])) - - clipLimits <- - auto_limits(data = expand.grid( - lon = c(tmp[[1]], tmp[[3]]), - lat = c(tmp[[2]], tmp[[4]])), - lon = "lon", - lat = "lat", - proj.in = LandCRS, - proj.out = 4326 + + if(inherits(x$shapefiles$bathy$raster[[1]], "factor")) { + x$shapefiles$bathy$raster <- droplevels(x$shapefiles$bathy$raster) + } + } else if(inherits(x$shapefiles$bathy, c("stars", "stars_proxy"))) { + + x$shapefiles$bathy <- raster_bathymetry( + x$shapefiles$bathy, + depths = NULL, + boundary = sf::st_transform(smoothr::densify(x$clip_limits, n = 10), crs = 4326), + verbose = FALSE ) - - } else if(polarMap) { # Polar maps - - if(rotate) message("Rotating polar maps has not been implemented. Changed rotate to FALSE") - - # Clip the shapefiles - - landBoundary <- clip_shapefile(shapefiles$land, - limits = limits, - proj.limits = convert_crs(4326), - return.boundary = TRUE) - - shapefiles$land <- landBoundary$shapefile - if(glaciers) shapefiles$glacier <- clip_shapefile(shapefiles$glacier, limits = limits, proj.limits = ifelse(decLimits, convert_crs(4326), LandCRS)) - if(bathymetry) shapefiles$bathy <- clip_shapefile(shapefiles$bathy, limits = limits, proj.limits = ifelse(decLimits, convert_crs(4326), LandCRS)) - - # Define map limits - - map.limits <- raster::extent(landBoundary$boundary)[1:4] - - } else { # Square maps - - if(rotate) { - if(!shapefiles$name %in% c("ArcticStereographic", "AntarcticStereographic")) { - message("The map rotation currently works only for stereographic maps. Changed rotate to FALSE") - rotate <- FALSE + + newgrid <- stars::st_as_stars(sf::st_bbox(x$clip_limits)) + x$shapefiles$bathy$raster <- stars::st_warp(x$shapefiles$bathy$raster, newgrid) + + if(x$polarMap) { + x$shapefiles$bathy$raster <- x$shapefiles$bathy$raster[x$clip_limits] } - } - - ## Rotate the shapefiles - - if(rotate) { - LandCRS <- sf::st_crs(gsub("lon_0=0", paste0("lon_0=", midLon), LandCRS$proj4string)) - clipLimits <- auto_limits(data = - expand.grid( - lon = sort(c(limits[1:2], midLon)), - lat = limits[3:4]), - lon = "lon", lat = "lat", - proj.out = LandCRS) - - # (These conversions will disappear once the package is entirely sf based) - shapefiles$land <- - sf::as_Spatial( - sf::st_transform(sf::st_as_sf(shapefiles$land), LandCRS) - ) - if(glaciers) shapefiles$glacier <- sf::as_Spatial(sf::st_transform(sf::st_as_sf(shapefiles$glacier), LandCRS)) - if(bathymetry) shapefiles$bathy <- sf::as_Spatial(sf::st_transform(sf::st_as_sf(shapefiles$bathy), LandCRS)) + } else { + # vector bathymetries + x$shapefiles$bathy <- clip_shapefile( + sf::st_transform(x$shapefiles$bathy, crs = x$crs), + limits = x$clip_limits + ) + x$shapefiles$bathy$depth <- droplevels(x$shapefiles$bathy$depth) } - - # Clip the shapefiles - - shapefiles$land <- clip_shapefile(shapefiles$land, - limits = clipLimits$projBound, - proj.limits = clipLimits$proj.out - ) - - if(glaciers) shapefiles$glacier <- clip_shapefile(shapefiles$glacier, limits = clipLimits$projBound, proj.limits = clipLimits$proj.out) - if(bathymetry) { - shapefiles$bathy <- clip_shapefile(shapefiles$bathy, limits = clipLimits$projBound, proj.limits = clipLimits$proj.out) - shapefiles$bathy@data <- droplevels(shapefiles$bathy@data) - } - - # Define map limits - - map.limits <- clipLimits$projLimits - } - # 6. Define the grid lines #### + if(!x$polarMap) { + x$limits <- + sf::st_bbox(sf::st_transform(sf::st_as_sf(x$clip_limits), 4236))[c("xmin", "xmax", "ymin", "ymax")] + } + + map_limits <- sf::st_bbox(x$clip_limits)[c("xmin", "xmax", "ymin", "ymax")] + + # Return #### + + list(shapefiles = x$shapefiles, polarMap = x$polarMap, decLimits = x$limits, limit_shape = x$clip_limits, map_limits = map_limits, crs = x$crs) + +} + +####################### # +## Define grid lines #### + +basemap_define_grid_lines <- function(x, lon.interval = NULL, lat.interval = NULL) { ## A quick fix. Improve later - if(exists("clipLimits")) { - if(abs(clipLimits$ddLimits[4]) != 90) { - tmp <- suppressWarnings(sp::spTransform(clipLimits$projBound, sp::CRS(convert_crs(4326)))@bbox) - clipLimits$ddLimits <- unname(c(sort(tmp[1,]), sort(tmp[2,]))) - } - } + # if(!is.null(x$clipLimits)) { + # if(abs(x$clipLimits$ddLimits[4]) != 90) { + # tmp <- sf::st_bbox(sf::st_transform(x$clipLimits$projBound, 4326)) + # x$clipLimits$ddLimits <- tmp[c("xmin", "xmax", "ymin", "ymax")] + # } + # } ## Define intervals if not specified if(is.null(lat.interval)) { - if(polarMap) { - latDist <- 90 - abs(limits) + if(x$polarMap) { + latDist <- 90 - abs(x$decLimits) } else { - latDist <- abs(diff(round(clipLimits$ddLimits)[3:4])) + limits <- sf::st_bbox(sf::st_transform(x$limit_shape, 4326))[c("xmin", "xmax", "ymin", "ymax")] + + latDist <- abs(diff(round(limits)[3:4])) } - lat.interval <- ifelse(latDist >= 30, 10, ifelse(latDist >= 15, 5, ifelse(latDist >= 10, 4, ifelse(latDist >= 6, 3, ifelse(latDist > 4, 2, 1))))) + lat.interval <- + ifelse(latDist >= 30, 10, + ifelse(latDist >= 15, 5, + ifelse(latDist >= 10, 4, + ifelse(latDist >= 6, 3, + ifelse(latDist > 4, 2, 1) + )))) } if(is.null(lon.interval)) { - if(polarMap) { + if(x$polarMap) { lon.interval <- 45 } else { - if(diff(clipLimits$ddLimits[1:2]) == 360) { + limits <- sf::st_bbox(sf::st_transform(x$limit_shape, 4326))[c("xmin", "xmax", "ymin", "ymax")] + + if(diff(limits[1:2]) == 360) { lonDist <- 360 } else { - tmp <- dd_to_deg(round(clipLimits$ddLimits)[1:2]) + tmp <- dd_to_deg(round(x$decLimits)[1:2]) if(tmp[1] > tmp[2]) { lonDist <- 360 - tmp[1] + tmp[2] @@ -524,29 +782,64 @@ basemap_data <- function(limits = NULL, data = NULL, shapefiles = NULL, bathymet } } - lon.interval <- ifelse(lonDist > 180, 45, ifelse(lonDist > 90, 30, ifelse(lonDist >= 40, 10, ifelse(lonDist > 10, 5, ifelse(lonDist > 4, 2, 1))))) + lon.interval <- + ifelse(lonDist > 180, 45, + ifelse(lonDist > 90, 30, + ifelse(lonDist >= 40, 10, + ifelse(lonDist > 10, 5, + ifelse(lonDist > 4, 2, 1) + )))) } } ## Define the grid lines based on intervals - if(polarMap) { + if(x$polarMap) { - poleLat <- ifelse(limits > 0, 90, -90) + poleLat <- ifelse(x$decLimits > 0, 90, -90) - LonGridLines <- data.frame(lon = rep(seq(-135, 180, lon.interval), each = 2), lat = rep(c(poleLat, limits), 360/lon.interval)) - LatLimitLine <- data.frame(lon = seq(-180, 180, 1), lat = limits) + LonGridLines <- data.frame( + id = rep(1:(360/lon.interval), each = 2), + lon = rep(seq(-135, 180, lon.interval), each = 2), + lat = rep(c(poleLat, x$decLimits), 360/lon.interval)) - LatGridLines <- sign(limits) * seq(from = round(abs(limits)) + lat.interval, to = abs(poleLat) - lat.interval, by = lat.interval) - LatGridLines <- LatGridLines[LatGridLines != limits] - LatGridLines <- data.frame(lon = rep(seq(-180, 180, 1), length(LatGridLines)), lat = rep(LatGridLines, each = nrow(LatLimitLine))) + LonGridLines <- + sf::st_sfc(sf::st_multilinestring( + x = lapply(unique(LonGridLines$id), function(i) { + sf::st_linestring(as.matrix(LonGridLines[LonGridLines$id == i, 2:3])) + }) + ), crs = 4326) + + LatLimitLine <- data.frame(lon = seq(-180, 180, 1), lat = x$decLimits) + + LatGridLines <- + sign(x$decLimits) * seq(from = round(abs(x$decLimits)) + lat.interval, + to = abs(poleLat) - lat.interval, by = lat.interval) + LatGridLines <- LatGridLines[LatGridLines != x$decLimits] + LatGridLines <- + data.frame(lon = rep(seq(-180, 180, 1), length(LatGridLines)), + lat = rep(LatGridLines, each = nrow(LatLimitLine))) + + LatGridLines <- sf::st_sfc(sf::st_multilinestring( + lapply(unique(LatGridLines$lat), function(k) { + sf::st_linestring(as.matrix(LatGridLines[LatGridLines$lat == k,])) + }) + ), crs = 4326) + + LatLimitLine <- + sf::st_sfc( + sf::st_linestring( + as.matrix(LatLimitLine) + ), crs = 4326) mapGrid <- list(lon.grid.lines = LonGridLines, lat.grid.lines = LatGridLines, lat.limit.line = LatLimitLine) } else { - minLat <- min(clipLimits$ddLimits[3:4]) - maxLat <- max(clipLimits$ddLimits[3:4]) + limits <- sf::st_bbox(sf::st_transform(x$limit_shape, 4326))[c("xmin", "xmax", "ymin", "ymax")] + + minLat <- min(limits[3:4]) + maxLat <- max(limits[3:4]) minLat <- ifelse(minLat < 0, -90, round_any(minLat, 10, floor)) maxLat <- ifelse(maxLat > 0, 90, round_any(maxLat, 10, ceiling)) @@ -558,9 +851,8 @@ basemap_data <- function(limits = NULL, data = NULL, shapefiles = NULL, bathymet # Return #### - out <- list(shapefiles = shapefiles, map.limits = map.limits, polar.map = polarMap, map.grid = mapGrid, proj = LandCRS) - - class(out) <- "basemapData" + list(shapefiles = x$shapefiles, polarMap = x$polarMap, decLimits = x$decLimits, + limit_shape = x$limit_shape, map_limits = x$map_limits, + crs = x$crs, mapGrid = mapGrid) - out } diff --git a/R/clip_shapefile.R b/R/clip_shapefile.R index 198814e..7f2095a 100644 --- a/R/clip_shapefile.R +++ b/R/clip_shapefile.R @@ -1,182 +1,140 @@ -#' @title Clip a shapefile (SpatialPolygon) using a bounding area -#' @description Clips an area from a larger shape file (\link[sp]{SpatialPolygons}). -#' @param x Original shape file to be clipped as a an \link[sp]{sp} or \link[sf]{sf} polygons object. Required. Must contain \code{\link[sp:is.projected]{CRS}} information. -#' @param limits The constraining area used to clip \code{x}. Required. Either a numeric vector of length 4 or a \link[sp]{SpatialPolygons} object. The first element of the numeric vector defines the minimum longitude, second element the maximum longitude, third element the minimum latitude and fourth element the maximum latitude of the bounding box. The \link[sp]{SpatialPolygons} object must contain \code{\link[sp:is.projected]{CRS}} information. See details. -#' @param proj.limits The \code{\link[sp:is.projected]{CRS}} projection attributes for \code{limits} as character string (will be passed to \code{\link[sp]{CRS}}). Use the PROJ6 format. Defaults to decimal degrees (see Usage). -#' @param simplify Should the \code{x} geometry be simplified before clipping? Useful to make the function faster for large shape files. Uses \code{rgeos::gSimplify} function. -#' @param tol Numerical tolerance value to be used for simplification. See \code{?rgeos::gSimplify}. -#' @param return.boundary logical. If \code{TRUE} returns the clip boundary together with the shapefile. -#' @param sf.mode Logical indicating whether the function should be executed using the sf package. Developmental switch during the transfer from sp to sf. -#' @details The function uses the \code{rgeos::gIntersection} function to clip smaller \link[sp]{SpatialPolygons} from larger ones. The clip area is constrained by either a numeric vector or \link[sp]{SpatialPolygons} object in the \code{limits} argument. One of these arguments must be given. Defining \code{limits} by a \link[sp]{SpatialPolygons} object gives greater freedom for the clip area as the area does not have to be rectangular. -#' @return Clipped \code{\link[sp]{SpatialPolygons}} object. If \code{return.boundary = TRUE}, a list containing the shapefile together with the clip boundary. +#' @title Clip a shapefile using a bounding area +#' @description Clips an area from a larger shapefile provided in sf or sp formats. +#' @param x Original shapefile to be clipped as a an \link[sf]{sf} or \code{sp} polygons object. Required. Must contain \code{\link[sf:st_crs]{CRS}} information. +#' @param limits The constraining area used to clip \code{x}. Required. Either a numeric vector of length 4 or a spatial object from the sf or sp packages. The first element of the numeric vector defines the minimum longitude, second element the maximum longitude, third element the minimum latitude and fourth element the maximum latitude of the bounding box. If a spatial object, it must contain \code{\link[sf:st_crs]{CRS}} information. See details. +#' @param proj.limits The \code{\link[sf:st_crs]{CRS}} projection attributes for \code{limits}. Hence format accepted by \code{\link[sf]{st_crs}} will suffice but integers are the easiest. Defaults to decimal degrees. +#' @param simplify Should the \code{x} geometry be simplified before clipping? Useful to make the function faster for large shape files. Uses \code{\link[sf]{st_simplify}} function. +#' @param tol Numerical tolerance value to be used for simplification. See \code{?sf::st_simplfy}. +#' @param return.boundary Logical. If \code{TRUE} returns the clip boundary together with the shapefile. +#' @param extra.validate Logical indicating whether \code{x} should be run through extra validation. Slows down the function but is necessary in some cases involving anti-meridian. +#' @details The function uses the \code{\link[sf]{st_intersection}} function to clip smaller polygons from larger ones. The clip area is constrained by either a numeric vector or a spatial object in the \code{limits} argument. Defining \code{limits} by a \code{\link[sf:st_sf]{sf}} object gives greater freedom for the clip area as the area does not have to be rectangular. +#' @return Clipped spatial object. If \code{return.boundary = TRUE}, a list containing the shapefile together with the clip boundary. #' @keywords internal #' @family create shapefiles -#' @import sp -#' @importFrom rgeos gIntersection gIntersects gSimplify #' @importFrom methods slot slot<- #' @importFrom grDevices chull -#' @author Mikko Vihtakari with a solution from \href{https://stackoverflow.com/questions/15881455/how-to-clip-worldmap-with-polygon-in-r}{Simon O'Hanlon, Roger Bivand/SO community} +#' @author Mikko Vihtakari #' @export # Test parameters -# x = shapefiles$land; proj.limits = convert_crs(4326); simplify = FALSE; tol = 60; return.boundary = FALSE -clip_shapefile <- function(x, limits, proj.limits = convert_crs(4326), simplify = FALSE, tol = 60, return.boundary = FALSE, sf.mode = FALSE) { +# proj.limits = 4326; simplify = FALSE; tol = 60; return.boundary = FALSE; extra.validate = FALSE +clip_shapefile <- function(x, limits, proj.limits = 4326, simplify = FALSE, tol = 60, return.boundary = FALSE, extra.validate = FALSE) { ## Checks - if("sf" %in% class(x) & !sf.mode) { - x <- sf::as_Spatial(x) + if(!inherits(x, c("sf"))) { + x <- sf::st_as_sf(x) } ## Projection - if(sf.mode) { - x_proj <- sf::st_crs(x) - } else { - x_proj <- suppressWarnings(sp::proj4string(x)) - } + x_proj <- sf::st_crs(x) if(is.na(x_proj)) stop("crs for x is missing. Define the projection attributes and try again.") ## Clip boundary - if(sf.mode) { - - ## sf mode ## - - if(inherits(limits, "sf")) { - proj.limits <- sf::st_crs(limits) - clip_boundary <- limits - } else { - if(!is.numeric(limits)) stop("limits have to be numeric, SpatialPolygonsDataFrame or SpatialPolygons object") - if(length(limits) == 1) { - - stop("Limits with length of 1 has not been implemented properly yet") - bd <- data.frame(lon = seq(-180, 180, by = 0.5), lat = limits) - bd <- transform_coord(bd, proj.out = x_proj) - ch <- grDevices::chull(bd$lat, bd$lon) - coords <- as.matrix(bd[c(ch, ch[1]), 1:2]) - clip_boundary <- sf::st_sfc(sf::st_polygon(list(coords)), crs = x_proj) - - } else if(length(limits) != 4) { - stop("the length of limits vector has to be 4. See limits argument") - } else { - - coords <- as.matrix(data.frame( - lon = c(limits[[1]], limits[[2]], limits[[2]], limits[[1]], limits[[1]]), - lat = c(limits[[3]], limits[[3]], limits[[4]], limits[[4]], limits[[3]]))) - clip_boundary <- sf::st_sfc(sf::st_polygon(list(coords)), crs = sf::st_crs(proj.limits)) - # clip_boundary <- sf::st_segmentize(clip_boundary, units::set_units(0.1, degree)) - } + if(inherits(limits, c("sf", "sfc", "SpatialPolygonsDataFrame", "SpatialPolygons"))) { + if(inherits(limits, c("SpatialPolygonsDataFrame", "SpatialPolygons"))) { + limits <- sf::st_as_sf(limits) } - ## sf mode end ## + proj.limits <- sf::st_crs(limits) + clip_boundary <- limits } else { - ## sp mode ## + if(!is.numeric(limits)) stop("limits have to be numeric, sf, SpatialPolygonsDataFrame or SpatialPolygons object") - if(inherits(limits, c("SpatialPolygonsDataFrame", "SpatialPolygons"))) { - proj.limits <- suppressWarnings(sp::proj4string(limits)) - clip_boundary <- limits + if(length(limits) == 1) { + + # stop("Limits with length of 1 has not been implemented properly yet") + bd <- data.frame(lon = seq(-180, 180, by = 0.5), lat = limits) + bd <- transform_coord(bd, proj.out = x_proj) + ch <- grDevices::chull(bd$lat, bd$lon) + coords <- as.matrix(bd[c(ch, ch[1]), 1:2]) + clip_boundary <- sf::st_sfc(sf::st_polygon(list(coords)), crs = x_proj) + + } else if(length(limits) != 4) { + stop("the length of limits vector has to be 4. See limits argument") } else { - if(!is.numeric(limits)) stop("limits have to be numeric, SpatialPolygonsDataFrame or SpatialPolygons object") - if(length(limits) == 1) { - bd <- data.frame(lon = seq(-180, 180, by = 0.5), lat = limits) - bd <- transform_coord(bd, proj.out = x_proj) - ch <- grDevices::chull(bd[[2]], bd[[1]]) - coords <- as.matrix(bd[c(ch, ch[1]), 1:2]) - clip_boundary <- sp::SpatialPolygons(list(sp::Polygons(list(sp::Polygon(coords)), ID = 1))) - - if(!rgeos::gIsValid(clip_boundary)) stop("Invalid geomethry due to clip_shapefile. Add the buffering script.") - - suppressWarnings(raster::crs(clip_boundary) <- x_proj) - - } else if(length(limits) != 4) { - stop("the length of limits vector has to be 4. See limits argument") - } else { - clip_boundary <- sp::Polygon(matrix(c(limits[1], limits[3], limits[1], limits[4], limits[2], limits[4], limits[2], limits[3], limits[1], limits[3]), ncol = 2, byrow = TRUE)) - clip_boundary <- sp::SpatialPolygons(list(sp::Polygons(list(clip_boundary), ID = "clip_boundary")), proj4string = suppressWarnings(sp::CRS(proj.limits))) - } + + coords <- as.matrix(data.frame( + lon = c(limits[[1]], limits[[2]], limits[[2]], limits[[1]], limits[[1]]), + lat = c(limits[[3]], limits[[3]], limits[[4]], limits[[4]], limits[[3]]))) + clip_boundary <- sf::st_sfc(sf::st_polygon(list(coords)), crs = sf::st_crs(proj.limits)) + # clip_boundary <- sf::st_segmentize(clip_boundary, units::set_units(0.1, degree)) } - - ## sp mode end ## - } ### Validate the clip boundary - if(sf.mode) { - if(!all(sf::st_is_valid(clip_boundary))) { - clip_boundary <- sf::st_make_valid(clip_boundary) + if(!all(sf::st_is_valid(clip_boundary))) { + clip_boundary <- sf::st_make_valid(clip_boundary) + } + + ### Validate x + + if(extra.validate) { + if(!all(sf::st_is_valid(x))) { + x <- sf::st_make_valid(x) } } + ## Densify (e.g. add vertices) the clip boundary + + # clip_boundary <- smoothr::densify(clip_boundary) + ## Check that the projections match - if(sf.mode) { - if(sf::st_crs(proj.limits) != sf::st_crs(x_proj)) { - clip_boundary <- sf::st_transform(clip_boundary, x_proj) - } - } else { - if(proj.limits != x_proj) { - clip_boundary <- suppressWarnings(sp::spTransform(clip_boundary, suppressWarnings(sp::CRS(x_proj)))) - } + if(sf::st_crs(clip_boundary) != sf::st_crs(x_proj)) { + clip_boundary <- sf::st_transform(clip_boundary, x_proj) + } + + ## Temporarily turn off s2 while handing decimal degree shapes (anti-meridian as exception, rotated maps. This is some dodgy stuff) + + if(suppressWarnings(sf::st_is_longlat(x))) { + + tmp <- sf::st_bbox(clip_boundary)[c("xmin", "xmax")] + + if((#sign(tmp[1]) != sign(tmp[2]) && + sf::st_crs(x)$proj4string == sf::st_crs(4326)$proj4string)) { + s2_mode <- sf::sf_use_s2() + suppressMessages(sf::sf_use_s2(FALSE)) + on.exit({suppressMessages(sf::sf_use_s2(s2_mode))}) + } + #else { + # if(!all(sf::st_is_valid(x))) { + # x <- sf::st_make_valid(x) + # } + # } } ## Simplify bathymetry if(simplify) { - if(sf.mode) { - # x <- suppressWarnings(sf::st_as_sf(rgeos::gBuffer(sf::as_Spatial(x), byid = TRUE, width = 0))) # Because the sf way under doesn't work (??) - x <- sf::st_make_valid(x, oriented = TRUE) - # x <- sf::st_buffer(x, dist = 1) - x <- sf::st_simplify(x, dTolerance = tol) - } else { - x <- rgeos::gSimplify(x, tol = tol) - } + x <- sf::st_make_valid(x, oriented = TRUE) + x <- sf::st_simplify(x, dTolerance = tol) } - ## Cropping + ## Cropping (st_intersection for round shapes) - if(sf.mode) { - ## May need to turn off s2 here - shapefile <- suppressWarnings(sf::st_intersection(x, clip_boundary)) - + if(nrow(sf::st_coordinates(clip_boundary)) > 100) { + shapefile <- suppressMessages(suppressWarnings(sf::st_intersection(x, clip_boundary))) } else { - ## Clipping the bathymetry (using a bypass scavenged from here: https://stackoverflow.com/questions/15881455/how-to-clip-worldmap-with-polygon-in-r) - ## Sometimes rgeos::gIntersection gets confused when the resulting clipped SpatialPolygon contains other shapes than polygons. The bypass fixes this problem, but takes a longer time to complete than the regular method. Therefore two methods - ## Alternatives here are: - ## system.time(sf::as_Spatial(st_crop(sf::st_as_sf(x), sf::st_as_sf(clip_boundary)))) # Requires tuning to make circular crop work - ## system.time(raster::crop(x, clip_boundary)) - ## Both seem slower than the chosen method - - error_test <- suppressWarnings(quiet(try(rgeos::gIntersection(x, clip_boundary, byid = TRUE), silent = TRUE))) - - if(inherits(error_test, "try-error")) { - shapefile <- suppressWarnings(rgeos::gIntersection(x, clip_boundary, byid = TRUE, drop_lower_td = TRUE, checkValidity = 0L)) - } else if(is.null(error_test)) { - shapefile <- x[-1:-length(x),] - } else { - shapefile <- error_test - } - - if(inherits(x, "SpatialPolygonsDataFrame") & length(shapefile) > 0) { - ids <- sapply(slot(shapefile, "polygons"), function(x) slot(x, "ID")) - ids <- select_element(strsplit(ids, " "), 1) - - if(ncol(x@data) == 1) { - tmp.df <- data.frame(x@data[ids,]) - names(tmp.df) <- names(x@data) - } else { - tmp.df <- x@data[ids,] - } - - shapefile <- sp::SpatialPolygonsDataFrame(shapefile, tmp.df, match.ID = FALSE) - + shapefile <- suppressWarnings(suppressMessages(sf::st_crop(x, clip_boundary))) + } + + ### Validate shapefile + + if(extra.validate) { + if(!all(sf::st_is_valid(shapefile))) { + shapefile <- sf::st_make_valid(shapefile) } } + ## Return + if(return.boundary) { list(shapefile = shapefile, boundary = clip_boundary) } else { diff --git a/R/convert_crs.R b/R/convert_crs.R deleted file mode 100644 index 4589ce7..0000000 --- a/R/convert_crs.R +++ /dev/null @@ -1,36 +0,0 @@ -##' @description Temporary workaround for the specification of projargs in CRS(), required by migration to -##' GDAL >= 3 and PROJ >= 6. -##' @title convert_crs, system specific epsg code string formating for CRS. -##' @keywords internal -##' @param epsg EPSG code of the projection (coercible to a character string). -##' @return A character string that can be used as projargs in the CRS (sp) function: either a SRS_string -##' (e.g. \code{"EPSG:4326"}) or a "+init" PROJ string (e.g. \code{"+init=epsg:4326"}), depending on the libraries and -##' packages versions used. -##' @author Yves Reecht -##' @details Test the version of GDAL, PROJ and sp, and return the appropriate format. -##' @examples -##' convert_crs() -##' sp::CRS(projargs = convert_crs()) -##' sp::CRS(projargs = convert_crs(epsg = 27700)) -##' @export -convert_crs <- function(epsg = 4326) -{ - ## Minimal versions for CRS() to support SRS_string: - minVersions <- c("GDAL" = "3.0", "PROJ" = "6.0", "sp" = "1.4-4") - - installed <- c(sf::sf_extSoftVersion(), - sp = as.character(utils::packageVersion("sp")))[names(minVersions)] - - res <- sapply(names(minVersions), - function(i) - { - utils::compareVersion(installed[i], - minVersions[i]) - }) - - return(paste0(ifelse(any(res < 0), "+init=epsg:", "EPSG:"), - epsg)) -} - - - diff --git a/R/datasets-docs.R b/R/datasets-docs.R index 0fe3c03..70b1e53 100644 --- a/R/datasets-docs.R +++ b/R/datasets-docs.R @@ -3,14 +3,14 @@ #' @keywords datasets shapefiles fishery #' @family datasets #' @name fdir_main_areas -#' @format \code{\link[sf:st_sf]{sf object}} in decimal degrees (EPSG:4326) containing major fishing zones defined by the Norwegian Directorate of Fisheries. Contains also Northwest Atlantic Fisheries Organization's divisions where Norwegian vessels tend to fish. +#' @format \code{\link[sf:st_sf]{sf object}} containing major fishing zones defined by the Norwegian Directorate of Fisheries. Contains also Northwest Atlantic Fisheries Organization's divisions where Norwegian vessels tend to fish. #' @source \href{https://open-data-fiskeridirektoratet-fiskeridir.hub.arcgis.com/}{Norwegian Directorate of Fisheries} and \href{https://www.nafo.int/About-us/Maps}{Northwest Atlantic Fisheries Organization} -#' @import sf +#' @importFrom sf st_sf #' @examples -#' if(requireNamespace("ggOceanMapsData", quietly = TRUE)) { +#' if(requireNamespace("ggspatial")) { #' \donttest{ #' basemap(fdir_main_areas) + -#' annotation_spatial(fdir_main_areas, fill = NA) +#' ggspatial::annotation_spatial(fdir_main_areas, fill = NA) #' } #' } "fdir_main_areas" @@ -20,14 +20,14 @@ #' @keywords datasets shapefiles fishery #' @family datasets #' @name fdir_sub_areas -#' @format \code{\link[sf:st_sf]{sf object}} in decimal degrees (EPSG:4326) containing major fishing zones defined by the Norwegian Directorate of Fisheries. +#' @format \code{\link[sf:st_sf]{sf object}} containing major fishing zones defined by the Norwegian Directorate of Fisheries. #' @source \href{https://open-data-fiskeridirektoratet-fiskeridir.hub.arcgis.com/}{Norwegian Directorate of Fisheries} -#' @import sf +#' @importFrom sf st_sf #' @examples -#' if(requireNamespace("ggOceanMapsData", quietly = TRUE)) { +#' if(requireNamespace("ggspatial")) { #' \donttest{ #' basemap(fdir_sub_areas) + -#' annotation_spatial(fdir_sub_areas, fill = NA) +#' ggspatial::annotation_spatial(fdir_sub_areas, fill = NA) #' } #' } "fdir_sub_areas" @@ -37,14 +37,14 @@ #' @keywords datasets shapefiles fishery #' @family datasets #' @name ices_areas -#' @format \code{\link[sf:st_sf]{sf object}} in decimal degrees (EPSG:4326) containing ICES Advisory Areas. +#' @format \code{\link[sf:st_sf]{sf object}} containing ICES Advisory Areas. #' @source \href{https://www.ices.dk/}{International Council for the Exploration of the Sea} -#' @import sf +#' @importFrom sf st_sf #' @examples -#' if(requireNamespace("ggOceanMapsData", quietly = TRUE)) { +#' if(requireNamespace("ggspatial")) { #' \donttest{ #' basemap(ices_areas) + -#' annotation_spatial(ices_areas, fill = NA) +#' ggspatial::annotation_spatial(ices_areas, fill = NA) #' } #' } "ices_areas" diff --git a/R/define_shapefiles.R b/R/define_shapefiles.R index 40d5e92..d9f7032 100644 --- a/R/define_shapefiles.R +++ b/R/define_shapefiles.R @@ -1,14 +1,15 @@ #' @title Define a shapefile to use in plotting from the limits argument #' @description An internal function to make \code{\link{basemap}} code more readable #' @param limits A numeric vector of length 4: The first element defines the minimum longitude, the second element the maximum longitude, the third element the minimum latitude and the fourth element the maximum latitude of the bounding box. +#' @param force_dd Logical indicating whether the shapefile should be forced to DecimalDegree. Required for transforming dd_* shapefile objects to other projections. #' @details This is an internal function, which is automatically run by the \code{\link{basemap}} function. -#' @return A list containing the correct shapefile and a logical statement whether the limits were supplied as decimal degrees. +#' @return A list containing the correct shapefile, a logical statement whether the limits were supplied as decimal degrees and coordinate reference system. #' @keywords internal #' @author Mikko Vihtakari #' @seealso \code{\link{basemap}} #' @export -define_shapefiles <- function(limits) { +define_shapefiles <- function(limits, force_dd = FALSE) { if(length(limits) == 1) { @@ -21,6 +22,10 @@ define_shapefiles <- function(limits) { } else { stop("limits argument has to be between 30 and 89 or -30 and -89 when numeric and length == 1 (polar stereographic maps).") } + + crs <- shapefile_list(shapefiles)$crs + if(force_dd) shapefiles <- "DecimalDegree" + } else if(length(limits) == 4) { if(is_decimal_limit(limits)) { @@ -49,12 +54,16 @@ define_shapefiles <- function(limits) { shapefiles <- "DecimalDegree" } + crs <- shapefile_list(shapefiles)$crs + if(force_dd) shapefiles <- "DecimalDegree" + } else { decLimits <- FALSE shapefiles <- NA + crs <- NA } } - list(shapefile.name = shapefiles, decimal.degree.limits = decLimits) + list(shapefile.name = shapefiles, decimal.degree.limits = decLimits, crs = crs) } diff --git a/R/dist2land.R b/R/dist2land.R index 02e0aaf..3ea8647 100644 --- a/R/dist2land.R +++ b/R/dist2land.R @@ -1,181 +1,175 @@ #' @title Calculate distance to the closest land for coordinates in a data frame #' @description Calculates the closest distance to land for coordinates in a data frame -#' @param data Data.frame containing geographic coordinates +#' @param data Data frame containing geographic coordinates. #' @param lon,lat Either the names of the longitude and latitude columns in \code{data} or \code{NULL} to \link[=guess_coordinate_columns]{guess the longitude and/or latitude columns} in \code{data}. -#' @param proj.in \code{\link[sp:is.projected]{proj4string}} projection argument for the coordinates in \code{data}. -#' @param shapefile Land shape to which distances should be calculated. Either a character argument referring to a name of pre-made shapefiles in \code{\link{shapefile_list}}, a single \link[sp]{SpatialPolygons} object or \code{NULL} to enable automatic definition of the land shapes based on \code{data}. +#' @param proj.in \code{\link[sf:st_crs]{coordinate reference system}} of \code{data}. +#' @param shapefile Land shape to which distances should be calculated. Either a character argument referring to a name of pre-made shapefiles in \code{\link{shapefile_list}}, a single \link[sf]{sf} or \code{sp} polygons object object or \code{NULL} to enable automatic definition of the land shapes based on \code{data}. Set to \code{"DecimalDegree"} by default which enables great circle distances using \link[sf]{s2} features assuming a spherical Earth (as a contrast to earlier versions of the function which used flat Earth). #' @param bind Logical indicating whether \code{x} should be returned with the distances (\code{TRUE}, default) or should the distances be returned as vector (\code{FALSE}). -#' @param dist.col The name of the distance column, if \code{bind = TRUE}. Defaults to "dist". +#' @param dist.col The name of the distance column, if \code{bind = TRUE}. Defaults to "ldist". #' @param binary Logical indicating whether binary (TRUE = the position is in the ocean, FALSE = the position is on land) should be returned instead of distances. Speeds up the function considerably. -#' @param cores Integer value defining how many cores should be used in the distance calculations. Parallelization speeds up the function (see \code{parallel::mclapply}), but naturally eats up computer resources during the calculation. Set to 1 to remove parallelization. #' @param verbose Logical indicating whether information about the process should be returned as messages. Set to \code{FALSE} to make the function silent. -#' @details The function calculates distances using projected coordinates and the \code{rgeos::gDistance} function. These distances do not consider the curvature of the Earth unless the projection of the used land shape does so (check out geosphere::dist2Line and \href{https://stackoverflow.com/a/51842756/1082004}{this SO solution if you want exact distances}). The function is fairly slow for large datasets. If you only want to use the function to remove (wrong) observations reported on land, set the \code{binary} argument to \code{TRUE}. This speeds up the calculations considerably. -#' -#' The \code{dist2land} function offers parallel processing, which speeds up the calculations for large datasets. Parallel processing has not been tested under Windows yet and may not work. +#' @details The function calculates great circle spherical distances using the \code{\link[sf]{st_distance}} function by default. The function can be slow for large datasets. If you only want to use the function to remove (wrong) observations reported on land, set the \code{binary} argument to \code{TRUE}. This speeds up the calculations by a factor of ten. #' @return Returns a vector if \code{bind = FALSE}, otherwise a data frame. The distances are given in a new column defined by the \code{dist.col} argument. The distances are \strong{kilometers} if \code{binary = FALSE}, otherwise logical (TRUE = the position is in the ocean, FALSE = the position is on land). -#' @import sp -#' @importFrom rgeos createSPComment gDistance -#' @importFrom utils txtProgressBar setTxtProgressBar -#' @importFrom parallel makeCluster parLapply stopCluster mclapply #' @author Mikko Vihtakari #' @examples -#' -#' if(requireNamespace("ggOceanMapsData", quietly = TRUE)) { -#' \donttest{ #' # Simple example: #' dt <- data.frame(lon = seq(-20, 80, length.out = 41), lat = 50:90) -#' dt <- dist2land(dt, cores = 1, verbose = FALSE) +#' dt <- dist2land(dt, verbose = FALSE) +#' \donttest{ #' qmap(dt, color = ldist) + scale_color_viridis_c() +#' +#' # Datasets covering the entire Earth seem to work now, except 0,0 lon/lat point +#' lon = deg_to_dd(seq(0,360,30)); lat = c(80,50,20,0,-20,-50,-80) +#' +#' dt <- data.frame( +#' lon = rep(lon, length(lat)), lat = rep(lat, each = length(lon))) #' -#' # No premade shapefiles for datasets covering the entire globe -#' dt <- data.frame(lon = -20:20, lat = seq(-90, 90, length.out = 41)) -#' qmap(dist2land(dt, cores = 1, verbose = FALSE), color = ldist) + -#' scale_color_viridis_c() # wrong! +#' qmap(dist2land(dt, verbose = FALSE), color = ldist) + +#' scale_color_viridis_c() #' } #' \dontrun{ -#' dt <- data.frame(lon = seq(-179, 179, length.out = 1000), lat = rep(60, 1000)) +#' dt <- data.frame( +#' lon = deg_to_dd(seq(0,360,length.out = 1e3)), +#' lat = rep(60, 1000)) +#' #' # The distance calculation is slow for large datasets #' system.time(dist2land(dt)) -#' #> user system elapsed -#' #> 0.071 0.036 9.632 +#' # user system elapsed +#' # 12.677 0.146 12.849 #' -#' # The parallel processing speeds it up -#' system.time(dist2land(dt, cores = 1)) -#' #> user system elapsed -#' #> 29.019 1.954 30.958 -#' -#' # binary = TRUE further speeds the function up +#' # binary = TRUE speeds the function up #' system.time(dist2land(dt, binary = TRUE)) -#' #> user system elapsed -#' #> 2.342 0.052 2.403 -#' } +#' # user system elapsed +#' # 1.239 0.120 1.369 #' } #' @export ## Test parameters -# lon = NULL; lat = NULL; shapefile = NULL; proj.in = convert_crs(4326); bind = TRUE; dist.col = "ldist"; binary = FALSE; geodesic.distances = FALSE; verbose = TRUE; cores = getCores() - -dist2land <- function(data, lon = NULL, lat = NULL, shapefile = NULL, proj.in = convert_crs(4326), bind = TRUE, dist.col = "ldist", binary = FALSE, cores = getCores(), verbose = TRUE) { +# lon = NULL; lat = NULL; shapefile = NULL; proj.in = 4326; bind = TRUE; dist.col = "ldist"; binary = FALSE; geodesic.distances = FALSE; verbose = TRUE; cores = getCores() +dist2land <- function(data, lon = NULL, lat = NULL, shapefile = "DecimalDegree", proj.in = 4326, bind = TRUE, dist.col = "ldist", binary = FALSE, verbose = TRUE) { + ## Case for defined x and undefined lon or/and lat #### - + if(is.null(lon) | is.null(lat)) { if(all(!is.data.frame(data))) stop("x argument has to be a data.frame") - + tmp <- guess_coordinate_columns(data) - + lon <- unname(tmp[names(tmp) == "lon"]) lat <- unname(tmp[names(tmp) == "lat"]) - + if(verbose) { message(paste0("Used ", lon, " and ", lat, " as input coordinate column names in data")) } - + if(length(lon) != 1 | length(lat) != 1) { stop("lon or lat columns were not found. Define manually.") } } - + ## Data - + ### Remove NA coordinates (and add later) - + na.rows <- is.na(data[[lon]]) | is.na(data[[lat]]) contains.nas <- any(na.rows) - + x <- as.data.frame(data[!na.rows, c(lon, lat)]) colnames(x) <- c("lon", "lat") - + ## Land shape ### - + if(!is.null(shapefile)) { - + error_test <- quiet(try(match.arg(shapefile, shapefile_list("all")$name), silent = TRUE)) - + if(!inherits(error_test, "try-error")) { shapefile <- shapefile_list(shapefile) if(verbose) message(paste("Using", shapefile$name, "as land shapes.")) land <- eval(parse(text = shapefile$land)) } else { - if(!inherits(shapefile, c("SpatialPolygonsDataFrame", "SpatialPolygons"))) stop("The shapefile must either be matchable string to shapefile_list or a SpatialPolygons object.") + if(!inherits(shapefile, c("sf", "sfc", "SpatialPolygonsDataFrame", "SpatialPolygons"))) stop("The shapefile must either be matchable string to shapefile_list or a sf or sp polygons object.") if(verbose) message("Using custom land shapes.") - land <- shapefile + if(inherits(shapefile, c("SpatialPolygonsDataFrame", "SpatialPolygons"))) { + land <- sf::st_as_sf(shapefile) + } else { + land <- shapefile + } } } else { - + ddLimits <- auto_limits(data = x, lon = "lon", lat = "lat", proj.in = proj.in, verbose = FALSE)$ddLimits - + shapefile.def <- define_shapefiles(ddLimits) shapefile <- shapefile_list(shapefile.def$shapefile.name) if(verbose) message(paste("Using", shapefile$name, "as land shapes.")) - + land <- eval(parse(text = shapefile$land)) } - - land <- rgeos::createSPComment(land) - + ## Coordinate points ### - - sp::coordinates(x) <- c("lon", "lat") - sp::proj4string(x) <- sp::CRS(proj.in) - x <- suppressWarnings(sp::spTransform(x, sp::CRS(suppressWarnings(sp::proj4string(land))))) - if(!suppressWarnings(sp::identicalCRS(land, x))) stop("Cannot convert projections correctly.") - + + x <- sf::st_as_sf(x, coords = c("lon", "lat"), crs = sf::st_crs(proj.in)) + x <- sf::st_transform(x, sf::st_crs(land)) + ############################ ## Distance calculation #### - - if(binary) {## Binary positions - + + if(binary) { ## Binary positions + if(verbose) message("Calculating binary positions...") - tmp <- suppressWarnings(unname(is.na(sp::over(x, land)[1]))) - - } else { ## gDistance - if(cores == 1) { - if(verbose) message("Calculating distances without parallel processing...") - pb <- utils::txtProgressBar(min = 0, max = length(x), style = 3) - tmp <- sapply(1:length(x), function(i) { - utils::setTxtProgressBar(pb, i) - return(suppressWarnings(rgeos::gDistance(x[i], land)/1000)) - }) - } - # Run in parallel on Windows and other platforms: - else { - # Do not use more cores than the number of files: - cores <- min(length(x), cores) - - if(verbose) message("Calculating distances with parallel processing...") + tmp <- is.na(as.integer(suppressMessages(sf::st_intersects(x, land)))) + if(verbose) message("Returning binary positions: TRUE in the ocean, FALSE on land.") + + } else { ## Distance + + if(verbose) message("Calculating distances...") + tmp <- apply(sf::st_distance(x, land), 1, min)/1e3 # km + + if(sf::st_is_longlat(x)) { + if(verbose) message("Returning great circle spherical distances from land as kilometers.") + } else { + if(verbose) message("Returning Euclidean distances from land as kilometers.") - # On Windows run special args to speed up: - if(.Platform$OS.type == "windows") { - cl <- parallel::makeCluster(cores, rscript_args = c("--no-init-file", "--no-site-file", "--no-environ")) - out <- parallel::parLapply(cl, 1:length(x), function(i) suppressWarnings(rgeos::gDistance(x[i], land)/1000)) - parallel::stopCluster(cl) - tmp <- unlist(out) - } - else { - tmp <- unlist(parallel::mclapply(1:length(x), function(i) { - suppressWarnings(rgeos::gDistance(x[i], land)/1000) - }, mc.cores = cores)) - } } } + ## Splitting x to number of available cores and running the distance calculus parallel would be possible but not implemented due to focus on other tasks in 2.0 + ## Saving the parallelization code from 1.3.7 ggOceanMaps version so that it can be implemented later when there's time. + # cores <- min(length(x), cores) + # + # if(verbose) message("Calculating distances with parallel processing...") + # + # # On Windows run special args to speed up: + # if(.Platform$OS.type == "windows") { + # cl <- parallel::makeCluster(cores, rscript_args = c("--no-init-file", "--no-site-file", "--no-environ")) + # out <- parallel::parLapply(cl, 1:length(x), function(i) suppressWarnings(gDistance(x[i], land)/1000)) + # parallel::stopCluster(cl) + # tmp <- unlist(out) + # } + # else { + # tmp <- unlist(parallel::mclapply(1:length(x), function(i) { + # suppressWarnings(gDistance(x[i], land)/1000) + # }, mc.cores = cores)) + # } + ## Return - + if(contains.nas) { na.rows[!na.rows] <- tmp na.rows[na.rows == 1] <- NA tmp <- na.rows } - + if(bind) { data[[dist.col]] <- tmp data } else { tmp } - + } diff --git a/R/geonorge_bathymetry.R b/R/geonorge_bathymetry.R index 7812a67..b527b25 100644 --- a/R/geonorge_bathymetry.R +++ b/R/geonorge_bathymetry.R @@ -3,7 +3,6 @@ #' @param filepath Character string defining the path to the \code{.gml} file. Must contain the file extension. #' @param layer Character string defining the layer containing depth information. If \code{NULL} assumed to be "dybdeareal". #' @param verbose Logical indicating whether information the reading process should be returned. -#' @param output.sf Logical indicating whether an \code{\link[sf:st_polygon]{sf}} (\code{TRUE}) or \code{sp} polygon should be returned. #' @details You can download the bathymetry polygon shapefiles from \href{https://kartkatalog.geonorge.no/metadata/kartverket/dybdedata/2751aacf-5472-4850-a208-3532a51c529a}{Geonorge}. Download the file in \code{GLM} format. #' @return An \link[sf:st_polygon]{sf} or \code{sp} object containing the depth polygons. Uses same projection than \code{bathy} (see \code{\link[sf:st_crs]{CRS}}). #' @author Mikko Vihtakari @@ -12,8 +11,8 @@ # "https://kartkatalog.geonorge.no/metadata/kartverket/dybdedata/2751aacf-5472-4850-a208-3532a51c529a" # add proj.out -# filepath <- "~/Downloads/Basisdata_21_Svalbard_25833_Dybdedata_GML.gml"; layer = NULL; verbose = TRUE; output.sf = TRUE -geonorge_bathymetry <- function(filepath, layer = NULL, verbose = FALSE, output.sf = FALSE) { +# filepath <- "~/Downloads/Basisdata_21_Svalbard_25833_Dybdedata_GML.gml"; layer = NULL; verbose = TRUE +geonorge_bathymetry <- function(filepath, layer = NULL, verbose = FALSE) { ## Define the layer @@ -37,16 +36,10 @@ geonorge_bathymetry <- function(filepath, layer = NULL, verbose = FALSE, output. x$depth <- factor(x$depth, levels = sort(unique(x$depth))) ## Return - - if(output.sf) { - return(x) - } else { - sf::as_Spatial(x) - } - + return(x) } -# x <- geonorge_bathymetry("~/Downloads/Basisdata_21_Svalbard_25833_Dybdedata_GML.gml", output.sf = FALSE) +# x <- geonorge_bathymetry("~/Downloads/Basisdata_21_Svalbard_25833_Dybdedata_GML.gml") # bla <- load_map_data(shapefile_list("Svalbard")) # # basemap(limits = c(10.9, 12.65, 78.83, 79.12), shapefiles = list(land = bla$svalbard_land, glacier = NULL, bathy = x), bathymetry = TRUE) diff --git a/R/internal_functions.R b/R/internal_functions.R index 03428dc..7e72907 100644 --- a/R/internal_functions.R +++ b/R/internal_functions.R @@ -79,31 +79,184 @@ dd_to_deg <- function(x) { deg_to_dd <- function(x) { x <- x - ifelse(x/360 > 1, floor(x/360), 0)*360 - ifelse(x > 0 & x <= 180, x, ifelse(x > 180, -1*(360 - x), NA)) + ifelse(x >= 0 & x <= 180, x, ifelse(x > 180, -1*(360 - x), NA)) } -#' @title Pick a suitable number of cores -#' @description Picks maximum four cores for parallel processing -#' @return Integer of suitable number of cores +# #' @title Pick a suitable number of cores +# #' @description Picks maximum four cores for parallel processing +# #' @return Integer of suitable number of cores +# #' @keywords internal +# #' @importFrom parallel detectCores +# #' @author The \href{https://github.com/StoXProject/RstoxData/blob/master/R/Utilities.R}{StoXProject} +# #' @export +# +# getCores <- function() { +# cores <- as.integer(getOption("mc.cores")) +# if (length(cores) == 0 || is.na(cores)) { +# cores <- parallel::detectCores() +# if (is.na(cores)) { +# return(1) +# } else { +# # Don't use too many cores in autodetect +# if (cores > 4) +# return(4) +# else +# return(cores) +# } +# } else { +# return(cores) +# } +# } + +#' @title Rotate CRS +#' @description Rotates a CRS such that North is in the middle of two meridians +#' @param crs The CRS to be rotated in \code{\link[sf]{st_crs}} format +#' @param meridians A numeric vector giving the two meridians which should be used in the rotation +#' @return Rotated CRS in \code{\link[sf]{st_crs}} format +#' @keywords internal +#' @author Mikko Vihtakari +#' @export + +rotate_crs <- function(crs, meridians) { + + ## Mid-longitude + + tmp <- dd_to_deg(meridians) + + if(tmp[1] > tmp[2]) { + lonDiff <- 360 - tmp[1] + tmp[2] + } else { + lonDiff <- tmp[2] - tmp[1] + } + + midLon <- tmp[1] + lonDiff/2 + midLon <- deg_to_dd(midLon) + + ## CRS rotation + + tmp <- unlist(strsplit(sf::st_crs(crs)$proj4string, " ")) + + if(any(grepl("lon_0=", tmp))) { + tmp[grepl("lon_0=", tmp)] <- paste0("lon_0=", midLon) + } else { + tmp <- c(tmp[-length(tmp)], paste0("lon_0=", midLon), tmp[length(tmp)]) + } + + sf::st_crs(paste(tmp, collapse = " ")) + + + +} + +#' @title Create clip boundary from decimal degree limits +#' @description Creates a clip boundary for map cropping from decimal degree limits +#' counter-clockwise following parallels +#' @inheritParams auto_limits +#' @param limits Numeric vector of length 4 giving the map limits in decimal degrees. See \link{basemap}. +#' @param crs Coordinate reference system in \code{\link[sf]{st_crs}} format. #' @keywords internal -#' @importFrom parallel detectCores -#' @author The \href{https://github.com/StoXProject/RstoxData/blob/master/R/Utilities.R}{StoXProject} +#' @author Mikko Vihtakari +#' @export + +dd_clip_boundary <- function(limits, crs, expand.factor = NULL) { + + tmp <- dd_to_deg(limits[1:2]) + + if(!is.null(expand.factor)) { + + lon.rdiff <- diff(tmp[1:2]) + lon.shift <- ((lon.rdiff*expand.factor) - lon.rdiff)/2 + tmp[1] <- tmp[1] - lon.shift + tmp[2] <- tmp[2] + lon.shift + + lat.rdiff <- diff(limits[3:4]) + lat.shift <- ((lat.rdiff*expand.factor) - lat.rdiff)/2 + limits[3] <- limits[3] - lat.shift + limits[4] <- limits[4] + lat.shift + + # Correct >= 180/90 limits for expanded decimal degree coordinates + + if(any(tmp < 0)) tmp[tmp < 0] <- 0 + if(any(tmp > 360)) tmp[tmp > 360] <- 360 + if(any(limits[3:4] < -90)) limits[3:4][limits[3:4] < -90] <- -90 + if(any(limits[3:4] > 90)) limits[3:4][limits[3:4] > 90] <- 90 + } + + if(tmp[1] > tmp[2]) { + lonDiff <- 360 - tmp[1] + tmp[2] + } else { + lonDiff <- tmp[2] - tmp[1] + } + + midLon <- tmp[1] + lonDiff/2 + + tmp <- deg_to_dd(c(tmp[1], midLon, tmp[2])) + + coords <- data.frame( + lon = c(tmp, rev(tmp), tmp[1]), + lat = c(rep(limits[3], 3), rep(limits[4], 3), limits[3]) + ) + + sf::st_as_sfc( + sf::st_bbox( + sf::st_transform( + sf::st_sfc(sf::st_polygon(list(as.matrix(coords))), crs = 4326), + sf::st_crs(crs) + ) + ) + ) +} + +#' @title Define bathymetry style for \code{\link{basemap}} +#' @description Defines bathymetry style to be used in \code{\link{basemap}} +#' @param x Character argument giving the input bathymetry style. Partially matched and can be abbreviated. See \code{bathy.style} in \code{\link{basemap}}. +#' @return Returns a named character vector with the name pointing to the bathymetry style and value to internal map element command for \code{\link{basemap}} (see \code{\link{map_cmd}}). +#' @keywords internal +#' @author Mikko Vihtakari #' @export -getCores <- function() { - cores <- as.integer(getOption("mc.cores")) - if (length(cores) == 0 || is.na(cores)) { - cores <- parallel::detectCores() - if (is.na(cores)) { - return(1) - } else { - # Don't use too many cores in autodetect - if (cores > 4) - return(4) - else - return(cores) - } +define_bathy_style <- function(x) { + + if(tolower(x) == "rbb") { + out <- "raster_binned_blues" + } else if(tolower(x) %in% c("rbg", "raster_binned_greys")) { + out <- "raster_binned_grays" + } else if(tolower(x) == "rcb") { + out <- "raster_continuous_blues" + } else if(tolower(x) %in% c("rcg", "raster_continuous_greys")) { + out <- "raster_continuous_grays" + } else if(tolower(x) %in% c("rub")) { + out <- "raster_user_blues" + } else if(tolower(x) %in% c("rug", "raster_user_greys")) { + out <- "raster_user_grays" + } else if(tolower(x) == "pb") { + out <- "poly_blues" + } else if(tolower(x) %in% c("pg", "poly_greys")) { + out <- "poly_grays" + } else if(tolower(x) == "cb") { + out <- "contour_blues" + } else if(tolower(x) %in% c("cg", "contour_grey", "contour_greys", "contour_grays")) { + out <- "contour_gray" } else { - return(cores) + out <- match.arg( + x, + c("raster_binned_blues", "raster_binned_grays", "raster_continuous_blues", + "raster_continuous_grays", "raster_user_blues", "raster_user_grays", + "poly_blues", "poly_grays", "contour_blues", "contour_gray") + ) } + + alternatives <- + c("raster_binned_blues" = "bathy_rb", + "raster_binned_grays" = "bathy_rb", + "raster_continuous_blues" = "bathy_rc", + "raster_continuous_grays" = "bathy_rc", + "raster_user_blues" = "bathy_rc", + "raster_user_grays" = "bathy_rc", + "poly_blues" = "bathy_pb", + "poly_grays" = "bathy_pg", + "contour_blues" = "bathy_cb", + "contour_gray" = "bathy_cg") + + alternatives[names(alternatives) == out] } diff --git a/R/load_map_data.R b/R/load_map_data.R index 35cb137..f26188e 100644 --- a/R/load_map_data.R +++ b/R/load_map_data.R @@ -1,16 +1,17 @@ #' @title Load large shapefile objects #' @description Internal function to load large shapefile objects. Downloads the files if they are not found \code{getOption("ggOceanMaps.datapath")} +#' @inheritParams basemap #' @param x An object from \code{\link{shapefile_list}}. #' @param force Logical indicating whether to download the file even though it exists. Useful when files in the \href{https://github.com/MikkoVihtakari/ggOceanMapsLargeData}{Github repository have been changed}. Overwrites the old file. #' @details This is an internal function, which is automatically run by the \code{\link{basemap}} function. Common users do not need to worry about these details. -#' @return A list of \code{\link[sp:SpatialPolygons]{SpatialPolygonsDataFrames}} +#' @return A list of spatial objects #' @keywords internal #' @export #' @importFrom utils menu download.file #' @author Mikko Vihtakari #' @seealso \code{\link{basemap}} -load_map_data <- function(x, force = FALSE) { +load_map_data <- function(x, force = FALSE, downsample = 0) { # Create the data download folder if it does not exist @@ -31,29 +32,110 @@ load_map_data <- function(x, force = FALSE) { } } - # Check whether the data has already been downloaded + # Detect load-case for bathy - filepath <- paste(getOption("ggOceanMaps.datapath"), paste0(tolower(x$name), ".rda"), sep = "/") + bathy_user_defined <- FALSE - if(file.exists(filepath) & !force) { - return(mget(load(filepath))) - } + if(!is.null(x$bathy)) { + if(grepl("user", names(x$bathy))) { + bathy_user_defined <- TRUE + } + } - # Download the file, if it does not exist + # Create file paths - if(!file.exists(filepath) | force) { - - msg <- paste0("You are trying to create a ", x$name, " map.", " Cannot find the required shapefiles from ", getOption("ggOceanMaps.datapath"), ". Do you want download them now?") + x[c("land", "glacier", "bathy")] <- + lapply(x[c("land", "glacier", "bathy")], function(k) { + if(is.null(k)) { + NULL + } else if(grepl("/", k)) { + if(!grepl("\\.", k)) { + normalizePath(paste0(k, ".rda"), mustWork = FALSE) + } else { + normalizePath(k, mustWork = TRUE) + } + } else { + unname(k) + #paste(getOption("ggOceanMaps.datapath"), paste0(tolower(x$name), ".rda"), sep = "/") + } + }) + + # Check whether the data has already been downloaded + + exist_list <- lapply(x[c("land", "glacier", "bathy")], function(k) { + if(is.null(k)) { + NULL + } else if(grepl("/", k)) { + if(file.exists(k) & !force) { + TRUE + } else { + FALSE + } + } else { + TRUE + } + }) + + exist_list[sapply(exist_list, is.null)] <- NULL + + # Download missing files, if they do not exist + + if(any(!unlist(exist_list))) { + missing_files <- paste(sapply(names(exist_list)[!unlist(exist_list)], function(k) x[[k]]), collapse = ", ") + msg <- paste0("Cannot find files: ", missing_files, ". Do you want download them now?") message(paste(strwrap(msg), collapse= "\n")) ret.val <- utils::menu(c("Yes", "No"), "") if(ret.val != 1) { - msg <- paste0(x$name, " shapefiles required to plot the map. Download the file or change the shapefiles argument.") + msg <- paste0(missing_files, " shapefiles required to plot the map. Download the file(s) or change the shapefiles argument.") stop(paste(strwrap(msg), collapse= "\n")) + } + } + + x[c("land", "glacier")] <- + lapply(x[c("land", "glacier")], function(k) { + if(is.null(k)) { + NULL + } else if(grepl("/", k)) { + if(file.exists(k) & !force) { + mget(load(k))[[1]] + } else { + tmp <- unlist(strsplit(k, "/")) + dest_path <- file.path(normalizePath(getOption("ggOceanMaps.datapath")),tmp[length(tmp)]) + + download.file(paste0(x$path, tmp[length(tmp)]), dest_path) + mget(load(dest_path))[[1]] + } } else { - download.file(x$path, filepath) - return(mget(load(filepath))) + eval(parse(text = k)) + } + }) + + if(!is.null(x$bathy)) { + k <- x[["bathy"]] + + if(bathy_user_defined) { + x[["bathy"]] <- stars::read_stars(k, downsample = downsample) + } else { + if(grepl("/", k)) { + if(file.exists(k) & !force) { + x[["bathy"]] <- mget(load(k))[[1]] + } else { + tmp <- unlist(strsplit(k, "/")) + dest_path <- file.path(normalizePath(getOption("ggOceanMaps.datapath")),tmp[length(tmp)]) + + download.file(paste0(x$path, tmp[length(tmp)]), dest_path) + x[["bathy"]] <- mget(load(dest_path))[[1]] + } + } else { + x[["bathy"]] <- eval(parse(text = k)) + } } } + + # Return + x[sapply(x, is.null)] <- NULL + + return(x) } \ No newline at end of file diff --git a/R/map_cmd.R b/R/map_cmd.R index db67244..f73d53c 100644 --- a/R/map_cmd.R +++ b/R/map_cmd.R @@ -9,7 +9,6 @@ #' map_cmd("grid_utm"), map_cmd("defs_utm"), sep = "+")))} #' @keywords internal #' @export -#' @import ggplot2 #' @author Mikko Vihtakari #' @seealso \code{\link{basemap}} @@ -18,9 +17,92 @@ map_cmd <- function(command, alternative = FALSE) { base = ' ggplot2::ggplot() ', + bathy_rb = ' + stars::geom_stars(data = X$shapefiles$bathy$raster, na.action = na.omit, + downsample = downsample, show.legend = bathy.legend, alpha = bathy.alpha) + ', + bathy_rc = ' + stars::geom_stars(data = X$shapefiles$bathy$raster, downsample = downsample, + show.legend = bathy.legend, alpha = bathy.alpha) + ', + bathy_rbb_scale = ' + ggplot2::scale_fill_manual( + name = "Depth (m)", + values = colorRampPalette(c("#F7FBFF", "#DEEBF7", "#9ECAE1", "#4292C6", "#08306B"))(nlevels(X$shapefiles$bathy$raster[[1]])), + guide = + if(bathy.legend) { + guide_legend(order = 1, override.aes = list(colour = NA)) + } else { + "none" + }) + ', + bathy_rbg_scale = ' + scale_fill_grey("Depth (m)", start = 1, end = 0.5, guide = + if(bathy.legend) { + guide_legend(order = 1, override.aes = list(colour = NA)) + } else { + "none" + }) + ', + bathy_rcb_scale = ' + ggplot2::scale_fill_gradientn( + name = "Depth (m)", + limits = c(0,NA), + values = c(0,0.01,0.05,0.25,0.75,1), + colors = colorRampPalette(c("#F7FBFF", "#DEEBF7", "#9ECAE1", "#4292C6", "#08306B"))(10), + na.value = "white" + ) + ', + bathy_rcg_scale = ' + scale_fill_distiller( + name = "Depth (m)", + limits = c(0,NA), + type = "seq", + direction = 1, + palette = "Greys", + na.value = "white" + ) + ', + # bathy_rbb = + # 'stars::geom_stars(data = X$shapefiles$bathy$raster, na.action = na.omit, + # show.legend = bathy.legend, alpha = bathy.alpha) + + # ggplot2::scale_fill_manual( + # name = "Depth (m)", + # values = colorRampPalette(c("#F7FBFF", "#DEEBF7", "#9ECAE1", "#4292C6", "#08306B"))(nlevels(X$shapefiles$bathy$raster[[1]])), + # guide = + # if(bathy.legend) { + # guide_legend(order = 1, override.aes = list(colour = NA)) + # } else { + # "none" + # }) + # ', + # bathy_rcb = + # 'stars::geom_stars(data = X$shapefiles$bathy$raster, + # show.legend = bathy.legend, alpha = bathy.alpha) + + # ggplot2::scale_fill_gradientn( + # name = "Depth (m)", limits = c(0,NA), + # values = c(0,0.01,0.05,0.25,0.75,1), + # colors = colorRampPalette(c("#F7FBFF", "#DEEBF7", "#9ECAE1", "#4292C6", "#08306B"))(10), + # na.value = "white" + # ) + # ', + # bathy_rbg = ' + # stars::geom_stars(data = X$shapefiles$bathy$raster, na.action = na.omit, + # show.legend = bathy.legend, alpha = bathy.alpha) + + # scale_fill_grey("Depth (m)", start = 1, end = 0.5, guide = + # if(bathy.legend) { + # guide_legend(order = 1, override.aes = list(colour = NA)) + # } else { + # "none" + # }) + # ', bathy_pb = ' - ggspatial::layer_spatial(data = X$shapefiles$bathy, aes(fill = depth), show.legend = bathy.legend, color = bathy.border.col, size = bathy.size, alpha = bathy.alpha) + - scale_fill_manual(name = "Depth (m)", values = colorRampPalette(c("#F7FBFF", "#DEEBF7", "#9ECAE1", "#4292C6", "#08306B"))(nlevels(X$shapefiles$bathy@data$depth)), guide = + ggplot2::geom_sf(data = X$shapefiles$bathy, ggplot2::aes(fill = depth), + show.legend = bathy.legend, color = bathy.border.col, + size = bathy.size, alpha = bathy.alpha) + + ggplot2::scale_fill_manual(name = "Depth (m)", + values = colorRampPalette(c("#F7FBFF", "#DEEBF7", "#9ECAE1", "#4292C6", "#08306B"))(nlevels(X$shapefiles$bathy$depth)), + guide = if(bathy.legend) { guide_legend(order = 1, override.aes = list(colour = NA)) } else { @@ -28,7 +110,9 @@ map_cmd <- function(command, alternative = FALSE) { }) ', bathy_pg = ' - ggspatial::layer_spatial(data = X$shapefiles$bathy, aes(fill = depth), show.legend = bathy.legend, color = bathy.border.col, size = bathy.size, alpha = bathy.alpha) + + ggplot2::geom_sf(data = X$shapefiles$bathy, ggplot2::aes(fill = depth), + show.legend = bathy.legend, color = bathy.border.col, size = bathy.size, + alpha = bathy.alpha) + scale_fill_grey("Depth (m)", start = 1, end = 0.5, guide = if(bathy.legend) { guide_legend(order = 1, override.aes = list(colour = NA)) @@ -37,59 +121,64 @@ map_cmd <- function(command, alternative = FALSE) { }) ', bathy_cg = ' - ggspatial::layer_spatial(data = X$shapefiles$bathy, fill = NA, color = bathy.border.col, size = bathy.size) + ggplot2::geom_sf(data = X$shapefiles$bathy, fill = NA, + color = bathy.border.col, size = bathy.size) ', bathy_cb = ' - ggspatial::layer_spatial(data = X$shapefiles$bathy, aes(color = depth), show.legend = bathy.legend, fill = NA, size = land.size) + - scale_color_manual(name = "Depth (m)", values = colorRampPalette(c("#DEEBF7", "#9ECAE1", "#4292C6", "#08306B", "#76536F"))(nlevels(X$shapefiles$bathy@data$depth))) + ggplot2::geom_sf(data = X$shapefiles$bathy, ggplot2::aes(color = depth), + show.legend = bathy.legend, fill = NA, size = land.size) + + ggplot2::scale_color_manual(name = "Depth (m)", + values = colorRampPalette(c("#DEEBF7", "#9ECAE1", "#4292C6", "#08306B", "#76536F"))(nlevels(X$shapefiles$bathy$depth))) ', land = ' - ggspatial::layer_spatial(data = X$shapefiles$land, fill = land.col, color = land.border.col, size = land.size) + ggplot2::geom_sf(data = X$shapefiles$land, fill = land.col, + color = land.border.col, size = land.size) ', glacier = ' - ggspatial::layer_spatial(data = X$shapefiles$glacier, fill = gla.col, color = gla.border.col, size = gla.size) + ggplot2::geom_sf(data = X$shapefiles$glacier, fill = gla.col, + color = gla.border.col, size = gla.size) ', defs_rect = ' - scale_y_continuous(breaks = X$map.grid$lat.breaks, expand = c(0,0.1)) + - scale_x_continuous(breaks = X$map.grid$lon.breaks, expand = c(0,0.1)) + - labs(y = "Latitude (decimal degrees)", x = "Longitude (decimal degrees)") + - coord_sf(xlim = X$map.limits[1:2], ylim = X$map.limits[3:4], expand = FALSE, + ggplot2::scale_y_continuous(breaks = X$map.grid$lat.breaks, expand = c(0,0.1)) + + ggplot2::scale_x_continuous(breaks = X$map.grid$lon.breaks, expand = c(0,0.1)) + + ggplot2::labs(y = "Latitude (decimal degrees)", x = "Longitude (decimal degrees)") + + ggplot2::coord_sf(xlim = X$map.limits[1:2], ylim = X$map.limits[3:4], expand = FALSE, crs = sf::st_crs(X$proj), default = TRUE) + theme_map(base_size = base_size, grid.col = grid.col, grid.size = grid.size) + - theme(legend.position = legend.position, + ggplot2::theme(legend.position = legend.position, legend.margin=margin(t = 0.2, b = 0, unit = "cm") ) ', defs_rect_proj = ' - labs(y = "Latitude (meters)", x = "Longitude (meters)") + - coord_sf(xlim = X$map.limits[1:2], ylim = X$map.limits[3:4], expand = FALSE, + ggplot2::labs(y = "Latitude (meters)", x = "Longitude (meters)") + + ggplot2::coord_sf(xlim = X$map.limits[1:2], ylim = X$map.limits[3:4], expand = FALSE, crs = sf::st_crs(X$proj), datum = sf::st_crs(X$proj), default = TRUE) + theme_map(base_size = base_size, grid.col = grid.col, grid.size = grid.size) + - theme(legend.position = legend.position, + ggplot2::theme(legend.position = legend.position, legend.margin=margin(t = 0.2, b = 0, unit = "cm") ) ', defs_polar = ' { - if(!is.na(grid.col)) ggspatial::geom_spatial_path(data = X$map.grid$lon.grid.lines, aes(x = lon, y = lat), crs = 4326, color = grid.col, size = grid.size) + if(!is.na(grid.col)) ggplot2::geom_sf(data = X$map.grid$lon.grid.lines, color = grid.col, linewidth = grid.size) } + { - if(!is.na(grid.col)) ggspatial::geom_spatial_path(data = X$map.grid$lat.grid.lines, aes(x = lon, y = lat), crs = 4326, color = grid.col, size = grid.size) + if(!is.na(grid.col)) ggplot2::geom_sf(data = X$map.grid$lat.grid.lines, color = grid.col, linewidth = grid.size) } + - ggspatial::geom_spatial_path(data = X$map.grid$lat.limit.line, aes(x = lon, y = lat), crs = 4326, color = land.border.col, size = land.size) + - coord_sf(xlim = X$map.limits[1:2], ylim = X$map.limits[3:4], expand = FALSE, + ggplot2::geom_sf(data = X$map.grid$lat.limit.line, color = land.border.col, size = land.size) + + ggplot2::coord_sf(xlim = X$map.limits[1:2], ylim = X$map.limits[3:4], expand = FALSE, crs = sf::st_crs(X$proj), default = TRUE) + - theme_void(base_size = base_size) + - theme(legend.position = legend.position, + ggplot2::theme_void(base_size = base_size) + + ggplot2::theme(legend.position = legend.position, legend.margin = margin(t = 0.2, b = 0, r = 0.2, unit = "cm") ) ', defs_polar_proj = ' - labs(y = "Latitude (meters)", x = "Longitude (meters)") + - ggspatial::geom_spatial_path(data = X$map.grid$lat.limit.line, aes(x = lon, y = lat), crs = 4326, color = land.border.col, size = land.size) + - coord_sf(xlim = X$map.limits[1:2], ylim = X$map.limits[3:4], expand = FALSE, + ggplot2::labs(y = "Latitude (meters)", x = "Longitude (meters)") + + ggplot2::geom_sf(data = X$map.grid$lat.limit.line, color = land.border.col, size = land.size) + + ggplot2::coord_sf(xlim = X$map.limits[1:2], ylim = X$map.limits[3:4], expand = FALSE, crs = sf::st_crs(X$proj), datum = sf::st_crs(X$proj), default = TRUE) + theme_map(base_size = base_size, grid.col = grid.col, grid.size = grid.size) + - theme(legend.position = legend.position, + ggplot2::theme(legend.position = legend.position, legend.margin = margin(t = 0.2, b = 0, r = 0.2, unit = "cm") ) ', diff --git a/R/qmap.R b/R/qmap.R index 2bae1fe..f9251b1 100644 --- a/R/qmap.R +++ b/R/qmap.R @@ -4,43 +4,38 @@ #' @param x,y,... Aesthetics passed into each layer. Longitude and latitude columns are automatically recognized using the \code{\link{guess_coordinate_columns}} function. #' @param geom Character argument specifying geom(s) to draw. Defaults to "point". Other alternatives are "text" and "label". The "text" option can also be triggered by simply mapping a variable to \code{label} (see Examples). #' @inheritParams basemap -#' @import ggplot2 ggspatial +#' @import ggplot2 #' @return Returns a \link[ggplot2]{ggplot} map, which can be assigned to an object and modified as any ggplot object. #' @family basemap functions #' @author Mikko Vihtakari #' @examples -#' if(requireNamespace("ggOceanMapsData", quietly = TRUE)) { #' dt <- data.frame(lon = c(-100, -80, -60), lat = c(10, 25, 40), var = c("a", "a", "b")) #' -#' # Set color -#' -#' qmap(dt, color = I("red")) +#' # Quickly see position of data +#' qmap(dt) #' -#' # Map color #' \donttest{ -#' qmap(dt, color = var) -#' # -#' -#' # Map text +#' # Set color +#' qmap(dt, color = I("blue")) #' -#' qmap(dt, label = var) +#' # Map color to a variable +#' qmap(dt, color = var) #' +#' # Map text to a variable +#' qmap(dt, label = var) #' #' # All basemap arguments work in qmap() -#' #' dt <- data.frame(lon = c(-80, -80, -50, -50), lat = c(65, 80, 80, 65)) -#' #' qmap(dt, rotate = TRUE) #' } -#' } #' @export -# data = dt; x = NULL; y = NULL; geom = "point"; limits = NULL; bathymetry = FALSE; glaciers = FALSE; rotate = TRUE; legends = TRUE; legend.position = "right"; lon.interval = NULL; lat.interval = NULL; bathy.style = "poly_blues"; bathy.border.col = NA; bathy.size = 0.1; land.col = "grey60"; land.border.col = "black"; land.size = 0.1; gla.col = "grey95"; gla.border.col = "black"; gla.size = 0.1; grid.col = "grey70"; grid.size = 0.1; base_size = 11; projection.grid = FALSE; verbose = FALSE -qmap <- function(data, x = NULL, y = NULL, geom = "point", limits = NULL, shapefiles = NULL, bathymetry = FALSE, glaciers = FALSE, rotate = FALSE, legends = TRUE, legend.position = "right", lon.interval = NULL, lat.interval = NULL, bathy.style = "poly_blues", bathy.border.col = NA, bathy.size = 0.1, land.col = "grey60", land.border.col = "black", land.size = 0.1, gla.col = "grey95", gla.border.col = "black", gla.size = 0.1, grid.col = "grey70", grid.size = 0.1, base_size = 11, projection.grid = FALSE, expand.factor = 1.1, verbose = FALSE, ...) { +# x = NULL; y = NULL; geom = "point"; limits = NULL; shapefiles = NULL; crs = NULL; bathymetry = FALSE; glaciers = FALSE; rotate = FALSE; legends = TRUE; legend.position = "right"; lon.interval = NULL; lat.interval = NULL; ifelse(!is.null(getOption("ggOceanMaps.bathy.style")), getOption("ggOceanMaps.bathy.style"), "raster_binned_blues"); bathy.border.col = NA; bathy.size = 0.1; land.col = "grey60"; land.border.col = "black"; land.size = 0.1; gla.col = "grey95"; gla.border.col = "black"; gla.size = 0.1; grid.col = "grey70"; grid.size = 0.1; base_size = 11; projection.grid = FALSE; expand.factor = 1.1; verbose = FALSE +qmap <- function(data, ..., x = NULL, y = NULL, geom = "point", limits = NULL, shapefiles = NULL, crs = NULL, bathymetry = FALSE, glaciers = FALSE, rotate = FALSE, legends = TRUE, legend.position = "right", lon.interval = NULL, lat.interval = NULL, bathy.style = NULL, bathy.border.col = NA, bathy.size = 0.1, land.col = "grey60", land.border.col = "black", land.size = 0.1, gla.col = "grey95", gla.border.col = "black", gla.size = 0.1, grid.col = "grey70", grid.size = 0.1, base_size = 11, projection.grid = FALSE, expand.factor = 1.1, verbose = FALSE) { ## Coordinate columns - if(!"sf" %in% class(data) & (is.null(x) | is.null(y))) { + if(!inherits(data, "sf") & (is.null(x) | is.null(y))) { coordCols <- guess_coordinate_columns(data) if(is.null(x)) { @@ -52,12 +47,16 @@ qmap <- function(data, x = NULL, y = NULL, geom = "point", limits = NULL, shapef } } + if(inherits(data, "sf")) { + if(is.na(sf::st_crs(data))) stop("data does not have a coordinate reference system. Use sf::st_set_crs() to define it.") + } + ## Base map pb <- basemap( limits = limits, data = if("sf" %in% class(data)) {data} else {data[c(x, y)]}, - shapefiles = shapefiles, + shapefiles = shapefiles, crs = crs, bathymetry = bathymetry, glaciers = glaciers, rotate = rotate, legends = legends, legend.position = legend.position, lon.interval = lon.interval, lat.interval = lat.interval, @@ -70,26 +69,96 @@ qmap <- function(data, x = NULL, y = NULL, geom = "point", limits = NULL, shapef expand.factor = expand.factor, verbose = verbose ) + ## Transform data + + if(!inherits(data, "sf")) { + if(!is.null(crs)) { + data <- transform_coord(data, lon = x, lat = y, proj.out = sf::st_crs(crs), bind = TRUE) + } else if(rotate) { + data <- transform_coord(data, lon = x, lat = y, rotate = TRUE, bind = TRUE) + } else { + data <- transform_coord(data, lon = x, lat = y, proj.out = attributes(pb)$crs, bind = TRUE) + } + + x_proj <- "lon.proj" + y_proj <- "lat.proj" + } + ## Geoms - # geom_arguments <- list( - # ..., - # color = "red", - # shape = 21 - # ) - # - # geom_arguments <- geom_arguments[!duplicated(names(geom_arguments))] + mf <- match.call() if("sf" %in% class(data)) { - pb + geom_sf(data = data, aes(...)) + # pb + geom_sf(data = data, aes(...), color = "red") + + if(any(grepl("colour|color", names(mf)))) { + pb + ggplot2::geom_sf(data = data, ggplot2::aes(...)) + } else { + pb + ggplot2::geom_sf(data = data, ggplot2::aes(...), color = "red") + } + } else if(geom == "point" && !methods::hasArg(label)) { - pb + ggspatial::geom_spatial_point(data = data, aes(x = get(x), y = get(y), ...), crs = 4326) - } else if(geom == "text" | methods::hasArg("label")) { - pb + ggspatial::geom_spatial_text(data = data, aes(x = get(x), y = get(y), ...), crs = 4326) + + if(any(grepl("colour|color", names(mf)))) { + pb + + ggplot2::geom_point( + data = data, + ggplot2::aes(x = .data[[x_proj]], y = .data[[y_proj]], ...)) + } else { + pb + + ggplot2::geom_point( + data = data, + ggplot2::aes(x = .data[[x_proj]], y = .data[[y_proj]], ...), color = "red") + } + + # if(any(grepl("colour|color", names(mf))) & any(grepl("shape", names(mf)))) { + # pb + + # ggplot2::geom_point( + # data = data, + # ggplot2::aes(x = .data[[x_proj]], y = .data[[y_proj]], ...)) + # } else if(any(grepl("colour|color", names(mf)))) { + # pb + + # ggplot2::geom_point( + # data = data, + # ggplot2::aes(x = .data[[x_proj]], y = .data[[y_proj]], + # shape = I(21), ...)) + # } else if(any(grepl("shape", names(mf)))) { + # pb + + # ggplot2::geom_point( + # data = data, + # ggplot2::aes(x = .data[[x_proj]], y = .data[[y_proj]], + # color = I("red"), ...)) + # } else { + # pb + + # ggplot2::geom_point( + # data = data, + # ggplot2::aes(x = .data[[x_proj]], y = .data[[y_proj]], + # color = I("red"), shape = I(21), ...)) + # } + + # geom_arguments <- list(...) + # + # geom_arguments <- list( + # ..., + # x = ggplot2::sym(x_proj), + # y = ggplot2::sym(y_proj), + # color = I("red"), + # shape = I(21) + # ) + # + # geom_arguments <- geom_arguments[!duplicated(names(geom_arguments))] + # + # pb + ggplot2::geom_point(data = data, do.call("aes", geom_arguments)) + + # pb + ggspatial::geom_spatial_point(data = data, aes(x = get(x), y = get(y), ...), crs = 4326) } else if(geom == "label") { - pb + ggspatial::geom_spatial_label(data = data, aes(x = get(x), y = get(y), ...), crs = 4326) + pb + ggplot2::geom_label(data = data, ggplot2::aes(x = get(x_proj), y = get(y_proj), ...)) + # pb + ggspatial::geom_spatial_label(data = data, aes(x = get(x), y = get(y), ...), crs = 4326) + } else if(geom == "text" | methods::hasArg("label")) { + pb + ggplot2::geom_text(data = data, ggplot2::aes(x = get(x_proj), y = get(y_proj), ...)) + #pb + ggspatial::geom_spatial_text(data = data, aes(x = get(x), y = get(y), ...), crs = 4326) } else { - stop("Other geom than point have not been implemented yet.") + stop("Other geom than point, label and text have not been implemented yet.") } } diff --git a/R/raster_bathymetry.R b/R/raster_bathymetry.R index 944fcde..b3963d2 100644 --- a/R/raster_bathymetry.R +++ b/R/raster_bathymetry.R @@ -1,43 +1,50 @@ #' @title Simplify a bathymetry raster ready for vectorization #' @description Simplifies bathymetry raster ready for the \code{\link{vector_bathymetry}} function. Warning: processing may take a long time if the bathymetry raster is large. -#' @param bathy A \link[raster]{raster} object or a string giving the path to a bathymetry NetCDF or grd file -#' @param depths Numeric vector giving the cut points for depth contours (see \code{\link[base]{cut}}). -#' @param proj.out A character string specifying the PROJ6 projection argument for the output. See \code{\link[sf]{st_crs}} and \href{https://proj.org/}{proj.org}. If \code{NULL}, the projection is retrieved from \code{bathy}. If \code{proj.out == proj.bathy}, the output will not be reprojected. -#' @param proj.bathy A character string specifying the \code{\link[sf:st_crs]{CRS}} projection arguments for the input (\code{bathy}). Only required if \code{bathy} lacks CRS information. If missing, \code{"EPSG:4326"} is assumed. -#' @param boundary A \link[sf]{st_polygon} object, text string defining the file path to a spatial polygon, or a numeric vector of length 4 giving the boundaries for which \code{bathy} should be cut to. Should be given as \strong{decimal degrees}. If numeric vector, the first element defines the minimum longitude, the second element the maximum longitude, the third element the minimum latitude and the fourth element the maximum latitude of the bounding box. Use \code{NULL} not to cut \code{bathy}. -#' @param file.name A character string specifying the file path \strong{without extension} where the output should be saved. If \code{NULL} a temporary file will be used. See \code{\link[raster]{writeRaster}}. -#' @param aggregation.factor An integer defining the \code{fact} argument from the \code{\link[raster]{aggregate}} function. Set to \code{NA} to ignore aggregation. -#' @param verbose Logical indicating whether information about guessed projection should be returned as message. Set to \code{FALSE} to make the function silent. -#' @details You can use \href{https://www.gebco.net/data_and_products/gridded_bathymetry_data/}{GEBCO}, \href{https://www.gebco.net/data_and_products/gridded_bathymetry_data/arctic_ocean/}{IBCAO}, \href{https://www.ngdc.noaa.gov/mgg/global/}{ETOPO1} bathymetry grids downloaded from respective sources as the \code{bathy} argument. The bathymetry grids read from files must be in NetCDF/grd format. Alternatively use the \code{marmap::getNOAA.bathy} function to download ETOPO1 bathymetry and convert it to a raster object using the \code{marmap::as.raster} function. +#' @param bathy A \link[stars:read_stars]{stars} object or a string giving the path to a bathymetry NetCDF or grd file +#' @param depths Numeric vector giving the cut points for depth contours (see \code{\link[base]{cut}}). If \code{NULL}, no depth aggregation will be made. This option is suitable for raster bathymetries passed directly to \code{basemap}. +#' @param proj.out A character string specifying the \link[sf:st_crs]{coordinate reference system} (CRS) argument for the output. See \code{\link[sf]{st_crs}} and \href{https://proj.org/}{proj.org}. If \code{NULL}, the projection is retrieved from \code{bathy} and the output will not be reprojected saving processing time (since \code{proj.out} and \code{proj.bathy} would match. +#' @param proj.bathy A character string specifying the \code{\link[sf:st_crs]{CRS}} for the input (\code{bathy}). Only required if \code{bathy} lacks CRS information. If \code{NULL}, \code{"EPSG:4326"} is assumed. +#' @param boundary A \link[sf]{st_polygon} object, text string defining the file path to a spatial polygon, \link[sf:st_bbox]{bounding box}, or a numeric vector of length 4 giving the boundaries for which \code{bathy} should be cut to. Should be given as \strong{decimal degrees}. If unnamed numeric vector, the first element defines the minimum longitude, the second element the maximum longitude, the third element the minimum latitude and the fourth element the maximum latitude of the bounding box. You can also use the sf bounding box format as named vector. Use \code{NULL} not to cut \code{bathy}. +#' @param warp Logical indicating whether the resulting grid should be resampled to a new CRS if \code{proj.out} != code{proj.bathy} using the \code{\link[stars]{st_warp}} function. A time-consuming operation, but necessary when CRS changes in raster bathymetries. Not required if the next step is to vectorise the bathymetry. +#' @param estimate.land Logical indicating whether to include land to the output. Can be used in the following \code{\link{vector_bathymetry}} step to estimate land polygons. +#' @param downsample An integer defining how many rows in \code{bathy} should be skipped to reduce the size (and resolution). 1 skips every second row, 2 every second and third. See \code{\link[stars]{st_downsample}}. Set to \code{NULL} (default) to skip downsampling. +#' @param verbose Logical indicating whether information about progress and guessed projection should be returned. Set to \code{FALSE} to make the function silent. +#' @details You can use \href{https://www.gebco.net/data_and_products/gridded_bathymetry_data/}{GEBCO}, \href{https://www.gebco.net/data_and_products/gridded_bathymetry_data/arctic_ocean/}{IBCAO}, \href{https://www.ncei.noaa.gov/products/etopo-global-relief-model}{ETOPO} bathymetry grids downloaded from respective sources as the \code{bathy} argument. The bathymetry grids read from files must be in any format read by \code{\link[stars]{read_stars}}. Alternatively use the \code{marmap::getNOAA.bathy} function to download ETOPO1 bathymetry and convert it to a raster object using the \code{marmap::as.raster} function. #' -#' Note that the size of the output is heavily influenced by the number of depth contours (\code{depths}) as well as the resolution of \code{bathy} and choice of \code{aggregation.factor}. To make the \code{\link{vector_bathymetry}} function and consequent plotting faster, limiting the details of the bathymetry raster may be desirable. -#' @return A list with a \link[raster]{raster} object the containing projected bathymetry defined by the \code{proj.out} argument and a data frame of depth intervals. +#' Note that the size of the output is heavily influenced by the number of depth contours (\code{depths}) as well as the resolution of \code{bathy} and choice of \code{downsample}. To make the \code{\link{vector_bathymetry}} function and consequent plotting faster, limiting the details of the bathymetry raster may be desirable. +#' @return A list with a \link[stars:read_stars]{stars} object the containing projected bathymetry defined by the \code{proj.out} argument and a data frame of depth intervals. #' @references GEBCO Compilation Group (2019) GEBCO 2019 15-arcsecond grid (doi:10.5285/836f016a-33be-6ddc-e053-6c86abc0788e). URL: \url{https://www.gebco.net/data_and_products/gridded_bathymetry_data/gebco_2019/gebco_2019_info.html}. -#' ETOPO1 1 Arc-Minute Global Relief Model. URL: \url{https://www.ngdc.noaa.gov/mgg/global/relief/ETOPO1/docs/ETOPO1.pdf}. +#' NOAA National Centers for Environmental Information. 2022: ETOPO 2022 15 Arc-Second Global Relief Model. NOAA National Centers for Environmental Information. \doi{10.25921/fd45-gt74}. #' @author Mikko Vihtakari #' @family create shapefiles #' @export -# bathy = file.path(etopoPath, "ETOPO1_Ice_g_gmt4.grd"); depths = c(50, 300, 500, 1000, 1500, 2000, 4000, 6000, 10000); proj.out = convert_crs("3996"); proj.bathy = convert_crs("3996"), file.name = NULL; boundary = c(-180.0083, 180.0083, -90, 90); aggregation.factor = 6 +# bathy = "/Users/a22357/Downloads/ETOPO_2022_v1_60s_N90W180_surface.nc" +# depths = c(0, 50, 300, 500, 1000, 1500, 2000, 4000, 6000, 10000); proj.out = 4326; file.name = NULL; boundary = c(-180, 180, -90, 90); aggregation.factor = 6 +# proj.out = shapefile_list("Barents")$crs +# bathy = file.path(etopoPath, "ETOPO1_Ice_g_gmt4.grd"); depths = c(0, 50, 300, 500, 1000, 1500, 2000, 4000, 6000, 10000); proj.out = convert_crs("3996"); proj.bathy = convert_crs("3996"), file.name = NULL; boundary = c(-180.0083, 180.0083, -90, 90); aggregation.factor = 6 # bathy = file.path(etopoPath, "ETOPO1_Ice_g_gmt4.grd"); depths = c(50, 300, 500, 1000, 1500, 2000, 4000, 6000, 10000); proj.out = arcticCRS; boundary = c(-180.0083, 180.0083, 30, 90); aggregation.factor = 2; file.name = NULL; verbose = TRUE -raster_bathymetry <- function(bathy, depths, proj.out = NULL, proj.bathy, boundary = NULL, file.name = NULL, aggregation.factor = NA, verbose = TRUE) { +# proj.out = NULL; proj.bathy = NULL; boundary = NULL; estimate.land = FALSE; downsample = NULL; verbose = FALSE +raster_bathymetry <- function(bathy, depths, proj.out = NULL, proj.bathy = NULL, boundary = NULL, warp = FALSE, estimate.land = FALSE, downsample = NULL, verbose = TRUE) { # Progress bar #### - pb <- utils::txtProgressBar(min = 0, max = 6, initial = 0, style = 3) + if(verbose) pb <- utils::txtProgressBar(min = 0, max = 8, initial = 0, style = 3) ## General checks #### ### Bathy argument - if(!inherits(bathy, "RasterLayer")) { + if(!inherits(bathy, c("RasterLayer", "stars", "stars_proxy"))) { if(!file.exists(bathy)) stop("Bathy raster file not found. Check the path in the bathy argument.") } ### The depths argument - if(!(is.vector(depths) & inherits(depths, c("numeric", "integer")))) { - stop("The depths parameter has to be a numeric or integer vector.") + if(!is.null(depths)) { + if(!(is.vector(depths) & inherits(depths, c("numeric", "integer")))) { + stop("The depths parameter has to be a numeric or integer vector.") + } } ### The boundary argument @@ -45,7 +52,11 @@ raster_bathymetry <- function(bathy, depths, proj.out = NULL, proj.bathy, bounda if(!is.null(boundary)) { #### If spatialpolygons - if(any(grepl("spatialpolygons|sf", class(boundary), ignore.case = TRUE))) { + if(any(grepl("spatialpolygons|sf|sfc", class(boundary), ignore.case = TRUE))) { + + if(any(grepl("spatialpolygons", class(boundary), ignore.case = TRUE))) { + boundary <- sf::st_as_sf(boundary) + } if(is.na(sf::st_crs(boundary))) { stop("boundary does not contain argument.") @@ -57,7 +68,7 @@ raster_bathymetry <- function(bathy, depths, proj.out = NULL, proj.bathy, bounda } else if(is.character(boundary) & length(boundary) == 1) { if(!file.exists(boundary)) stop("Boundary shapefile not found. Check your path") - boundary <- sf::st_read(boundary, quiet = TRUE) + boundary <- sf::st_read(boundary, quiet = !verbose) if(is.na(sf::st_crs(boundary))) { stop("boundary does not have a defined CRS.") @@ -65,95 +76,187 @@ raster_bathymetry <- function(bathy, depths, proj.out = NULL, proj.bathy, bounda stop("boundary has to be defined as decimal degrees") } - } else if(!(is.vector(boundary) & class(boundary) %in% c("numeric", "integer") & length(boundary) == 4)) { + #### If numeric vector + } else if(!(is.vector(boundary) & inherits(boundary, c("numeric", "integer")) & + length(boundary) == 4)) { stop("The boundary parameter has to be a numeric/integer vector of length 4 giving the decimal degree longitude and latitude limits for the strata region OR a character argument giving the location of the shapefile polygon.") + } else if(length(boundary) == 4 & inherits(boundary, c("numeric", "integer"))) { + if(is.null(names(boundary))) names(boundary) <- c("xmin", "xmax", "ymin", "ymax") + + boundary <- dd_clip_boundary(boundary, 4326) + + if(!is.null(proj.out)) { + boundary <- sf::st_transform(sf::st_as_sfc(sf::st_bbox(sf::st_transform(boundary, sf::st_crs(proj.out)))), 4326) + } } } - utils::setTxtProgressBar(pb, 1) + if(verbose) utils::setTxtProgressBar(pb, 1) ## Open raster ### - if(inherits(bathy, "RasterLayer")) { + if(inherits(bathy, "stars")) { ras <- bathy } else { - if(verbose) { - ras <- raster::raster(bathy) - } else { - ras <- suppressMessages(raster::raster(bathy)) - } + ras <- stars::read_stars(bathy, quiet = !verbose) + # ras <- terra::rast(bathy) + # ras <- raster::raster(bathy) } - utils::setTxtProgressBar(pb, 2) + if(verbose) utils::setTxtProgressBar(pb, 2) - if(missing(proj.bathy)) { - proj.bathy <- convert_crs(4326) + if(is.null(proj.bathy)) { + if(is.na(sf::st_crs(ras))) { + proj.bathy <- 4326 + } else { + proj.bathy <- sf::st_crs(ras) + } } if(is.na(sf::st_crs(ras))) { if(verbose) message(paste0("bathy does not contain coordinate reference information. Using ", proj.bathy, ". Adjust this setting by changing the proj.bathy argument")) - raster::crs(ras) <- raster::crs(proj.bathy) + ras <- sf::st_set_crs(ras, proj.bathy) } if(!is.null(boundary)) { - ras <- raster::crop(ras, raster::extent(boundary)) + if(sf::st_crs(ras) != sf::st_crs(boundary)) { + boundary <- sf::st_bbox( + sf::st_transform( + sf::st_as_sfc( + sf::st_bbox(boundary) + ), + sf::st_crs(ras) + ) + ) + } - if(grepl("spatialpolygons|sf", class(boundary), ignore.case = TRUE)) { - ras <- raster::mask(ras, boundary) + if(inherits(boundary, c("sf", "sfc", "bbox"))) { + ras <- ras[sf::st_bbox(boundary)] + # ras <- raster::crop(ras, raster::extent(boundary)) + # ras <- terra::crop(ras, boundary) + } else { + stop("boundary must be an sf object at this stage") } } - utils::setTxtProgressBar(pb, 3) + if(verbose) utils::setTxtProgressBar(pb, 3) ## Set proj.out (if not set) - if(is.null(proj.out) & !is.na(raster::crs(ras))) { - proj.out <- raster::crs(ras) + if(is.null(proj.out) & !is.na(sf::st_crs(ras))) { + proj.out <- sf::st_crs(ras) } - ## Aggregate (reduce size) #### + ## Downsample (reduce size) #### - if(!is.na(aggregation.factor)) ras <- raster::aggregate(ras, fact = aggregation.factor) + if(!is.null(downsample)) { + if(verbose) { + ras <- stars::st_downsample(ras, n = downsample) + } else { + ras <- suppressMessages(stars::st_downsample(ras, n = downsample)) + } + } - utils::setTxtProgressBar(pb, 4) + if(verbose) utils::setTxtProgressBar(pb, 4) ## Project the raster #### if(sf::st_crs(ras) != sf::st_crs(proj.out)) { - ras <- raster::projectRaster(from = ras, crs = raster::crs(proj.out)) + + ras <- sf::st_transform(ras, sf::st_crs(proj.out)) + + # ras <- sf::st_transform(ras, sf::st_crs(proj.out)) + # ras <- raster::projectRaster(from = ras, crs = raster::crs(proj.out)) + # ras <- terra::project(ras, sf::st_crs(proj.out)$input) + } else { + warp <- FALSE } - utils::setTxtProgressBar(pb, 5) + if(verbose) utils::setTxtProgressBar(pb, 5) - ## Reclassify raster #### + ## Read to memory if not done already - if(all(depths >= 0)) depths <- sort(-1 * depths) + if(inherits(ras, "stars_proxy")){ + ras <- stars::st_as_stars(ras) + } + + if(verbose) utils::setTxtProgressBar(pb, 6) - depths <- sort(unique(c(-Inf, depths, 0, Inf))) + ## Reclassify raster #### - cut_int <- paste(abs(depths[-1]), abs(depths[-length(depths)]), sep = "-") + if(!is.null(depths)) { + ## Raster for vector_bathymetry + + if(all(depths >= 0)) depths <- sort(-1 * depths) + + depths <- sort(unique(c(-Inf, depths, 0, Inf))) + + cut_int <- paste(abs(depths[-1]), abs(depths[-length(depths)]), sep = "-") + + cut_df <- data.frame( + from = depths[-length(depths)], + to = depths[-1], + average = sapply(strsplit(cut_int, "-"), function(k) mean(as.numeric(k))), + interval = cut_int, + stringsAsFactors = FALSE + ) + + ## Remove land + + if(estimate.land) { + cut_df$interval[grepl("Inf-0", cut_df$interval)] <- "land" + } else { + cut_df$interval[grepl("Inf-0", cut_df$interval)] <- NA + # bathy$raster <- stars::st_as_stars(bathy$raster) + # levels(bathy$raster[[1]])[levels(bathy$raster[[1]]) == "land"] <- NA + } + + r <- cut(ras, c(cut_df$from, Inf), labels = cut_df$interval) + + r[[1]] <- factor(r[[1]], levels = rev(levels(r[[1]]))) # reverse the order + + } else { + ## Raster for basemap + + if(inherits(ras[[1]], "units")) { + r <- units::drop_units(ras) + } else { + r <- ras + } + + # Remove land and turn depths to positive values + r[[1]][r[[1]] > 0] <- NA + r[[1]] <- -1*r[[1]] + + # Depth intervals + + cut_df <- range(r[[1]], na.rm = TRUE) + } - cut_df <- data.frame( - from = depths[-length(depths)], - to = depths[-1], - average = sapply(strsplit(cut_int, "-"), function(k) mean(as.numeric(k))), - interval = cut_int, - stringsAsFactors = FALSE - ) + ## Warp to new grid - cut_df[nrow(cut_df), "average"] <- NA + if(verbose) utils::setTxtProgressBar(pb, 7) - cut_matrix <- as.matrix(cut_df[-ncol(cut_df)]) + if(warp) { + + newgrid <- + stars::st_as_stars( + sf::st_bbox( + sf::st_transform( + sf::st_as_sfc( + sf::st_bbox(boundary) + ), + sf::st_crs(proj.out) + ) + ) + ) + + r <- stars::st_warp(r, newgrid) + } - r <- raster::reclassify( - ras, - rcl = cut_matrix, - right = NA, - filename = ifelse(is.null(file.name), '', paste0(file.name, ".grd")) - ) - utils::setTxtProgressBar(pb, 6) + if(verbose) utils::setTxtProgressBar(pb, 8) ## Return diff --git a/R/reorder_layers.R b/R/reorder_layers.R index 77e654f..45c2242 100644 --- a/R/reorder_layers.R +++ b/R/reorder_layers.R @@ -7,11 +7,11 @@ #' @import ggplot2 #' @author Mikko Vihtakari #' @examples -#' if(requireNamespace("ggOceanMapsData", quietly = TRUE)) { +#' if(requireNamespace("ggspatial", quietly = TRUE)) { #' \donttest{ #' data("ices_areas") #' p <- basemap(c(-20, 15, 50, 70)) + -#' annotation_spatial(ices_areas, aes(fill = Area_Full), show.legend = FALSE) +#' ggspatial::annotation_spatial(ices_areas, aes(fill = Area_Full), show.legend = FALSE) #' #' # Polygons on top of land #' p diff --git a/R/shapefile_list.R b/R/shapefile_list.R index fc5b081..4aea479 100644 --- a/R/shapefile_list.R +++ b/R/shapefile_list.R @@ -4,12 +4,12 @@ #' @param get.data Logical indicating whether spatial data should be returned instead of names of spatial data objects. #' @details Custom shapefiles for \code{\link{basemap}} should be defined as lists with (at least) following names (everything should be provided as characters): #' \itemize{ -#' \item \strong{land} Object name of the \code{\link[sp:SpatialPolygons]{SpatialPolygonsDataFrame}} containing land. Required. -#' \item \strong{glacier} Object name of the \code{\link[sp:SpatialPolygons]{SpatialPolygonsDataFrame}} containing glaciers. Use \code{NULL} if glaciers are not needed. -#' \item \strong{bathy} Object name of the \code{\link[sp:SpatialPolygons]{SpatialPolygonsDataFrame}} containing bathymetry contours. Use \code{NULL} if bathymetry is not needed. +#' \item \strong{land} Name of the object containing land polygons. Required. +#' \item \strong{glacier} Name of the object containing glacier polygons. Use \code{NULL} if glaciers are not needed. +#' \item \strong{bathy} Name of the object containing land polygons. Use \code{NULL} if bathymetry is not needed. #' } #' -#' All linked spatial data objects must be in same projection. Pre-made shapefiles contain additional elements that are used in the \code{\link{basemap}} function, but not required for custom shapefile datasets. +#' All linked spatial data objects must be in same projection. High-resolution pre-made data are still under development and may not be available. Pre-made shapefiles contain additional elements that are used in the \code{\link{basemap}} function, but not required for custom shapefile datasets. #' #' @return Returns a data frame of provided pre-made shapefiles, if \code{name = "all"}. Returns a shapefile list containing the information for a particular map otherwise. #' @family basemap functions @@ -26,61 +26,105 @@ shapefile_list <- function(name, get.data = FALSE) { alternatives <- list( list(name = "ArcticStereographic", - land = "ggOceanMapsData::arctic_land", - glacier = "ggOceanMapsData::arctic_glacier", - bathy = "ggOceanMapsData::arctic_bathy", + land = file.path(options("ggOceanMaps.datapath"), "arctic_land"), + glacier = file.path(options("ggOceanMaps.datapath"), "arctic_glacier"), + bathy = c( + "raster_binned" = "dd_rbathy", + "raster_continuous" = file.path(options("ggOceanMaps.datapath"), "dd_rbathy_cont"), + "raster_user" = getOption("ggOceanMaps.userpath"), + "vector" = file.path(options("ggOceanMaps.datapath"), "arctic_bathy")), crs = 3995, limits = 30, - path = NA), + path = c("ggOceanMapsLargeData" = + "https://github.com/MikkoVihtakari/ggOceanMapsLargeData/raw/master/data/") + ), list(name = "AntarcticStereographic", - land = "ggOceanMapsData::antarctic_land", - glacier = "ggOceanMapsData::antarctic_glacier", - bathy = "ggOceanMapsData::antarctic_bathy", + land = file.path(options("ggOceanMaps.datapath"), "antarctic_land"), + glacier = file.path(options("ggOceanMaps.datapath"), "antarctic_glacier"), + bathy = c( + "raster_binned" = "dd_rbathy", + "raster_continuous" = file.path(options("ggOceanMaps.datapath"), "dd_rbathy_cont"), + "raster_user" = getOption("ggOceanMaps.userpath"), + "vector" = file.path(options("ggOceanMaps.datapath"), "antarctic_bathy")), crs = 3031, limits = -35, - path = NA), + path = c("ggOceanMapsLargeData" = + "https://github.com/MikkoVihtakari/ggOceanMapsLargeData/raw/master/data/") + ), list(name = "DecimalDegree", - land = "ggOceanMapsData::dd_land", - glacier = "ggOceanMapsData::dd_glacier", - bathy = "ggOceanMapsData::dd_bathy", + land = "dd_land", + glacier = file.path(options("ggOceanMaps.datapath"), "dd_glacier"), + bathy = c( + "raster_binned" = "dd_rbathy", + "raster_continuous" = file.path(options("ggOceanMaps.datapath"), "dd_rbathy_cont"), + "raster_user" = getOption("ggOceanMaps.userpath"), + "vector" = file.path(options("ggOceanMaps.datapath"), "dd_bathy")), crs = 4326, limits = c(-180, 180, -90, 90), - path = NA), + path = c("ggOceanMapsLargeData" = + "https://github.com/MikkoVihtakari/ggOceanMapsLargeData/raw/master/data/") + ), list(name = "Svalbard", - land = "svalbard_land", - glacier = "svalbard_glacier", - bathy = "svalbard_bathy", + land = file.path(options("ggOceanMaps.datapath"), "svalbard_land"), + glacier = file.path(options("ggOceanMaps.datapath"), "svalbard_glacier"), + bathy = c( + "raster_binned" = "dd_rbathy", + "raster_continuous" = file.path(options("ggOceanMaps.datapath"), "dd_rbathy_cont"), + "raster_user" = getOption("ggOceanMaps.userpath"), + "vector" = file.path(options("ggOceanMaps.datapath"), "svalbard_bathy")), crs = 32633, limits = c(402204.7, 845943.9, 8253526.1, 8978517.5), - path = "https://github.com/MikkoVihtakari/ggOceanMapsLargeData/raw/master/data/svalbard.rda"), + path = c("ggOceanMapsLargeData" = + "https://github.com/MikkoVihtakari/ggOceanMapsLargeData/raw/master/data/") + ), list(name = "BarentsSea", - land = "barentssea_land", - glacier = "barentssea_glacier", - bathy = "barentssea_bathy", + land = file.path(options("ggOceanMaps.datapath"), "barentssea_land"), + glacier = file.path(options("ggOceanMaps.datapath"), "barentssea_glacier"), + bathy = c( + "raster_binned" = "barentssea_rbathy", + "raster_continuous" = file.path(options("ggOceanMaps.datapath"), "dd_rbathy_cont"), + "raster_user" = getOption("ggOceanMaps.userpath"), + "vector" = file.path(options("ggOceanMaps.datapath"), "barentssea_bathy")), crs = 32636, limits = c(-400000, 1300000, 7400000, 9350000), - path = "https://github.com/MikkoVihtakari/ggOceanMapsLargeData/raw/master/data/barentssea.rda"), - list(name = "IBCAO", - land = "ggOceanMapsData::arctic_land", - glacier = "ggOceanMapsData::arctic_glacier", - bathy = "ibcao_bathy", - crs = 3995, - limits = c(-2879810, 2700490, -2879810, 2879810), - path = "https://github.com/MikkoVihtakari/ggOceanMapsLargeData/raw/master/data/ibcao_bathy.rda"), - list(name = "GEBCO", - land = "ggOceanMapsData::arctic_land", - glacier = "ggOceanMapsData::arctic_glacier", - bathy = "gebco_bathy", - crs = 3995, - limits = 30, - path = "https://github.com/MikkoVihtakari/ggOceanMapsLargeData/raw/master/data/gebco_bathy.rda"), - list(name = "EMODnet", - land = "emodnet_land", - glacier = NA, - bathy = "emodnet_bathy", - crs = 3575, - limits = c(-3e5, -1e5, -3.1e6, -2.9e6), - path = "https://github.com/MikkoVihtakari/ggOceanMapsLargeData/raw/master/data/emodnet.rda") + path = "https://github.com/MikkoVihtakari/ggOceanMapsLargeData/raw/master/data/barentssea.rda" + )#, + # list(name = "IBCAO", + # land = file.path(options("ggOceanMaps.datapath"), "ibcao_land"), + # glacier = file.path(options("ggOceanMaps.datapath"), "ibcao_glacier"), + # bathy = c( + # "raster_binned" = "ibcao_rbathy", + # "raster_continuous" = file.path(options("ggOceanMaps.datapath"), "ibcao_rbathy_cont"), + # "raster_user" = getOption("ggOceanMaps.userpath"), + # "vector" = file.path(options("ggOceanMaps.datapath"), "ibcao_bathy")), + # crs = 3995, + # limits = c(-2879810, 2700490, -2879810, 2879810), + # path = "https://github.com/MikkoVihtakari/ggOceanMapsLargeData/raw/master/data/ibcao_bathy.rda" + # ), + # list(name = "GEBCO", + # land = file.path(options("ggOceanMaps.datapath"), "gebco_land"), + # glacier = file.path(options("ggOceanMaps.datapath"), "gebco_glacier"), + # bathy = c( + # "raster_binned" = "ibcao_rbathy", + # "raster_continuous" = file.path(options("ggOceanMaps.datapath"), "gebco_rbathy_cont"), + # "raster_user" = getOption("ggOceanMaps.userpath"), + # "vector" = file.path(options("ggOceanMaps.datapath"), "gebco_bathy")), + # crs = 3995, + # limits = 30, + # path = "https://github.com/MikkoVihtakari/ggOceanMapsLargeData/raw/master/data/gebco_bathy.rda" + # ), + # list(name = "EMODnet", + # land = file.path(options("ggOceanMaps.datapath"), "emodnet_land"), + # glacier = NA, + # bathy = c( + # "raster_binned" = "emodnet_rbathy", + # "raster_continuous" = file.path(options("ggOceanMaps.datapath"), "emodnet_rbathy_cont"), + # "raster_user" = getOption("ggOceanMaps.userpath"), + # "vector" = file.path(options("ggOceanMaps.datapath"), "emodnet_bathy")), + # crs = 3575, + # limits = c(-3e5, -1e5, -3.1e6, -2.9e6), + # path = "https://github.com/MikkoVihtakari/ggOceanMapsLargeData/raw/master/data/emodnet.rda" + # ) ) names(alternatives) <- sapply(alternatives, function(k) k$name) @@ -91,6 +135,7 @@ shapefile_list <- function(name, get.data = FALSE) { out <- do.call(rbind, lapply(alternatives, function(k) { + k$bathy <- paste(k$bathy, collapse = "|") k$limits <- paste0("c(",paste(k$limits, collapse = ", "), ")") data.frame(k, stringsAsFactors = FALSE) }) diff --git a/R/shapefiles-docs.R b/R/shapefiles-docs.R new file mode 100644 index 0000000..bb441bc --- /dev/null +++ b/R/shapefiles-docs.R @@ -0,0 +1,29 @@ +#' @title Decimal degree bathymetry +#' @docType data +#' @keywords maps shapefiles +#' @family mapfiles +#' @name dd_rbathy +#' @format \link[stars:read_stars]{Raster} bathymetry in decimal degrees (EPSG:4326). Downsampled from ETOPO 60 arc-second grid. +#' @source NOAA National Centers for Environmental Information. 2022: ETOPO 2022 15 Arc-Second Global Relief Model. NOAA National Centers for Environmental Information. \doi{10.25921/fd45-gt74} +#' @importFrom stars read_stars +"dd_rbathy" + +#' @title Decimal degree land shapes +#' @docType data +#' @keywords maps shapefiles +#' @family mapfiles +#' @name dd_land +#' @format \link[sf:st_sf]{Simple feature collection} land shapes in decimal degrees (EPSG:4326). Obtained from Natural Earth Data (10m vectors). Includes the islands dataset. +#' @source \href{https://www.naturalearthdata.com/}{Natural Earth Data} +#' @importFrom sf st_sf +"dd_land" + +# @title Decimal degree glacier shapes +# @docType data +# @keywords maps shapefiles +# @family mapfiles +# @name dd_glacier +# @format \link[sf:st_sf]{Simple feature collection} glacier shapes in decimal degrees (EPSG:4326). Obtained from Natural Earth Data (10m vectors). Includes the ice-sheets dataset. +# @source \href{https://www.naturalearthdata.com/}{Natural Earth Data} +# @importFrom sf st_sf +# "dd_glacier" diff --git a/R/theme_map.R b/R/theme_map.R index 78dd9e6..07a76f1 100644 --- a/R/theme_map.R +++ b/R/theme_map.R @@ -12,7 +12,7 @@ theme_map <- function(..., grid.col, grid.size) { theme_bw(...) %+replace% theme(panel.background = element_blank(), panel.border = element_rect(fill = NA, colour = "black", size = 0.2), - panel.grid = element_line(colour = grid.col, size = grid.size), + panel.grid = element_line(colour = grid.col, linewidth = grid.size), plot.background = element_blank(), axis.ticks.x = element_line(colour = "black", size = 0.2), axis.ticks.y = element_line(colour = "black", size = 0.2), diff --git a/R/transform_coord.R b/R/transform_coord.R index 583f1f0..3b1145a 100644 --- a/R/transform_coord.R +++ b/R/transform_coord.R @@ -1,5 +1,6 @@ #' @title Transform spatial coordinates to another projection #' @description Transforms spatial coordinates from original projection (decimal degrees assumed) to another projection. +#' @inheritParams basemap #' @param x Data frame to be transformed. Can be omitted if numeric vectors are assigned to \code{lon} and \code{lat}. #' @param lon,lat Either a name of the longitude and latitude columns in \code{x} or a numeric vector containing longitude and latitude coordinates. Use \code{NULL} to \link[=guess_coordinate_columns]{guess the longitude and/or latitude columns} in \code{x}. #' @param new.names Character vector of length 2 specifying the names of transformed longitude and latitude columns, respectively. Alternatively \code{NULL}, which returns column names from \code{x} or "auto", which uses \code{NULL} if \code{bind = FALSE} and \code{c("lon.proj", "lat.proj")} if \code{bind = TRUE}. @@ -24,10 +25,9 @@ #' @export ## Debug parameters -# x = NULL; lon = NULL; lat = NULL; new.names = "auto"; proj.in = 4326; proj.out = NULL; verbose = FALSE; bind = FALSE; na = "ignore" -# x = data; bind = TRUE; new.names = "auto"; na = "ignore" +# lon = NULL; lat = NULL; new.names = "auto"; rotate = FALSE; proj.in = 4326; proj.out = NULL; verbose = FALSE; bind = FALSE; na = "ignore" -transform_coord <- function(x = NULL, lon = NULL, lat = NULL, new.names = "auto", proj.in = 4326, proj.out = NULL, verbose = FALSE, bind = FALSE, na = "ignore") { +transform_coord <- function(x = NULL, lon = NULL, lat = NULL, new.names = "auto", rotate = FALSE, proj.in = 4326, proj.out = NULL, verbose = FALSE, bind = FALSE, na = "ignore") { # Checks ---- @@ -59,10 +59,8 @@ transform_coord <- function(x = NULL, lon = NULL, lat = NULL, new.names = "auto" if(is.null(proj.in)) { if(is.null(x)) stop("a spatial object as x is required when proj.in = NULL") - if("sf" %in% class(x)) { + if(inherits(x, c("data.frame", "data.table", "sf", "sfc", "SpatialPolygonsDataFrame", "SpatialPolygons", "SpatialPoints", "SpatialPointsDataFrame"))) { proj.in <- sf::st_crs(x) - } else if("sp" %in% class(x)) { - proj.in <- raster::crs(x) } else stop("a spatial object of class sf or sp as x is required when proj.in = NULL") } @@ -134,32 +132,56 @@ transform_coord <- function(x = NULL, lon = NULL, lat = NULL, new.names = "auto" ## Output projection if not defined --- if(is.null(proj.out)) { - limits <- c(range(y[[lon]]), range(y[[lat]])) - shapefile.def <- define_shapefiles(limits) + # limits <- sf::st_bbox(sf::st_as_sf(y, coords = c(lon, lat), crs = proj.in))[c("xmin", "xmax", "ymin", "ymax")] # does not work for antimeridian + # limits <- c(range(y[[lon]]), range(y[[lat]])) # can't do this because lon coordinates won't get ordered correctly + # limits <- auto_limits(y)$ddLimits # note: can't do this because auto_limits uses transform_coord() + + shapefile.def <- define_shapefiles(c(range(y[[lon]]), range(y[[lat]]))) proj.out <- sf::st_crs(shapefile_list(shapefile.def$shapefile.name)$crs) + + if(sf::st_is_longlat(proj.in)) { + limits <- c(deg_to_dd(range(dd_to_deg(y[[lon]]))), range(y[[lat]])) + names(limits) <- c("xmin", "xmax", "ymin", "ymax") + } else { + limits <- sf::st_bbox( + sf::st_transform( + sf::st_as_sfc( + sf::st_bbox( + sf::st_as_sf(y, coords = c(lon, lat), crs = proj.in) + ) + ), + 4326) + )[c("xmin", "xmax", "ymin", "ymax")] + } } ## Fix the CRS if(!inherits(proj.in, "crs")) { error_test <- quiet(try(sf::st_crs(proj.in), silent = TRUE)) - + if(inherits(error_test, "try-error")) { stop("Failed to convert the argument proj.in to sf::st_crs object in the transform_coord function. This is likely a bug. If so, please file a bug report on GitHub.") } else { proj.in <- error_test } } - + if(!inherits(proj.out, "crs")) { error_test <- quiet(try(sf::st_crs(proj.out), silent = TRUE)) - + if(inherits(error_test, "try-error")) { stop("Failed to convert the argument proj.out to sf::st_crs object in the transform_coord function. This is likely a bug. If so, please file a bug report on GitHub.") } else { proj.out <- error_test } } + + ## Rotate + + if(rotate) { + proj.out <- rotate_crs(proj.out, limits[1:2]) + } ## Coordinate transformation --- @@ -169,23 +191,7 @@ transform_coord <- function(x = NULL, lon = NULL, lat = NULL, new.names = "auto" c(lon, lat)), id = y$id) - # lon = sample(-30:60, 1e2, replace = TRUE); lat = sample(45:80, 1e2, replace = TRUE); y = data.frame(lon, lat); lon = "lon"; lat = "lat" # <- Debugging code - - ## sf version - # y <- sf::st_as_sf(y, coords = c(lon, lat), - # crs = as.integer(gsub("\\D", "", proj.in))) - # - # y <- sf::st_coordinates(sf::st_transform(y, proj.out)) - # - # colnames(y)[colnames(y) == "X"] <- lon - # colnames(y)[colnames(y) == "Y"] <- lat - - ## sp version - # sp::coordinates(y) <- c(lon, lat) - # sp::proj4string(y) <- if(class(proj.in) == "CRS") {proj.in} else {sp::CRS(proj.in)} - # - # y <- sp::spTransform(y, if(class(proj.out) == "CRS") {proj.out} else {sp::CRS(proj.out)}) - # y <- data.frame(sp::coordinates(y)) + # sf::st_transform(sf::st_as_sf(y, coords = c(lon, lat), crs = proj.in), proj.out) ## ---- diff --git a/R/vector_bathymetry.R b/R/vector_bathymetry.R index 173bb16..68c389e 100644 --- a/R/vector_bathymetry.R +++ b/R/vector_bathymetry.R @@ -2,163 +2,122 @@ #' @description Vectorizes bathymetry rasters. Designed to be used for the output of \code{\link{raster_bathymetry}} function. Warning: processing may take a long time if the bathymetry raster is large. #' @param bathy bathyRaster object from the \code{\link{raster_bathymetry}} function. #' @param drop.crumbs Single numeric value specifying a threshold (area in km2) for disconnected polygons which should be removed. Set to \code{NULL} to bypass the removal. Uses the \link[smoothr]{drop_crumbs} function. -#' @param remove.holes Single numeric value specifying a threshold (area in km2) for holes which should be removed. Set to \code{NULL} to bypass the removal. Uses the \link[smoothr]{fill_holes} function. +#' @param remove.holes Single numeric value specifying a threshold (area in km2) for holes which should be removed. Set to \code{NULL} to bypass the removal. Uses the \link[smoothr]{fill_holes} function. Currently VERY slow. #' @param smooth Logical indicating whether the pixelated contours should be smoothed. Uses the \link[smoothr]{smooth_ksmooth} function. -#' @param output.sf Logical indicating whether an \code{\link[sf:st_polygon]{sf}} (\code{TRUE}) or \code{sp} (\code{FALSE}) polygon should be returned. #' @details The \code{drop.crumbs} and \code{remove.holes} arguments can be used to make the resulting object smaller in file size. The \code{smooth} argument can be used to remove the pixelated contours, but often increases file size. Note also that using this option will bias the contours with respect to real world. -#' @return An \link[sf:st_polygon]{sf} or \code{sp} object containing the depth polygons. Uses same projection than \code{bathy} (see \code{\link[sf:st_crs]{CRS}}). +#' @return An \link[sf:st_polygon]{sf} object containing the depth polygons. Uses same projection than \code{bathy} (see \code{\link[sf:st_crs]{CRS}}). #' @author Mikko Vihtakari #' @family create shapefiles #' @export -# bathy = rb; drop.crumbs = NULL; remove.holes = NULL; smooth = FALSE; output.sf = FALSE -vector_bathymetry <- function(bathy, drop.crumbs = NULL, remove.holes = NULL, smooth = FALSE, output.sf = FALSE) { - +# bathy = rb; drop.crumbs = NULL; remove.holes = NULL; smooth = FALSE +vector_bathymetry <- function(bathy, drop.crumbs = NULL, remove.holes = NULL, smooth = FALSE) { + + ## Turn off s2 + # s2_mode <- sf::sf_use_s2() + # suppressMessages(sf::sf_use_s2(FALSE)) + # on.exit({suppressMessages(sf::sf_use_s2(s2_mode))}) + # Progress bar #### - - pb <- utils::txtProgressBar(min = 0, max = 6, initial = 0, style = 3) - + + pb <- utils::txtProgressBar(min = 0, max = 7, initial = 0, style = 3) + ## General checks #### - + ### Bathy argument - + if(!inherits(bathy, "bathyRaster")) stop("bathy has to be output from the raster_bathymetry function.") if(is.na(sf::st_crs(bathy$raster))) stop("bathy does not contain coordinate reference information") - + if(!inherits(bathy$raster, c("stars", "stars_proxy"))) stop("bathy$raster has to be a stars object") + ### The drop.crumbs argument - + if(!is.null(drop.crumbs)) { if(!(is.vector(drop.crumbs) & class(drop.crumbs) %in% c("numeric", "integer") & length(drop.crumbs) == 1)) { stop("The drop.crumbs parameter has to be a single value.") } } - + utils::setTxtProgressBar(pb, 1) - + ## Polygonization #### - - pol <- sf::st_as_sf(stars::st_as_stars(bathy$raster), as_points = FALSE, merge = TRUE) - + + pol <- sf::st_as_sf(bathy$raster, as_points = FALSE, merge = TRUE) + utils::setTxtProgressBar(pb, 2) - + ### Validate the polygon - - if(!all(sf::st_is_valid(pol))) { - pol <- sf::st_make_valid(pol) - - if(!all(sf::st_is_valid(pol))) stop("The initial geometry validation did not work. You are skrewed...or try the buffer +/- trick.") - } - + + # if(!all(sf::st_is_valid(pol))) { + pol <- sf::st_make_valid(pol) + + if(!all(sf::st_is_valid(pol))) stop("The initial geometry validation did not work. You are skrewed...or try the buffer +/- trick.") + # } + utils::setTxtProgressBar(pb, 3) - + ## Drop crumbs and holes - + if(!is.null(drop.crumbs)) { pol <- pol[sf::st_area(pol) > units::set_units(drop.crumbs, "km^2", mode = "standard"),] } - + utils::setTxtProgressBar(pb, 4) - + if(!is.null(remove.holes)) { + # This operation is painfully slow. nngeo::st_remove_holes(pol, max_area = remove.holes * 1e6) would + # potentially do the same, but is also slow. pol <- smoothr::fill_holes(pol, units::set_units(remove.holes, "km^2", mode = "standard")) } - + utils::setTxtProgressBar(pb, 5) - + ## Smooth and simplify - + if(smooth) { pol <- smoothr::smooth(pol, method = "ksmooth") - # smoothr::smooth_ksmooth(pol, smoothness = 2, wrap = TRUE, n = 2L) - # pol <- ksmooth_polys(x = pol, k = 2, N = 2L) if(!all(sf::st_is_valid(pol))) { pol <- sf::st_make_valid(pol) } pol <- sf::st_simplify(pol, preserveTopology = TRUE) - # x <- rgeos::gSimplify(pol, tol = 500, topologyPreserve = TRUE) - # pol <- SpatialPolygonsDataFrame(x, pol@data) + } - + utils::setTxtProgressBar(pb, 6) - + ## Manipulate depth data - + names(pol)[1] <- "depth" - tmp <- pol$depth - - # if(!any(grepl("depth", names(tmp), ignore.case = TRUE))) { - # names(tmp)[1] <- "depth" - # } else { - # names(tmp)[grepl("depth", names(tmp), ignore.case = TRUE)] <- "depth" - # } - - tmp <- factor(tmp, levels = sort(unique(tmp))) - - level_key <- bathy$depth.invervals$interval[c(-nrow(bathy$depth.invervals))] - names(level_key) <- bathy$depth.invervals$average[c(-nrow(bathy$depth.invervals))] - level_key <- rev(level_key) - - tmp <- dplyr::recode_factor(tmp, !!!level_key) - - pol$depth <- tmp - + # pol$depth <- factor(pol$depth, levels = rev(levels(pol$depth))) + # tmp <- pol$depth + # + # tmp <- factor(tmp, levels = sort(unique(tmp))) + # + # level_key <- bathy$depth.invervals$interval[c(-nrow(bathy$depth.invervals))] + # names(level_key) <- bathy$depth.invervals$average[c(-nrow(bathy$depth.invervals))] + # level_key <- rev(level_key) + # + # tmp <- dplyr::recode_factor(tmp, !!!level_key) + # + # pol$depth <- tmp + ## Final validation - - # if(!suppressMessages(suppressWarnings(rgeos::gIsValid(pol)))) { - # pol <- suppressMessages(suppressWarnings(rgeos::gBuffer(pol, byid = TRUE, width = 0))) - # - # if(!rgeos::gIsValid(pol)) stop("The final geometry validation did not work. Adjust something.") - # } - - # setTxtProgressBar(pb, 7) - - ## Return - - if(output.sf) { - return(pol) - } else { - return(sf::as_Spatial(pol)) + + if(!all(sf::st_is_valid(pol))) { + pol <- sf::st_make_valid(pol) } + if(!all(sf::st_is_valid(pol))) stop("The final geometry validation did not work. Adjust something.") + + utils::setTxtProgressBar(pb, 7) + + ## Return + + return(pol) + } - -# @title Wrapper to \code{\link[smoothr]{smooth_ksmooth}} SpatialPolygonsDataFrames -# @param x SpatialPolygonsDataFrame -# @param k The \code{smoothness} parameter in \code{\link[smoothr]{smooth_ksmooth}} -# @param N The \code{n} parameter in \code{\link[smoothr]{smooth_ksmooth}} -# @return Smoothed \code{SpatialPolygonsDataFrame} -# @keywords internal -# @import smoothr -# @export - -# ksmooth_polys <- function(x, k, N) { -# -# total <- length(st_geometry(x)) -# # pb <- txtProgressBar(min = 1, max = total, style = 3) -# -# tp <- lapply(1:total, function(i) { -# # setTxtProgressBar(pb, i) -# -# for(j in 1:length(x@polygons[[i]]@Polygons)) { -# x@polygons[[i]]@Polygons[[j]]@coords <- -# st_polygon(stats::na.omit( -# list(smoothr::smooth_ksmooth(st_coordinates(x[1,])[,1:2], -# smoothness = k, wrap = TRUE, n = N) -# ))) -# } -# -# x@polygons[[i]] -# -# }) -# -# out <- sp::SpatialPolygons(tp) -# suppressWarnings(sp::proj4string(out) <- sp::proj4string(x)) -# -# sp::SpatialPolygonsDataFrame(out, x@data) -# } -# diff --git a/R/zzz.R b/R/zzz.R index 3b02069..516ca5b 100644 --- a/R/zzz.R +++ b/R/zzz.R @@ -33,7 +33,7 @@ ".ggOceanMapsenv <- new.env(); ", ".ggOceanMapsenv$datapath <- 'YourCustomPath'}", ". You can use usethis::edit_r_profile() to edit the file.", - "'~/Documents/ggOceanMapsLargeData'", + " normalizePath('~/Documents/ggOceanMapsLargeData') ", "would make it in a writable folder on most operating systems.") } else { msg <- paste0("ggOceanMaps: Using ", getOption("ggOceanMaps.datapath"), diff --git a/README.Rmd b/README.Rmd index 03b8e3d..188bfb7 100644 --- a/README.Rmd +++ b/README.Rmd @@ -31,6 +31,8 @@ knitr::opts_chunk$set( +**Breaking news (pun intended): ggOceanMaps goes [sf](https://r-spatial.github.io/sf/)!** Most of ggOceanMaps code has been rewritten. There are plenty of new features in 2.0 (see [this](https://mikkovihtakari.github.io/ggOceanMaps/articles/new-features.html)), but likely also many new bugs. Please [report them here](https://github.com/MikkoVihtakari/ggOceanMaps/issues). + ## Overview The ggOceanMaps package for [R](https://www.r-project.org/) allows plotting data on bathymetric maps using [ggplot2](https://ggplot2.tidyverse.org/reference). The package is designed for ocean sciences and greatly simplifies bathymetric map plotting anywhere around the globe. ggOceanMaps uses openly available geographic data. Citing the particular data sources is advised by the CC-BY licenses whenever maps from the package are published (see the [*Citations and data sources*](#citations-and-data-sources) section). @@ -41,36 +43,23 @@ The ggOceanMaps package has been developed by the [Institute of Marine Research] The package is available on [CRAN](https://CRAN.R-project.org/package=ggOceanMaps) and as a [GitHub version](https://github.com/MikkoVihtakari/ggOceanMaps), which is updated more frequently than the CRAN version. -Installation of the CRAN version: +Installation of the GitHub version: ```{r eval = FALSE} -install.packages("ggOceanMaps") +remotes::install_github("MikkoVihtakari/ggOceanMaps") ``` -Due to the package size limitations, ggOceanMaps requires the [ggOceanMapsData](https://github.com/MikkoVihtakari/ggOceanMapsData) package which stores shapefiles used in low-resolution maps. If the data package does not install automatically, you can install it from GitHub using devtools or remotes packages, or a drat repository on Windows and Mac: +Installation of the CRAN version: ```{r eval = FALSE} -remotes::install_github("MikkoVihtakari/ggOceanMapsData") - -# OR: - -install.packages( - "ggOceanMapsData", - repos = c("https://mikkovihtakari.github.io/drat", - "https://cloud.r-project.org") -) +install.packages("ggOceanMaps") ``` -The GitHub version of ggOceanMaps can be installed using the [**devtools**](https://cran.r-project.org/web/packages/devtools/index.html) package. - -```{r eval = FALSE} -remotes::install_github("MikkoVihtakari/ggOceanMapsData") # required by ggOceanMaps -remotes::install_github("MikkoVihtakari/ggOceanMaps") -``` +The new 2.0 version of ggOceanMaps does not require the ggOceanMapsData package any longer. Detailed map data are downloaded when needed from the [ggOceanMapsLargeData](#data-path) repository. ## Usage -**ggOceanMaps** extends on [**ggplot2**](http://ggplot2.tidyverse.org/reference/). The package uses spatial shapefiles, [GIS packages for R](https://cran.r-project.org/web/views/Spatial.html) to manipulate, and the [**ggspatial**](https://cran.r-project.org/web/packages/ggspatial/index.html) package to help to plot these shapefiles. The shapefile plotting is conducted internally in the `basemap` function and uses [ggplot's sf object plotting capabilities](https://ggplot2.tidyverse.org/reference/ggsf.html). Maps are plotted using the `basemap()` or `qmap()` functions that work almost similarly to [`ggplot()` as a base](https://ggplot2.tidyverse.org/reference/index.html) for adding further layers to the plot using the `+` operator. The maps generated this way already contain multiple ggplot layers. Consequently, the [`data` argument](https://ggplot2.tidyverse.org/reference/ggplot.html) needs to be explicitly specified inside `geom_*` functions when adding `ggplot2` layers. Depending on the location of the map, the underlying coordinates may be projected. Decimal degree coordinates need to be transformed to the projected coordinates using the `transform_coord`, [ggspatial](https://paleolimbot.github.io/ggspatial/), or [`geom_sf` functions.](https://ggplot2.tidyverse.org/reference/ggsf.html) +**ggOceanMaps** extends on [**ggplot2**](http://ggplot2.tidyverse.org/reference/). The package uses spatial ([**sf**](https://r-spatial.github.io/sf/)) shape- (e.g. vector) and ([**stars**](https://r-spatial.github.io/stars/)) raster files, [geospatial packages for R](https://cran.r-project.org/web/views/Spatial.html) to manipulate, and ggplot2 to plot these data. The vector and raster plotting is conducted internally in the `basemap` function, which uses [ggplot's sf object plotting capabilities](https://ggplot2.tidyverse.org/reference/ggsf.html). Maps are plotted using the `basemap()` or `qmap()` functions that work almost similarly to [`ggplot()` as a base](https://ggplot2.tidyverse.org/reference/index.html) for adding further layers to the plot using the `+` operator. The maps generated this way already contain multiple ggplot layers. Consequently, the [`data` argument](https://ggplot2.tidyverse.org/reference/ggplot.html) needs to be explicitly specified inside `geom_*` functions when adding `ggplot2` layers. Depending on the location of the map, the underlying coordinates may be projected. Decimal degree coordinates need to be transformed to the projected coordinates using the `transform_coord`, [ggspatial](https://paleolimbot.github.io/ggspatial/), or [`geom_sf` functions.](https://ggplot2.tidyverse.org/reference/ggsf.html) ```{r} library(ggOceanMaps) @@ -86,14 +75,14 @@ See the [ggOceanMaps website](https://mikkovihtakari.github.io/ggOceanMaps/index ## Data path -While ggOceanMaps allows plotting any custom-made shapefiles, the package contains a shortcut to plot higher resolution maps for [certain areas needed by the author](https://github.com/MikkoVihtakari/ggOceanMapsLargeData/tree/master/data) without the need of generating the shapefiles manually. These high-resolution shapefiles are downloaded from the [ggOceanMapsLargeData](https://github.com/MikkoVihtakari/ggOceanMapsLargeData) repository. As a default, the shapefiles are downloaded into a temporary directory meaning that the user would need to download the large shapefiles every time they restart R. This limitation is set by [CRAN policies](https://cran.r-project.org/web/packages/policies.html). You can define a custom folder for high-resolution shapefiles on your computer by modifying your .Rprofile file (e.g. `usethis::edit_r_profile()`). Add the following lines to the file: +While ggOceanMaps allows plotting any custom-made shapefiles, the package contains a shortcut to plot higher resolution maps for [certain areas needed by the author](https://mikkovihtakari.github.io/ggOceanMaps/articles/premade-maps.html) without the need of generating the shapefiles manually. These high-resolution shapefiles are downloaded from the [ggOceanMapsLargeData](https://github.com/MikkoVihtakari/ggOceanMapsLargeData) repository. As a default, the shapefiles are downloaded into a temporary directory meaning that the user would need to download the large shapefiles every time they restart R. This limitation is set by [CRAN policies](https://cran.r-project.org/web/packages/policies.html). You can define a custom folder for high-resolution shapefiles on your computer by modifying your .Rprofile file (e.g. `usethis::edit_r_profile()`). Add the following lines to the file: ```{r eval = FALSE} .ggOceanMapsenv <- new.env() .ggOceanMapsenv$datapath <- 'YourCustomPath' ``` -It is smart to use a directory R has writing access to. For example `"~/Documents/ggOceanMapsLargeData"` would work for most operating systems. +It is smart to use a directory R has writing access to. For example `normalizePath("~/Documents/ggOceanMapsLargeData")` would work for most operating systems. You will need to set up the data path to your .Rprofile file only once and ggOceanMaps will find the path even though you updated your R or packages. ggOceanMaps will inform you about your data path when you load the package. @@ -103,9 +92,9 @@ The data used by the package are not the property of the Institute of Marine Res - **ggOceanMapsData land polygons.** [Natural Earth Data](https://www.naturalearthdata.com/downloads/10m-physical-vectors/) 1:10m Physical Vectors with the Land and Minor Island datasets combined. Distributed under the [CC Public Domain license](https://creativecommons.org/publicdomain/) ([terms of use](https://www.naturalearthdata.com/about/terms-of-use/)). - **ggOceanMapsData glacier polygons.** [Natural Earth Data](https://www.naturalearthdata.com/downloads/10m-physical-vectors/) 1:10m Physical Vectors with the Glaciated Areas and Antarctic Ice Shelves datasets combined. Distributed under the [CC Public Domain license](https://creativecommons.org/publicdomain/) ([terms of use](https://www.naturalearthdata.com/about/terms-of-use/)). -- **ggOceanMapsData bathymetry.** [Amante, C. and B.W. Eakins, 2009. ETOPO1 1 Arc-Minute Global Relief Model: Procedures, Data Sources and Analysis. NOAA Technical Memorandum NESDIS NGDC-24. National Geophysical Data Center, NOAA](https://www.ngdc.noaa.gov/mgg/global/relief/ETOPO1/docs/ETOPO1.pdf). Distributed under the [U.S. Government Work license](https://www.usa.gov/government-works). +- **ggOceanMapsData bathymetry.** [NOAA National Centers for Environmental Information. 2022: ETOPO 2022 15 Arc-Second Global Relief Model. NOAA National Centers for Environmental Information. DOI: 10.25921/fd45-gt74](https://www.ncei.noaa.gov/products/etopo-global-relief-model). Distributed under the [U.S. Government Work license](https://www.usa.gov/government-works). - **Detailed shapefiles of Svalbard and the Norwegian coast in [ggOceanMapsLargeData](https://github.com/MikkoVihtakari/ggOceanMapsLargeData)** are from [Geonorge.no](https://www.geonorge.no/). Distributed under the [CC BY 4.0 license](https://creativecommons.org/licenses/by/4.0/). -- **Detailed bathymetry of the Arctic (IBCAO), Northern Hemisphere (GEBCO) and the Barents Sea (BarentsSea) in [ggOceanMapsLargeData](https://github.com/MikkoVihtakari/ggOceanMapsLargeData)** are vectorized from the [General Bathymetric Chart of the Oceans](https://www.gebco.net/data_and_products/gridded_bathymetry_data/) 15-arcsecond 2021 grid. [Terms of use](https://www.gebco.net/data_and_products/gridded_bathymetry_data/gebco_2019/grid_terms_of_use.html) +- **Detailed bathymetry of the Arctic (IBCAO), Northern Hemisphere (GEBCO) and the Barents Sea (BarentsSea) in [ggOceanMapsLargeData](https://github.com/MikkoVihtakari/ggOceanMapsLargeData)** are vectorized from the [General Bathymetric Chart of the Oceans](https://www.gebco.net/data_and_products/gridded_bathymetry_data/) 15-arcsecond 2023 grid. [Terms of use](https://www.gebco.net/data_and_products/gridded_bathymetry_data/gebco_2019/grid_terms_of_use.html) - **Detailed bathymetry of the Northeast Atlantic (EMODned) in [ggOceanMapsLargeData](https://github.com/MikkoVihtakari/ggOceanMapsLargeData)** is vectorized from the [European Marine Observation and Data Network](https://www.emodnet-bathymetry.eu/data-products) 3.75-arcsecond grid. [Terms of use](https://www.emodnet-bathymetry.eu/home/terms-of-use) Further, please cite the package whenever maps generated by the package are published. For up-to-date citation information, please use: @@ -116,26 +105,8 @@ citation("ggOceanMaps") ## Getting help -If your problem does not involve bugs in ggOceanMaps, the quickest way of getting help is posting your problem to [Stack Overflow](https://stackoverflow.com/questions/tagged/r). Please remember to include a reproducible example that illustrates your problem. +If your problem does not involve bugs in ggOceanMaps, the quickest way of getting help can be posting your problem to [Stack Overflow](https://stackoverflow.com/search?q=ggoceanmaps). Alternatively, you are welcome to use the [issues section](https://github.com/MikkoVihtakari/ggOceanMaps/issues) on GitHub. Please remember to include a reproducible example that illustrates your problem and to add links to potential cross-posts. ## Contributions Any contributions to the package are more than welcome. Please contact the package maintainer Mikko Vihtakari () to discuss your ideas on improving the package. Bug reports and corrections should be submitted directly to [the GitHub site](https://github.com/MikkoVihtakari/ggOceanMaps/issues). Please include a [minimal reproducible example](https://en.wikipedia.org/wiki/Minimal_working_example). Considerable contributions to the package development will be credited with authorship. - -## Debugging installation - -After a successful installation, the following code should return a plot shown under - -```{r} -library(ggOceanMaps) -basemap(60) -``` - -If the `basemap()` function complains about ggOceanMapsData package not being available, the drat repository may have issues (assuming you followed the installation instructions above). Try installing the ggOceanMapsData package using the devtools/remotes package. The data package does not contain any C++ code and should compile easily. - -If you encounter problems during the devtools installation, you may set the `upgrade` argument to `"never"` and try the following steps: - -1. Manually update all R packages you have installed (Packages -> Update -> Select all -> Install updates in R Studio). If an update of a package fails, try installing that package again using the `install.packages` function or the R Studio menu. -1. Run `devtools::install_github("MikkoVihtakari/ggOceanMaps", upgrade = "never")`. -1. If the installation of a dependency fails, try installing that package manually and repeat step 2. -1. Since R has lately been updated to 4.0, you may have to update your R to the latest major version for all dependencies to work (`stars`, `rgdal` and `sf` have been reported to cause trouble during the installation). diff --git a/README.md b/README.md index 6ecfabb..de9bd56 100644 --- a/README.md +++ b/README.md @@ -2,7 +2,7 @@ # ggOceanMaps **Plot data on oceanographic maps using ggplot2. R package version -1.3.7** +2.0.0** @@ -13,6 +13,13 @@ +**Breaking news (pun intended): ggOceanMaps goes +[sf](https://r-spatial.github.io/sf/)!** Most of ggOceanMaps code has +been rewritten. There are plenty of new features in 2.0 (see +[this](https://mikkovihtakari.github.io/ggOceanMaps/articles/new-features.html)), +but likely also many new bugs. Please [report them +here](https://github.com/MikkoVihtakari/ggOceanMaps/issues). + ## Overview The ggOceanMaps package for [R](https://www.r-project.org/) allows @@ -39,51 +46,33 @@ The package is available on version](https://github.com/MikkoVihtakari/ggOceanMaps), which is updated more frequently than the CRAN version. -Installation of the CRAN version: +Installation of the GitHub version: ``` r -install.packages("ggOceanMaps") +remotes::install_github("MikkoVihtakari/ggOceanMaps") ``` -Due to the package size limitations, ggOceanMaps requires the -[ggOceanMapsData](https://github.com/MikkoVihtakari/ggOceanMapsData) -package which stores shapefiles used in low-resolution maps. If the data -package does not install automatically, you can install it from GitHub -using devtools or remotes packages, or a drat repository on Windows and -Mac: +Installation of the CRAN version: ``` r -remotes::install_github("MikkoVihtakari/ggOceanMapsData") - -# OR: - -install.packages( - "ggOceanMapsData", - repos = c("https://mikkovihtakari.github.io/drat", - "https://cloud.r-project.org") -) +install.packages("ggOceanMaps") ``` -The GitHub version of ggOceanMaps can be installed using the -[**devtools**](https://cran.r-project.org/web/packages/devtools/index.html) -package. - -``` r -remotes::install_github("MikkoVihtakari/ggOceanMapsData") # required by ggOceanMaps -remotes::install_github("MikkoVihtakari/ggOceanMaps") -``` +The new 2.0 version of ggOceanMaps does not require the ggOceanMapsData +package any longer. Detailed map data are downloaded when needed from +the [ggOceanMapsLargeData](#data-path) repository. ## Usage **ggOceanMaps** extends on [**ggplot2**](http://ggplot2.tidyverse.org/reference/). The package uses -spatial shapefiles, [GIS packages for +spatial ([**sf**](https://r-spatial.github.io/sf/)) shape- (e.g. vector) +and ([**stars**](https://r-spatial.github.io/stars/)) raster files, +[geospatial packages for R](https://cran.r-project.org/web/views/Spatial.html) to manipulate, and -the -[**ggspatial**](https://cran.r-project.org/web/packages/ggspatial/index.html) -package to help to plot these shapefiles. The shapefile plotting is -conducted internally in the `basemap` function and uses [ggplot’s sf -object plotting +ggplot2 to plot these data. The vector and raster plotting is conducted +internally in the `basemap` function, which uses [ggplot’s sf object +plotting capabilities](https://ggplot2.tidyverse.org/reference/ggsf.html). Maps are plotted using the `basemap()` or `qmap()` functions that work almost similarly to [`ggplot()` as a @@ -108,7 +97,7 @@ basemap(data = dt, bathymetry = TRUE) + color = "red", fill = NA) ``` -![](man/figures/README-unnamed-chunk-6-1.png) +![](man/figures/README-unnamed-chunk-5-1.png) See the [ggOceanMaps website](https://mikkovihtakari.github.io/ggOceanMaps/index.html), @@ -126,7 +115,7 @@ useful. While ggOceanMaps allows plotting any custom-made shapefiles, the package contains a shortcut to plot higher resolution maps for [certain areas needed by the -author](https://github.com/MikkoVihtakari/ggOceanMapsLargeData/tree/master/data) +author](https://mikkovihtakari.github.io/ggOceanMaps/articles/premade-maps.html) without the need of generating the shapefiles manually. These high-resolution shapefiles are downloaded from the [ggOceanMapsLargeData](https://github.com/MikkoVihtakari/ggOceanMapsLargeData) @@ -144,8 +133,8 @@ computer by modifying your .Rprofile file ``` It is smart to use a directory R has writing access to. For example -`"~/Documents/ggOceanMapsLargeData"` would work for most operating -systems. +`normalizePath("~/Documents/ggOceanMapsLargeData")` would work for most +operating systems. You will need to set up the data path to your .Rprofile file only once and ggOceanMaps will find the path even though you updated your R or @@ -172,11 +161,11 @@ from the following sources: Shelves datasets combined. Distributed under the [CC Public Domain license](https://creativecommons.org/publicdomain/) ([terms of use](https://www.naturalearthdata.com/about/terms-of-use/)). -- **ggOceanMapsData bathymetry.** [Amante, C. and B.W. Eakins, 2009. - ETOPO1 1 Arc-Minute Global Relief Model: Procedures, Data Sources and - Analysis. NOAA Technical Memorandum NESDIS NGDC-24. National - Geophysical Data Center, - NOAA](https://www.ngdc.noaa.gov/mgg/global/relief/ETOPO1/docs/ETOPO1.pdf). +- **ggOceanMapsData bathymetry.** [NOAA National Centers for + Environmental Information. 2022: ETOPO 2022 15 Arc-Second Global + Relief Model. NOAA National Centers for Environmental Information. + DOI: + 10.25921/fd45-gt74](https://www.ncei.noaa.gov/products/etopo-global-relief-model). Distributed under the [U.S. Government Work license](https://www.usa.gov/government-works). - **Detailed shapefiles of Svalbard and the Norwegian coast in @@ -188,7 +177,7 @@ from the following sources: [ggOceanMapsLargeData](https://github.com/MikkoVihtakari/ggOceanMapsLargeData)** are vectorized from the [General Bathymetric Chart of the Oceans](https://www.gebco.net/data_and_products/gridded_bathymetry_data/) - 15-arcsecond 2021 grid. [Terms of + 15-arcsecond 2023 grid. [Terms of use](https://www.gebco.net/data_and_products/gridded_bathymetry_data/gebco_2019/grid_terms_of_use.html) - **Detailed bathymetry of the Northeast Atlantic (EMODned) in [ggOceanMapsLargeData](https://github.com/MikkoVihtakari/ggOceanMapsLargeData)** @@ -202,11 +191,10 @@ are published. For up-to-date citation information, please use: ``` r citation("ggOceanMaps") -#> #> To cite package 'ggOceanMaps' in publications use: #> -#> Vihtakari M (2022). _ggOceanMaps: Plot Data on Oceanographic Maps -#> using 'ggplot2'_. R package version 1.3.7, +#> Vihtakari M (2023). _ggOceanMaps: Plot Data on Oceanographic Maps +#> using 'ggplot2'_. R package version 2.0.0, #> . #> #> A BibTeX entry for LaTeX users is @@ -214,8 +202,8 @@ citation("ggOceanMaps") #> @Manual{, #> title = {ggOceanMaps: Plot Data on Oceanographic Maps using 'ggplot2'}, #> author = {Mikko Vihtakari}, -#> year = {2022}, -#> note = {R package version 1.3.7}, +#> year = {2023}, +#> note = {R package version 2.0.0}, #> url = {https://mikkovihtakari.github.io/ggOceanMaps/}, #> } ``` @@ -223,9 +211,12 @@ citation("ggOceanMaps") ## Getting help If your problem does not involve bugs in ggOceanMaps, the quickest way -of getting help is posting your problem to [Stack -Overflow](https://stackoverflow.com/questions/tagged/r). Please remember -to include a reproducible example that illustrates your problem. +of getting help can be posting your problem to [Stack +Overflow](https://stackoverflow.com/search?q=ggoceanmaps). +Alternatively, you are welcome to use the [issues +section](https://github.com/MikkoVihtakari/ggOceanMaps/issues) on +GitHub. Please remember to include a reproducible example that +illustrates your problem and to add links to potential cross-posts. ## Contributions @@ -238,37 +229,3 @@ include a [minimal reproducible example](https://en.wikipedia.org/wiki/Minimal_working_example). Considerable contributions to the package development will be credited with authorship. - -## Debugging installation - -After a successful installation, the following code should return a plot -shown under - -``` r -library(ggOceanMaps) -basemap(60) -``` - -![](man/figures/README-unnamed-chunk-9-1.png) - -If the `basemap()` function complains about ggOceanMapsData package not -being available, the drat repository may have issues (assuming you -followed the installation instructions above). Try installing the -ggOceanMapsData package using the devtools/remotes package. The data -package does not contain any C++ code and should compile easily. - -If you encounter problems during the devtools installation, you may set -the `upgrade` argument to `"never"` and try the following steps: - -1. Manually update all R packages you have installed (Packages -\> - Update -\> Select all -\> Install updates in R Studio). If an update - of a package fails, try installing that package again using the - `install.packages` function or the R Studio menu. -2. Run - `devtools::install_github("MikkoVihtakari/ggOceanMaps", upgrade = "never")`. -3. If the installation of a dependency fails, try installing that - package manually and repeat step 2. -4. Since R has lately been updated to 4.0, you may have to update your - R to the latest major version for all dependencies to work (`stars`, - `rgdal` and `sf` have been reported to cause trouble during the - installation). diff --git a/_pkgdown.yml b/_pkgdown.yml index 5355913..298ef5a 100644 --- a/_pkgdown.yml +++ b/_pkgdown.yml @@ -55,6 +55,7 @@ reference: - vector_bathymetry - geonorge_bathymetry - clip_shapefile + - rotate_crs - title: Size adjustment functions desc: > @@ -64,7 +65,18 @@ reference: contents: - LS - FS - + +- reference: +- title: Map data + desc: > + Shape- and raster files used by the package. Higher resolution files are + stored in the [ggOceanMapsLargeData](https://github.com/MikkoVihtakari/ggOceanMapsLargeData) + repository. These files are automatically used by the basemap function. The + `basemap` function also supports custom sf and stars objects. + contents: + - dd_land + - dd_rbathy + - reference: - title: Datasets desc: > @@ -92,16 +104,7 @@ reference: - dd_to_deg - deg_to_dd - select_element - -- title: Shapefiles - desc: > - Shapefiles used by the package are stored in the - [ggOceanMapsData](https://github.com/MikkoVihtakari/ggOceanMapsData) package - and the [ggOceanMapsLargeData](https://github.com/MikkoVihtakari/ggOceanMapsLargeData) - repository. - These shapefiles are automatically used by the basemap function. The - `basemap` function also supports custom shapefiles. - + - dd_clip_boundary navbar: components: @@ -111,11 +114,15 @@ navbar: articles: text: Articles menu: - - text: Customising shapefiles - - text: Premade maps - href: articles/premade-maps.html - - text: Premade shapefiles - href: articles/premade-shapefiles.html + - text: News + - text: New features in version 2.0 + href: articles/new-features.html + # - text: ------- + # - text: Customising shapefiles + # - text: Premade maps + # href: articles/premade-maps.html + # - text: Premade shapefiles + # href: articles/premade-shapefiles.html - text: ------- - text: Presentations - text: SI2022 poster diff --git a/data/dd_land.rda b/data/dd_land.rda new file mode 100644 index 0000000..134ddb6 Binary files /dev/null and b/data/dd_land.rda differ diff --git a/data/dd_rbathy.rda b/data/dd_rbathy.rda new file mode 100644 index 0000000..16c8c12 Binary files /dev/null and b/data/dd_rbathy.rda differ diff --git a/data/fdir_areas.rda b/data/fdir_areas.rda index 22e073c..c8eccbf 100644 Binary files a/data/fdir_areas.rda and b/data/fdir_areas.rda differ diff --git a/data/ices_areas.rda b/data/ices_areas.rda index 4a51a8d..143572d 100644 Binary files a/data/ices_areas.rda and b/data/ices_areas.rda differ diff --git a/docs/404.html b/docs/404.html index 286d996..78992ff 100644 --- a/docs/404.html +++ b/docs/404.html @@ -7,10 +7,10 @@ Page not found (404) • ggOceanMaps - - + + - + License • ggOceanMapsLicense • ggOceanMaps @@ -10,7 +10,7 @@ ggOceanMaps - 1.3.7 + 2.0.0 + + + + + +
+ + + + +
+
+ + + +

The old GIS packages for R (sp, raster and +rgeos) as the engine for ggOceanMaps have been replaced by +newer ones: sf handling vector data and stars +handling rasters. This allowed more efficient use of the geospatial data +while plotting maps and led to multiple improvements.

+
+

Dependencies +

+

Number of dependencies has been reduced. Since plotting bathymetries +in raster format is now quicker than in vector format, low-resolution +raster data could be shipped with ggOceanMaps and the +ggOceanMapsData package is no longer needed. Further, +the strong dependency to the ggspatial package has been +removed meaning that you’ll now need to load the package into +workspace before you can use ggspatial functions to plot data on +basemaps. Also the dependencies to the old GIS packages for R have been +removed.

+
+
+

New bathymetry system +

+

The shift to stars package allowed more efficient plotting of raster +data. This made it possible to add support for raster bathymetries. +Consequently, the bathymetry system which stemmed from PlotSvalbard, the +predecessor of ggOceanMaps, has been redesigned. You can always +roll back to the old system by setting +options(ggOceanMaps.bathy.style = "poly_blues") in your +Rprofile, but please read further because the new system is +better.

+

ggOceanMaps now ships with a low-resolution raster bathymetry dataset +in decimal degree coordinate reference system (4326). This dataset is +projected on the fly depending on basemap limits. The projection is +faster and the quality is not much worse compared to pre-v2.0 +ggOceanMaps:

+
+basemap(c(100, 160, -20, 30), bathymetry = TRUE)
+

+Processing time: 3.5 sec

+

The low-resolution default bathymetry is optimized for +processing time and there are higher-resolution +datasets available but you’ll need to download them. Some of +the downloads happen automatically, while for the others, you’ll need to +do a bit of setup work. As for the automatic download, +ggOceanMapsLargeData contains a compressed version of ETOPO +60 arc-second global relief model (it’s the 60 Arc-Second +Resolution/Ice surface_elevation netCDF on the webpage). Use and +download this dataset by simply specifying the bathy.style +argument:

+
+basemap(c(100, 160, -20, 30), bathy.style = "rcb")
+

+Processing time: 18.7 sec

+

Then the best bit. If this resolution still is not enough, +you can use any bathymetry grid you want as long as +stars::read_stars can open it. First, we’ll download the +entire GEBCO +15 arc-second grid (click this- +or the netCDF link for “GEBCO_2023 Grid (ice surface elevation)” in the +table on the website). This dataset is 7.5 Gb in size, so you’ll need a +good internet connection. Unzip the dataset to the downloads folder, and +define this option either at the beginning of your R session or in your +Rprofile: +options(ggOceanMaps.userpath = "~/Downloads/gebco_2023/GEBCO_2023.nc"). +Now we can make maps using 15 arc-second GEBCO data:

+
+basemap(c(100, 160, -20, 30), bathy.style = "rub")
+

+Processing time: 45.9 sec

+

basemap() now has downsample argument which +can be used to reduce the resolution of excessively large custom raster +datasets for a given region (the lower the resolution, the smaller the +file size):

+
+basemap(c(100, 160, -20, 30), bathy.style = "rub", downsample = 10)
+

+Processing time: 41.4 sec

+

Note how the processing time does not change in this case, but it +seems to be shorter for smaller maps.

+

You can also use custom bathymetries from other packages, such as +marmap, but you’ll have to specify correct +bathy.style for the data (this should be improved in future +versions of the package):

+
+library(marmap)
+
+dt <- data.frame(lon = c(-125, -125, -111, -111), lat = c(28, 37, 37, 28))
+limits <- auto_limits(dt, expand.factor = 1.1)$projLimits
+
+mar_bathy <- marmap::getNOAA.bathy(
+  lon1 = limits["xmin"], lon2 = limits["xmax"], 
+  lat1 = limits["ymin"], lat2 = limits["ymax"])
+bathy <- raster_bathymetry(stars::st_as_stars(marmap::as.raster(mar_bathy)), 
+                           depths = NULL, verbose = FALSE)
+
+basemap(dt, shapefiles = list(land = dd_land, bathy = bathy), bathy.style = "rcb")
+

+

Note that shapefiles providing better polygon bathymetry resolution +in ggOceanMapsLargeData have been deprecated for now. Remaking all +shapefiles and quality-controlling them was too time consuming since I +had to rewrite most of the code and quality-control the changes. More +shapefiles options might appear with future releases of +ggOceanMaps.

+

Since the new bathymetry system is mostly raster based now, it is +advised to save the maps made by ggOceanMaps in raster +format (most journals accept jpegs +with highest quality setting and >= 600 ppi resolution). +To learn more about the new bathymetry system, please +check out the User +Manual

+
+
+

The new world is round - sometimes +

+

The sf +package uses round-Earth model (s2 package) instead of +the flat-Earth model used by geos. While thinking round can be +challenging when making polygons, ggOceanMaps attempts to use the s2 +capabilities as much as possible - and you might encounter unexpected +“features” as a consequence. The benefit of going round-Earth is that we +can now use only one land polygon shapefile in decimal degrees, which +gets projected to flat-Earth on the fly while plotting maps:

+
+basemap(c(-20, 15, 50, 70), bathy.style = "rub")
+

+

The map above is entirely produced from decimal degree data, which +are transformed during plotting. Doing such transformation within a +reasonable time (and quality) was not possible when using old geospatial +packages for R.

+

We can now also plot maps crossing the +anti-meridian:

+
+basemap(c(110, -110, 10, 55), bathymetry = TRUE)
+

+

Round-Earth is not only a blessing and there are still some +unresolved issues, such as lines from decimal degree +shapes that could not be dissolved in the new projection. Most of these +issues are related to glaciers:

+
+basemap(limits = -60, glaciers = TRUE)
+

+

A simple fix for now is to use projected shapefiles from +ggOceanMapsLargeData:

+
+basemap(limits = -60, glaciers = TRUE, shapefiles = "Antarctic")
+

+

If someone has an idea how to fix this dissolve issue, please contact +the developer.

+
+
+

The crs argument - bring the world where your data are +

+

A limitation with pre-2.0 ggOceanMaps was that you could not change +the coordinate reference system (CRS), but you’d have to change the CRS +of your data instead. This was problematic in some cases when wanting to +plot spatial model results that were computed using a certain CRS. +Thanks to round-Earth model, sf and stars, now you can change +the CRS of maps produced by ggOceanMaps on the fly:

+
+cowplot::plot_grid(
+  basemap(c(11,16,67.3,68.6), bathy.style = "rub", crs = 4326, legends = FALSE),
+  basemap(c(11,16,67.3,68.6), bathy.style = "rub", legends = FALSE),
+  basemap(c(11,16,67.3,68.6), bathy.style = "rub", crs = 32633, legends = FALSE),
+  ncol = 3, labels = "auto"
+)
+
+Lofoten plotted using three CRS: a) 4326, i.e. decimal degrees, b) 3995, i.e. Arctic stereographic and c) 32633, i.e. UTM zone 33N.

+Lofoten plotted using three CRS: a) 4326, i.e. decimal degrees, b) 3995, +i.e. Arctic stereographic and c) 32633, i.e. UTM zone 33N. +

+
+
+
+

Tweaks to qmap +

+

Don’t forget qmap(), people. It’s meant to be your +every-day tool to check where your spatial data are located before you +move on to analyze on them. The default qmap data point +color has been changed to red to make the points more visible on the +default grey land. In addition, the long-standing issue with +qmap confusing shape argument to +shapefiles argument in basemap has finally +been fixed:

+
+qmap(data.frame(lat = c(34, 38, 41, 32), lon = c(138, 137, 144, 129)), 
+     shape = I(0), size = I(4))
+

+
+
+

dist2land goes round-Earth +

+

The dist2land() function was bugged for large-scale +datasets in pre-2.0 ggOceanMaps because it used a flat-Earth model and +different shapefiles depending on the location. This should not be the +case any longer as dist2land now uses the s2 package and +returns great circle distances in kilometers. The only +known bug is that the function does not manage to calculate the 0,0 +lon/lat point correctly:

+
+lon = deg_to_dd(seq(0,360,30)); lat = c(80,50,20,0,-20,-50,-80)
+dt <- data.frame(
+  lon = rep(lon, length(lat)), lat = rep(lat, each = length(lon)))
+dt <- dist2land(dt, verbose = FALSE)
+

Processing time: 2.1 sec

+
+qmap(dt, color = ldist) +
+  scale_color_viridis_c()
+

+

The function is still somewhat slow for large datasets and there +currently is no parallellization option. If you have large datasets, +split them up and try running in multiple cores using a +mclapply loop. Such an option could be added to +dist2land in future releases but it is easy to do by hand +too.

+
+
+

The future +

+

Obviously with so many new features and rewriting of the underlying +code, there will be new unexpected bugs. These will be corrected in +future releases of ggOceanMaps as reports come in. In addition to bug +fixing, I am intending to revise the spatial data file collection +offered in ggOceanMapsLargeData repository improving land-shape quality. +The aim for the future is to find a way to plot oceans on a detailed +enough level to show Norwegian fjords sufficiently.

+
+
+
+ + + +
+ + + +
+
+ + + + + + + diff --git a/docs/articles/new-features_files/figure-html/unnamed-chunk-10-1.png b/docs/articles/new-features_files/figure-html/unnamed-chunk-10-1.png new file mode 100644 index 0000000..b891afa Binary files /dev/null and b/docs/articles/new-features_files/figure-html/unnamed-chunk-10-1.png differ diff --git a/docs/articles/new-features_files/figure-html/unnamed-chunk-11-1.png b/docs/articles/new-features_files/figure-html/unnamed-chunk-11-1.png new file mode 100644 index 0000000..75247df Binary files /dev/null and b/docs/articles/new-features_files/figure-html/unnamed-chunk-11-1.png differ diff --git a/docs/articles/new-features_files/figure-html/unnamed-chunk-12-1.png b/docs/articles/new-features_files/figure-html/unnamed-chunk-12-1.png new file mode 100644 index 0000000..dfd270d Binary files /dev/null and b/docs/articles/new-features_files/figure-html/unnamed-chunk-12-1.png differ diff --git a/docs/articles/new-features_files/figure-html/unnamed-chunk-14-1.png b/docs/articles/new-features_files/figure-html/unnamed-chunk-14-1.png new file mode 100644 index 0000000..9bf8053 Binary files /dev/null and b/docs/articles/new-features_files/figure-html/unnamed-chunk-14-1.png differ diff --git a/docs/articles/new-features_files/figure-html/unnamed-chunk-2-1.png b/docs/articles/new-features_files/figure-html/unnamed-chunk-2-1.png new file mode 100644 index 0000000..e555bb1 Binary files /dev/null and b/docs/articles/new-features_files/figure-html/unnamed-chunk-2-1.png differ diff --git a/docs/articles/new-features_files/figure-html/unnamed-chunk-3-1.png b/docs/articles/new-features_files/figure-html/unnamed-chunk-3-1.png new file mode 100644 index 0000000..b5cfa6a Binary files /dev/null and b/docs/articles/new-features_files/figure-html/unnamed-chunk-3-1.png differ diff --git a/docs/articles/new-features_files/figure-html/unnamed-chunk-4-1.png b/docs/articles/new-features_files/figure-html/unnamed-chunk-4-1.png new file mode 100644 index 0000000..91dca15 Binary files /dev/null and b/docs/articles/new-features_files/figure-html/unnamed-chunk-4-1.png differ diff --git a/docs/articles/new-features_files/figure-html/unnamed-chunk-5-1.png b/docs/articles/new-features_files/figure-html/unnamed-chunk-5-1.png new file mode 100644 index 0000000..2572f16 Binary files /dev/null and b/docs/articles/new-features_files/figure-html/unnamed-chunk-5-1.png differ diff --git a/docs/articles/new-features_files/figure-html/unnamed-chunk-6-1.png b/docs/articles/new-features_files/figure-html/unnamed-chunk-6-1.png new file mode 100644 index 0000000..9f60f03 Binary files /dev/null and b/docs/articles/new-features_files/figure-html/unnamed-chunk-6-1.png differ diff --git a/docs/articles/new-features_files/figure-html/unnamed-chunk-7-1.png b/docs/articles/new-features_files/figure-html/unnamed-chunk-7-1.png new file mode 100644 index 0000000..c73eff3 Binary files /dev/null and b/docs/articles/new-features_files/figure-html/unnamed-chunk-7-1.png differ diff --git a/docs/articles/new-features_files/figure-html/unnamed-chunk-8-1.png b/docs/articles/new-features_files/figure-html/unnamed-chunk-8-1.png new file mode 100644 index 0000000..38c9735 Binary files /dev/null and b/docs/articles/new-features_files/figure-html/unnamed-chunk-8-1.png differ diff --git a/docs/articles/new-features_files/figure-html/unnamed-chunk-9-1.png b/docs/articles/new-features_files/figure-html/unnamed-chunk-9-1.png new file mode 100644 index 0000000..de8ea3f Binary files /dev/null and b/docs/articles/new-features_files/figure-html/unnamed-chunk-9-1.png differ diff --git a/docs/articles/poster.html b/docs/articles/poster.html index c7b56d2..e3f659a 100644 --- a/docs/articles/poster.html +++ b/docs/articles/poster.html @@ -8,10 +8,10 @@ SI2022 poster • ggOceanMaps - - + + - + + + - + + + - + Authors and Citation • ggOceanMapsAuthors and Citation • ggOceanMaps @@ -10,7 +10,7 @@ ggOceanMaps - 1.3.7 + 2.0.0 - - - - - -
-
-
- -
-

Temporary workaround for the specification of projargs in CRS(), required by migration to -GDAL >= 3 and PROJ >= 6.

-
- -
-

Usage

-
convert_crs(epsg = 4326)
-
- -
-

Arguments

-
epsg
-

EPSG code of the projection (coercible to a character string).

- -
-
-

Value

- - -

A character string that can be used as projargs in the CRS (sp) function: either a SRS_string -(e.g. "EPSG:4326") or a "+init" PROJ string (e.g. "+init=epsg:4326"), depending on the libraries and -packages versions used.

-
-
-

Details

-

Test the version of GDAL, PROJ and sp, and return the appropriate format.

-
-
-

Author

-

Yves Reecht

-
- -
-

Examples

-
convert_crs()
-#> [1] "EPSG:4326"
-sp::CRS(projargs = convert_crs())
-#> Coordinate Reference System:
-#> Deprecated Proj.4 representation: +proj=longlat +datum=WGS84 +no_defs 
-#> WKT2 2019 representation:
-#> GEOGCRS["WGS 84 (with axis order normalized for visualization)",
-#>     ENSEMBLE["World Geodetic System 1984 ensemble",
-#>         MEMBER["World Geodetic System 1984 (Transit)",
-#>             ID["EPSG",1166]],
-#>         MEMBER["World Geodetic System 1984 (G730)",
-#>             ID["EPSG",1152]],
-#>         MEMBER["World Geodetic System 1984 (G873)",
-#>             ID["EPSG",1153]],
-#>         MEMBER["World Geodetic System 1984 (G1150)",
-#>             ID["EPSG",1154]],
-#>         MEMBER["World Geodetic System 1984 (G1674)",
-#>             ID["EPSG",1155]],
-#>         MEMBER["World Geodetic System 1984 (G1762)",
-#>             ID["EPSG",1156]],
-#>         MEMBER["World Geodetic System 1984 (G2139)",
-#>             ID["EPSG",1309]],
-#>         ELLIPSOID["WGS 84",6378137,298.257223563,
-#>             LENGTHUNIT["metre",1],
-#>             ID["EPSG",7030]],
-#>         ENSEMBLEACCURACY[2.0],
-#>         ID["EPSG",6326]],
-#>     PRIMEM["Greenwich",0,
-#>         ANGLEUNIT["degree",0.0174532925199433],
-#>         ID["EPSG",8901]],
-#>     CS[ellipsoidal,2],
-#>         AXIS["geodetic longitude (Lon)",east,
-#>             ORDER[1],
-#>             ANGLEUNIT["degree",0.0174532925199433,
-#>                 ID["EPSG",9122]]],
-#>         AXIS["geodetic latitude (Lat)",north,
-#>             ORDER[2],
-#>             ANGLEUNIT["degree",0.0174532925199433,
-#>                 ID["EPSG",9122]]],
-#>     USAGE[
-#>         SCOPE["Horizontal component of 3D system."],
-#>         AREA["World."],
-#>         BBOX[-90,-180,90,180]],
-#>     REMARK["Axis order reversed compared to EPSG:4326"]] 
-sp::CRS(projargs = convert_crs(epsg = 27700))
-#> Coordinate Reference System:
-#> Deprecated Proj.4 representation:
-#>  +proj=tmerc +lat_0=49 +lon_0=-2 +k=0.9996012717 +x_0=400000
-#> +y_0=-100000 +ellps=airy +units=m +no_defs 
-#> WKT2 2019 representation:
-#> PROJCRS["OSGB36 / British National Grid",
-#>     BASEGEOGCRS["OSGB36",
-#>         DATUM["Ordnance Survey of Great Britain 1936",
-#>             ELLIPSOID["Airy 1830",6377563.396,299.3249646,
-#>                 LENGTHUNIT["metre",1]]],
-#>         PRIMEM["Greenwich",0,
-#>             ANGLEUNIT["degree",0.0174532925199433]],
-#>         ID["EPSG",4277]],
-#>     CONVERSION["British National Grid",
-#>         METHOD["Transverse Mercator",
-#>             ID["EPSG",9807]],
-#>         PARAMETER["Latitude of natural origin",49,
-#>             ANGLEUNIT["degree",0.0174532925199433],
-#>             ID["EPSG",8801]],
-#>         PARAMETER["Longitude of natural origin",-2,
-#>             ANGLEUNIT["degree",0.0174532925199433],
-#>             ID["EPSG",8802]],
-#>         PARAMETER["Scale factor at natural origin",0.9996012717,
-#>             SCALEUNIT["unity",1],
-#>             ID["EPSG",8805]],
-#>         PARAMETER["False easting",400000,
-#>             LENGTHUNIT["metre",1],
-#>             ID["EPSG",8806]],
-#>         PARAMETER["False northing",-100000,
-#>             LENGTHUNIT["metre",1],
-#>             ID["EPSG",8807]]],
-#>     CS[Cartesian,2],
-#>         AXIS["(E)",east,
-#>             ORDER[1],
-#>             LENGTHUNIT["metre",1]],
-#>         AXIS["(N)",north,
-#>             ORDER[2],
-#>             LENGTHUNIT["metre",1]],
-#>     USAGE[
-#>         SCOPE["Engineering survey, topographic mapping."],
-#>         AREA["United Kingdom (UK) - offshore to boundary of UKCS within 49°45'N to 61°N and 9°W to 2°E; onshore Great Britain (England, Wales and Scotland). Isle of Man onshore."],
-#>         BBOX[49.75,-9,61.01,2.01]],
-#>     ID["EPSG",27700]] 
-
-
-
- - -
- - - -
- - - - - - - diff --git a/docs/reference/dd_clip_boundary.html b/docs/reference/dd_clip_boundary.html new file mode 100644 index 0000000..20d69ed --- /dev/null +++ b/docs/reference/dd_clip_boundary.html @@ -0,0 +1,112 @@ + +Create clip boundary from decimal degree limits — dd_clip_boundary • ggOceanMaps + Skip to contents + + +
+
+
+ +
+

Creates a clip boundary for map cropping from decimal degree limits +counter-clockwise following parallels

+
+ +
+

Usage

+
dd_clip_boundary(limits, crs, expand.factor = NULL)
+
+ +
+

Arguments

+
limits
+

Numeric vector of length 4 giving the map limits in decimal degrees. See basemap.

+ + +
crs
+

Coordinate reference system in st_crs format.

+ + +
expand.factor
+

Expansion factor for map limits. Set to NULL to ignore.

+ +
+
+

Author

+

Mikko Vihtakari

+
+ +
+ + +
+ + + +
+ + + + + + + diff --git a/docs/reference/dd_land.html b/docs/reference/dd_land.html new file mode 100644 index 0000000..c142c6b --- /dev/null +++ b/docs/reference/dd_land.html @@ -0,0 +1,104 @@ + +Decimal degree land shapes — dd_land • ggOceanMaps + Skip to contents + + +
+
+
+ +
+

Decimal degree land shapes

+
+ +
+

Usage

+
dd_land
+
+ +
+

Format

+

Simple feature collection land shapes in decimal degrees (EPSG:4326). Obtained from Natural Earth Data (10m vectors). Includes the islands dataset.

+
+
+

Source

+

Natural Earth Data

+
+
+

See also

+

Other mapfiles: +dd_rbathy

+
+ +
+ + +
+ + + +
+ + + + + + + diff --git a/docs/reference/dd_rbathy.html b/docs/reference/dd_rbathy.html new file mode 100644 index 0000000..2bc7c4c --- /dev/null +++ b/docs/reference/dd_rbathy.html @@ -0,0 +1,104 @@ + +Decimal degree bathymetry — dd_rbathy • ggOceanMaps + Skip to contents + + +
+
+
+ +
+

Decimal degree bathymetry

+
+ +
+

Usage

+
dd_rbathy
+
+ +
+

Format

+

Raster bathymetry in decimal degrees (EPSG:4326). Downsampled from ETOPO 60 arc-second grid.

+
+
+

Source

+

NOAA National Centers for Environmental Information. 2022: ETOPO 2022 15 Arc-Second Global Relief Model. NOAA National Centers for Environmental Information. doi:10.25921/fd45-gt74

+
+
+

See also

+

Other mapfiles: +dd_land

+
+ +
+ + +
+ + + +
+ + + + + + + diff --git a/docs/reference/dd_to_deg.html b/docs/reference/dd_to_deg.html index dab67e2..814ccc1 100644 --- a/docs/reference/dd_to_deg.html +++ b/docs/reference/dd_to_deg.html @@ -1,5 +1,5 @@ -Convert decimal degree values to angular degrees — dd_to_deg • ggOceanMapsConvert decimal degree values to angular degrees — dd_to_deg • ggOceanMaps @@ -10,7 +10,7 @@ ggOceanMaps - 1.3.7 + 2.0.0 + + + + + +
+
+
+ +
+

Defines bathymetry style to be used in basemap

+
+ +
+

Usage

+
define_bathy_style(x)
+
+ +
+

Arguments

+
x
+

Character argument giving the input bathymetry style. Partially matched and can be abbreviated. See bathy.style in basemap.

+ +
+
+

Value

+ + +

Returns a named character vector with the name pointing to the bathymetry style and value to internal map element command for basemap (see map_cmd).

+
+
+

Author

+

Mikko Vihtakari

+
+ +
+ + +
+ + + +
+ + + + + + + diff --git a/docs/reference/define_shapefiles.html b/docs/reference/define_shapefiles.html index 97e6862..cea5f1a 100644 --- a/docs/reference/define_shapefiles.html +++ b/docs/reference/define_shapefiles.html @@ -1,5 +1,5 @@ -Define a shapefile to use in plotting from the limits argument — define_shapefiles • ggOceanMapsDefine a shapefile to use in plotting from the limits argument — define_shapefiles • ggOceanMaps @@ -10,7 +10,7 @@ ggOceanMaps - 1.3.7 + 2.0.0
+ + + + + +
+
+
+ +
+

Rotates a CRS such that North is in the middle of two meridians

+
+ +
+

Usage

+
rotate_crs(crs, meridians)
+
+ +
+

Arguments

+
crs
+

The CRS to be rotated in st_crs format

+ + +
meridians
+

A numeric vector giving the two meridians which should be used in the rotation

+ +
+
+

Value

+ + +

Rotated CRS in st_crs format

+
+
+

Author

+

Mikko Vihtakari

+
+ +
+ + +
+ + + +
+ + + + + + + diff --git a/docs/reference/round_any.html b/docs/reference/round_any.html index abdf0cd..d69821c 100644 --- a/docs/reference/round_any.html +++ b/docs/reference/round_any.html @@ -1,5 +1,5 @@ -Round to multiple of any number — round_any • ggOceanMapsRound to multiple of any number — round_any • ggOceanMaps @@ -10,7 +10,7 @@ ggOceanMaps - 1.3.7 + 2.0.0