first and foremost many thanks for your very useful package which I'm using to map the horizontal distribution of categories of archaeological finds on a sites we excavated using a grid system. As the individual quantities have sort of a high variance I get quite bit differences between the biggest and smallest pies.
A very short reproducible example of what I'm doing:
library(ggplot2)
library(scatterpie)
data = data.frame(x = c(1, 1, 2, 2, 3, 3),
y = c(1, 2, 1, 2, 1, 2),
A = sample(1:20, 6),
B = sample(1:1000, 6))
data$SUM <- rowSums(data[,c("A", "B")])
ggplot() +
geom_scatterpie(data = data,
aes(x = x,
y = y,
r = (SUM/1000)),
cols = c("A", "B"),
color = NA)
Should give results like: or
At the moment I'm using the logarithms of the SUM column (r = log(SUM)/50) in order to 'equalize' those differences a bit. But this approach isn't thus robust. I haven't found any solution that matches e.g. scale_size_continuous(range = c(2, 4)), which I'm using to control the min and max of point sizes (I use geom_point for mapping distributions of a specific category of finds).
Hi,
first and foremost many thanks for your very useful package which I'm using to map the horizontal distribution of categories of archaeological finds on a sites we excavated using a grid system. As the individual quantities have sort of a high variance I get quite bit differences between the biggest and smallest pies.
A very short reproducible example of what I'm doing:
Should give results like: or
At the moment I'm using the logarithms of the
SUM
column (r = log(SUM)/50
) in order to 'equalize' those differences a bit. But this approach isn't thus robust. I haven't found any solution that matches e.g.scale_size_continuous(range = c(2, 4))
, which I'm using to control the min and max of point sizes (I usegeom_point
for mapping distributions of a specific category of finds).Best and many thanks for any help,
Dirk