The scroll speed is currently computed to be linear in the position of the mouse along the axis on which you're scrolling, meaning that the closer the mouse is to the edge of the screen, the faster you scroll. But this only happens in a fairly narrowly-constrained range, so it's not very noticeable; to the extent that it is noticeable, I think it just looks weird.
You can also use the arrow keys to scroll, but they use different speeds, so if you hit the left arrow key and put the mouse on the right edge, then the camera scrolls right but much slower than if you weren't hitting the arrow key, which seems totally wrong.
I think there should be a single flat speed at which the camera scrolls, and it should scroll in a direction if-and-only-if ((the mouse is at that edge or the corresponding arrow key has been pressed) and (neither mouse nor arrow key is also trying to scroll in the opposite direction)).
The scroll speed is currently computed to be linear in the position of the mouse along the axis on which you're scrolling, meaning that the closer the mouse is to the edge of the screen, the faster you scroll. But this only happens in a fairly narrowly-constrained range, so it's not very noticeable; to the extent that it is noticeable, I think it just looks weird.
You can also use the arrow keys to scroll, but they use different speeds, so if you hit the left arrow key and put the mouse on the right edge, then the camera scrolls right but much slower than if you weren't hitting the arrow key, which seems totally wrong.
I think there should be a single flat speed at which the camera scrolls, and it should scroll in a direction if-and-only-if ((the mouse is at that edge or the corresponding arrow key has been pressed) and (neither mouse nor arrow key is also trying to scroll in the opposite direction)).