Input as an electrical engineer but not a power engineer:
Frequency-wise, both 50 and 60 Hz were basically arbitrary, but within certain constraints. First, there is a relationship between the frequency at which your turbines run and the frequency of electricity you make (they needn't be the same, but will be related by a half-integer factor). Though the simplest systems will have them the same, and turbines have various RPMs that they like to run at. Second, if you want to run incandescent lights off of your power, it needs to oscillate quickly enough that you can't see it (or, above roughly 30 Hz - the existence of the 25 Hz standard took advantage of the fact that the human eyes don't actually notice flickering that slow per se, though you'll notice certain effects (it would make everything look "movie-like"), plus the fact that the filaments in the bulbs had thermal inertia (i.e. it takes them time to cool down); it's also worth nothing that the 25Hz standard almost didn't happen because of the lighting issue, but manufacturing issues stopped this (the turbines involved in setting that standard spun very slow, and manufacturing turbines with many many poles was considered too expensive and unreliable). So 30ish Hz is the lower bound, and then there's also an upper bound provided by the limitations of induction motors (Tesla's model didn't work much above 100 Hz).
Someone pointed out that lower currents will have less losses than higher, but at the frequencies we're talking about that effect should be completely negligible.
Voltage-wise, we balance safety and efficiency...but the real reason for the range used today has to do with the materials used in the first generation of electrical equipment. In fact, I was taught that the reason that Europe is higher is because in the few years between the first American and first European power grids, materials (principally rubber) had advanced enough to safely support higher voltages. Plus, Americans were very wary because the War of the Currents had recently shown them all how "dangerous" AC was, even if they ended up using it.
Safety-wise, there's a major step in damage at about 200V, below which the electricity won't usually cause severe burns and above which it will (this is about the level required to cause major burning and breakdown in the outer layer of your skin); 110V is measurably safer than 220 (anecdotally, we were pretty blase about mains power in circuits lab, which I understand is not the case in most of Europe). Interestingly, safety-wise 60Hz is much more dangerous than 50Hz because it's much closer to the human pulse rate and so can more easily disturb the heart - but that wouldn't be realized OTL until long, long after the standard was adopted.
With regards to 3-phase, which someone mentioned: in the US, at least, power is always 3 phases on the high tension wires, and then will have phases isolated at the neighborhood or building level. Especially in apartment buildings between different apartments or floors, but also sometimes in individual houses, you can actually measure the voltage of different outlets against each other and see the 120 degree offset. The main reason that 3 phase isn't used more widely is the cost of handling it and the relatively small number of things that have a powerful enough motor to cause it to be more cost-effective to use a 3-phase motor.