WW2 diverted the focus of research from nuclear power to nuclear bombs.
It's hard to tell, because the discovery of fission was almost simultaneous with the beginning of World War II. In any case, a very great many people
immediately noted the prospect of a bomb, and started looking into it right away, so I'm not sure there really ever
was a moment when "the focus" shifted to nuclear bombs--if people were looking at nuclear reactors, it was as much because they thought that they might have some use for building bombs or at least military applications as because of an interest in generating power.
In any case, though, any effects of "shifting focus" were absolutely dwarfed by the fact that the war made the United States willing to spend unprecedented amounts of cash on what amounted to a basic research project in physics and chemistry. Even a cursory look at other nuclear research projects at the time, which were generally afforded a low priority more in line with what they would have had in a scenario with no war, shows the absolutely decisive effects war priority caused in program pace. You would never have seen the first reactor go online in 1942 without the Manhattan Project. You would never have seen the construction of multiple high-power (for the time) reactors as early as 1944 and 1945 without the Manhattan Project. There was no way any government or private funder would have spent the amount of cash needed to make that happen in the pre-war era. Even after the war, large-scale governmental funding motivated largely by military and diplomatic concerns was needed to develop nuclear reactors as fast as they actually were developed.
Actually, that brings up one
other effect that World War II especially but also World War I to a lesser extent had on research--greatly increasing the role of government in funding scientific development. Before those wars, there was a limited amount of governmentally funded research and development, mostly for public health and military purposes. Other fields of science depended on other funding sources such as private donors, corporate research and development labs, university funding, and similar things. If you don't have the wars, this is likely to change much more slowly than IOTL. This will have broad negative effects on research, but especially on certain esoteric fields such as particle physics or space exploration where the projects are costly and somewhat difficult to market relative to, say, medicine. It's hard to predict what the long-term effects of this will end up being, but you may have a serious negative impact on computing without agencies like the NSF and ARPA/DARPA around to fund computing research. Of course AT&T and IBM and so on will also be funding research, but they have their own corporate incentives that may lead them to approaches that are good for them and not as good for everyone else.
More development going in to reactors could thus have lead to the engineering being more mature when people were building dozens of reactors (meaning safer, more economical power) and later development of the bombs means a later rush to build them in their thousands. So was the OTL path really an advancement over the alternative?
I did specifically note this in my post:
Though this may not have necessarily been a good thing--a slower and less precipitous development of nuclear technology would likely have led to a better appreciation of its features and foibles, and probably more exploration of alternative paths to using nuclear phenomena than were actually checked due to war or perceived war needs.
However, you only specified technological "progress," and it's inarguable that nuclear technology advanced much more rapidly thanks to the war than it would have otherwise.