OK maybe it's not normally the opposite, but it's often quite different from their original position.
And it happens quite often...
Am I right that you were thinking of this in terms of climate change -- basically, the argument being that if scientists have to revise their theories pretty substantially on a fairly regular basis, then we shouldn't be setting policy based on mainstream climate science opinion, because they could be wrong, and then we would have wasted the effort?
If so, consider two arguments (each of which I can expand on if you'd like, since I'm abbreviating greatly here):
(1) It would be disastrous if we took as a general principle that public policy should not consider mainstream science, merely because mainstream science is sometimes wrong. Think of all the things we invest in because scientists tell us they're pretty sure it'll work. We never would have dug the Erie Canal or the Panama Canal if we hadn't operated on a basis of the opinion that the scientific experts were probably right when they predicted we could tackle the technical challenges of those projects. We never would have sent satellites into space, or split the atom. We never would have cured polio, or addressed epidemics of tuberculosis and pellagra. We never would have de-leaded gasoline or gotten HIV under control or brought down the rates of smoking-related cancers. And so on. Our normal practice, when there isn't a strong vested interest in denying the science, is to assume that if a large enough portion of scientists are expressing a high degree of confidence after a significant period of intense study, then public policy should be set with the rebuttable assumption that they are right. That assumption may not always be correct. But it USUALLY will be. So those bets will pay off most of the time, and we'll be better off than if we hadn't made them. It's about playing the odds.
The problem comes when there's a vested interest in denying the science for non-scientific reasons. Take one item from my list: pellagra. It's a disease caused by a vitamin deficiency, and it used to be hugely common in the American South. Researchers studied it and concluded, based on plenty of evidence, that it was a nutritional problem, not a communicable disease. There was political resistance against this, because if it was a nutritional deficiency, then it was so common among southerners because they were too poorly paid to afford a well-rounded diet, and the cure would need to be paying them more or otherwise addressing their dietary needs. That would be really expensive. Since the solution was unpalatable to the southern economic elite, the science faced an uphill climb. Only after researchers found a dirt-cheap way to address the dietary deficiency (using brewer's yeast), did the conservative politicians and business leaders stop denying the science. I think that's where things are going with climate change science. For as long as the available solutions look expensive, the conservative politicians and business leaders will pretend they don't believe the science. The moment a cheap solution comes along, they'll drop the pretense.
(2) The second thing to consider is the asymmetry of risk. In other words, what are the consequences if we bet wrong in one direction versus the other. If we bet that the very large majority of scientists are right and they turn out to be fundamentally wrong, what will we have lost? Well, we will have somewhat slowed our global economy for the period of time between when we start acting and when we come to realize they were wrong. However, that negative effect will be partly offset by the relief rally when we take our foot off the brakes, so to speak (e.g., eliminating the carbon taxes and other limits). It will be a short-term harm. And it will be further partly offset by positive side effects of our efforts (e.g., less particulate and chemical pollution, more energy efficiency allowing us to get more miles out of non-renewable resources, etc.) So, it'll be a short-term harm with long-term positive side-effects.
By comparison, what if we bet that the large majority of scientists are wrong and they turn out to be right -- what have we lost? Well, then we will have put ourselves in a position where we have to pay for the pound of cure where we might once have paid for the ounce of prevention. We will find ourselves fighting up-hill against all sorts of vicious cycles --self-reinforcing warming phenomena-- such that we will need to make MUCH bigger sacrifices in the future to stabilize the climate at warmer levels than we would have had to make today, to stabilize them at cooler levels. Plus, in many cases we will trigger effectively irreversible changes -- losses of species and ecosystems that no practical investment down the road will bring back. We'll wind up with millennia of less productive environments for our descendants. We'll be talking about immeasurable numbers of premature deaths and diminished lives.
So, we're looking at a situation where we can either make an apparently high-probability bet with fairly low costs if we're wrong, or an apparently low-probability bet with catastrophic costs if we're wrong. Which is the wiser bet?