.:`=-~rANdOm~`-=:. Game Servers (Read Only) > Discussion

Fukushima Reactor Decommissioning

<< < (2/2)

Xrain:

--- Quote from: ·UηİŦ··© on November 22, 2013, 01:21:31 PM ---Is Nuclear Power, as-is in this age, even as efficient or practical as it could be (energy and convenience vs waste and containment during dangerous events/human error)?
Should we just continue to do theory work, and back away from the gamma rays until we figure out just how to operate fission plants without incident or neglect?
Should we spend 50 years worth of research possibly boosted by a singularity, in order to maximize the practical uses of solar power/battery life? Should we do with natural fuels? Nuclear power plus?
Is anything even wrong with Nuclear power (was this just another fluke in the wake of progress)?

--- End quote ---

Suggestions like this give the engineering side of my brain cancer.

You cant step back go to just theoretical work and expect to optimize a technology for efficiency and safety. As great as that would be, It just dosn't work like that in real life.

Let me give you an example that would explain.

Lets go back in time to 2004 when intel just released the Prescott Pentium 4 Processor. The P4's were powerhungry inefficient beasts more effective at heating your home than running your computer. Lets say at this point people get tired of having to buy a new processor every 1.5 years that is twice as fast as its predecessor.

So in response intel pulls away from actively manufacturing new processors, and just goes strictly into R&D for new processors. The next one they will make will be the processor that pretty much gets all that we can out of silicon (IE we hit the limits of physics and quantum mechanics for size reduction). This should happen in say 2025 seems to be a reasonable timeline.

So that's 21 years of processor R&D to get the smallest transistors we can out of silicon. So, intel in 2011 spent around 8.5 billion dollars in 2011, they seem to have been increasing R&D expenditures by 1 billion every year for the past couple of years so half way between 2004 and 2025 is ~2015 so lets just say that intels average yearly R&D expenditures for these years is 13 billion dollars a year.

So to do all the development necessary they will have to spend 273 billion dollars to get the best processor possible in 2025. However, processors are only one small part of the computer, you still have graphics cards, north bridges, communication protocols, software, and all of the other technological developments that make your computer what it is. These all have to be developed as well since I don't think using USB 2.0, IDE hard drives, DDR1, and windows XP will be acceptable for our super computer. A conservative estimate of these development costs would be ~5 trillion dollars due to the massive range of technology and development involved (There was $240 billion spent on R&D globally in 2012).

So that's ~ 5 trillion dollars for the R&D necessary to get our super computer. But wait there is another problem. We have never made any new computer hardware for 21 years. So all of the engineers that last made hardware are for the most part retired. So we have to completely retool the entire computer industry, all with engineers who have never done anything more than theoretical development. So I would be pretty safe in adding another trillion or two just to get all of the manufacturing capability and experience up to speed again.

So that brings us to around 7 trillion dollars for just the R&D and manufacturing capability development. So now we can make our computers finally, and lets say every single person in the world will buy one of these computers. All 2.4 billion of us who use the internet on the planet. And everyone will buy the mid range $1000 computer. Usually there is a 200-300% profit markup on these computers, so the actual hardware costs are around $400, however now we have to include that massive R&D expenditure into the pricing otherwise the company will go bankrupt. 7 trillion / 2.4 billion = $3000.

So now $400 + $3000 * a 200% profit markup your new computer costs ~$10,000 for all intents and purposes. Now your problem is not everyone in the world can afford a $10,000 computer, and I promise you very few people would save money for 21 years just for a nebulous computer that may or may not come out, so I'd only about 300 million people in the world would be able to afford that. Now that the volume of computers has dropped substantially the economies of scale drop significantly as well.

Now its $800 for the components for that computer and the R&D cost per unit is now $23,000 per computer. So now with that same 200% markup your computer now costs $50,000. Only 1% of the population can now afford a $50,000 computer. So your volume reduces further... etc. etc.

Basically what the end result of all this is that computer would eventually cost the same as a house.


In real life that R&D cost I quoted you would in reality be 4 times as high or more, this is because the technology you would have in 2025 would be as dissimilar to 2004 as what the technology in 1992 is from today. There is almost no comparison, you are basically introducing a brand new technology. Similar to if the Wright brothers attempted to introduce the world to a 737 instead of a powered glider. To top it off, humanity as a whole has a serious problem performing significant investments.

For example the last nuclear plant in the US was brought online in 1996, our investment into nuclear R&D in 1990? $737 million, in 1995 that had dropped to $103 million.

Our investment in 2000? $37 million. That's right 37 million dollars was our total expenditure in US on nuclear Fission R&D. Walmart spends more on building a new branch store.

So R&D speed is pretty much directly correlated to if you are making new nuclear plants.

So I am sorry, If we want clean safe nuclear power the only path to that road is actual development. You don't optimize all of your technology in theoretical land, it just doesn't work, as much as a wish it did. Since if it did that would mean the Three stage amplifier I am making would have a performance factor of 700,000 which is what I calculated it as. Not the 20,000 which I am actually getting.

Navigation

[0] Message Index

[*] Previous page

Go to full version