We all know that parking in the sun results in a hot car, and we already know a great way to get the humid air out, but that doesn't do much for your sizzling steering wheel. Fortunately, redditor IHaveNoSwag has a great solution: turn your wheel 180 degrees when you park. Doing so puts the side you normally touch in the shade and the other side exposed to the sun. While part of your wheel will still be pretty hot when you touch it, you'll quickly revert the wheel back to its normal position and you'll no longer have to worry about burning your hands. This is pretty smart, and making me look back on 12 years of hand-searing driving I could've avoided. More »
None of these are mine - this is an attempt to replace the sharing function that google reader stripped away from us. Annoyingly, it doesn't seem to be attributing the blog it came from... I'll try to figure out a way of making it do that.
Avoid an Untouchably Hot Steering Wheel by Rotating it 180 Degrees When You Park Your Car [Cars]
Avoid an Untouchably Hot Steering Wheel by Rotating it 180 Degrees When You Park Your Car [Cars]:
We all know that parking in the sun results in a hot car, and we already know a great way to get the humid air out, but that doesn't do much for your sizzling steering wheel. Fortunately, redditor IHaveNoSwag has a great solution: turn your wheel 180 degrees when you park. Doing so puts the side you normally touch in the shade and the other side exposed to the sun. While part of your wheel will still be pretty hot when you touch it, you'll quickly revert the wheel back to its normal position and you'll no longer have to worry about burning your hands. This is pretty smart, and making me look back on 12 years of hand-searing driving I could've avoided. More »



We all know that parking in the sun results in a hot car, and we already know a great way to get the humid air out, but that doesn't do much for your sizzling steering wheel. Fortunately, redditor IHaveNoSwag has a great solution: turn your wheel 180 degrees when you park. Doing so puts the side you normally touch in the shade and the other side exposed to the sun. While part of your wheel will still be pretty hot when you touch it, you'll quickly revert the wheel back to its normal position and you'll no longer have to worry about burning your hands. This is pretty smart, and making me look back on 12 years of hand-searing driving I could've avoided. More »
"Reset" trailer raises bar for indie game graphics ... by a mile
"Reset" trailer raises bar for indie game graphics ... by a mile:

How does a two-man indie development studio create a game with the sort of gameplay and visuals one would normally associate with seven- or eight-figure budgets? That must be the question on at least some of the lips of the 400,000 or so people who've watched the trailer to Finnish studio Theory Interactive's Reset, a first-person puzzle game said to be very much in the mold of Portal... Continue Reading "Reset" trailer raises bar for indie game graphics ... by a mile
Section: Games
Tags: Graphics, Video Games
Related Articles:



How does a two-man indie development studio create a game with the sort of gameplay and visuals one would normally associate with seven- or eight-figure budgets? That must be the question on at least some of the lips of the 400,000 or so people who've watched the trailer to Finnish studio Theory Interactive's Reset, a first-person puzzle game said to be very much in the mold of Portal... Continue Reading "Reset" trailer raises bar for indie game graphics ... by a mile
Section: Games
Tags: Graphics, Video Games
Related Articles:
- Artificial Life announces First Massive Multi Player 3G Game
- Atari asks developers to reimagine Pong
- V2.0 of 'V-girl - your virtual girlfriend' is even more realistic
- TheO Ball lets you get physical with your smartphone
- GameChanger merges traditional board games with the iPad
- Exertris Interactive exercise-bike
Is Reprap UP! to the Chinese challenge?
Is Reprap UP! to the Chinese challenge?:
It appears that Delta Micro's UP! and UP! Mini are aiming to be a serious threat to Reprap and other personal 3D printer offerings in very short order.
It appears that Delta Micro's UP! and UP! Mini are aiming to be a serious threat to Reprap and other personal 3D printer offerings in very short order.
The UP! Mini, a complete, out-of-the-box, 3D printing solution for under $1000
Pizza-making vending machines on their way to the U.S.
This is technology at it's very best :)
Pizza-making vending machines on their way to the U.S.:

Remember how people reacted when McDonalds announced that it was going to start selling pizzas? Well, if buying pies from a chain best known for cheap hamburgers might have been difficult for some folks to get their heads around, they will likely find this even stranger – buying them from a vending machine. Nonetheless, that’s exactly what Dutch company A1 Concepts is hoping Americans will do, when its Let’s Pizza machines arrive in the U.S... Continue Reading Pizza-making vending machines on their way to the U.S.
Section: Electronics
Tags: Food technology, Pizza, Vending Machine
Related Articles:



Pizza-making vending machines on their way to the U.S.:
Remember how people reacted when McDonalds announced that it was going to start selling pizzas? Well, if buying pies from a chain best known for cheap hamburgers might have been difficult for some folks to get their heads around, they will likely find this even stranger – buying them from a vending machine. Nonetheless, that’s exactly what Dutch company A1 Concepts is hoping Americans will do, when its Let’s Pizza machines arrive in the U.S... Continue Reading Pizza-making vending machines on their way to the U.S.
Section: Electronics
Tags: Food technology, Pizza, Vending Machine
Related Articles:
- Bluetooth-enabled magnet orders pizza at the push of a button
- The supreme sports sofa – just US$30,000
- Green Box - the eco-friendly pizza box with built-in plates
- Free Wi-Fi coming to Japanese vending machines in 2012
- Cupcake vending machine dispenses baked treats 24/7
- A vending machine for ... golf lessons??
Apple Introduces New MacBook Pro with Retina Display
I need this
Apple Introduces New MacBook Pro with Retina Display:

Apple has unveiled an all new 15-inch MacBook Pro featuring a “Retina” display with a resolution of 220ppi.
Read more and comment »
Apple Introduces New MacBook Pro with Retina Display:
Apple has unveiled an all new 15-inch MacBook Pro featuring a “Retina” display with a resolution of 220ppi.
Read more and comment »
Real-time tech demos that showcase the future of console and PC gaming
Real-time tech demos that showcase the future of console and PC gaming:
For me, and I suspect for many of you, upgrading a graphics card used to be one of the most exciting things you could do to a desktop PC. With a dreamy look on my face, I still remember when I plugged in my first 1MB PCI graphics card, allowing me to bump up my screen resolution from 800×600 to 1024×768 — and then, later, slotting in a 4MB card, the number of colors leapt up from a dithered 256 to a jaw-dropping 16 million.
A few years later, in the mid-90s, primarily thanks to 3Dfx, 3D accelerators emerged. As far as gaming is concerned, the next decade was a blur. ATI debuted its Rage graphics card in 1995, with both 2D and 3D acceleration. Nvidia outed the GeForce 256 in 1999, introducing hardware transform and lighting (T&L). In 2000, with the release of the GeForce 2 (and the MX range), 3Dfx had finally met its match; by 2002, the company would file for bankruptcy, and Nvidia would scoop up its intellectual property and employees. By the end of 2000 (GeForce 2 Ultra, my first real graphics card!), graphics cards maxed out at 250MHz core speeds and up to 64MB of 128-bit 7GB/sec DDR RAM.
By 2005, with the release of the ATI Radeon X1000 series (R520) and Nvidia 7000 series (G70), we were up to 512MB of RAM per GPU (and 1GB on dual-GPU cards), effective memory clocks in the 1GHz range, 256-bit memory buses with 50GB/sec bandwidth, and pretty DirectX 9 games such as Battlefield 2. And then the seventh-generation consoles happened.
Designed and conceived in 2003-2004, and released in 2005-2006, the PlayStation 3 (which has an G70-like GPU) and Xbox 360 (R520) had bleeding edge capabilities at the time. When Sony showed the Final Fantasy 7 tech demo (above), the office desks and pillows of gamer geeks the world over raised a few inches. Seven years on, though, and their graphics capabilities are looking a little long in the tooth. Over the same period of time, video game consoles have replaced PCs as the de facto gaming platform. As a result, as I’m sure you’re aware, almost every A-list video game is first developed for the antiquated consoles, and then ported to the PC, usually with scant few changes to the interface or engine. Never mind the fact that the latest AMD and Nvidia GPUs, with over a thousand shader cores, are now considerably more powerful (and complex) than CPUs, and support bandwidths of 200GB/sec to 4GB of RAM — developers simply aren’t interested in spending the time and money to develop games for such hardware, when the bulk of their customers are still using an Xbox or PS3.
All this is about to change, though. Over the next couple of years, the eighth generation of consoles will be released, and nestled within them will be DirectX 11 GPUs. The current rumors are that both the PS4 and Xbox 720 (or whatever they end up being called) will have one or two Southern Islands AMD HD 7000-series GPUs. At long last, consoles will again be comparable to PCs. After 7 years of waiting, we should finally see PC games that fully capitalize on the awesome GPUs that AMD and Nvidia keep pumping out.
And now we’ve finally reached the point of this story: To show you what these next-generation games will look like. Over the last couple of months, a handful of beautiful, rendered-in-real-time tech demos have emerged from Square Enix, Epic Games, and Nvidia. While the Nvidia demo is obviously more of a benchmark, the demos from Square Enix and Epic Games are what we can expect over the next couple of years.
The first is A New Dawn, the follow-up to Nvidia’s rather famous decade-old Dawn tech demo. This video was rendered in real time on two GTX 670 cards in SLI.
The next is the Unreal Engine 3 Samaritan demo, which was recently shown running on a single Nvidia GTX 680 at the Game Developer Conference. It’s worth noting that Unreal Engine 4 has apparently been in development in 2003, and is expected to target eighth-generation console hardware; i.e. the current crop of Nvidia and AMD graphics cards.
Finally, we have Agni’s Philosophy, a real-time tech demo from Square Enix that uses off-the-shelf hardware — but the company hasn’t said specifically which graphics card(s) are being used. If this doesn’t elicit a huge grin and various exultant grunts, I don’t know what will.
For me, and I suspect for many of you, upgrading a graphics card used to be one of the most exciting things you could do to a desktop PC. With a dreamy look on my face, I still remember when I plugged in my first 1MB PCI graphics card, allowing me to bump up my screen resolution from 800×600 to 1024×768 — and then, later, slotting in a 4MB card, the number of colors leapt up from a dithered 256 to a jaw-dropping 16 million.
A few years later, in the mid-90s, primarily thanks to 3Dfx, 3D accelerators emerged. As far as gaming is concerned, the next decade was a blur. ATI debuted its Rage graphics card in 1995, with both 2D and 3D acceleration. Nvidia outed the GeForce 256 in 1999, introducing hardware transform and lighting (T&L). In 2000, with the release of the GeForce 2 (and the MX range), 3Dfx had finally met its match; by 2002, the company would file for bankruptcy, and Nvidia would scoop up its intellectual property and employees. By the end of 2000 (GeForce 2 Ultra, my first real graphics card!), graphics cards maxed out at 250MHz core speeds and up to 64MB of 128-bit 7GB/sec DDR RAM.
By 2005, with the release of the ATI Radeon X1000 series (R520) and Nvidia 7000 series (G70), we were up to 512MB of RAM per GPU (and 1GB on dual-GPU cards), effective memory clocks in the 1GHz range, 256-bit memory buses with 50GB/sec bandwidth, and pretty DirectX 9 games such as Battlefield 2. And then the seventh-generation consoles happened.
Designed and conceived in 2003-2004, and released in 2005-2006, the PlayStation 3 (which has an G70-like GPU) and Xbox 360 (R520) had bleeding edge capabilities at the time. When Sony showed the Final Fantasy 7 tech demo (above), the office desks and pillows of gamer geeks the world over raised a few inches. Seven years on, though, and their graphics capabilities are looking a little long in the tooth. Over the same period of time, video game consoles have replaced PCs as the de facto gaming platform. As a result, as I’m sure you’re aware, almost every A-list video game is first developed for the antiquated consoles, and then ported to the PC, usually with scant few changes to the interface or engine. Never mind the fact that the latest AMD and Nvidia GPUs, with over a thousand shader cores, are now considerably more powerful (and complex) than CPUs, and support bandwidths of 200GB/sec to 4GB of RAM — developers simply aren’t interested in spending the time and money to develop games for such hardware, when the bulk of their customers are still using an Xbox or PS3.
All this is about to change, though. Over the next couple of years, the eighth generation of consoles will be released, and nestled within them will be DirectX 11 GPUs. The current rumors are that both the PS4 and Xbox 720 (or whatever they end up being called) will have one or two Southern Islands AMD HD 7000-series GPUs. At long last, consoles will again be comparable to PCs. After 7 years of waiting, we should finally see PC games that fully capitalize on the awesome GPUs that AMD and Nvidia keep pumping out.
And now we’ve finally reached the point of this story: To show you what these next-generation games will look like. Over the last couple of months, a handful of beautiful, rendered-in-real-time tech demos have emerged from Square Enix, Epic Games, and Nvidia. While the Nvidia demo is obviously more of a benchmark, the demos from Square Enix and Epic Games are what we can expect over the next couple of years.
The first is A New Dawn, the follow-up to Nvidia’s rather famous decade-old Dawn tech demo. This video was rendered in real time on two GTX 670 cards in SLI.
The next is the Unreal Engine 3 Samaritan demo, which was recently shown running on a single Nvidia GTX 680 at the Game Developer Conference. It’s worth noting that Unreal Engine 4 has apparently been in development in 2003, and is expected to target eighth-generation console hardware; i.e. the current crop of Nvidia and AMD graphics cards.
Finally, we have Agni’s Philosophy, a real-time tech demo from Square Enix that uses off-the-shelf hardware — but the company hasn’t said specifically which graphics card(s) are being used. If this doesn’t elicit a huge grin and various exultant grunts, I don’t know what will.
Real-time tech demos that showcase the future of console and PC gaming
Real-time tech demos that showcase the future of console and PC gaming:
For me, and I suspect for many of you, upgrading a graphics card used to be one of the most exciting things you could do to a desktop PC. With a dreamy look on my face, I still remember when I plugged in my first 1MB PCI graphics card, allowing me to bump up my screen resolution from 800×600 to 1024×768 — and then, later, slotting in a 4MB card, the number of colors leapt up from a dithered 256 to a jaw-dropping 16 million.
A few years later, in the mid-90s, primarily thanks to 3Dfx, 3D accelerators emerged. As far as gaming is concerned, the next decade was a blur. ATI debuted its Rage graphics card in 1995, with both 2D and 3D acceleration. Nvidia outed the GeForce 256 in 1999, introducing hardware transform and lighting (T&L). In 2000, with the release of the GeForce 2 (and the MX range), 3Dfx had finally met its match; by 2002, the company would file for bankruptcy, and Nvidia would scoop up its intellectual property and employees. By the end of 2000 (GeForce 2 Ultra, my first real graphics card!), graphics cards maxed out at 250MHz core speeds and up to 64MB of 128-bit 7GB/sec DDR RAM.
By 2005, with the release of the ATI Radeon X1000 series (R520) and Nvidia 7000 series (G70), we were up to 512MB of RAM per GPU (and 1GB on dual-GPU cards), effective memory clocks in the 1GHz range, 256-bit memory buses with 50GB/sec bandwidth, and pretty DirectX 9 games such as Battlefield 2. And then the seventh-generation consoles happened.
Designed and conceived in 2003-2004, and released in 2005-2006, the PlayStation 3 (which has an G70-like GPU) and Xbox 360 (R520) had bleeding edge capabilities at the time. When Sony showed the Final Fantasy 7 tech demo (above), the office desks and pillows of gamer geeks the world over raised a few inches. Seven years on, though, and their graphics capabilities are looking a little long in the tooth. Over the same period of time, video game consoles have replaced PCs as the de facto gaming platform. As a result, as I’m sure you’re aware, almost every A-list video game is first developed for the antiquated consoles, and then ported to the PC, usually with scant few changes to the interface or engine. Never mind the fact that the latest AMD and Nvidia GPUs, with over a thousand shader cores, are now considerably more powerful (and complex) than CPUs, and support bandwidths of 200GB/sec to 4GB of RAM — developers simply aren’t interested in spending the time and money to develop games for such hardware, when the bulk of their customers are still using an Xbox or PS3.
All this is about to change, though. Over the next couple of years, the eighth generation of consoles will be released, and nestled within them will be DirectX 11 GPUs. The current rumors are that both the PS4 and Xbox 720 (or whatever they end up being called) will have one or two Southern Islands AMD HD 7000-series GPUs. At long last, consoles will again be comparable to PCs. After 7 years of waiting, we should finally see PC games that fully capitalize on the awesome GPUs that AMD and Nvidia keep pumping out.
And now we’ve finally reached the point of this story: To show you what these next-generation games will look like. Over the last couple of months, a handful of beautiful, rendered-in-real-time tech demos have emerged from Square Enix, Epic Games, and Nvidia. While the Nvidia demo is obviously more of a benchmark, the demos from Square Enix and Epic Games are what we can expect over the next couple of years.
The first is A New Dawn, the follow-up to Nvidia’s rather famous decade-old Dawn tech demo. This video was rendered in real time on two GTX 670 cards in SLI.
The next is the Unreal Engine 3 Samaritan demo, which was recently shown running on a single Nvidia GTX 680 at the Game Developer Conference. It’s worth noting that Unreal Engine 4 has apparently been in development in 2003, and is expected to target eighth-generation console hardware; i.e. the current crop of Nvidia and AMD graphics cards.
Finally, we have Agni’s Philosophy, a real-time tech demo from Square Enix that uses off-the-shelf hardware — but the company hasn’t said specifically which graphics card(s) are being used. If this doesn’t elicit a huge grin and various exultant grunts, I don’t know what will.
For me, and I suspect for many of you, upgrading a graphics card used to be one of the most exciting things you could do to a desktop PC. With a dreamy look on my face, I still remember when I plugged in my first 1MB PCI graphics card, allowing me to bump up my screen resolution from 800×600 to 1024×768 — and then, later, slotting in a 4MB card, the number of colors leapt up from a dithered 256 to a jaw-dropping 16 million.
A few years later, in the mid-90s, primarily thanks to 3Dfx, 3D accelerators emerged. As far as gaming is concerned, the next decade was a blur. ATI debuted its Rage graphics card in 1995, with both 2D and 3D acceleration. Nvidia outed the GeForce 256 in 1999, introducing hardware transform and lighting (T&L). In 2000, with the release of the GeForce 2 (and the MX range), 3Dfx had finally met its match; by 2002, the company would file for bankruptcy, and Nvidia would scoop up its intellectual property and employees. By the end of 2000 (GeForce 2 Ultra, my first real graphics card!), graphics cards maxed out at 250MHz core speeds and up to 64MB of 128-bit 7GB/sec DDR RAM.
By 2005, with the release of the ATI Radeon X1000 series (R520) and Nvidia 7000 series (G70), we were up to 512MB of RAM per GPU (and 1GB on dual-GPU cards), effective memory clocks in the 1GHz range, 256-bit memory buses with 50GB/sec bandwidth, and pretty DirectX 9 games such as Battlefield 2. And then the seventh-generation consoles happened.
Designed and conceived in 2003-2004, and released in 2005-2006, the PlayStation 3 (which has an G70-like GPU) and Xbox 360 (R520) had bleeding edge capabilities at the time. When Sony showed the Final Fantasy 7 tech demo (above), the office desks and pillows of gamer geeks the world over raised a few inches. Seven years on, though, and their graphics capabilities are looking a little long in the tooth. Over the same period of time, video game consoles have replaced PCs as the de facto gaming platform. As a result, as I’m sure you’re aware, almost every A-list video game is first developed for the antiquated consoles, and then ported to the PC, usually with scant few changes to the interface or engine. Never mind the fact that the latest AMD and Nvidia GPUs, with over a thousand shader cores, are now considerably more powerful (and complex) than CPUs, and support bandwidths of 200GB/sec to 4GB of RAM — developers simply aren’t interested in spending the time and money to develop games for such hardware, when the bulk of their customers are still using an Xbox or PS3.
All this is about to change, though. Over the next couple of years, the eighth generation of consoles will be released, and nestled within them will be DirectX 11 GPUs. The current rumors are that both the PS4 and Xbox 720 (or whatever they end up being called) will have one or two Southern Islands AMD HD 7000-series GPUs. At long last, consoles will again be comparable to PCs. After 7 years of waiting, we should finally see PC games that fully capitalize on the awesome GPUs that AMD and Nvidia keep pumping out.
And now we’ve finally reached the point of this story: To show you what these next-generation games will look like. Over the last couple of months, a handful of beautiful, rendered-in-real-time tech demos have emerged from Square Enix, Epic Games, and Nvidia. While the Nvidia demo is obviously more of a benchmark, the demos from Square Enix and Epic Games are what we can expect over the next couple of years.
The first is A New Dawn, the follow-up to Nvidia’s rather famous decade-old Dawn tech demo. This video was rendered in real time on two GTX 670 cards in SLI.
The next is the Unreal Engine 3 Samaritan demo, which was recently shown running on a single Nvidia GTX 680 at the Game Developer Conference. It’s worth noting that Unreal Engine 4 has apparently been in development in 2003, and is expected to target eighth-generation console hardware; i.e. the current crop of Nvidia and AMD graphics cards.
Finally, we have Agni’s Philosophy, a real-time tech demo from Square Enix that uses off-the-shelf hardware — but the company hasn’t said specifically which graphics card(s) are being used. If this doesn’t elicit a huge grin and various exultant grunts, I don’t know what will.
Unreal engine 4 shown off on videos
Unreal engine 4 shown off on videos:
Every year at E3, we can see lots of technology demonstrations and new video games that will eventually (we hope) come to market. Looking back at previous years, some of the coolest games I played have been based on the Unreal Engine. The latest version of the Unreal engine, Unreal Engine 4 has been shown off in a pair of videos outlining different aspects of the game engine.

The first video you see shows Epic Senior technical artist Alan Willard walking through some of the aspects of the game engine and showing things such as dynamic particle lighting, and the way lighting in rooms can change based on objects thrown into the environment. It’s very cool to see how all these different objects interact with each other to change the game environment dramatically. The action in the video is real time, in engine demonstration footage, not CGI.
The second video is more cinematic and shows some of these qualities put together into a fully rendered scene. I think the flowing lava looks very good and I especially like the way the snow swirls and acts realistically as to do the little sparks coming from the hammer. I can’t wait to see a video game based on this game engine, check out the videos and see what you think.
Unreal engine 4 shown off on videos is written by Shane McGlaun & originally posted on SlashGear.
© 2005 - 2012, SlashGear. All right reserved.
Every year at E3, we can see lots of technology demonstrations and new video games that will eventually (we hope) come to market. Looking back at previous years, some of the coolest games I played have been based on the Unreal Engine. The latest version of the Unreal engine, Unreal Engine 4 has been shown off in a pair of videos outlining different aspects of the game engine.
The first video you see shows Epic Senior technical artist Alan Willard walking through some of the aspects of the game engine and showing things such as dynamic particle lighting, and the way lighting in rooms can change based on objects thrown into the environment. It’s very cool to see how all these different objects interact with each other to change the game environment dramatically. The action in the video is real time, in engine demonstration footage, not CGI.
The second video is more cinematic and shows some of these qualities put together into a fully rendered scene. I think the flowing lava looks very good and I especially like the way the snow swirls and acts realistically as to do the little sparks coming from the hammer. I can’t wait to see a video game based on this game engine, check out the videos and see what you think.
Unreal engine 4 shown off on videos is written by Shane McGlaun & originally posted on SlashGear.
© 2005 - 2012, SlashGear. All right reserved.
Zeolite thermal storage retains heat indefinitely, absorbs four times more heat than water
Zeolite thermal storage retains heat indefinitely, absorbs four times more heat than water:
Hold onto your hat/life partner/gonads: Scientists in Germany have created small, zeolite pellets that can store up to four times more heat than water, loss-free for “lengthy periods of time.” In theory, you can store heat in these pellets, and then extract exactly the same amount of heat after an indeterminate amount of time.
Zeolites (literally “boil stones”) aren’t exactly new: The term was coined in 1756 by Axel Cronstedt, a Swedish mineralogist who noted that some minerals, upon being heated, release large amounts of steam from water that had been previously adsorbed. For the last 250 years, scientists have tried to shoehorn this process in a heat storage system — and now, the Fraunhofer Institute, working with industrial partners, has worked out how to do it.
I will try to explain how this works, but the science is fairly complicated: When Fraunhofer’s zeolite comes into contact with water, a chemical reaction adsorbs the water and emits heat. When heat is applied to the zeolite, the process is reversed and the water is released. Because the heat is locked up in the chemical structure of the zeolite, the material never actually feels warm — which is why this is a “loss-free” storage method.
These two processes can be kept separate — so first you charge the balls up with heat, and then later you can just add water (!) to release the heat. This reaction occurs all along the surface of the zeolite — and because zeolites are porous, a single gram of the material has a surface area of 1000 square meters (10,700sqft). It is for this reason that Fraunhofer’s zeolite can store up to four times more heat than water.
While the hydration/dehydration process is well understood, the main technical challenge was building an actual heat storage system. “First we developed the process engineering, then we looked around to see how we could physically implement the thermal storage principle — i.e. how a storage device has to be constructed, and at which locations heat exchangers, pumps and valves are needed,” says Mike Blicker, the group manager. As you can see in the picture on the right, the setup is fairly complicated. The team has now successfully built a transportable 750-liter storage tank, which is currently being wheeled around Germany to test the storage system in real-world situations.
Moving forward, this could be huge news for almost every technological and industrial sphere. Currently, there are very few options for storing heat other than water, which can’t store much heat for a given volume, and it loses heat relatively rapidly. Power plants, biogas plants, steel mills, factories — these all produce vast amounts of heat that could (and should) be reused. They wouldn’t even have to be used on-site, either: charged-up zeolite balls could be distributed to nearby homes and offices. In the future, Blicker suggests that we could eventually replace house water tanks with zeolite systems, too. “It would be ideal if we were able to devise a modular system that would allow us to construct each storage device to suit the individual requirement,” says Blicker.
Personally, I’m hoping for a module small enough to put inside each of my seven computers. I wonder if that’ll be enough to heat my shower in the morning…
Read more at Fraunhofer, or check out Microsoft’s solution to waste heat: Data furnaces
Hold onto your hat/life partner/gonads: Scientists in Germany have created small, zeolite pellets that can store up to four times more heat than water, loss-free for “lengthy periods of time.” In theory, you can store heat in these pellets, and then extract exactly the same amount of heat after an indeterminate amount of time.
Zeolites (literally “boil stones”) aren’t exactly new: The term was coined in 1756 by Axel Cronstedt, a Swedish mineralogist who noted that some minerals, upon being heated, release large amounts of steam from water that had been previously adsorbed. For the last 250 years, scientists have tried to shoehorn this process in a heat storage system — and now, the Fraunhofer Institute, working with industrial partners, has worked out how to do it.
I will try to explain how this works, but the science is fairly complicated: When Fraunhofer’s zeolite comes into contact with water, a chemical reaction adsorbs the water and emits heat. When heat is applied to the zeolite, the process is reversed and the water is released. Because the heat is locked up in the chemical structure of the zeolite, the material never actually feels warm — which is why this is a “loss-free” storage method.
These two processes can be kept separate — so first you charge the balls up with heat, and then later you can just add water (!) to release the heat. This reaction occurs all along the surface of the zeolite — and because zeolites are porous, a single gram of the material has a surface area of 1000 square meters (10,700sqft). It is for this reason that Fraunhofer’s zeolite can store up to four times more heat than water.
Moving forward, this could be huge news for almost every technological and industrial sphere. Currently, there are very few options for storing heat other than water, which can’t store much heat for a given volume, and it loses heat relatively rapidly. Power plants, biogas plants, steel mills, factories — these all produce vast amounts of heat that could (and should) be reused. They wouldn’t even have to be used on-site, either: charged-up zeolite balls could be distributed to nearby homes and offices. In the future, Blicker suggests that we could eventually replace house water tanks with zeolite systems, too. “It would be ideal if we were able to devise a modular system that would allow us to construct each storage device to suit the individual requirement,” says Blicker.
Personally, I’m hoping for a module small enough to put inside each of my seven computers. I wonder if that’ll be enough to heat my shower in the morning…
Read more at Fraunhofer, or check out Microsoft’s solution to waste heat: Data furnaces
Subscribe to:
Comments (Atom)
