“You cannot have food, water, or energy security without climate security. They are interconnected and inseparable. They form four resource pillars on which global security, prosperity and equity stand. Each depends on the others. Plentiful, affordable food requires reliable and affordable access to water and energy. Increasing dependence on coal, oil, and gas threatens climate security, increasing the severity of floods and droughts, damaging food production, exacerbating the loss of biodiversity and, in countries that rely on hydropower, undermining energy security through the impact on water availability. As the world becomes more networked, the impacts of climate change in one country or region will affect the prosperity and security of others around the world.”—
A section of UK Foreign Secretary William Hague’s remarkable speech, ‘The Diplomacy of Climate Change’, presented to the Council on Foreign Relations in New York in 2010.
Those solar panels on top of your roof aren’t just providing clean power; they are cooling your house.. too, according to a team of researchers… Using thermal imaging, researchers determined that during the day, a building’s ceiling was 5 degrees Fahrenheit cooler under solar panels than under an exposed roof. At night, the panels help hold heat in, reducing heating costs in the winter.
“Talk about positive side-effects,” said Kleissl.
As solar panels sprout on an increasing number of residential and commercial roofs, it becomes more important to consider their impact on buildings’ total energy costs, Kleissl said. His team determined that the amount saved on cooling the building amounted to getting a 5 percent discount on the solar panels’ price, over the panels’ lifetime. Or to put it another way, savings in cooling costs amounted to selling 5 percent more solar energy to the grid than the panels are actually producing— for the building researchers studied… Also, the more efficient the solar panels, the bigger the cooling effect, said Kleissl. For the building researchers analyzed, the panels reduced the amount of heat reaching the roof by about 38 percent…
“There are more efficient ways to passively cool buildings, such as reflective roof membranes,” said Kleissl. “But, if you are considering installing solar photovoltaic, depending on your roof thermal properties, you can expect a large reduction in the amount of energy you use to cool your residence or business.”
In a recent interview, Schlossnagle said that not only does the current technology allow less-qualified people to analyze data, but that most of the analysis being done is strictly for technical benefit. The real benefit will be realized when the technology is capable of powering real-time business decisions.
Our interview follows.
How has data analysis evolved over the last few years?
Theo Schlossnagle: The general field of data analysis has actually devolved over the last few years because the barrier to entry is dramatically lower. You now have a lot of people attempting to analyze data with no sound mathematics background. I personally see a lot of “analysis” happening that is less mature than your run-of-the-mill graduate-level statistics course or even undergraduate-level signal analysis course.
But where does it need to evolve? Storage is cheaper and more readily available than ever before. This leads organizations to store data like its going out of style. This isn’t a bad thing, but it causes a significantly lower signal-to-noise ratio. Data analysis techniques going forward will need to evolve much better noise reduction capabilities.
What does real-time data allow that wasn’t available before?
Theo Schlossnagle: Real-time data has been around for a long time, so in a lot of ways, it isn’t offering anything new. But the tools to process data in real-time have evolved quite a bit. CEP systems now provide a much more accessible approach to dealing with data in real time and building millisecond-granularity real-time systems. In a web application, imagine being about to observe something about a user and make an intelligent decision on that data combined with a larger aggregate data stream — all before before you’ve delivered the headers back to the user.
“Education is, today at least, a black box. Society invests significantly in primary, secondary, and higher education. Unfortunately, we don’t really know how our inputs influence or produce outputs. We don’t know, precisely, which academic practices need to be curbed and which need to be encouraged. We are essentially swatting flies with a sledgehammer and doing a fair amount of peripheral damage. Learning analytics are a foundational tool for informed change in education. Over the past decade, calls for educational reform have increased, but very little is understood about how the system of education will be impacted by the proposed reforms. I sometimes fear that the solution being proposed to what ails education will be worse than the current problem. We need a means, a foundation, on which to base reform activities. In the corporate sector, business intelligence serves this “decision foundation” role. In education, I believe learning analytics will serve this role. Once we better understand the learning process — the inputs, the outputs, the factors that contribute to learner success — then we can start to make informed decisions that are supported by evidence.”—How data and analytics can improve education - O’Reilly Radar (via infoneer-pulse)
After 10 years of battling incumbent utilities, Marin Clean Energy became California’s first operational community choice aggregation (CCA) authority in 2010. Already, local ratepayers can opt to get 100 percent of their electricity from renewable resources.
CCA offers an option for cities, counties, and collaborations to opt out of the traditional role of energy consumers. Instead, they can become the local retail utility, buying electricity in bulk and selecting their power providers on behalf of their citizens in order to find lower prices or cleaner energy (or even reduce energy demand). Marin Clean Energy started operations last year:
"When it launched last fall, Marin Energy Authority’s goal was to offer 20 percent renewable energy to its customers,” said interim director Dawn Weisz. “We were able to offer 27.5 percent compared to the state-mandated 20 percent.” The state recently increased the mandate to one third. PG&E has about 17 percent under contract, according to Ms. Weisz.
Customers can also opt for the “deep green,” 100 percent renewable service for a 10 percent premium.
Marin Clean Energy not only contracts for a higher portion of renewable energy than PG&E, it’s trying to increase its share of local, distributed generation:
"We are filling a niche market for mid-sized renewable energy generation in the 20 to 60 megawatt range," said Ms. Weisz … "When we went out to solicit renewable power offers, Pacific Gas & Electric told us we would not get any bids. We were looking for 40 megawatts. We were offered over 600. Almost all was solar."
A bit choppy, but here are some highlights from a recent paper on improving quantum computers - i.e. the mindblowing computers of the future. Just thought I’d post to remind everyone that this hasn’t gone away just because it’s not been in the headlines lately. Research is ongoing and improving our understanding and technology.
…Using high magnetic fields [they] managed to suppress decoherence, which is one of the key stumbling blocks in quantum computing. “High magnetic fields reduce the level of the noises in the surroundings, so they can constrain the decoherence very efficiently,” Takahashi said. Decoherence has been described as a “quantum bug” that destroys fundamental properties that quantum computers would rely on…
Though the concepts underpinning quantum computing are not new, problems such as decoherence have hindered the construction of a fully functioning quantum computer. Think of decoherence as a form of noise or interference, knocking a quantum particle out of superposition — robbing it of that special property that makes it so useful. If a quantum computer relies on a quantum particle’s ability to be both here and there, then decoherence is the frustrating phenomenon that causes a quantum particle to be either here or there.
The researchers calculated all sources of decoherence in his experiment as a function of temperature, magnetic field, and by nuclear isotopic concentrations, and suggested the optimum condition to operate qubits, reducing decoherence by approximately 1,000 times.
I wanted to make a business out of collecting e-waste (electronics) how would i go about it, as far as finding where to expose it and who would pay for the exposion of the material? Thank you for your time.
Thanks for asking. I don’t have an immediate good answer on where to start in the e-waste business. Do others have suggestions?
How has one industrialized country created one of the world’s most successful education systems in a way that is completely hostile to testing? That’s the question asked — and answered — in a new documentary called “The Finland Phenomenon: Inside the World’s Most Surprising School System.” Examining the nation with one of the most comparatively successful education systems on the planet, the film contradicts the test-obsessed, teacher-demonizing orthodoxy of education “reform” that now dominates America’s political debate.
Manhattan driving may become somewhat less maddening this week, as the city brings online a new traffic-monitoring system aimed to reduce Midtown congestion.
Over the past year, crews have been outfitting Midtown streets with new hardware that detects traffic flow. Microwave sensors installed in the middle of blocks determine whether a line of cars is waiting at a stoplight, and EZPass readers at intersections measure how long it takes a vehicle to get from one street corner to the next.
Those sensors feed information back to the city’s traffic-command center in Queens, where computers ingest the real-time traffic data
IBM scientists have perfected a memory technology that is faster, cheaper and more than flash memory, and which could be commercially available within five years.
The report, in PhysOrg, says the breakthrough in a technology called “phase-change memory” will allow computers and servers to boot instantaneously and significantly enhance the overall performance of IT systems.
A promising contender is PCM that can write and retrieve data 100 times faster than flash, enable high storage capacities and not lose data when the power is turned off. Unlike flash, PCM is also very durable and can endure at least 10 million write cycles, compared to current enterprise-class flash at 30,000 cycles or consumer-class flash at 3,000 cycles. While 3,000 cycles will out live many consumer devices, 30,000 cycles are orders of magnitude too low to be suitable for enterprise applications.
So how does it work?
PCM leverages the resistance change that occurs in the material — an alloy of various elements — when it changes its phase from crystalline — featuring low resistance — to amorphous — featuring high resistance — to store data bits. In a PCM cell, where a phase-change material is deposited between a top and a bottom electrode, phase change can controllably be induced by applying voltage or current pulses of different strengths. These heat up the material and when distinct temperature thresholds are reached cause the material to change from crystalline to amorphous or vice versa.