smartercities:

MIT maps PV potential for Cambridge, MA | Green Futures Magazine
A new 3D map covering 17,000 rooftops in Cambridge, Massachusetts, means communities can estimate the benefits of installing photovoltaic panels on a particular building at a glance. The Mapdwell Project, developed by MIT’s Sustainable Design Lab, combines Google satellite imagery with light detection and ranging data. It improves on previous models by taking account of roof shapes, physical obstructions and weather conditions offering a more accurate calculation of potential hourly solar energy production.

smartercities:

MIT maps PV potential for Cambridge, MA | Green Futures Magazine

A new 3D map covering 17,000 rooftops in Cambridge, Massachusetts, means communities can estimate the benefits of installing photovoltaic panels on a particular building at a glance. The Mapdwell Project, developed by MIT’s Sustainable Design Lab, combines Google satellite imagery with light detection and ranging data. It improves on previous models by taking account of roof shapes, physical obstructions and weather conditions offering a more accurate calculation of potential hourly solar energy production.

One of the latest artificial intelligence systems from MIT is as smart as a 4-year-old
When kids eat glue, they’re exhibiting a lack of common sense. Computers equipped with artificial intelligence, it turns out, suffer from a similar problem.
While computers can tell you the chemical composition of glue, most can’t tell you if it is a gross choice for a snack. They lack the common sense that is ingrained in adult humans. 
For the last decade, MIT researchers have been building a system called ConceptNet that can equip computers with common-sense associations. It can process that a person may desire a dessert such as cake, which has the quality of being sweet. The system is structured as a graph, with connections between related concepts and terms.
The University of Illinois-Chicago announced today that its researchers put ConceptNet to the test with an IQ assessment developed for young children. ConceptNet 4, the second-most recent iteration from MIT, earned a score equivalent to the average 4-year-old. It did well at vocabulary and recognizing similarities, but did poorly at answering “why” questions. Children would normally get similar scores in each of the categories.

One of the latest artificial intelligence systems from MIT is as smart as a 4-year-old

When kids eat glue, they’re exhibiting a lack of common sense. Computers equipped with artificial intelligence, it turns out, suffer from a similar problem.

While computers can tell you the chemical composition of glue, most can’t tell you if it is a gross choice for a snack. They lack the common sense that is ingrained in adult humans. 

For the last decade, MIT researchers have been building a system called ConceptNet that can equip computers with common-sense associations. It can process that a person may desire a dessert such as cake, which has the quality of being sweet. The system is structured as a graph, with connections between related concepts and terms.

The University of Illinois-Chicago announced today that its researchers put ConceptNet to the test with an IQ assessment developed for young children. ConceptNet 4, the second-most recent iteration from MIT, earned a score equivalent to the average 4-year-old. It did well at vocabulary and recognizing similarities, but did poorly at answering “why” questions. Children would normally get similar scores in each of the categories.

MIT Builds An Open-Source Platform For Your Body | Fast Company
MIT Media Lab’s 11-day health care hackathon pulled students and big companies together with a common goal: Healing a broken industry.
Siberian temperatures. Eleven grueling days, navigating rough terrain. Six teams, matched for talent, competing for glory at the end. The Iditarod? Nah, just the annual MIT Health and Wellness Hackathon.
This isn’t your average social app-fest. The goal is to jump-start an open source platform where apps that track all different aspects of your bodily health can exchange information. It’s a Sisyphean task, since most digital health solutions today are trapped in silos, but the organizers believe they can change that by enfranchising big companies instead of trying to disrupt them.

MIT Builds An Open-Source Platform For Your Body | Fast Company

MIT Media Lab’s 11-day health care hackathon pulled students and big companies together with a common goal: Healing a broken industry.

Siberian temperatures. Eleven grueling days, navigating rough terrain. Six teams, matched for talent, competing for glory at the end. The Iditarod? Nah, just the annual MIT Health and Wellness Hackathon.

This isn’t your average social app-fest. The goal is to jump-start an open source platform where apps that track all different aspects of your bodily health can exchange information. It’s a Sisyphean task, since most digital health solutions today are trapped in silos, but the organizers believe they can change that by enfranchising big companies instead of trying to disrupt them.

poptech:

Autonomous robotic plane flies indoors at MIT

For decades, academic and industry researchers have been working on control algorithms for autonomous helicopters — robotic helicopters that pilot themselves, rather than requiring remote human guidance. Dozens of research teams have competed in a series of autonomous-helicopter challenges posed by the Association for Unmanned Vehicle Systems International (AUVSI); progress has been so rapid that the last two challenges have involved indoor navigation without the use of GPS.

But MIT’s Robust Robotics Group — which fielded the team that won the last AUVSI contest — has set itself an even tougher challenge: developing autonomous-control algorithms for the indoor flight of GPS-denied airplanes. At the 2011 International Conference on Robotics and Automation (ICRA), a team of researchers from the group described an algorithm for calculating a plane’s trajectory; in 2012, at the same conference, they presented an algorithm for determining its “state” — its location, physical orientation, velocity and acceleration. Now, the MIT researchers have completed a series of flight tests in which an autonomous robotic plane running their state-estimation algorithm successfully threaded its way among pillars in the parking garage under MIT’s Stata Center.

DARPA and NIH to fund ‘human body on a chip’ research | KurzweilAI
MIT-led team to receive up to $32 million from DARPA and NIH to develop technology that could accelerate pace and efficiency of pharmaceutical testing 
Researchers in the Department of Biological Engineering at MIT plan to develop a technology platform that will mimic human physiological systems in the laboratory, using an array of integrated, interchangeable engineered human tissue constructs, with $32 million funding over the next five years from the Defense Advanced Research Projects Agency (DARPA) and the National Institutes of Health (NIH).
A cooperative agreement between MIT and DARPA worth up to $26.3 million will be used to establish a new program titled “Barrier-Immune-Organ: MIcrophysiology, Microenvironment Engineered TIssue Construct Systems” (BIO-MIMETICS) at MIT, in collaboration with researchers at the Charles Stark Draper Laboratory, MatTek Corp. and Zyoxel Ltd.

DARPA and NIH to fund ‘human body on a chip’ research | KurzweilAI

MIT-led team to receive up to $32 million from DARPA and NIH to develop technology that could accelerate pace and efficiency of pharmaceutical testing 

Researchers in the Department of Biological Engineering at MIT plan to develop a technology platform that will mimic human physiological systems in the laboratory, using an array of integrated, interchangeable engineered human tissue constructs, with $32 million funding over the next five years from the Defense Advanced Research Projects Agency (DARPA) and the National Institutes of Health (NIH).

A cooperative agreement between MIT and DARPA worth up to $26.3 million will be used to establish a new program titled “Barrier-Immune-Organ: MIcrophysiology, Microenvironment Engineered TIssue Construct Systems” (BIO-MIMETICS) at MIT, in collaboration with researchers at the Charles Stark Draper Laboratory, MatTek Corp. and Zyoxel Ltd.

MIT’s Semi-Autonomous Car Balances Human, Computer Control | Autopia | Wired.com
There are autonomous cars, and there are drivers’ cars. Now we have something in the middle. Sterling Anderson and Karl Iagnemma of MIT have created a semi-autonomous driving system that gives drivers full control of the vehicle, but kicks when the car gets too close to another object. This sounds like the adaptive cruise control found in expensive Mercedes-Benzes, but this software is much more nuanced and ambitious than anything on the road.

MIT’s Semi-Autonomous Car Balances Human, Computer Control | Autopia | Wired.com

There are autonomous cars, and there are drivers’ cars. Now we have something in the middle. Sterling Anderson and Karl Iagnemma of MIT have created a semi-autonomous driving system that gives drivers full control of the vehicle, but kicks when the car gets too close to another object. This sounds like the adaptive cruise control found in expensive Mercedes-Benzes, but this software is much more nuanced and ambitious than anything on the road.

What My 11 Year Old’s Stanford Course Taught Me About Online Education - Forbes
My 11 year old son just took a course at Stanford. That has a nice ring to it but it is actually meaningless because these days anyone can take a course at Stanford. You don’t even have to pay. All you need is access to a computer and a reasonable Internet connection. So what we can say is my 11 year old son just watched a bunch of videos on the Internet.
That doesn’t make for an interesting post except that this ‘bunch of videos’ is currently being heralded as the future of higher education. In the New York Times, David Brooks saw courses like the one my son took as a tsunami about to hit campuses all over the world. And he isn’t alone. Harvard’s Clay Christensen sees it asa transformative technology that will change education forever. And along with Stanford many other institutions, most notably Harvard and MIT, are leaping into the online mix. This is attracting attention and investment dollars. It has people nervous and excited. So I wondered, what happens when someone who has grown up online encountered one of these new ventures?
The course my son just completed was ‘Game Theory’ taught by Matthew Jackson and Yoav Shoham.

What My 11 Year Old’s Stanford Course Taught Me About Online Education - Forbes

My 11 year old son just took a course at Stanford. That has a nice ring to it but it is actually meaningless because these days anyone can take a course at Stanford. You don’t even have to pay. All you need is access to a computer and a reasonable Internet connection. So what we can say is my 11 year old son just watched a bunch of videos on the Internet.

That doesn’t make for an interesting post except that this ‘bunch of videos’ is currently being heralded as the future of higher education. In the New York TimesDavid Brooks saw courses like the one my son took as a tsunami about to hit campuses all over the world. And he isn’t alone. Harvard’s Clay Christensen sees it asa transformative technology that will change education forever. And along with Stanford many other institutions, most notably Harvard and MIT, are leaping into the online mix. This is attracting attention and investment dollars. It has people nervous and excited. So I wondered, what happens when someone who has grown up online encountered one of these new ventures?

The course my son just completed was ‘Game Theory’ taught by Matthew Jackson and Yoav Shoham.

EyeRing finger-mounted connected cam captures signs and dollar bills, identifies them with OCR (hands-on) — Engadget
Ready to swap that diamond for a finger-mounted camera with a built-in trigger and Bluetooth connectivity? If it could help identify otherwise indistinguishable objects, you might just consider it. TheMIT Media Lab’s EyeRing project was designed with an assistive focus in mind, helping visually disabled persons read signs or identify currency, for example, while also serving to assist children during the tedious process of learning to read. Instead of hunting for a grownup to translate text into speech, a young student could direct EyeRing at words on a page, hit the shutter release, and receive a verbal response from a Bluetooth-connected device, such as a smartphone or tablet. EyeRing could be useful for other individuals as well, serving as an ever-ready imaging device that enables you to capture pictures or documents with ease, transmitting them automatically to a smartphone, then on to a media sharing site or a server.
We peeked at EyeRing during our visit to the MIT Media Lab this week, and while the device is buggy at best in its current state, we can definitely see how it could fit into the lives of people unable to read posted signs, text on a page or the monetary value of a currency note. We had an opportunity to see several iterations of the device, which has come quite a long way in recent months, as you’ll notice in the gallery below. The demo, which like many at the Lab includes a Samsung Epic 4G, transmits images from the ring to the smartphone, where text is highlighted and read aloud using a custom app. Snapping the text “ring,” it took a dozen or so attempts before the rig correctly read the word aloud, but considering that we’ve seen much more accurate OCR implementations, it’s reasonable to expect a more advanced version of the software to make its way out once the hardware is a bit more polished — at this stage, EyeRing is more about the device itself, which had some issues of its own maintaining a link to the phone. You can get a feel for how the whole package works in the video after the break, which required quite a few takes before we were able to capture an accurate reading.

EyeRing finger-mounted connected cam captures signs and dollar bills, identifies them with OCR (hands-on) — Engadget

Ready to swap that diamond for a finger-mounted camera with a built-in trigger and Bluetooth connectivity? If it could help identify otherwise indistinguishable objects, you might just consider it. TheMIT Media Lab’s EyeRing project was designed with an assistive focus in mind, helping visually disabled persons read signs or identify currency, for example, while also serving to assist children during the tedious process of learning to read. Instead of hunting for a grownup to translate text into speech, a young student could direct EyeRing at words on a page, hit the shutter release, and receive a verbal response from a Bluetooth-connected device, such as a smartphone or tablet. EyeRing could be useful for other individuals as well, serving as an ever-ready imaging device that enables you to capture pictures or documents with ease, transmitting them automatically to a smartphone, then on to a media sharing site or a server.

We peeked at EyeRing during our visit to the MIT Media Lab this week, and while the device is buggy at best in its current state, we can definitely see how it could fit into the lives of people unable to read posted signs, text on a page or the monetary value of a currency note. We had an opportunity to see several iterations of the device, which has come quite a long way in recent months, as you’ll notice in the gallery below. The demo, which like many at the Lab includes a Samsung Epic 4G, transmits images from the ring to the smartphone, where text is highlighted and read aloud using a custom app. Snapping the text “ring,” it took a dozen or so attempts before the rig correctly read the word aloud, but considering that we’ve seen much more accurate OCR implementations, it’s reasonable to expect a more advanced version of the software to make its way out once the hardware is a bit more polished — at this stage, EyeRing is more about the device itself, which had some issues of its own maintaining a link to the phone. You can get a feel for how the whole package works in the video after the break, which required quite a few takes before we were able to capture an accurate reading.

First wirelessly controlled drug-delivery chip successfully tested | KurzweilAI
Researchers from MIT and MicroCHIPS Inc. have developed and tested a programmable, wirelessly controlled  chip to administer daily doses of an osteoporosis drug normally given by injection.
This  is the first successful test of such a device and could help usher in a  new era of telemedicine — delivering health care over a distance,  say MIT professors Robert Langer and Michael Cima, who had the idea 15  years ago.
Pharmacy on the chip
“You could  literally have a pharmacy on a chip,” says Langer. “You can do remote  control delivery, you can do pulsatile drug delivery, and you can  deliver multiple drugs.”

First wirelessly controlled drug-delivery chip successfully tested | KurzweilAI

Researchers from MIT and MicroCHIPS Inc. have developed and tested a programmable, wirelessly controlled  chip to administer daily doses of an osteoporosis drug normally given by injection.

This is the first successful test of such a device and could help usher in a new era of telemedicine — delivering health care over a distance, say MIT professors Robert Langer and Michael Cima, who had the idea 15 years ago.

Pharmacy on the chip

“You could literally have a pharmacy on a chip,” says Langer. “You can do remote control delivery, you can do pulsatile drug delivery, and you can deliver multiple drugs.”

Implantable Tumor Tracker Is a Tiny Lab That Lives Inside Your Body And Reports Back | Popular Science



Rather than bringing people into the lab, researchers at MIT are putting tiny labs into people via a tiny implantable capsule that can track the growth of a tumor or detect heart-deterioration or even silent heart attacks from inside the body. The miniature lab is small enough to implant via a needle during a normal biopsy, and can remain inside the body vigilantly watching for increased tumor growth. The inside of the device is filled with nanoparticles, each sporting an antibody specially designed to bind to specific molecules like those that are produced by certain kinds of tumors or by damaged heart muscle cells.


via joshbyard:

Implantable Tumor Tracker Is a Tiny Lab That Lives Inside Your Body And Reports Back | Popular Science

Rather than bringing people into the lab, researchers at MIT are putting tiny labs into people via a tiny implantable capsule that can track the growth of a tumor or detect heart-deterioration or even silent heart attacks from inside the body. The miniature lab is small enough to implant via a needle during a normal biopsy, and can remain inside the body vigilantly watching for increased tumor growth. The inside of the device is filled with nanoparticles, each sporting an antibody specially designed to bind to specific molecules like those that are produced by certain kinds of tumors or by damaged heart muscle cells.

via joshbyard:

Twine, “the simplest possible way to get the objects in your life texting, tweeting or emailing.” This 2.5” square, created by a duo of MIT Media Lab graduates, offers wifi connectivity, internal and external sensors, and two AAA batteries that keep it running for two months. John Pavlus has more.

via curiositycounts: