mit media lab
Latest
Change up your space with robotic furniture from Ori
Need a way to dramatically improve your living space? How about robotic furniture? No, it's not some far-off dream for the future. It's a collaboration between MIT Media Lab spin-off Ori with designer Yves Béhar.
Brittany Vincent07.11.2016This web game shows that landing a Falcon 9 rocket is pretty much impossible
You thought the carrier landing stage in Top Gun was a nightmare to pull off? Then get ready to scream obscenities you didn't know you knew at MIT Media Lab's SpaceX Falcon 9 Lander. This 8-bit web-game combines all of the pulse pounding excitement of landing a multi-million dollar prototype spacecraft with the rage-inducing control scheme from a 1983 Yugo. The goal of the game is simple, get the rocket to set down gently on an ocean-going platform using only the WASD keys before its limited fuel supply runs out. Now try it without giving yourself a coronary.
Andrew Tarantola09.17.2015Formlabs FORM 1 high-resolution 3D printer spotted in the wild, we go eyes on (video)
Last time we checked in with the 3D printing upstarts over at Formlabs, their Kickstarter was doing splendidly, having over doubled its initial funding target. Well, less than a month later, and with the money still rolling in, the current total stands (at time of writing) at a somewhat impressive $2,182,031 -- over 20 times its initial goal. When we heard that the team behind it, along with some all important working printers, rolled into town, how could we resist taking the opportunity to catch up? The venue? London's 3D print show. Where, amongst all the printed bracelets and figurines, the FORM 1 stood out like a sore thumb. A wonderfully orange, and geometrically formed one at that. We elbowed our way through the permanent four-deep crowd at their booth to take a closer look, and as the show is running for another two days, you can too if you're in town. Or you could just click past the break for more.
James Trew10.19.2012FORM 1 delivers high-end 3D printing for an affordable price, meets Kickstarter goal in 1 day
A $2,300 3D printer isn't really anything special anymore. We've seen them as cheap as $350 in fact. But all those affordable units are of the extrusion variety -- meaning they lay out molten plastic in layers. The FORM 1 opts for a method called stereolithography that blasts liquid plastic with a laser, causing the resin to cure. This is one of the most accurate methods of additive manufacturing, but also one of the most expensive thanks to the need for high-end optics, with units typically costing tens-of-thousands of dollars. A group of recent grads from the MIT Media Lab have managed to replicate the process for a fraction of the cost and founded a company called Formlabs to deliver their innovations to the public. Like many other startups, the group turned to Kickstarter to get off the ground and easily passed its $100,000 within its first day. As of this writing over $250,000 had been pledged and the first 25 printers have already been claimed. The FORM 1 is capable of creating objects with layers as thin as 25 microns -- that's 75 percent thinner than even the new Replicator 2. The company didn't scrimp on design and polish to meet its affordability goals either. The base is a stylish brushed metal with the small build platform protected by an orange plastic shell. There's even a companion software tool for simple model creation. You can still get one, though the price of entry is now $2,500, at the Kickstarter page. Or you can simply get a sneak peek in the gallery and video below. %Gallery-166660%
Terrence O'Brien09.26.2012MIT Media Lab's Tensor Displays stack LCDs for low-cost glasses-free 3D (hands-on video)
Glasses-free 3D may be the next logical step in TV's evolution, but we have yet to see a convincing device make it to market that doesn't come along with a five-figure price tag. The sets that do come within range of tickling our home theater budgets won't blow you away, and it's not unreasonable to expect that trend to continue through the next few product cycles. A dramatic adjustment in our approach to glasses-free 3D may be just what the industry needs, so you'll want to pay close attention to the MIT Media Lab's latest brew. Tensor Displays combine layered low-cost panels with some clever software that assigns and alternates the image at a rapid pace, creating depth that actually looks fairly realistic. Gordon Wetzstein, one of the project creators, explained that the solution essentially "(takes) the complexity away from the optics and (puts) it in the computation," and since software solutions are far more easily scaled than their hardware equivalent, the Tensor Display concept could result in less expensive, yet superior 3D products. We caught up with the project at SIGGRAPH, where the first demonstration included four fixed images, which employed a similar concept as the LCD version, but with backlit inkjet prints instead of motion-capable panels. Each displaying a slightly different static image, the transparencies were stacked to give the appearance of depth without the typical cost. The version that shows the most potential, however, consists of three stacked LCD panels, each displaying a sightly different pattern that flashes back and forth four times per frame of video, creating a three-dimensional effect that appears smooth and natural. The result was certainly more tolerable than the glasses-free 3D we're used to seeing, though it's surely a long way from being a viable replacement for active-glasses sets -- Wetzstein said that the solution could make its way to consumers within the next five years. Currently, the technology works best in a dark room, where it's able to present a consistent image. Unfortunately, this meant the light levels around the booth were a bit dimmer than what our camera required, resulting in the underexposed, yet very informative hands-on video you'll see after the break.%Gallery-162096%
Zach Honig08.09.2012MIT projection system extends video to peripheral vision, samples footage in real-time
Researchers at the MIT Media Lab have developed an ambient lighting system for video that would make Philips' Ambilight tech jealous. Dubbed Infinity-by-Nine, the rig analyzes frames of footage in real-time -- with consumer-grade hardware no less -- and projects rough representations of the video's edges onto a room's walls or ceiling. Synchronized with camera motion, the effect aims to extend the picture into a viewer's peripheral vision. MIT guinea pigs have reported a greater feeling of involvement with video content when Infinity-by-Nine was in action, and some even claimed to feel the heat from on-screen explosions. A five screen multimedia powerhouse it isn't, but the team suggests that the technology could be used for gaming, security systems, user interface design and other applications. Head past the jump to catch the setup in action.
Alexis Santos06.25.2012MIT researchers teach computers to recognize your smile, frustration
Wipe that insincere, two-faced grin off your face -- your computer knows you're full of it. Or at least it will once it gets a load of MIT's research on classifying frustration, delight and facial expressions. By teaching a computer how to differentiate between involuntary smiles of frustration and genuine grins of joy, researchers hope to be able to deconstruct the expression into low-level features. What's the use of a disassembled smile? In addition to helping computers suss out your mood, the team hopes the data can be used to help people with autism learn to more accurately decipher expressions. Find out how MIT is making your computer a better people person than you after the break. [Thanks, Kaustubh]
Sean Buckley05.28.2012ZeroN slips surly bonds, re-runs your 3D gestures in mid-air
Playback of 3D motion capture with a computer is nothing new, but how about with a solid levitating object? MIT's Media Lab has developed ZeroN, a large magnet and 3D actuator, which can fly an "interaction element" (aka ball bearing) and control its position in space. You can also bump it to and fro yourself, with everything scanned and recorded, and then have real-life, gravity-defying playback showing planetary motion or virtual cameras, for example. It might be impractical right now as a Minority Report-type object-based input device, but check the video after the break to see its awesome potential for 3D visualization.
Steve Dent05.14.2012EyeRing finger-mounted connected cam captures signs and dollar bills, identifies them with OCR (hands-on)
Ready to swap that diamond for a finger-mounted camera with a built-in trigger and Bluetooth connectivity? If it could help identify otherwise indistinguishable objects, you might just consider it. The MIT Media Lab's EyeRing project was designed with an assistive focus in mind, helping visually disabled persons read signs or identify currency, for example, while also serving to assist children during the tedious process of learning to read. Instead of hunting for a grownup to translate text into speech, a young student could direct EyeRing at words on a page, hit the shutter release, and receive a verbal response from a Bluetooth-connected device, such as a smartphone or tablet. EyeRing could be useful for other individuals as well, serving as an ever-ready imaging device that enables you to capture pictures or documents with ease, transmitting them automatically to a smartphone, then on to a media sharing site or a server. We peeked at EyeRing during our visit to the MIT Media Lab this week, and while the device is buggy at best in its current state, we can definitely see how it could fit into the lives of people unable to read posted signs, text on a page or the monetary value of a currency note. We had an opportunity to see several iterations of the device, which has come quite a long way in recent months, as you'll notice in the gallery below. The demo, which like many at the Lab includes a Samsung Epic 4G, transmits images from the ring to the smartphone, where text is highlighted and read aloud using a custom app. Snapping the text "ring," it took a dozen or so attempts before the rig correctly read the word aloud, but considering that we've seen much more accurate OCR implementations, it's reasonable to expect a more advanced version of the software to make its way out once the hardware is a bit more polished -- at this stage, EyeRing is more about the device itself, which had some issues of its own maintaining a link to the phone. You can get a feel for how the whole package works in the video after the break, which required quite a few takes before we were able to capture an accurate reading.
Zach Honig04.25.2012Perifoveal Display tracks head positioning, highlights changing data on secondary LCDs (hands-on)
If there's a large display as part of your workstation, you know how difficult it can be to keep track of all of your windows simultaneously, without missing a single update. Now imagine surrounding yourself with three, or four, or five jumbo LCDs, each littered with dozens of windows tracking realtime data -- be it RSS feeds, an inbox or chat. Financial analysts, security guards and transit dispatchers are but a few of the professionals tasked with monitoring such arrays, constantly scanning each monitor to keep abreast of updates. One project from the MIT Media Lab offers a solution, pairing Microsoft Kinect cameras with detection software, then highlighting changes with a new graphical user interface. Perifoveal Display presents data at normal brightness on the monitor that you're facing directly. Then, as you move your head to a different LCD, that panel becomes brighter, while changes on any of the displays that you're not facing directly (but still remain within your peripheral vision) -- a rising stock price, or motion on a security camera -- are highlighted with a white square, which slowly fades once you turn to face the new information. During our hands-on demo, everything worked as described, albeit without the instant response times you may expect from such a platform. As with most Media Lab projects, there's no release date in sight, but you can gawk at the prototype in our video just after the break.
Zach Honig04.25.2012DIY Cellphone has the footprint of an ice cream sandwich, definitely doesn't run ICS (hands-on)
Building your own wireless communications device isn't for the faint of heart, or the law-abiding -- the FCC tends to prefer placing its own stamp of approval on devices that utilize US airwaves, making a homegrown mobile phone an unlikely proposition. That didn't stop a team at the MIT Media Lab from creating such a DIY kit, however. Meet the Do-It-Yourself Cellphone. This wood-based mobile rig, while it's currently in the prototype phase (where it may indefinitely remain), would eventually ship with a circuit board, control pad, a fairly beefy antenna and a monochrome LCD. Sounds like it'd be right at home at some kid's garage workshop in the early '80s, not showcased at an MIT open house. The argument here is that people spend more time with their phone than with any other device, so naturally they'd want to build one to their liking. Nowadays, folks expect their pocketable handset to enable them to not only place and receive phone calls, but also store phone numbers, offer a rechargeable battery, and, well, in some cases even send and receive email, and surf the web -- none of which are available with such a kit. The prototype we saw was fully functional. It could place calls. It could receive calls. There was even Caller ID! The phone does indeed feel homemade, with its laser-cut plywood case and a design that lacks some of the most basic gadget essentials, like a rechargeable battery (or at very least some provisions for replacing the 9-volt inside without unscrewing the case). Audio quality sounded fine, and calls went out and came in without a hitch -- there's a SIM card slot inside, letting you bring the nondescript phone to the carrier of your choice. Does it work? Yes. Is it worth dropping $100-150 in parts to build a jumbo-sized phone with a microscopic feature set? No, there's definitely nothing smart about the DIY Cellphone. If you want to throw together your own handset, however, and not risk anyone questioning the legitimacy of your homemade claim, you might want to keep an eye out for this to come to market. The rest of you will find everything you need in the video just past the break. We're just happy to have walked away without any splinters.
Zach Honig04.25.2012OLED Display Blocks pack six 128 x 128 panels, we go hands-on at MIT (video)
How do you develop an OLED display that gives a 360-degree perspective? Toss six 1.25-inch panels into a plastic cube, then turn it as you see fit. That's an overly simplistic explanation for the six-sided display on hand at the MIT Media Lab today, which is quite limited in its current form, but could eventually serve an enormous variety of applications. Fluid Interfaces Group Research Assistant Pol Pla i Conesa presented several such scenarios for his Display Blocks, which consist of 128 x 128-pixel OLED panels. Take, for example, the 2004 film Crash, which tells interweaving stories that could be presented simultaneously with such a display -- simply rotate the cube until you land on a narrative you'd like to follow, and the soundtrack will adjust to match. It could also go a long way when it comes to visualizing data, especially when in groups -- instead of virtually constructing profiles of individuals who applied for a slot at MIT, for example, or segments of a business that need to be organized based on different parameters, you could have each assigned to a cube, which can be tossed into an accepted or rejected pile, and repositioned as necessary. Imagine having a group of display cubes when it comes time to plan the seating chart for a reception -- each cube could represent one individual, with a color-coded background and a name or photo up top, with different descriptive elements on each side. The same could apply to products at monstrous companies like Samsung or Sony, where executives need to make planning decisions based on product performance, and could benefit greatly from having all of the necessary information for a single gadget listed around each cube. On a larger scale, the cubes could be used to replace walls and floors in a building -- want to change the color of your wallpaper? Just push a new image to the display, and dedicate a portion of the wall for watching television, or displaying artwork. You could accomplish this with networked single-sided panels as well, but that wouldn't be nearly as much fun. The Media Lab had a working prototype on display today, which demonstrated the size and basic functionality, but didn't have an adjustable picture. Still, it's easy to imagine the potential of such a device, if, of course, it ever becomes a reality. As always, you'll find our hands-on demo just past the break.
Zach Honig04.24.2012Droplet and StackAR bring physical interface to virtual experiences, communicate through light (hands-on)
Light-based communication seems to wind throughout the MIT Media Lab -- it is a universal language, after all, since many devices output light, be it with a dedicated LED or a standard LCD, and have the capacity to view and interpret it. One such device, coined Droplet, essentially redirects light from one source to another, while also serving as a physical interface for tablet-based tasks. Rob Hemsley, a research assistant at the Media Lab, was on hand to demonstrate two of his projects. Droplet is a compact self-contained module with an integrated RGB LED, a photodiode and a CR1216 lithium coin battery -- which provides roughly one day of power in the gadget's current early prototype status. Today's demo used a computer-connected HDTV and a capacitive-touch-enabled tablet. Using the TV to pull up a custom Google Calendar module, Hemsley held the Droplet up to a defined area on the display, which then output a series of colors, transmitting data to the module. Then, that data was pushed to a tablet after placing the Droplet on the display, pulling up the same calendar appointment and providing a physical interface for adjusting the date and time, which is retained in the cloud and the module itself, which also outputs pulsing light as it counts down to the appointment time. StackAR, the second project, functions in much the same way, but instead of outputting a countdown indicator, it displays schematics for a LilyPad Arduino when placed on the tablet, identifying connectors based on a pre-selected program. The capacitive display can recognize orientation, letting you drop the controller in any position throughout the surface, then outputting a map to match. Like the Droplet, StackAR can also recognize light input, even letting you program the Arduino directly from the tablet by outputting light, effectively simplifying the interface creation process even further. You can also add software control to the board, which will work in conjunction with the hardware, bringing universal control interfaces to the otherwise space-limited Arduino. Both projects appear to have incredible potential, but they're clearly not ready for production just yet. For now, you can get a better feel for Droplet and StackAR in our hands-on video just past the break.
Zach Honig04.24.2012MIT gets musical with Arduino-powered DrumTop, uses household objects as a source of sound
Everyone's favorite microcontroller has been a boon among hobbyists and advanced amateurs, but it's also found a home among the brilliant projects at MIT's Media Lab, including a groovy instrument called DrumTop. This modern take on the drum pad delivers Arduino-powered interactivity in its simplest form -- hands-on time with ordinary household objects. Simply place a cup, or a plastic ball, even a business card on the DrumTop to make your own original music. The prototype on display today includes eight pads, which are effectively repurposed speakers that tap objects placed on top, with an FSR sensor recognizing physical pressure and turning it into a synchronized beat. There's also a dial in the center that allows you to speed up or slow down the taps, presenting an adjustable tempo. DrumTop is more education tool than DJ beat machine, serving to teach youngsters about the physical properties of household objects, be it a coffee mug, a CD jewel case or a camera battery. But frankly, it's a lot of fun for folks of every age. There's no word on when you might be able to take one home, so for now you'll need to join us on our MIT visit for a closer look. We make music with all of these objects and more in the video after the break.
Zach Honig04.24.2012Newsflash uses high-frequency light to transmit data from iPad to smartphone, we go hands-on (video)
MIT's Media Lab is chock-full of cutting-edge tech projects that researchers create, then often license to manufacturers and developers. One such project is called Newsflash, and uses high-frequency red and green light to transmit data to the built-in camera on a receiving device -- in this case Samsung's Epic 4G. The concept is certainly familiar, and functions in much the same way as a QR code, generating flashing light that's invisible to the human eye instead of a cumbersome 2D square. In the Media Lab's implementation, an iPad is used to display a static news page with flashing colored bands at the top, representing just a few vertical pixels on the LCD. As the device presents the standard touch experience you're already familiar with, it also broadcasts data that can be read by any camera, but flashes too quickly to be distracting or even noticeable to the naked eye. A NewsFlash app then interprets those flashes and displays a webpage as instructed -- either a mobile version with the same content, or a translation of foreign websites. As with most MediaLab projects, Newsflash is simply a concept at this point, but it could one day make its way to your devices. Jump past the break to see it in action.
Zach Honig04.24.2012MIT builds camera that can capture at the speed of light (video)
A team from the MIT media lab has created a camera with a "shutter speed" of one trillion exposures per second -- enabling it to record light itself traveling from one point to another. Using a heavily modified Streak Tube (which is normally used to intensify photons into electron streams), the team could snap a single image of a laser as it passed through a soda bottle. In order to create the slow-motion film in the video we've got after the break, the team had to replicate the experiment hundreds of times. The stop-motion footage shows how light bounces through the bottle, collecting inside the opaque cap before dispersing. The revolutionary snapper may have a fast shutter but the long time it takes to process the images have earned it the nickname of the "the world's slowest fastest camera." [Image courtesy of MIT / M. Scott Brauer]
Daniel Cooper12.13.2011MIT's folding CityCar takes a spin on video, still no room for humans
The MIT Media Lab has been working on a folding, stackable electric vehicle for quite a few years now, but it seems those have at least been fairly productive years, as the so-called CityCar has now finally progressed to something resembling a finished prototype. The only problem for those eager to hop into one is that it's a half-sized prototype, which makes accommodating a driver just a tad difficult. It does do a fairly good job of folding itself up though, and MIT expects a full-size version to go into production in 2013. Interestingly, MIT doesn't necessarily see people actually owning the vehicles themselves, but it would like to see them be made available throughout cities -- letting you rent one for a short trip across town, for instance, and not have to worry about returning it. Head on past the break to see it on video courtesy of The Next Web.
Donald Melanson08.26.2011Minecraft designs tossed into 3D printer, we're pretty sure a wizard did it
We've seen the cuboid tools afforded to Minecraft builders used for some pretty impressive in-game creations -- but never before have we seen those creations manifested out-game. Two students at the MIT Media Lab have whipped up a piece of software called Minecraft.Print(), which is capable of feeding a megastructure built in the game's world into a 3D printer, which that turns it into a real-life thing. Check out the video above to check out this impressive tool in action. Keep an eye out for the few frames where a wizard steps in and performs the series of magic spells required to make something like this work -- we missed them our first time through, but we know they're there.
Griffin McElroy07.08.201115 Minutes of Fame: Joi Ito on player relationships and connecting
From Hollywood celebrities to the guy next door, millions of people have made World of Warcraft a part of their lives. How do you play WoW? We're giving each approach its own 15 Minutes of Fame. Last week, we set the stage with internet superman Joichi Ito in a conversation that meandered through the old days of gaming, from getting his feet wet in MUDs to why he misses 40-man raiding in WoW. This week, we're back to discuss why some people with MBAs make crappy raid leaders, how WoW builds stronger bonds between people who work together, and his plans to bring WoW along for the ride at the MIT Media Lab. Catch up on last week's part 1 of our interview with Joi first. 15 Minutes of Fame: So what's your own guild focused on now? Joi Ito: The Horde side had kind of wound down a little bit. It still exists, but it's mostly the Alliance side now. When we were both going strong, it was really fun because we did a lot of joint stuff. [laughs] What we would do is do sort of sister guild PVP -- but it would always get messy because you'd find people from other guilds noticing and then jumping in. Right now, we're definitely not first in the realm, but we just hit level 25. I'm pretty delinquent; I need to level myself up, so as not to embarrass everyone too badly. [laughs] Every expansion, we go through several iterations of discussing the governance and stuff like that, and a lot of the old-time guild leadership aren't active. I got grandfathered in because it's hard not to have someone in the background, I guess, being the custodian of things to do when no one else can decide. There was a really interesting paper written by Dmitri Williams, who's an academic, and they did a study on the relationship between guild rules and stability of the guild. It said that in guilds that called themselves "casual" but didn't have any rules, the players tended to have more anxiety than those guilds where there were rules, and that "casual" didn't mean no rules, that rules help people feel comfortable. Our guild rules are pretty anal, a pretty extensive set of rules, and there's a lot of participation by the guild members in working on these rules. I don't know how formal rules are in other guilds -- I haven't been in too many other guilds -- but discussing the rules and the governance of the guild seems to be a thing that a lot of our guild likes to spend time on. That's primarily where my focus is these days, making sure that the leadership and the guild rules aren't too out of sync with what's going on in the game.
Lisa Poisso06.09.201115 Minutes of Fame: MIT Media Lab director Joichi Ito on WoW
From Hollywood celebrities to the guy next door, millions of people have made World of Warcraft a part of their lives. How do you play WoW? We're giving each approach its own 15 Minutes of Fame. Let's get Joichi Ito's professional credentials out of the way first. The 44-year-old Japanese venture capitalist is the incoming director of the avant-garde MIT Media Laboratory. A self-professed "informal learner" (he dropped out of college twice and never finished a degree) now shines as one of the stars of the digital age, serving on the board of directors for Creative Commons, Technorati, ICANN, and Mozilla, and catching the wave as an early-bird investor in Last.fm, Flickr, and Twitter. Currently a resident of Dubai (he moved there so he could get a better feel for the people and the region), he circumnavigates the globe a full two times every month in the course of his international pursuits. According to his Twitter stream, he's been scuba diving in Japan this week taking underwater radioactivity samples; after catching the bug to learn how to dive, he promptly became a master diver and now is a PADI open water instructor. He's the godson of psychedelic explorer Timothy Leary ... ... and a guild leader in World of Warcraft. "My feeling is that what we are doing in WoW represents in many ways the future of real time collaborative teams and leadership in an increasingly ad hoc, always-on, diversity intense and real-time environment," he wrote in his blog back in 2006. In fact, one of his presentations on WoW made it into an early incarnation of our Moviewatch feature in 2007. So yeah, we're going to talk about WoW ... Need anything else to cement his gaming cred? Two more tidbits: Ito's GMed a WoW raiding guild since the original days of Molten Core, and he owns an actual handwritten map drawn by Richard Bartle, creator of the first MUD -- it's like the Magna Carta of gaming.
Lisa Poisso06.02.2011