. | . |
A camera that can see unlike any imager before it by Staff Writers Washington DC (SPX) Sep 21, 2016
Picture a sensor pixel about the size of a red blood cell. Now envision a million of these pixels-a megapixel's worth-in an array that covers a thumbnail. Take one more mental trip: dive down onto the surface of the semiconductor hosting all of these pixels and marvel at each pixel's associated tech-mesh of more than 1,000 integrated transistors, which provide each and every pixel with a tiny reprogrammable brain of its own. That is the vision for DARPA's new Reconfigurable Imaging (ReImagine) program. "What we are aiming for," said Jay Lewis, program manager for ReImagine, "is a single, multi-talented camera sensor that can detect visual scenes as familiar still and video imagers do, but that also can adapt and change their personality and effectively morph into the type of imager that provides the most useful information for a given situation." This could mean selecting between different thermal (infrared) emissions or different resolutions or frame rates, or even collecting 3-D LIDAR data for mapping and other jobs that increase situational awareness. The camera ultimately would rely on machine learning to autonomously take notice of what is happening in its field of view and reconfigure the imaging sensor based on the context of the situation. The future sensor Lewis has in mind would even be able to perform many of these functions simultaneously because different patches of the sensor's carpet of pixels could be reconfigured by way of software to work in different imaging modes. That same reconfigurability should enable the same sensor to toggle between different sensor modes from one lightning-quick frame to the next. No single camera can do that now. A primary driver here, according to Lewis, who works in DARPA's Microsystems Technology Office (MTO), is the shrinking size and cost of militarily important platforms that are finding roles in locations that span from orbit to the seas. With multi-functional sensors like the ones that would come out of a successful ReImagine program, these smaller and cheaper platforms would provide a degree of situational awareness that today can only come from suites of single-purpose sensors fitted onto larger airborne, ground, space-based, and naval vehicles and platforms. And with the more extensive situational awareness, Lewis said, would come the most important payoff: more informed decision-making. Today, DARPA posted a Special Notice (DARPA-SN-16-68) on FedBizOpps.gov with instructions for those who might want to attend a Proposers Day on September 30 in Arlington, VA, as a step toward possibly participating in the ReImagine program. In the coming days, DARPA expects to also post a Broad Agency Announcement that specifies the new program's technical objectives, milestones, schedule, and deliverables, along with instructions for researchers seeking to submit proposals. One key feature of the ReImagine program is that teams will be asked to develop software-configurable applications based on a common digital circuit and software platform. During the four-year program, MIT-Lincoln Laboratory-a federally funded research and development center (FFRDC) whose roots date back to the WWII mission to develop radar technology-will be tasked to provide the common reconfigurable digital layer of what will be the system's three-layer sensor hardware. The challenge for successful proposers ("performers" in DARPAspeak) will be to design and fabricate various megapixel detector layers and "analog interface" layers, as well as associated software and algorithms for converting a diversity of relevant signals (LIDAR signals for mapping, for example) into digital data. That digital data, in turn, should be suitable for processing and for participation in machine learning procedures through which the sensors could become autonomously aware of specific objects, information, happenings, and other features within their field of view. One reason for using a common digital layer, according to Lewis, is the hope that it will enable a community developing "apps" in software to accelerate the innovation process and unlock new applications for software-reconfigurable imagers. In follow-on phases of the program, performers will need to demonstrate portability of the developing technology in outdoor testing and, in Lewis's words, "develop learning algorithms that guide the sensor, through real-time adaptation of sensor control parameters, to collecting the data with the highest content of useful information." That adaption might translate, in response to visual cues, into toggling into a thermal detection mode to characterize a swarm of UAVs or into hyper-slow-motion (high-frame rate) video to help tease out how a mechanical device is working. "Even as fast as machine learning and artificial intelligence are moving today, the software still generally does not have control over the sensors that give these tools access to the physical world," Lewis said. "With ReImagine, we would be giving machine-learning and image processing algorithms the ability to change or decide what type of sensor data to collect." Importantly, he added, as with eyes and brains, the information would flow both ways: the sensors would inform the algorithms and the algorithms would affect the sensors. Although defense applications are foremost on his mind, Lewis also envisions commercial spinoffs. Smart phones of the future could have camera sensors that do far more than merely take pictures and video footage, their functions limited only by the imaginations of a new generation of app developers, he suggested.
Related Links Defense Advanced Research Projects Agency Satellite-based Internet technologies
|
|
The content herein, unless otherwise known to be public domain, are Copyright 1995-2024 - Space Media Network. All websites are published in Australia and are solely subject to Australian law and governed by Fair Use principals for news reporting and research purposes. AFP, UPI and IANS news wire stories are copyright Agence France-Presse, United Press International and Indo-Asia News Service. ESA news reports are copyright European Space Agency. All NASA sourced material is public domain. Additional copyrights may apply in whole or part to other bona fide parties. All articles labeled "by Staff Writers" include reports supplied to Space Media Network by industry news wires, PR agencies, corporate press officers and the like. Such articles are individually curated and edited by Space Media Network staff on the basis of the report's information value to our industry and professional readership. Advertising does not imply endorsement, agreement or approval of any opinions, statements or information provided by Space Media Network on any Web page published or hosted by Space Media Network. General Data Protection Regulation (GDPR) Statement Our advertisers use various cookies and the like to deliver the best ad banner available at one time. All network advertising suppliers have GDPR policies (Legitimate Interest) that conform with EU regulations for data collection. By using our websites you consent to cookie based advertising. If you do not agree with this then you must stop using the websites from May 25, 2018. Privacy Statement. Additional information can be found here at About Us. |