August 24, 2009
By Kris Osborn
In 2008, U.S. military forces collected 400,000 hours of airborne surveillance video, up from several thousand hours 10 years ago. So the Pentagon is turning to computers to help save, sort and search it all.
“The proliferation of unmanned systems across the battlefield is not going to lessen in the future. We saw it happen in the first Gulf War. Once commanders have it, there is an insatiable appetite for FMV,” or full-motion video, said Maj. Gen. John Custer III, who commands the U.S. Army Intelligence Center, Fort Huachuca, Ariz.
“You not only need the tools to exploit that, you need storage because commanders don’t only want to see a building now but what it looked like yesterday, six weeks ago and six months ago,” Custer said. “When you have 18 systems up for 18 hours a day, you get into terabytes in a week. We are going to be in large data-storage warehousing for the rest of time.”
As a result, the Pentagon, the U.S. government, academia and industry are collaborating to create industrywide standards for the many emerging technologies aimed at addressing this problem.
“We are collecting million of minutes of FMV on the battlefield. Our ability to disseminate that information to commanders and decision-makers is directly proportional to the implementation of industry standards,” said U.S. Navy Cmdr. Joseph Smith, military deputy in the sensor assimilation division at the National Geospatial – Intelligence Agency.
“Motion imagery comes not only from standard assets such as the Predator and Reaper but from handheld video from soldiers on the ground, traffic cameras, etc,” Smith said.
One system that tries to make sense of all that video is a Harris-Lockheed Martin product called Full-Motion Video Asset Management Engine (FAME), which attaches related data to it: imagery, audio, sensor feeds and more. The demand for FAME and systems like it is skyrocketing across the military services, said Susan Meisner, spokeswoman for the National Geospatial – Intelligence Agency.
“One of the challenges we have seen is they [the Pentagon] have really stovepiped things. There is one thing that handles video, one that handles imagery, sigint, etc., and they are siloed. We will bring in chat information and align it with the video so everything is synched in space and time. Imagine watching a football game with just video?” said Lucius Stone, director of government solutions for Harris’ broadcast communications division.
“With people watching stuff live, there is so much irrelevant stuff coming in. We want to put automation into the system so you are bringing in humans when there is something for them to look for,” Stone said.
The first FAME systems were developed five years ago at Fort Polk, La., and Fort Irwin, Calif. The system provides access to archival video feeds along with geospatial maps and additional intelligence information, such as nearby audio.
Multiple Formats, Alerts
FAME can synchronize dozens of video formats and resolutions, including UAVs, night vision cameras and other kinds of sensors. FAME can send e-mail alerts when something of combat relevance occurs in a sensor’s field of view, such as insurgent movement or the planting of IEDs.
Smith said these kinds of technologies can access relevant images from previous ISR efforts.
“You put in a certain number of search terms and the machine returns to you relevant video clips. This can only be done if we do a prior work-up and tag video to associate new data with data we already have,” Smith said.
Modus Operandi’s Wave Exploitation Framework (Wave-EP) can look for words in context to reduce false positive cues, said George Eanes, the firm’s vice president of business development. The program can access multiple databases and data streams, Eanes said.
“It looks for cross-reference data from multiple sources, combining data with imagery. It converts data to an XML format so that it is normalized and accessible for the computer application,” Eanes said. “We say we are looking for needles in a needle stack. We have this massive amount of data. How do you make sense of it all and figure out what patterns are important?”
For instance, Wave-EP can combine cues from surveillance aircraft with audio from intercepted radio or cell phone conversations. It can access databases that contain information such as stolen cars in a particular area as a way to look for potential vehicle-borne IEDs.
Modus Operandi has Wave-EP contracts with the Air Force Research Lab, Army, Defense Advanced Research Project Agency and Missile Defense Agency.
One of the key challenges with these kinds of technologies is ensuring there is enough bandwidth to move increasingly large amounts of information in real time. The U.S. military is also working to boost bandwidth.
“Something I said in 2000 was, ‘In each future war the U.S. is engaged in, we will use one-tenth the force of the previous conflict and 100 times the bandwidth. We had 10 divisions in first Gulf War, and one or two divisions in the Iraq war,” said Custer.
Dense Wavelength Division Multi-Plexing (DWDM) shoots different wavelengths of laser light through fiber-optic cables, Custer said. DWDM is currently fielded at Fort Huachuca.
“DWDM takes the fiber-optic cable that gave us 1 gigabyte and shoots different colors of light to get 40 gigs to the desktop,” said Custer.
The hybrid-type asymmetrical wars in Iraq and Afghanistan greatly increase the need for these kind of technologies, Pentagon and industry officials said.
“We are not fighting against troops organized with tanks and airplanes. We are fighting against terrorists who are always trying to find a new way to inflict damage on us. To have technology that is trying to make sense of information and predict what the terrorists are doing is really important. Our troops can’t beat anything if they don’t know who to go after,” said Eanes.
Phi Beta Iota Editorial Comment:
1) We told Keith Hall and Marty Hurwitz at the General Defense Intelligence Program (GDIP) Conference at USSOCOM in Tampa in 1988 (21 years ago today) that adding geospatial attributes to every datum in every collection discipline was going to be essential if they ever wanted to get to machine-speed all-source fusion. Of course no one wanted to listen to the Marine Corps, what could the Marines possibly know….
2) The above article is very revealing, not only for how screwed up the DoD Global Information Grid (GIG), practically non-existent, something GAO confirmed in 2004 and 2006, but how screwed up the industry is. For anyone to propose pre-work with meta-tagging tells us all we need to know about how stone cold dead their understanding is up the state of the art in this area. Automatically embedded time, date, and place stamps, combined with geometric data (altitude, orientation, angle of look, resolution) are all that is needed to be able to reach the Marine Corps’ stated objective in 1988 when the above comment was made: “Show me everything we have in chronological order about this specific point on the Earth.”
3) A year later, Col Bruce Brun, the third Director of the Marine Corps Intelligence Center (MCIC) after Col Walt Breede III and Col Forest Lucy, told the Defense Mapping Agency (DMA) annual conference, “I don’t care how much data you give me, if I cannot plot it on a map it is useless to me.” We still do not have the 1:50,000 combat charts with contour lines for most of the countries in the Expeditionary Environment, because DMA’s successor, the National Geospatial-Intelligence Agency (NGA), keeps making promises it cannot keep, avoids mentioning that the vaunted shuttle mission came back in the form of data that looks like Swiss Cheese, and still does not “compute” the need for 175 hard copies of any given 1:50 sheet per battalion, many more if you are also supplying coalition forces and non-governmental forces.
IN A NUTSHELL: there is a correlation of forces that now suggests that a new foundation is needed for Defense Information, one that is exclusively unclassified and focused on M4IS2 at machine speed, adding Real Time Processing and Near Real Time Processing (RTP, NRTP) as well as very large aggregate pattern analysis that is repeated over and over and over. The classified systems cannot do that and in all likelihood never will be able to do that. Instead we should start with the 80% that is not classified, create the end to end system (hand-helds, desk-tops, cloud processing, analytics, visualization on the fly) at the unclassified level, and then gradually begin adding the 20% from the classified world, one discipline at a time.