The Terrifying Technology Inside Drone Cameras

1,484,885
0
Published 2023-03-11
▶ Visit brilliant.org/NewMind to get a 30-day free trial + the first 200 people will get 20% off their annual subscription

UAVs operate in the world of tactical intelligence, surveillance and reconnaissance or ISR, generally providing immediate support for military operations often with constantly evolving mission objects. Traditionally, airborne ISR imaging systems were designed around one of two objectives, either looking at a large area without the ability to provide detailed resolution of a particular object or providing a high resolution view of specific targets, with a greatly diminished capability to see the larger context. Up until the 1990s, wet film systems were used on both the U2 and SR-71. Employing a roll of film 12.7 cm or 5" wide and almost 3.2 km or 2 miles long, this system would capture one frame every 6.8 seconds, with a limit of around 1,6000 frame captures per roll.

BIRTH OF DIGITAL
The first digital imaging system to be used for reconnaissance was the optical component of the Advanced Synthetic Aperture Radar System or ASARS. Installed on the U-2 reconnaissance aircraft in the late 1970s, ASARS used a large, phased-array antenna to create high-resolution images of the ground below using radar. Complementing the radar, was an imaging system that used a Charge-coupled device or CCD camera to capture visible light images of the terrain being surveyed. This CCD camera operated in synchronization with the radar system and had a resolution of around 1 meter or 3.3 feet per pixel.

A CCD sensor consists of an array of tiny, light-sensitive cells arranged in an array. When combined with the limitation of computing hardware of the time, their designs were generally limited to less than a megapixel, with resolutions as low as 100,000 pixels being found in some systems.

CMOS
By the early 1990s, a new class of imagining sensor called active-pixel sensors, primarily based on the CMOS fabrication process began to permeate the commercial market. Active-pixel sensors employ several transistors at each photo site to both amplify and move the charge using a traditional signal path, making the sensor far more flexible for different applications due to this pixel independence. CMOS sensors also use more conventional, and less costly manufacturing techniques already established for semiconductor fabrication production lines.

FIRST WAMI
Wide Area Motion Imagery takes a completely different approach to traditional ISR technologies by making use of panoramic optics paired with an extremely dense imaging sensor. The first iteration of Constant Hawk’s optical sensor was created by combining 6 - 11 megapixel CMOS image sensors that captured only visible and some infrared light intensity with no color information.

At an altitude of 20,000 feet, the "Constant Hawk" was designed to survey a circular area on the ground with a radius of approximately 96 kilometers or 60 miles, covering a total area of over 28,500 square kilometers or about 11,000 square miles. Once an event on the ground triggers a subsequent change in the imagery of that region, the system would store a timeline of the imagery captured from that region. This now made it possible to access any event at any time that occurred within the system’s range and the mission’s flight duration. The real time investigation of a chain of events over a large area was now possible in an ISR mission.

In 2006 Constant Hawk became the first Wide Area Motion Imagery platform to be deployed as part of the Army’s Quick Reaction Capability to help combat enemy ambushes and improvised explosive devices in Iraq. In 2009, BAE System would add night vision capabilities and increase the sensor density to 96 megapixels. In 2013, full color imagery processing capability would be added.

The system was so successful that the Marine Corps would adopt elements of the program to create its own system called Angel Fire and a derivative system called Kestrel.

ARGUS-IS
As Constant Hawk was seeing its first deployment, several other similar systems were being developed that targeted more niche ISR roles, however one system in particular would create a new class of aerial surveillance, previously thought to be impossible. Called the ARGUS-IS, this DARPA project, contracted to BAE Systems aimed to image an area at such high detail and frame rate that it could collect "pattern-of-life" data that specifically tracks individuals within the sensor field. The system generates almost 21 TB of color imagery every second. Because ARGUS-IV is specifically designed for tracking, a processing system derived from the Constant Hawk project called Persistics was developed.

Because this tracking can even be done backwards in time, the system now becomes a powerful tool for forensic investigators and intelligence analysis of patterned human behavior.


--
SUPPORT NEW MIND ON PATREON
www.patreon.com/newmind

All Comments (21)
  • @PapaWheelie1
    This is how they make every traffic light red whenever I go anywhere 😂
  • @kizzjd9578
    As a commercial drone mapping pilot, the standard mapping sensor is 20mp from surface up to about 300ft agl. The top range sensor is 45mp which depending on which lens used, can double the gsd as the 20mp from the same height. With a 20mp sensor @ 300ft, the gsd is about 20mm/pixel. If this is the case, then I believe the military currently can achieve 10 times this resolution at a higher altitude. Can’t wait to see what the future holds.
  • @AliHSyed
    This technology is so impressive, so scary
  • @davidgray6999
    What you can see from orbit is of course limited by physics, but you'd be surprised at what you can do with clever techniques. For example, you can slew your field of view so it stays on one location for a few seconds, then register the different views into one, significantly improving the resolution.
  • @alexwang007
    I can't wait to see you cover modern thermal imaging (mostly commercially available uncooled micro bolometer LWIR cameras)
  • @chrisw1462
    Would love to see something on the data handling system for something like this - How does a drone handle storing 21 TB per second???
  • @maxmyzer9172
    Sats are still used for surveillance, just not for things like tracking cars. They have very high resolution but also have to have frames of reference due to the atmospheric distortion
  • Excellent video, incredible to know the capabilities of these systems in the early 2010s was already this good. It is terrifying to think about the capabilities that are in use today, and those that are worked on. Person of Interest was a documentary it seems...
  • Love that last image of a floating balloon 🎈 People should open their eyes to technology
  • The density and size of the image sensors is nowhere near as important as the ability to optically zoom into a subject under surveillance. Sensors may offer similar resolution to photographic film but it is the development of smaller, lightweight optical systems that makes the difference.
  • you can cover such a wide topics in detailed yet easy to listen. As always, good one....
  • The real magic is what happens when you integrate the low-altitude synthetic aperture radar sensors with the visual sensors! Especially when you got a library of ML targets and some impressive computing power.
  • AI or manual pilots decide what information to be stored. Not all 21TB per second is stored. Even if it is stored, immediately it is uploaded to military satellite link capable of uploading terabytes per second. They don't have to have server running on drone.
  • @dawnlightening
    @New Mind - Thanks for uploading this very informative and educational video! One last thing which you for some reason avoided mentioning is the excellent complementarity between data-intensive surveillance systems and data-hungry AI. This combination would lead to exactly the terrifying dystopian scenes portrayed in the Stargate and Terminator movie series, in which intelligent machines track down and destroy humans.
  • I have mixed feelings about these capabilities. The raw technology is fascinating, and there are plenty of scientific and public safety applications. Tracking emerging volcanoes, their eruptions, and subsequent evacuations would be of interest in the near future. After that, law enforcement could do a lot with this tech, and that's where my gut instinct is troubled. I love the idea of faster and safer resolutions to Amber Alerts, but I've seen enough body cam footage to know that there are plenty of officers who overstep their authority - and could easily bring George Orwell's 1984 to life here. Also chilling would be less-than-noble civilian uses; harassment of politicians, skewed narratives in divorce court, stalking a battered spouse, etc. I hope the legislative and judicial forces of the free world find some responsible limits for the tracking of average citizens.
  • Great video. Having done a lot of work in commercial aerial imaging and satellite imagery, this tech is definitely next level. eg. a 2mm res is just insanely good. The issues with processing the volumes of real-time data are absolutely huge, so image subtraction and feature extraction are a clever way to make this manageable and usable for analysis. I'm guessing the recent advances in AI such as GPT-4 applied to this ISR use-case will enable another huge leap forward with smarter processing at the edge.
  • @AJ-ln4sm
    I feel for whoever is watching me, they are probably bored to death.
  • @garivera15
    The secondary soundtrack is not only distracting but also annoying. Why was it necessary to include it?