Son DDDAS UofA

A DDDAS-based Autonomous Situational Awareness System for 3-D Border Surveillance Sponsor: Air Force Office of Scientific Research FA9550-12-1-0238 (DDDAS); 15RT1016 (New) Program Manager: Dr. Frederica Darema Sara Minaeian1, Seunghan Lee1, Yifei Yuan1, Dr. Young-Jun Son1, Dr. Jian Liu1, Dr. Jyh-Ming Lien2 1

Systems and Industrial Engineering, University of Arizona 2 Computer Sciences, George Mason University

DDDAS Conference 2016 - August 11, 2016

Agenda • Background • New Project Overview • DDDAMS-based Planning and Control Framework • Proposed Approach and Emerging Challenges • Experiments and Analysis

Computer Integrated Manufacturing & Simulation Lab Department of Systems and Industrial Engineering, The University of Arizona, Tucson

1

Overview of Previous Project Motivation: TUS 1- Project (23-mile long area of US/Mexico border in Sasabe, AZ)

Problem: A highly complex, uncertain, dynamically changing border environment

Goal: Develop a simulation-based planning and control system for surveillance and crowd control via collaborative UAVs/UGVs Proposed approach Hardware-in-the-Loop Dynamic Data Driven Adaptive Multi-scale Simulation (DDDAMS) Incorporates real UAVs/UGV in addition to the simulated ones Adopts Dynamic Data Driven Application System (DDDAS) paradigm (Darema, 2004) Utilizes different fidelities into the simulation Computer Integrated Manufacturing & Simulation Lab Department of Systems and Industrial Engineering, The University of Arizona, Tucson

2

DDDAMS-based Planning and Control Framework

Computer Integrated Manufacturing & Simulation Lab Department of Systems and Industrial Engineering, The University of Arizona, Tucson

3

Multi-Resolution Data

Challenge:

Aggregate Multi-resolution data

Opportunity:

UAVs’ Global perception and UGVs’ detailed perception Computer Integrated Manufacturing & Simulation Lab Department of Systems and Industrial Engineering, The University of Arizona, Tucson

4

Detection Module: Modeling GoPro HERO 3+ Tarot Gimbal

Carl Zeiss Tessar HD1080p

HD (16:9): 1280x720p @ 120 ~ 25 fps

HD (16:9): 1280x720p @ 30 fps FOV: 90

FOV(x): 64.4 ; FOV(y): 37.2

Onboard Computer: ODROID U3 1.7 GHz quad-core ARM-Cortex-A9

FOV(y)

Linux-based operating system

FOV (x)

h

DR(y) DR(x) Computer Integrated Manufacturing & Simulation Lab Department of Systems and Industrial Engineering, The University of Arizona, Tucson

5

Detection Module: Results Optical FlowBased Motion Detection

HOG-Based Human Classification

Computer Integrated Manufacturing & Simulation Lab Department of Systems and Industrial Engineering, The University of Arizona, Tucson

6

Detection Module: Localization 1 • • •

%

Perspective transformation At least 4 coplanar, non-collinear landmarks $ UGVs with 2 known positions:

Landmarks with: (x,y): known GPS location, (u,v): detected image location.

– real-world location (GPS) – image location (labels)



" Weak Perspective approximation ⎡ U i (t ) ⎢ ⎢ Vi (t ) ⎢ W (t ) ⎣ i

⎤ ⎡ xi (t ) ⎥ ⎢ ⎥ = M ⎢ yi (t ) ⎥ ⎢ 1 ⎣ ⎦

UAV UGVs as colored landmarks

⎤ ⎥ ⎥ ⎥ ⎦

!

UAV’s detection range UGV’s detection range Crowd’s individuals

#

UAV

UAV’s detection range

UGV 1. Minaeian, S, Liu, J., and Son, Y.-J. (2015). Vision-based Target Detection and Localization via a Team of Cooperative UAV and UGVs. IEEE Transactions on Systems, Man, and Cybernetics, 46(7): 1005-1016. Computer Integrated Manufacturing & Simulation Lab Department of Systems and Industrial Engineering, The University of Arizona, Tucson

7

Tracking Module: Modeling 1

Auto-Regression-Based Motion Modeling

" 𝑮𝒍,𝒎 𝒕 + 𝝉 𝑷

Grid-Based Crowd Dynamic Modeling

" 𝑨𝒍,𝒎 𝒕 + 𝝉 𝑷 1. Yuan, Y., Li, M., Son, Y.-J., and Liu, J. (2015). DDDAS-based Information-Aggregation for Crowd Dynamics Modeling with UAVs and UGVs. Frontiers in Robotics and AI, 2(8).. Computer Integrated Manufacturing & Simulation Lab Department of Systems and Industrial Engineering, The University of Arizona, Tucson

8

Tracking Module: Results

/ 𝑡+𝜏 = 𝑤 ,4 ,9 𝑃,-,. -,. 𝑃- ,. 𝑡 + 𝜏 + (1 − 𝑤-,. )𝑃-,. 𝑡 + 𝜏

Bayesian estimation: 75% less computation time; Comparable/better prediction performance Computer Integrated Manufacturing & Simulation Lab Department of Systems and Industrial Engineering, The University of Arizona, Tucson

9

Tracking Module: Social Force Direction/angle and walking speed of humans has been modeled using 2 heuristics based on their visual data (Moussaïd et al., 2011) • Heuristic 1: Minimize the angle/direction of each individual by minimizing the distance to its destination 2 minα d(α ) = dmax + f 2 (α ) − 2dmax f (α )cos(α − α 0 )

• Heuristic 2: Change walking speed of human to avoid collisions v = min(v0 ,

dh ) τ

human field of view: (−ϕ ,+ϕ ) (for e.g. − 90 ,+90) maximum range of view: dmax (for e.g. 10 m) human comfortable walking speed: v0 (for e.g. 1.5 m / s) distance to obstacle: dh

relaxation time ( time requires to adopt new behavior ) : τ (for e.g. 1 sec) Computer Integrated Manufacturing & Simulation Lab Department of Systems and Industrial Engineering, The University of Arizona, Tucson

10

Motion Planning Module: Modeling •

Given selected destination of UAV/UGV, find the path that optimizes a certain combination of criteria – f1: the vehicle travelling distance (Euclidian) – f2: composite function involving vehicle altitude/elevation change

(a) minimize travel distance



(b) minimize energy consumption

(c) minimize the weighted average of (a) and (b)

Weighted average of the multiple objectives Computer Integrated Manufacturing & Simulation Lab Department of Systems and Industrial Engineering, The University of Arizona, Tucson

11

Agent-based Hardware-in-the-Loop Simulation UGV (APM:Rover / Ardurover)

UAV (APM:Copter / Arducopter)

Sensory Data (e.g. GPS)

Hardware Interface: MAVproxy

Control Commands (MAVLink Messages)

Agent-based Simulation Repast Simphony with 3D GIS Optical-Flow-Based Motion Detection

HOG-Based Human Classification Computer Integrated Manufacturing & Simulation Lab Department of Systems and Industrial Engineering, The University of Arizona, Tucson

12

Agenda • Background • New Project Overview • DDDAMS-based Planning and Control Framework • Proposed Approach and Emerging Challenges • Experiments and Analysis

Computer Integrated Manufacturing & Simulation Lab Department of Systems and Industrial Engineering, The University of Arizona, Tucson

13

UAVs-UGVs Surveillance Framework •

Aerial targets as well as land-based targets



Not all targets are enemies

Computer Integrated Manufacturing & Simulation Lab Department of Systems and Industrial Engineering, The University of Arizona, Tucson

14

High altitude level (HAL)

Low altitude level (LAL)

Surface level (SL)

Computer Integrated Manufacturing & Simulation Lab Department of Systems and Industrial Engineering, The University of Arizona, Tucson

15

3-Level Measurement System in Border Surveillance 3-Levels

Types

Sensors

Measurement Data

ElectroOptical/Infrared (EO/IR)

High altitude level (HAL)

SAR Image

Synthetic Aperture Radar (SAR)

EO/IR Image Spectral Image

Remote Sensing

Low altitude level (LAL)

Surface level (SL)

Surveillance Camera

Lidar Image Thermal Images

Mobile Sensors

Fixed Sensors

Magnetic Data

Computer Integrated Manufacturing & Simulation Lab Department of Systems and Industrial Engineering, The University of Arizona, Tucson

16

Agenda • Background • New Project Overview • DDDAMS-based Planning and Control Framework • Proposed Approach and Emerging Challenges • Experiments and Analysis

Computer Integrated Manufacturing & Simulation Lab Department of Systems and Industrial Engineering, The University of Arizona, Tucson

17

Revised DDDAMS-based Framework- Level 0

Computer Integrated Manufacturing & Simulation Lab Department of Systems and Industrial Engineering, The University of Arizona, Tucson

18

Agenda • Background • New Project Overview • DDDAMS-based Planning and Control Framework • Proposed Approach and Emerging Challenges • Experiments and Analysis

Computer Integrated Manufacturing & Simulation Lab Department of Systems and Industrial Engineering, The University of Arizona, Tucson

19

1.0 - Target DRI v Detection, Recognition, and Identification of aerial and land-based foe targets (e.g. traffickers) from Regular targets.

Discovery by any means of the presence of a person, object, or phenomenon *

• •

Sensing Technologies: Thermal technologies, radar, etc. Motion detection, optical flow, etc.

* Military, U.S. (2005). Dictionary of military and associated terms. US Department of Defense. Computer Integrated Manufacturing & Simulation Lab Department of Systems and Industrial Engineering, The University of Arizona, Tucson

20

1.0 - Target DRI v Detection, Recognition, and Identification of aerial and land-based foe targets (e.g. traffickers) from Regular targets.

Determination of the nature of a detected person, object or phenomenon, and its class or type *

• • •

Image Feature Extraction Multivariate Classification, HOG, etc. Information-aggregation method for Multiple Sensor data

* Military, U.S. (2005). Dictionary of military and associated terms. US Department of Defense. Computer Integrated Manufacturing & Simulation Lab Department of Systems and Industrial Engineering, The University of Arizona, Tucson

21

1.0 - Target DRI v Detection, Recognition, and Identification of aerial and land-based foe targets (e.g. traffickers) from Regular targets.

Discrimination between recognizable objects as being friendly or enemy *

• •

Gaussian Mixture Model for Objective Identification BDI (Belief–Desire–Intention) framework

* Military, U.S. (2005). Dictionary of military and associated terms. US Department of Defense. Computer Integrated Manufacturing & Simulation Lab Department of Systems and Industrial Engineering, The University of Arizona, Tucson

22

Emerging Challenges - Target DRI Sensor Type

Vehicle

Armed People

Animals

New Observation

Image

Training Thermal Sensor Magnetic Sensor

Information Aggregated Classifier

Predicted Class

Underground, … Radar, etc





Updating Real Class

Challenge: Feature Extraction Potential Method: Discriminant Analysis

Feature 2

Sample Image Data

V V V V V AV A V AA A AP AP A A AP A APAP AP Feature 1

Computer Integrated Manufacturing & Simulation Lab Department of Systems and Industrial Engineering, The University of Arizona, Tucson

Challenge: Classification under DDDAS framework Potential Methods: SVM, QDA, KNN 23

2.0- Pattern Processing v Recognition and Prediction of objective, behavior and route patterns of foe targets



Spatial Optimization Method

Computer Integrated Manufacturing & Simulation Lab Department of Systems and Industrial Engineering, The University of Arizona, Tucson

24

Emerging Challenges - Pattern Processing …          …        …           …            … 0

0

1

0



0

1

0

0



2

0

0

0



0

0

3

0



… … … … …

3D Grid Matrix with Class Values 0: Unoccupied; 2: Friend;

1: Neutral; 3: Foe.

Challenge: Computational load increases from n2 to (n3)*K. K: the number of classes Computer Integrated Manufacturing & Simulation Lab Department of Systems and Industrial Engineering, The University of Arizona, Tucson

25

3.0- Mission Control v Resource allocation and Motion planning for controlling foe targets and terminate/mitigate their activities

Target’s Predicted Objectives

Optimized Sensors Allocation

Persistent Surveillance

Computer Integrated Manufacturing & Simulation Lab Department of Systems and Industrial Engineering, The University of Arizona, Tucson

26

Emerging Challenges - Mission Control Challenge: Risk Assessment Uncertainty Quantification

Target Pattern Prediction

Target1

Target Risk Assessment

Target3

+

Target2 Updating

Target

Risk

1

0.75

2

0.97

3

0.23





Allocation Optimization

New Observation

Computer Integrated Manufacturing & Simulation Lab Department of Systems and Industrial Engineering, The University of Arizona, Tucson

27

Agenda • Background • New Project Overview • DDDAMS-based Planning and Control Framework • Proposed Approach and Emerging Challenges • Experiments and Analysis

Computer Integrated Manufacturing & Simulation Lab Department of Systems and Industrial Engineering, The University of Arizona, Tucson

28

Experiment 1: Visual Sensors- Tracking Template Tracking: Particle Sampling

Computer Integrated Manufacturing & Simulation Lab Department of Systems and Industrial Engineering, The University of Arizona, Tucson

29

Experiment 1: Visual Sensors- Tracking Handling Occlusion: Adaptive Modeling

Computer Integrated Manufacturing & Simulation Lab Department of Systems and Industrial Engineering, The University of Arizona, Tucson

30

Experiment 2: Seismic Sensors- DRI •

Real time seismic data are collected by geophones on the grounds.

Geophones

Configuration

Real Time Data Collection

Computer Integrated Manufacturing & Simulation Lab Department of Systems and Industrial Engineering, The University of Arizona, Tucson

31

Experiment 2: Seismic Sensors- DRI •

In Detection stage: Normal  Situation

Target  Appearance

0.8

0.8

0.7

0.7

0.6

0.6

0.5

0.5

0.4

0.4

0.3

0.3

0.2

0.2

0.1

0.1

0

0 0



50

100

150

200

250

0

300

50

100

150

200

Increased Magnitude

250

300

In Identification stage: Target   Running

Target   Walking 0.8

0.8

0.7

0.7

0.6

0.6

0.5

0.5

0.4

0.4

0.3

0.3

0.2

0.2

0.1

0.1

0

0 0

50

100

Computer Integrated Manufacturing & Simulation Lab Department of Systems and Industrial Engineering, The University of Arizona, Tucson

150

200

Higher Frequency

250

300

32

Experiment 3: Various Sensors- Simulated

Computer Integrated Manufacturing & Simulation Lab Department of Systems and Industrial Engineering, The University of Arizona, Tucson

33

Ongoing Work • Feature extraction & classification of aerial and ground targets • Identification of foe targets based on various sensors data • Data aggregation from 3-levels of heterogeneous sensors • Efficient prediction of foe targets behaviors • Introducing uncertainty in resource allocation • Data collection for realistic scenarios and model validation • Modeling sensors in a Physics-based simulation Computer Integrated Manufacturing & Simulation Lab Department of Systems and Industrial Engineering, The University of Arizona, Tucson

34

Thank You Sponsor: Air Force Office of Scientific Research FA9550-12-1-0238 (DDDAS); 15RT1016 (New) Program Manager: Dr. Frederica Darema PIs: Young-Jun Son1, Jian Liu1, Jyh-Ming Lien2 Students: S. Minaeian1, S. Lee1, Y. Yuan1 1 Systems and Industrial Engineering, University of Arizona 2 Computer Science, George Mason University PI Contact: [email protected]; (520)626-9530 http://www.sie.arizona.edu/faculty/son/index.html; Computer Integrated Manufacturing & Simulation Lab Department of Systems and Industrial Engineering, The University of Arizona, Tucson

35