Mobile ISR Intelligent ISR Management and Exploitation for the Expeditionary Warfighter

D. Madden, T. Choe, H. Deng, K. Gunda, H. Gupta, N. Ramanathan, Z. Rasheed, E. Shayne ObjectVideo, Inc. Reston, VA, USA {dmadden, tchoe, hdeng, kgunda, hgupta, nramanathan, zrasheed, eshayne}@objectvideo.com Abstract - Modern warfighters are informed by an expanding variety of Intelligence, Surveillance and Reconnaissance (ISR) sources, but the timely exploitation of this data poses a significant challenge. ObjectVideo (“OV”) presents a system, Mobile ISR, to facilitate ISR knowledge discovery for expeditionary warfighters. The aim is to collect, manage, and deliver time-critical information when and where it is needed most. The Mobile ISR system consumes video, still imagery, and target metadata from airborne, ground-based, and hand-held sensors, and indexes that data based on content using state-of-the-art video analytics and user tagging. The data is stored in a geospatial database and disseminated to warfighters according to their mission context and current activity. The warfighters use an Android mobile application to view this data in the context of an interactive map or augmented reality display, and to capture their own imagery and video. A complex event processing engine enables powerful queries to the knowledge base. The system leverages the extended DoD Discovery Metadata Specification (DDMS) card format, with extensions to include representation of entities, activities, and relationships.

I.

INTRODUCTION

The modern Intelligence, Surveillance, and Reconnaissance (ISR) enterprise takes inputs from a wide variety of sensors, including narrow- and wide-area aerial video payloads, Ground Moving Target Indicator (GMTI) radars, and a multitude of cameras on the ground, both stationary and carried by warfighters. Tremendous effort has been invested in processing, managing, and visualizing this data to the benefit of ISR analysts. However, for the expeditionary warfighter in the field, operating without large screens and desktop interfaces, this data is difficult or impossible to access. There are a number of challenges in bringing the power of the ISR enterprise to the expeditionary warfighter. Mobile network reliability and bandwidth and the portable power constraints limit the amount of real-time data and processing available at the tactical edge. Even without these limitations, This work was funded through contracts N00014-12-C-0423, N0001413-C-0275, and N00014-11-C-0514 and performed under the supervision of the Office of Naval Research. 978-1-4799-5921-1/14/$31.00 ©2014 IEEE

A. Hakeem Decisive Analytics Corp. Arlington, VA, USA [email protected]

there exists far more data than a single warfighter has time to look at, much less make sense of. And, of course, the warfighter is already quite busy with other tasks, his or her hands and attention occupied. It is the goal of this project to overcome these challenges and leverage the vast amount of data in the ISR enterprise to improve the situational awareness of the expeditionary warfighter. We present a Mobile ISR system which takes advantage of low-cost, commercial off-the-shelf (COTS) mobile hardware to deliver the most important pieces of information to the warfighters who need it the most, and allow them to easily contribute their own ISR reports to the enterprise. The system uses video analytics and a complex event processing engine to automatically prioritize key ISR data. Detailed blue-force tracking allows the system to disseminate background information and alerts to each warfighter based upon their location and mission context. Each warfighter can access this data via streamlined map and augmented reality interfaces running on compact Android tablets, with the ability to quickly capture and share geo-tagged photos, videos, and text reports. An optional Android Wear smartwatch allows the user to easily receive, examine, and act upon new alerts and mission goals with a minimum of interaction. This paper is structured as follows: In Section II, some key technical challenges for the Mobile ISR system are detailed. In Section III, the architecture of the developed system is presented and its different modules and capabilities are briefly described. In Section IV, details of the system components and their implementation are described. Section V offers details of system deployment at a field exercise, followed by a discussion of future directions in Section VI.

II.

TECHNICAL CHALLENGES

C. Warfighter Situational Awareness

This section addresses the technical challenges in designing and implementing the system and its continued execution in realistic scenarios.

Warfighters operate in challenging environments. We can expect that their bodies, senses, and cognition are all productively occupied doing their jobs, leaving little room for interaction with a mobile computer. For a mobile ISR application to succeed, it must quickly provide enhanced situational awareness to the warfighter with a minimum of distraction and interaction. In the consumer market, eager adoption of new mobile devices occurs only when the real benefits of the device outweigh the ever-present drawbacks: additional weight and bulk, charging batteries, cost, etc. The same is certainly true with the military, however the stakes are higher.

A. Network and Power Constraints Warfighters in austere environments do not have the luxuries of stable electrical grids and broadband connections. Power is often limited to batteries carried on-person. Network connectivity can come from a variety of sources (mobile adhoc networks (MANET), tactical cellular, satellite), often requiring hops through multiple heterogeneous links and subject to frequent outages. A mobile solution must be resilient to non-ideal network bandwidth and reliability, as expressed in Figure 1, while taking care to use the mobile device’s radios efficiently to conserve battery power.

We explain how these challenges are addressed by the MobileISR system in the System Architecture and Implementation sections which follow.

B. Data Overload Sensors in the battlefield produce far more data than any one person could attend to, even in ideal conditions. Even a fraction of this data could easily overwhelm the attention of a warfighter carrying out other tasks. Given the network limitations previously discussed, the data must be filtered at a point proximal to the tactical edge. It is equally vital that information be organized and properly presented so that each warfighter receives the data that is useful to their particular mission and current activity while de-emphasizing data that may be distracting. Naïve attempts at filtering this data along single axes such as location, time, or severity often exclude useful data without sufficiently reducing the overall volume.

III. SYSTEM ARCHITECTURE The system is composed of distributed applications and services, using RESTful web services [1] for communication. Figure 2 shows a high-level diagram of the system architecture. Deployment of each component is flexible and allows services to be placed according to the particular network topology. Individual services can be deployed to a cloud infrastructure, as virtual machines at a forward operating base, or running on small form factor hardware at the tactical edge. A. Video Analytics The system leverages ObjectVideo’s existing video analytics to ingest full motion video from the ISR enterprise or independent sources. The video analytics engine georegisters video from calibrated ground cameras or instrumented aerial sensors to a DEM or LiDAR model, and performs tracking,

Live or Recorded FMV: H.264, MPEG-4, WMV Metadata (if available) Target Tracks: STANAG 4607, ObjectVideo Primi ves

Video Analytics

DDMS Adapter DDMS, Imagery

Complex Event Processing

Web Clients

DDMS, Imagery JSON data, Imagery, MP4 FMV

Mobile Clients

Mobile ISR Service

DDMS descriptors: Extended DoD Discovery Metadata Specifica on (DDMS) cards, stored to Accumulo DB Event Detec ons and Query Results: Extended DDMS Cards, output as GHub Datasets or to Accumulo DB Knowledge Objects: JSON (mission, warfighter, and point of interest data, JPEG imagery, MP4 Video), output as GHub Datasets or DDMS Cards

Figure 2: Mobile ISR Architecture

Figure 1: Network challenges at the tactical edge.

2

event specifications from individual elements such as agents, actions, scene elements, and relationships. In Figure 3, the complex event processing framework has detected a predefined event: a person (labeled “242”) crossing a tripwire (shown in red) and then meeting with another person.

detection, and classification to produce geo-indexed metadata on moving targets. Metadata is linked to the source video for fast retrieval and review of evidential footage. B. Department of Defense Discovery Metadata Specification (DDMS) Adapter

These event specifications can be used to cue analysts in and warfighters about important events happening all over the battlefield, such as movements of multiple vehicles or distinctive activities by actors. Detected events from one or more specifications are routed to the Mobile ISR Service, where they can be appropriately disseminated to expeditionary warfighters.

DDMS is an XML metadata standard within DCGS Integration Backbone (DIB) [2]. The core layer of the DDMS schema contains valuable catalog information, such as author, timestamps, and geolocation which can be used to index and retrieve full motion video assets. ObjectVideo has extended this format to include data about the contents of the data, including entities (movers, structures, and landmarks), events, and the relationships between them.

D. Mobile ISR Service

Target metadata from the video analytics engine or from third party sources (using the NATO Standardization Agreement (STANAG) 4607 format [3]) are fed to the DDMS adapter. The DDMS Adapter translates the metadata into properly formatted DDMS “cards”. The extended DDMS cards are then stored in a geo-indexed Accumulo/GeoMesa database [4,5] for rapid retrieval. They can also be shared with the ISR Enterprise via NRL’s GeoSpatial Hub (GHUB) [6], where they can be syndicated to the Tactical Service Oriented Architecture (T-SOA) or parsed by the Active-Wiki knowledge repository [7]. This data is also made available to the Complex Event Processing engine.

The Mobile ISR Service hosts a RESTful HTTP API which is used by the mobile clients and web clients alike to share relevant ISR data and alerts. The service receives periodic status updates from each mobile client, and maintains blue force tracking information on each, including the current mission and activity. The service updates each mobile client according to this information, providing the most relevant data according to the needs of each warfighter. The service also tracks receipt of this data to avoid unnecessary retransmission over mobile networks. The mobile service also receives geotagged ISR imagery, video, and text reports from the mobile clients, which can be forwarded on to the ISR enterprise according to configurable rules.

C. Complex Event Processing E. Mobile Client

ObjectVideo has developed a complex event processing framework that uses hierarchical inference to detect specified activities and conditions in large datasets. Using a drag-anddrop graphical user interface (GUI), users can build complex

The mobile client software runs on an Android tablet or smartphone and provides two interfaces for geographical situational awareness: an overhead map view and an augmented reality view, which show the locations of nearby targets and points of interest in relation to the warfighter. A background process periodically updates the Mobile ISR service with the warfighter’s location, activity, and other status information and receives updates on relevant ISR data. The mobile client also allows the user to quickly capture and submit geo-tagged imagery, video, and text reports. The Android tablet application optionally supports the Android Wear API, sending certain alerts to a compatible smartwatch where the warfighter may be notified, view and act upon them in a compact interface without taking the tablet out of a pocket. F. Web Client The web client serves as a command and control interface as well as providing a useful debugging view into the system. Hosted by the Mobile ISR service, it displays all targets, warfighters, and points of interest on an interactive map in any modern web browser. The web client allows the user to explore the ISR data, follow warfighters progress and view their previous tracks and activity, and to send new missions and mission information to warfighters.

Figure 3: Results of a complex event detection.

3

now the job of the Complex Event Processing engine to identify pertinent events which might be of interest to the warfighter. The system allows an operator to specify and then automatically detect complex activities undertaken by targets over wide areas of the battlefield, given adequate tracking data.

IV. IMPLEMENTATION The following sections describe the implementation of the system outlined above and the methods employed to overcome the challenges outlined in Section 2. Please note that core issues of automated video analytics, such as robust target and activity detection, classification, geolocation of video, and tracking in single and multiple camera views, are not discussed in depth in this paper – interested readers are referred to [8,9,10] for detailed discussions on these topics.

The Complex Event Specification UI enables hierarchical complex event specification using intuitive drag-and-drop icons and drawing tools. The system incorporates Geographic Information Systems (GIS) data and a geo-browser interface (NASA World Wind [11]) to allow geospatial elements in the event specifications, and alerts to displayed in context on the map.

A. Complex Event Processing and Specification with Extended DDMS Cards As discussed in the Architecture section, the Video Analytics engine automatically extracts metadata from the videos by leveraging existing algorithms that classify tracked targets as vehicles or pedestrians based on shape and appearance in a linear discriminant analysis framework. Vehicles are further classified according to class (passenger car, SUV or pickup truck) using a 3D model-based fitting method and annotated with a color feature descriptor using the methods described in this section. This allows the system to detect events such as “a blue SUV entered an area of interest”. Approaches exist to fit and classify vehicles. However, few of these approaches are real-time methods. The novel real-time vehicle fitting algorithm that OV developed under a separate effort uses simplified 3D models learned from detailed Computer-Aided Design (CAD) vehicle models representing the abovementioned vehicle classes of interest. Each simple model contains 16 vertices and 28 facets that best approximate the mean shape of multiple CAD models in each class, as shown in Figure 4.

In Figure 5, the user has created a specification to find persons loitering inside an area of interest prior to meeting another person. To create this, the user placed the “Person” agent on the screen, then selected the “Area of Interest” scene element, and defined that area on a map of the site. The user then selected the “Loiter” action and dragged it between the “Person” and “Area of Interest” to connect them and create a simple event. The user followed a similar process to create a “Person meets Person” event, and then dragged arrows between the first agent in each event to indicate that the person involved should be the same (this is dependent on accurate tracking of this person earlier in the pipeline). This graphical representation is now translated into a series of queries against the spatially-indexed metadata in the Accumulo/GeoMesa database. Target specifications such as size, classification, and color, combined with geospatial and temporal constraints can be used to help narrow queries to return very specific results. Execution of these queries yields detected occurrences of the specified events, which are pushed to the Mobile ISR service via a RESTful HTTP API.

The output of the Video Analytics engine is detailed, georegistered metadata for each target, which the DDMS adapter processes to create representative DDMS cards. ObjectVideo’s additions to the Extensible Layer of the DDMS specification allow the metadata to describe the content of the video, including the geospatial extent of coverage over time, the entities in the video and their movements, simple events such as loitering or tripwire crossing, and spatio-temporal or identity relationships between elements. The cards are stored in a geospatial database (Accumulo/GeoMesa) for later retrieval. This provides a means of sharing rich content information across systems and across the DoD enterprise.

B. Mobile Data Conditioning It is the role of the Mobile ISR Service to condition and filter data between the Mobile ISR Clients and the rest of the

At this point, a huge amount of data has been extracted from the video, registered to the map, cataloged, and indexed. It is

Passenger Car

SUV

Pickup Truck Figure 5: The Complex Event Specification UI

Figure 4: Simplified 3D vehicle models learned from various vehicle types.

4

accordingly – each piece of data will be sent only once to each client, saving processing and bandwidth. The service tracks which knowledge objects have been distributed to each client and handles this automatically. If the client should somehow become out of sync it can request a full refresh.

system. The Mobile ISR Service receives data from the Complex Event Processing engine as explained above, but can also be populated with data from GHub, map servers, or KML files. This can include alerts about detected events, target sightings, mission data, or simply informational landmarks and points of interest. All interaction between the Mobile ISR Client application and the Mobile ISR Service takes place over a RESTful web service API. This has a number of benefits, including ease of testing both sides of the interface, and tremendous expandability – additional clients (such as our browser-based Web Client interface) can easily reuse the API.

To determine which data should be included in each update, the service incorporates warfighter location, status, and mission specifications, as well as the content of the data point. To that end, the server supports a representation of each warfighter’s current mission, including: • Mission Name, ID, and Description • Waypoints and other defined geography • Participants • Related Tracked Entities • Mission objective • Related Points of Interest and intelligence For instance, activity which might normally be deemed unimportant might become important if it is at a related point of interest or en route to a waypoint. By understanding the mission, the Mobile ISR Client can also provide assistance to the user in locating the next waypoint.

The API is defined using the JAX-RS web service standards [12], using annotated Java classes to define method stubs. JavaScript Object Notation (JSON) [13] was chosen for object serialization as opposed to SOAP/XML due to its compactness, readability, and compatibility with many web and mobile frameworks. The Mobile ISR Service includes a module to translate from JSON to DDMS and back. The system makes extensive use of cross-platform, opensource components and frameworks. The Mobile ISR Service and API are implemented in Java using the Jersey framework [14], a JAX-RS implementation, and use the Jackson library [15] to serialize Plain Old Java Objects (POJOs) to JSON and back. Jackson serialization is also being used on the Android client, which means the same POJOs can be shared on the server and client. This vastly simplifies development and testing. The service is backed by a PostgreSQL database [16] instance, running the PostGIS geospatial extension [17]. Each device update and ISR report is stored and indexed in the PostGIS database for retrieval by time, location, tag, or ID. This enables powerful yet simple queries into the stored data. The Jersey framework produces a Web Application Description Language (WADL) file on demand, which describes the implemented API for would-be integrators.

The service also calculates a “notification level” for each data point in the updates sent to each warfighter’s device. These notification levels are determined based on the individual warfighter’s position and mission, and the timeliness, severity, affiliation, and recipients of that piece of data, and are specific to each warfighter. The reason or reasons for the assigned notification level are included in the object sent to the client. This notification level is used by the client application to determine if the knowledge object is simply displayed on the map, is added to a notification list and displayed on the map, or if an audible/vibration/LED alert is displayed in addition. It also determines whether or not the optional smartwatch receives the notification. This ensures that the warfighter is not unduly distracted by low-priority information.

The service is cross-platform, and has been tested on Windows, OS X, and Linux (CentOS 6.2), and has a relatively low memory overhead, and runs well on a server with only 1 GB of RAM. It is easily deployable as a Virtual Machine (VM), and requires only single port for HTTP/HTTPS connections to be open, making it easy to configure on secure networks. Maven [18] is used for dependency management and the build process. HTTPS security and authentication to the service and client provide password-based authentication and Transport Layer Security (TLS) encryption. This is accomplished on the server side by using the Apache web server [19] as a secure proxy to our servlet.

To illustrate the system, take as an example a warfighter on patrol with a Mobile ISR Client device. As the warfighter moves, the client makes an HTTP call to the Mobile ISR Service, POSTing a JSON object containing the warfighter’s status, including geolocation, sensed activity, and an identifier for the warfighter. The service updates its database with this new data, and responds to this POST with a JSON object containing any new data for the warfighter. If this was the first time the client had connected to the service, this update might contain: • The warfighter’s current mission objective and waypoints, • ISR data related to the mission objective, • Points of Interest and ISR data near the warfighter and the mission route, • Positions of hostiles that might be further away, and

One of the principal functions of the Mobile ISR Service is to disseminate new data to each Mobile ISR Client based on need and context. The service tracks the position, status, and mission of each warfighter and responds with ISR updates which are custom tailored for that user. The service logs the data sent to each client and bundles incremental updates

5

radio “on” and consume a fixed minimum amount of energy, it is best to batch together data transfers on cellular networks, and also pre-fetch data that will be used soon while the radio is turned on. Automatic data transfers on the mobile application are designed with this principle in mind, posting data to the service and retrieving updates in a single call.



Any ISR data specifically routed to the warfighter (i.e., results of a particular complex event specification) Since this warfighter’s device has been in contact with the service previously, this data is already present on the client, and the service is aware of this. The service includes only a new point of interest that the warfighter has approached and a report of a hostile just submitted by another warfighter. The point of interest is marked with a low notification level – the Mobile ISR Client simply adds it to the map and AR displays. However, the report of the hostile is marked with a high notification level, and causes an alert to appear on the screen and the warfighter’s smartwatch to vibrate.

1) Mobile Client Map View The Mobile ISR Client’s map interface is one of the two main ways the application presents geospatial data to the warfighter, as shown in Figure 6. The map displays the warfighter’s location and relevant ISR data overlaid over map tiles with labeled road and place names. Where specified, the overlaid ISR data uses NATO military symbol color coding for affiliation and, where specified, shapes for classification. Users may zoom in and out on the map with a pinch gesture and scroll the map by dragging. The warfighter can also select filters which de-emphasize certain data for clarity.

C. Mobile ISR Client The Mobile ISR Client is an Android application, deployed on a smartphone or tablet running the Android operating system, version 4.4 [20]. The client interface was designed and tested with the Nexus 7 (2013) tablet [21], as the seven-inch screen provides ample real estate for map navigation (1920 x 1200 pixels) yet is still easily stowed in large pockets. The application was developed using Android standards for interaction, and should be quite easy for anyone familiar with smartphones or tablets to learn and use. The application installs a service to manage communication and background status updates. Notifications use the standard Android mechanisms, and will appear even if the app is in the background. The client features extensive diagnostics and performance reporting in order to better tune the system and take full advantage of field testing. The status object being reported back to the Mobile ISR Service contains the type of network (WiFi/cellular) the device is connected to, the access point if on WiFi, and the signal strength, as well as device battery levels. It also contains a summary of any communications failures since the previous update, including the locations and timestamps. This allows the service to not only reconstruct the warfighter’s path during the outage, but also begin to build a map of network performance throughout each warfighter’s travels. There is also detailed client-side logging of communication performance. A status indicator alerts the mobile user to any problems with connectivity or background processing. A comma-seperated values (CSV) report is also available from the Mobile ISR Service via a web services call. The system has several parameters which are modified for poor network conditions including adjusting the frequency of updates, and reducing imagery sizes through downsampling and higher compression rates. The system is designed to conserve device battery as much as possible without impacting performance. Cellular radios typically power down slowly and in stages after a data transfer is complete, due to the latency associated with switching the radio “on” from the “off” state and subsequently acquiring a connection. Since even sending a single byte will turn the

Figure 6: The Mobile ISR Client's map interface.

6

2) Mobile Client Augmented Reality View

The application uses the osmdroid library [22] for its map control. This avoids a dependency on connectivity to Google’s servers which using their map control imposes. Map tiles can be served to the Mobile ISR Client application by public map tile servers provided by OpenStreet Map. However, for field operations, the team required a mechanism for offline tile access. One option is to use the tile server hosted locally with the Mobile ISR Service (for use by the Web Client), but to reduce network traffic, the tiles needed for a given site are instead cached on each mobile device.

Using the action bar at the top of the display, the user can toggle between the Map view and the Augmented Reality (AR) view, as shown in Figure 8. The latter displays the live view from the device’s rear-facing camera. Using the estimated location of the target and the device, combined with the device’s compass and accelerometer readings, the AR view overlays bounding boxes to mark the approximate location of targets in the view. When in the “Pursue” mode, this can be very helpful in locating a landmark: imagery from the last sighting is shown in the drawer at right, and large arrows tell the warfighter which way to turn if the target location would be off-screen.

A drawer control slides out from the right side of the display and presents a list of ISR data, as shown in Figure 7. The user can select one by tapping the icon on the map or the entry in the list, which reveals a thumbnail image (if available), and a description. The user can then tap “Details” to see more information including larger imagery, or tap “Pursue” to focus on this particular target.

These features rely on accurate position and bearing information. In 2013, Google released its Fused Location Provider as part of an update to the Google Play Services API [24]. The Fused Location Provider handles the problem of fusing data between different location sources such as GPS, WiFi, and local movement sensors. The application’s default behavior is to rely on the Fused Location Provider as testing has shown this to be more effective than using the individual sources. Determining the compass bearing of the device - used for determining which direction the back-facing camera is pointing - is slightly more complicated. The Android OS does not provide a simple compass reading API. Instead, applications can access raw data from both the magnetic field sensor and the accelerometer. So the first task is to translate that raw data into a simple compass bearing value – projecting the magnetic field sensor onto the plane of the earth’s surface based on the accelerometer’s determination of that plane’s normal. This is complicated by the fact that Android provides magnetic sensor data in terms of the three axes of the device which are not necessarily oriented the same way as the user. If the device is held vertically (parallel to the user's face), that produces data in a different matrix than if the device is held horizontally (parallel to the ground); and also different depending on whether the device is being held in portrait or

Figure 8: Augmented reality view showing the approximate location of the selected target.

Figure 7: The drawer interface on the right side of the screen shows details for each item on the map.

7

landscape mode. The data needs to be mapped appropriately to a compass reading for all device orientations - including all angles in between these extremes. The next problem is that these sensors are extraordinarily sensitive, and many locations present significant magnetic interference (particularly near electronic equipment). This means that displaying the raw compass bearing value to the user results in a very “jumpy” indicator in the UI. The values therefore must be filtered and smoothed. However, too much smoothing results in a UI indicator that responds sluggishly as the device moves. This required fine-tuning to find an appropriate smoothing algorithm and to determine an appropriate balance. Additionally, the desirable balance point is different for a good map-based UI experience as compared to what is required for a good AR-based UI experience. 3) Mobile Client ISR Capture Capturing and submitting ISR reports from the field was also a key goal for this application. At any time, the warfighter can tap the plus-symbol in the lower left-hand corner to reveal a radial menu for ISR capture activities, as seen in Figure 9. Tapping the appropriate buttons allows the warfighter to quickly capture an image, video, or simple text report, annotate it with text tags from a drop-down menu, and submit it to the Mobile ISR Service (with automatic geo-tagging). This uses a custom camera activity which allows the user to quickly take a picture by tapping and holding on the object of interest in the frame – this directs the camera to focus on that point, while also recording the frame coordinates of the tap and the telemetry of the camera at the time of capture. This allows the object of interest and its actual location to be later inferred from the ISR report with very little input from the warfighter. Figure 9: The radial menu, accessible from the lower left corner of the screen, shows options for creating ISR reports.

4) Smartwatch Integration The Mobile ISR Client also includes integration with the Android Wear smartwatch platform [25]. Android Wear is an effort from Google designed to extend notifications and interface from Android phones and tablets to a series of touchscreen watches from third parties. This effort fits well with our goals, being designed to be “glanceable” and require “low interaction” from the user [26]. The watches connect via Bluetooth Low Energy and also include additional sensors including gyroscope, accelerometer, and compass. The aim is to allow the warfighter to get many of the benefits of the mobile application while keeping their hands free for other tasks, and limiting the amount of attention and interaction required of the warfighter.

on the reading of the watch’s internal compass. The location information is provided by the tablet’s GPS sensor. Mission notifications and waypoint-based navigation are integrated into the Wear app, allowing warfighters to receive and act on new orders from command without touching the tablet device, as shown in Figure 11. D. Web Client The system includes a Web Client, which provides an HTML 5 command and control interface to view ISR reports and entities, to assign mission objectives to warfighters, and to monitor warfighter status. The Web Client also serves as a useful development and debugging tool. It is implemented in HTML and JavaScript, using the Leaflet.js library for the map UI functionality, and runs in any modern browser. The map UI fetches raster map tiles as needed from a GeoServer instance [27], typically (but not necessarily) co-resident with

New POI notifications appear on both the tablet and the watch at the same time, as in Figure 10. Notification on the watch include the description, the thumbnail images, and an arrow showing the direction and distance to the POI, oriented based

8

as it updates. The context menu also has an option to “Show Trail”. Selecting this option displays the full historical path or “snail trail” of the selected warfighter or target, as shown in Figure 12. Each marked point in the path represents a recorded location of the warfighter’s mobile device, while the line segments between points are color-coded based on detected activity (running, walking, riding). A dotted line indicates that the device had communication problems (failures to connect to the server) during that period. The timeline on the lower edge shows the same data as the path, organized chronologically instead of spatially. Each location update is shown as a dot along the timeline, while a color-coded bar shows activity. Grey shaded areas indicate connectivity problems. The timeline and map trail are linked – hovering the mouse over a point on the map trail will highlight the corresponding time slice in the timeline, and vice versa. It is hoped that this interface will help commanders to better understand warfighter activity and network conditions. V. TACTICAL NETWORK DEPLOYMENT AND TESTING The system described in this paper was deployed and operated during the Agile Bloodhound 2013 exercise at Marine Corps Base Hawaii during September and October of 2013 [29,30]. Agile Bloodhound is an integration and demonstration event conducted by the ONR Code 30 Thrust Managers for C4 and ISR with support from the Space and Naval Warfare Systems Center, Pacific (SSC-Pacific), Code 53603, and the Marine Forces Pacific Experimentation Center (MEC). The goal of the event was to showcase research and development and technologies which benefit expeditionary warfighters. The exercise consisted of three weeks of integration, network, and operational testing culminating in two days of demonstrations to an audience of U.S. Marines. The team received valuable feedback from expeditionary Marines, which was incorporated into the prototype in order to streamline the interface and facilitate operation by the warfighter.

Figure 10: The Mobile ISR Client tablet interface synchronized with the Android Wear smartwatch.

Figure 11: The Android Wear smartwatch guides the user to the next mission waypoint.

the MobileISR service. These tiles are rasterized by GeoServer as needed from Open Street Maps vector data stored in the PostGIS database instance, and cached for future reuse. Styling is applied during the rasterization process using a Styled Layer Descriptor (SLD) [28] file for each layer. The web client allows a commander to see the positions of each Mobile ISR Client, as well as each ISR report submitted, with live updates. Details of each report, including imagery and video, can be reviewed from the web page. A context menu is available on each entity, which now allows a commander to “Follow” a warfighter: the map will automatically pan to stay centered on the warfighter’s location

Figure 12: The Web Client, showing the path of a selected warfighter.

9

Networking for the exercise consisted of two fixed nodes, one at Space and Naval Warfare Systems Center Pacific (SSC Pac) and one at the MEC, connected by a satellite link. A satellite link also connected the MEC to two mobile nodes, Humvees carrying Trellisware MANET [31] radios. After initial testing, virtual machines running the Complex Event Processing service and Video Processing service were installed on a VM server in a MEC building alongside an instance of GHub. Data posted to the GHub instance was syndicated to the TSOA and available to subscribers at the MEC and at SSC Pac.

Figure 13: Exercise Data-flow Diagram

The Mobile ISR Service was deployed on a laptop in one of the Humvees, where it could communicate with the Complex Event Processing service and the GHub instance in the MEC via satellite connection and to the Trellisware MANET installed on that Humvee. The Mockets proxy middleware from the Florida Institute of Human and Machine Cognition (IHMC) [32] was used on each end of the satellite link to improve the reliability of the web services over that link. Each Mobile ISR Client tablet was connected via WiFi to a TrellisWare radio, and carried with those radios in vehicles which stayed within range of the Humvee (approximately 1000 feet or more). This configuration is shown in Figure 13. The Mobile ISR Client app was used to track position and submit ISR Reports back to GHub and the ISR enterprise, syndicated over Tactical SOA to SSC Pac, as shown in the Mobile ISR Client map interface in Figure 14. Pre-recorded video was fed into the Video Analytics service which fed the Complex Event Processing service, which then posted the resulting events to GHub, the ISR enterprise, and to the Mobile ISR Clients. VI. DISCUSSION AND FUTURE WORK The system performed well during the Agile Bloodhound exercise, and has been further improved based on feedback from the participants. The system provides an interface into the ISR enterprise for the expeditionary warfighter, and tools to disseminate data according to mission relevance and user intent.

Figure 14: The Mobile ISR Client shows submitted reports on the map of MCBH

ObjectVideo is continuing to develop this project to enhance usability and more deeply integrate with the ISR enterprise. The complex event specification user interface is being implemented as an HTML and JavaScript thin-client for easier integration with existing systems. The system will also benefit from alignment of ontology between the ISR enterprise and tags submitted by users in the field. Integrating with information requirements and mission planning software will facilitate system deployment.

ACKNOWLEDGEMENTS This work was funded through contracts N00014-12-C-0423, N00014-13-C-0275, and N00014-11-C-0514 and performed under the supervision of the Office of Naval Research. Thanks to ONR Code 30 Thrust Managers for C4 and ISR, the Space and Naval Warfare Systems Center, Pacific (SSCPacific), Code 53603, and the Marine Forces Pacific Experimentation Center (MEC), the Office of Naval Research and the Office of the Secretary of Defense, and to TrellisWare and IHMC for their support during Agile Bloodhound 2013.

10

[17] [18] [19] [20]

REFERENCES [1]

[2] [3] [4] [5] [6] [7]

[8]

[9]

[10]

[11] [12] [13] [14] [15] [16]

Roy Fielding, “Architectural Styles and the Design of Network-based Software Architectures,” Chapter 5, “Representational State Transfer (REST),” Ph.D. dissertation, UC Irvine 2000. Department of Defense Discovery Metadata Specification (DDMS), http://metadata.ces.mil/dse/irs/DDMS/ NATO Standardization Agreement(STANAG) http://nso.nato.int/nso/ Accumulo, https://accumulo.apache.org/. GeoMesa, http://www.geomesa.org/ Navy Research Labs, Stennis’s GeoSpatial Hub, http://dmap.nrlssc.navy.mil/ghub.php Scott C. McGirr, Ronald J. Wroblewski, Eric L. Dorman, “Active Wiki knowledge repository”, SPACE AND NAVAL WARFARE SYSTEMS CENTER PACIFIC SAN DIEGO CA, 2012 P. Remagnino, S.A. Velastin, G. Foresti, and M. Trivedi, “Novel concepts and challenges for the next generation of video surveillance,”Machine Vision and App. No.3-4, 2007. W. Hu, T. Tan, L. Wang, and S. Maybank, “A survey on visual surveillance of object motion and behaviors,” IEEE Transactions on Systems, Man, and Cybernetics, August 2004. Z. Rasheed, et al., "Automated visual analysis in large scale sensor networks", ACM/IEEE International Conference on Distributed Smart Cameras (ICDSC), September, 2008. NASA World Wind, http://worldwind.arc.nasa.gov/ Java API for RESTful Services (JAX-RS), https://jax-rs-spec.java.net/ JavaScript Object Notation (JSON), http://json.org/ Jersey, https://jersey.java.net/ Jackson, http://jackson.codehaus.org/ PostgreSQL, http://www.postgresql.org/

[21] [22] [23] [24]

[25] [26]

[27] [28] [29]

[30] [31]

[32]

11

PostGIS, http://postgis.net/ Apache Maven, http://maven.apache.org/ Apache HTTP Server Project, http://httpd.apache.org/ Android KitKat 4.4, http://www.android.com/versions/kit-kat-4-4/ Nexus 7 Android Tablet, http://www.google.com/nexus/7/ osmdroid, https://github.com/osmdroid/osmdroid OpenStreetMap, http://www.openstreetmap.org/about Android Location APIs, https://developer.android.com/google/playservices/location.html Android Wear, https://developer.android.com/wear/index.html Creative Vision for Android Wear, https://developer.android.com/design/wear/creati ve-vision.html GeoServer, http://geoserver.org/ Styled Layer Descriptor, http://www.opengeospatial.org/standards/sld Eric Beidel, “Marines Tap Real-Time Information Like Never Before”, Office of Naval Research Media Release 2013, http://www.onr.navy.mil/Media-Center/Press-Releases/2013/AgileBloodhound-2013-ISR-C4.aspx Harrison Donnelly, “Marines Seek Agile Comms”, Military Information Technology 2014, Volume 18 Issue 2 TrellisWare MANET, http://www.trellisware.com/tactical-scalablemanet-tsm/ Mockets, Florida Institute of Human and Machine Cognition: http://nomads.ihmc.us/index.php?id=mockets

Mobile ISR

interaction with a mobile computer. For a mobile ISR application to .... referred to [8,9,10] for detailed discussions on these topics. A. Complex Event Processing ...

6MB Sizes 3 Downloads 101 Views

Recommend Documents

Forward What is an ISR handout.pdf
There was a problem previewing this document. Retrying... Download. Connect more apps... Try one of the apps below to open or edit this item. Forward What is ...

MOBILE StratEgy gUIDE - Intelligent Mobile Support
problems. This is often referred to as enterprise social networking. ... In addition by 2006 the estimate was just 8-10%. We have .... sales rep select the best site.

Adv for Bid - Mobile - City of Mobile
Sep 2, 2015 - All bidders bidding in amounts exceeding that established by the State Licensing Board for. General Contractors must be properly licensed ...

Mobile Mapping
MOBILE MAPPING WITH ANDROID DEVICES. David Hughell and Nicholas Jengre. 24/April/2013. Table of Contents. Table of Contents .

Mobile Mapping
Apr 24, 2013 - MOBILE MAPPING WITH ANDROID DEVICES. David Hughell and Nicholas Jengre .... 10. Windows programs to assist Android mapping .

a mobile mapping data warehouse for emerging mobile ...
decade there will be a global population of over one billion mobile imaging handsets - more than double the number of digital still cameras. Furthermore, in ...

a mobile mapping data warehouse for emerging mobile ...
Mobile vision services are a type of mobile ITS applications that emerge with ... [12], we develop advanced methodologies to aid mobile vision and context ...

pdf-37\mobile-marketing-how-mobile-technology-is-revolutionizing ...
There was a problem loading more pages. pdf-37\mobile-marketing-how-mobile-technology-is-rev ... -communications-and-advertising-by-daniel-rowles.pdf.

Mobile Terrace Neighborhood Study Site - City of Mobile
CONtACt iNFORMAtiON. Prepared by: City of Mobile ... Do not make business or legal decisions based on this data before ... the U. S. Census Bureau. Source:.

creating mobile apps with jquery mobile pdf free download ...
There was a problem loading more pages. Retrying... creating mobile apps with jquery mobile pdf free download. creating mobile apps with jquery mobile pdf ...

Mobile Terrace Neighborhood Study Site - City of Mobile
OLD SHELL RD. SIXTH ST. THIRTEENTH ST. FIFTH ST ... Phone: 251.208.7943 – Email: [email protected] http://maps.cityofmobile.org. The City of Mobile ...

creating mobile apps with jquery mobile pdf free download ...
There was a problem previewing this document. Retrying... Download. Connect more apps... Try one of the apps below to open or edit this item. creating mobile ...

Mobile Marketing: How Mobile Technology is ...
Click the button below to register a free account and download the file. Books Synopsis : ... Online resources include a digital marketing instructors manual ...