217
1 INTRODUCTION
The Border Guard has to be equipped with a
technology enabling communication, acquisition,
exchange and visualization of data in different
operational situations.Currentlythe Polish Republic
maritimeborderismonitoringandsupportedbythe
Automatic National System of Radar Control for
Maritime Areas of Poland (Zautomatyzowany System
Radarowego Nadzoru Polskich Obszarów Morskich‐
ZSRN) which is an integrated security system
(Gałęziowski A. 2005, Fiorini M. & Maciejewski S.
2013). The main limitation of this system is that BG
mobile units are not connected with the central
system,whichhavebeenaddressedintheKONSOLA
project(KaczmarekS.
20132015).However,eventhe
extended system collects only the basic tactical data
about observed objects, like position, speed and
name, and provides communication means. At the
sametime,theassessmentoftheongoingorarchival
situationsrequiresintegrationofnotonlytacticalmap
data but also of telephone and radio
calls, videos
(cameras),photos,files,SMS/SDS,AISandradardata.
The range of information gathered by the Border
Guard requires a specially designated system
architecture, particularly when multiple, potentially
multicomponent(multimedia)operationaltaskshave
tobepresented onamultidisplay. The requirement
for visualizing archival events besides the current
onesincreasesthesystemdesigndifficulty.
ThepaperpresentsarealizationoftheSTRADAR
project, which is dedicated to streaming realtime
data in distributed dispatcher and teleinformation
systems of the Border Guard. The project is
implemented for the security and defense and it is
funded by the National Centre for
Research and
Development. The project is the extension of the
KONSOLAproject(KaczmarekS.20132015)inwhich
the Border Guard distributed map data exchange
Streaming Real-time Data in Distributed Dispatcher and
Teleinformation Systems for Visualization of
Multimedia Data of the Border Guard
M.Blok,B.Czaplewski,S.Kaczmarek,J.Litka,M.Narloch&M.Sac
GdańskUniversityofTechnology,Gdańsk,Poland
ABSTRACT: Surveillance of the sea borders is a very important task for the Border Guard. Monitoring of
countrymaritimeborderisanimportanttaskoftheBorderGuard.Thistaskcanbefacilitatedwiththeuseof
thetechnologyenablinggatheringinformationfrom
distributedsourcesanditssupervisionandvisualization.
Thistaskcanbeaccomplishedusingatechnologythatallowstocollectinformationfromdistributedsensorsof
differenttypes,unifythecollectedinformation,andpresentthesynchronizeddata.Thesystempresentedinthe
paperisanextensionandenhancementofthepreviouslydeveloped
distributedmap dataexchange system.
Theaddedfunctionalityallowstosupplementthemapdatawithmultimedia(telephoneandradiocalls,video
(cameras),photos,files,SMS/SDS)andpresentationofcurrentandarchivalsituationonamultidisplayscreen
intheEventsVisualizationPost.Inthepaper,thesystemarchitecture,functionalityandmain
systemelements
aredescribedandsupportedwithpreliminaryanalysisandtestresults.
http://www.transnav.eu
the International Journal
on Marine Navigation
and Safety of Sea Transportation
Volume 12
Number 2
June 2018
DOI:10.12716/1001.12.02.01
218
system have been developed (Blok et al. 2016a). In
new system, tactical information is collected from
mobile units (vehicles, airplanes and vessels),
ObservationPoints(OP)andWebService.Thescope
oftheinformationtypesincludesAISandradardata,
telephone and radio calls, video (cameras), photos,
files, SMS/SDS messages. The
STRADAR project
allows for visualization of current or archival
operationalsituationcomposedofanycombinationof
synchronizeddataoftheabovetypes.
The STRADAR project consists of multiple
stationary and mobile elements, although the most
significantaretheCenterServer(CS),ArchiveServers
(AS)andtheEventsVisualizationPost(EVP).
TheCS
runs the custom the Map Server (MS) application,
managesthelistofvisualizationtasks,andperforms
various management operations. The MS provides
accesstodataonnavalsituationforvisualizationon
digital maps. Archive Servers (AS) are based on the
approach utilized in the cloud technology. The AS
provides
access to archived data, which can be of
various types: files, images, SMS, SDS, video, or
audio. Finally, the EVP enables interactive
visualizationofdata,generationofnewvisualization
tasks, and some minor control functions. The
visualization of events is performed in Events
VisualizationPost(EVP) onthemultiscreen
display
and the need for visualization can be notified by
mobile unit or by EVP itself. The functionality, the
concept and the realization of the system, hardware
and software implementation, as well as the initial
testsresultshavebeenpresentedinthispaper.
Thestructureofthepaperisasfollows.
Section2
describesthesystemarchitecture.Section3describes
the functionality of system elements and more
specifically the Central Server (CS) and Map Server
(MS), Observation Point (OP), Mobile Unit (MU),
EventsVisualizationPost(EVP),andArchiveServers
(AS). In Section 4, initial test results are presented.
Thesummaryis
containedinSection5.
2 SYSTEMARCHITECTURE
Distributed characteristic of the system architecture
(Fig. 1) is caused by requirements of Border Guard
(mainly strong support for mobile units including
bidirectional data exchange and voice, video
communication), geographically distributed sources
of data (GPS, AIS, ARPA including also data
providedfromexternalWebServices
andsurveillance
cameras)andageneralruleforbuildingcomplexIT
solutions as interconnected set of sophisticated
components performing specialized task for data
acquisition, processing, storage and visualization
withtheaidofadvancedtechnology.
Another distinguished feature of the developed
system is its “all IP paradigm” which results in
unified
form of communication through TCP/IP
network(includingmobileIPradiolinks)accordingly
for data, video and even interactive voice through
applicationofVoiceoverIP(VoIP)technologyforreal
timecommunicationbetweenBGpersonnel.
Figure1.Diagramofthemultimediadistributedsystem(STRADAR).
219
System architecture consists of stationary and
mobile parts. The main functionalities of stationary
part are gathered in the Central Server (CS) where
the major elements are located. One of the most
important element in the CS is the CENTRE server
with the custom MapServer application developed
forstoringinthe
SQLdatabaseandprocessingdata
used in constant visualization of digital maps on
stationary consoles and on demand on consoles in
mobileunits.DatagatheredintheCENTREformap
visualization originate from all system sensors
(including far and mobile) and can be accessed by
any mobile MapServer and its
associate mobile
consoleon MobileUnits (MU).TheCENTRE server
with the MapServer application is described in
detailsinSection3.1.AnotherkeyelementintheCS
is the Events Visualization Post (EVP) which is a
dedicatedPCserverwithacustomapplicationanda
specialized hardware allowing (with the aid
of the
CENTRE and Archive Servers) presentation of a
complex tactical situation with various data of
several types (including digital maps, still images,
audio/video, SMS/SDS) on multiple displays joined
into large logical screen. The EVP is described in
details in Section 3.4. A vital role in the designed
system plays
a specialized entity named Archive
Servers which are sophisticated solution based on
cloudcomputingconceptwithaNoSQLdatabase,a
distributed file system, and other “stateoftheart”
technologies for “Big Data” storage and processing
among cluster of application and archive serves
available by WebService (http/JSON request) and
Message Oriented
Middleware communication
channels. Archive Servers are used as an advanced
“data base” for storage and retrieval of files, still
images, audio, video and SMS/SDS data for the
purposeofpresentationintheEVP.ArchiveServers
are presented in detail in Section 3.5. The CS is
equipped with multiple Stationary Consoles
which
are used for presentation of tactical data on digital
maps. Moreover, Stationary Consoles are also
telecommunication terminals for interactive, real
time voice communication among BG personnel
accomplishedwiththeaidofVoIPtechnology.That
form of communication was a fundamental design
demand for the proposed architecture as an
advancedform
ofamoderndispatchersystem.Thus,
in the proposed system, an infrastructure for VoIP
communication is provided including Telephone,
RadioServersandGSMgateforcommunicationwith
a public mobile telecommunication network
including possibility of sending and receiving SMS
messages. It is worth noting that for the sake of
investigation
andinspectionpurposeseverycalland
message in VoIP communication of the proposed
system is recorded in the Recorder and stored in
Archive Servers for future retrieval accordingly to
demands. That feature was imposed by another
important feature of the proposed system, which is
possibility of simultaneous presentation of not only
current, but also of all archival data from multiple
sources including multimedia content (voice, video,
images)andtacticaldatafrommaritimesensorsand
externalWebServicesbothpublicandconfidential.
All elements in the CS are attached to a high
speedLANandtothededicatedIPbasedBGWide
Area
Network (WAN). The BG WAN is used to
communicate with multiple, geographically
distributed autonomous OPs (Observation Points).
The OPs belong to fixed part of the system
architecture and are used as remote posts for
mounting system sensor (AIS, radar, surveillance
cameras)andradioantennasforcommunicationwith
Mobile Units. Each OP
is equipped with specially
customized version of the MapServer application
(described in details in Section 3.2) providing data
from the set of sensors which cannot directly
communicatethroughTCP/IPnetwork.
In the system Mobile Units represent BG patrol
units which can be land vehicles, airplanes and sea
vessels. The system
architecture is designed to
provideadvancedsupportformobileunitsincluding
bidirectional communication over broadband IP
based(packet)radio link whichisdeveloped under
anotherproject(StefańskiJ.20142017).MobileUnits
consistofa speciallydesignedserverURCenclosed
inaruggedcase,appropriatesetofsensors,aradio
modem for broadband IP communication and a
militaryclasslaptop(mobileconsole)connectedtoa
highspeedLAN.Fromthesystemarchitecturalpoint
ofviewconsolesonmobileunitsareofferedthesame
setofservices(includingVoIP)asstationaryconsoles
with small limitations imposed by IP radio link
communication.
Operators of mobile consoles focus
theirmainattentiononsituationaroundpatrolunits
and information gathered from local sensors (GPS,
AIS,ARPA,camera)sotheURCisequippedwiththe
specialversionoftheMapServersupportingthemin
theirduties.
Figure2. From the top: URC for OP, switch, Archive
Servers(AS),voicerecorder,telephoneserver.
220
UsageofanuniversalIPnetworkintheproposed
system allows application of respective protocols
from TCP/IP family for communication between
particular elements. Regarding data communication
betweenconsoles,theEVPandMapserversaconcept
of communication based on Message Oriented
Middlewareisconcerned.ParticularlyanISO/ECMA
standardprotocolnamedAdvance
MessageQueuing
Protocol (AMQP) is applied. Moreover the AMQP
andHTTPareusedforcommunicationwithArchive
Servers. Regarding interactive VoIP communication
the SIP is used as signaling (control) protocol
betweenterminalandtelecommunicationservers.In
the proposed system media (voice and video) are
carried by the Realtime Transport
Protocol/Real
time Transport Control Protocol (RTP/RTCP). This
concept allows application of Real Time Streaming
Protocol (RTSP) for control of multimedia sessions,
particularly video from surveillance cameras and
video recordings stored in Archive Servers. The
elementsofthewholesystemaresynchronizedwith
the aid of the Network Time Protocol (NTP) and
a
central time server. Application of GPS in time
synchronizationispossibleasa backupinthecaseof
isolation of particular, mainly mobile, units. The
realizationofsomecrucialelementsisshowninFig
2.
3 FUNCTIONALITYOFSYSTEMELEMENTS
Inthissectionma inelementsofthesystemdescribed
intheprevioussectionarepresentedinmoredetails.
3.1 CentralServer
The main functionalities provided by the CENTER
are(a)map data gathering and providing,(b)tasks
notificationsprocessingand(c)EventsVisualization
Post(EVP)support.Allthesetasksareimplemented
inthesoftwarecalledtheMapServer(Fig.3).
The
mapdatagatheredintheCENTERcomefrom
locally attached sources (WebService), which are
processedby the map data processing module, and
from sources attached to mobile MapServers
operating on mobile units (MU) (ARPA radar, AIS,
GPS).ThedatacollectedbytheMUs aresenttothe
CENTER during database
synchronization sessions
throughtheradiochannel servedbythe
communication module 2 and are processed by the
databasesynchronization module.Mapdata
duplicates, which are inevitable in case of data
gathered from different sources operating in the
same area, are removed by deduplication modules.
After the deduplication, map data are inserted
into
thecentraldatabasefromwheretheycanbeaccessed
ondemand.Requestsformapdataprocessedinthe
localqueriesprocessingmodulecanbereceivedfrom
the EVP and stationary consoles through fixed
network (communication module1)orfrommobile
consoles through the radio channel and network
(communicationmodule
2).
In this regard, the discussed fragment of the
system can be considered a Vessel Traffic Service
(VTS) system utilizing combined AIS, ARPA, and
GPS data. Studies on applications of AIS for VTS
systems canbefoundin(ChangS.J.2004,Tetreault
B.J.2005,MagnusS.Eideetal.2006). Other
studies
onoperatorsituationawarenessinVTSsystemscan
befoundin(WiersmaJ.W.F2010).
The other two main functionalities of the
MapServer (tasks notifications processing and the
EVP support) relate to system features enabling
presentationofmapandmultimediadataintheEVP.
The tasks notification processing module receives
notifications containing information about tasks
consisting of map and multimedia elements which
are proposed for presentation in the EVP. These
propositions can be created by the operators of
stationaryormobileconsolesandtheEVPoperator.
Notifications received in the CENTER are stored in
thecentral databaseand theupdated list
oftasks is
senttotheEVPwheretheoperatorcaneitheraccept
for presentation or discard each task notifying the
CENTERabouttheirdecision.
Figure3.FunctionaldiagramofthecentralMapServer.
221
The EVP support module processes the EVP
requests concerning visualization of task elements
(map data, files/photos, SMS/SDS, video from
cameras, audio recorded calls). For each
visualization start request the module retrieves the
detailsoftherequestedtaskelementfromthecentral
database,whichhave been stored thereby the
tasks
notifications processing module. When the task
element refers to current or archival map data, then
the request is processed by the MapServer. For
multimedia (file/photo, SMS/SDS, video or audio)
task elements, a type of the requested data is
determined(currentorarchival)andarequestissent
toa
properservice(URL) of themultimediaarchive
servers (AS). For requests regarding visualization of
archival multimedia data, the AS respond with
metadata, which are gathered by the EVP support
moduleandsenttotheEVP.ThisallowstheEVPto
fetch the demanded information directly from the
multimedia archive servers in
order to visualize
archivalevents.
Whentherequestconcernsvisualizationofcurrent
multimedia data, the AS create a subscription and
sendsubscriptionidentifieraswellasqueueaddress
to the EVP support module. Subsequently, these
parameters are stored in the central database and
passedtothe EVP subscription module by
invoking
itsglobalfunction.IntheEVPsubscriptionmodulea
new thread is created, which is responsible for
connectingtothesubscription queueand processing
messages arriving to this queue. They include
heartbeat messages (which have to be answered
within a specified time to confirm that the
subscription processing thread is
“alive”) and
metadata associated with the requested type of
currentmultimediainformation.TheroleoftheEVP
subscription module is to send these metadata
directly to the EVP so that they can be used to
visualizecurrentevents(inasimilarwaytoarchival
events).
WhentheEVPfinishesvisualization
of a specific
taskelement,avisualizationendrequestissenttothe
EVP support module which updates the task data
storedinthecentraldatabaseaccordingly.Forendof
visualization of current multimedia data, additional
operations are required by EVP support module in
ordertoclosetheassociatedsubscription.Firstly,
the
subscription identifier as well as queue address are
obtained from the central database. Secondly, these
parameters are passed to the EVP subscription
modulebyinvokingitsglobalfunction,whichresults
in disconnecting from the subscription queue and
deleting the subscription handling thread. Finally, a
proper request is sent to
the AS to remove the
subscription.
Apart from requests regarding visualization of
taskelements,theEVPsupportmodulealsohandles
requests foraddingnotestomultimediadata stored
intheAS,whichcanbegeneratedeitherbytheEVP
or consoles (both mobile and stationary) operators.
All received add note requests
are processed in the
EVP support module and forwarded to the AS. The
result of the performed operation is retrieved from
the ASandpassedtotheoriginator of the add note
request.
3.2 ObservationPoint
The central MapServer gathers map data from local
sources,MUs as wellasOPs
andstoresthem inthe
central database. The MapServer in MU (Blok et al.
2016a)collectsdatafromsourcesavailableatthisunit
andstorestheminthelocaldatabaseatthesametime
synchronizing it with the central database using the
best effort approach. Additionally, the mobile
MapServer provides mobile
consoles with map data
stored in the local database or retrieved from the
CENTERthroughradiochannel.
Inthecurrentproject,mapdataarealsocollected
inOPs,however,sincetherearenomobileconsolesin
OPs,theMapServerinOPscanbesimplifiedwithits
functionalitylimitedonlyto
mapdatagatheringand
synchronization(Fig.4).
Figure4.FunctionaldiagramoftheMapServerintheOP.
3.3 MobileUnit
Functionality for mobile units derives from the
KONSOLAproject(Bloketal.2016,Czaplewskietal.
2016), which was to complement existing systems,
currently used by the Border Guards, with the
transmissionofdataonsupervisedobjects,i.e.marine
vessels,land vehicles, andaircrafts,from the mobile
unit
oftheborderguardtothestationarynetwork,as
wellas, from the network to the mobile unit. In the
current STRADAR project, the functionality for
mobileunitswas extendedbynewtoolsforreporting
events for visualization, i.e. for sending tasks for
visualization of events in the Events Visualization
Post(EVP).
Amobile unit,regardless ofitstype, isequipped
with a Universal Radio Controller (URC), a mobile
console, a radio link, an ARPA radar or an AIS
receiver, and a GPS receiver. A mobile consoles are
militaryclasslaptops, which are connected to URCs
vialocalIPnetwork.URCs
areequippedwithalocal
database,whichhastheinformationaboutsupervised
objects present within the sensor range of a mobile
unit, the local MapServer, which is responsible for
222
providingappropriatemapdata,processingincoming
tasks for the EVP from mobile consoles, and
providingcommunicationwiththeCENTERviaradio
link.AlltheURCsareautomaticallyuploadingtheir
data to the central server in order to synchronize
databases. Each URC is able to download the data
fromthecentral
server,whichcollectsdataaboutthe
navalsituationfromthenetworkofAISreceiversand
radars of the Maritime Office provided by the web
serviceandotherURCs.Inthisway,mobileconsoles
can visualize the data on supervised objects located
beyondthereachofthesensorsoftheBG
mobileunit.
TherealizationofamobileconsoleisshowninFig.5.
The structure of the equipment of a mobile unit is
presentedinFig.6.
Figure5. A mobile console (on the left) and a stationary
console(ontheright).
Figure6.Thestructureoftheequipmentofamobileunit.
Mobileconsolesareequippedwiththedispatcher
application,whichisC#.NETapplicationfor32bitor
64bit MS Windows 7, 8.1 or 10. The dispatcher
application for the mobile units allows the console
operator to use the following functionality: data
visualization on maps, radiocommunications,
telephony, recording conversations, crossnetwork
conference,intercom,SMS,filetransfer,andreporting
events (task generation) for EVP. In the context of
data visualization on maps and event reporting, the
dispatcher application consists of three tools: the
MapControl, the BrowserControl and the
TaskGenerationControl. All three controls cooperate
withalocal mapserverin the URC. Details
abouta
structureandfunctionalityoftheMapControlandthe
BrowserContol can be found in (Czaplewski et. al.
2016). Details about a structure and functionality of
theMapServercanbefoundin(Bloketal.2016).
The TaskGeneratorControl is a tool for reporting
eventsforvisualizationintheEVP.Thesereports
are
calledtasks,andeachtaskcanincludeseveral(atleast
one) elements. In the project, there are 6 types of
elements,whichcorrespondstothetypesofdatathat
can be visualized in the EVP: map (visualization on
the map), browser (of map objects), file/image,
SMS/SDS,video,andaudio.
The first thing to be done to generate new
visualization task is to declare a priority and
optionallyaddatextdescriptionaboutthetask.Then,
the operator has to add some elements to the task,
that can be of various types: map, object browser,
file/image,SMS/SDS,video,oraudio.Every
taskmust
consist of at least one element, and simultaneously
thereisnoupperlimittothenumberofelementsin
thetask.Theoperatorcanchoosetoaddanelement
fromamacro,whichisapredefinedandsavedform
ofanelement,ortodefineacustomelement.
Macros
allow for quick designing of tasks consisting of
frequently used elements, while the advanced form
foruniqueelementsisstillavailabletotheoperator.
In case of the custom element, after selecting the
desiredtypeofelement,theoperatorhastodefineits
parameters. Defined parameters will be used as
searchfiltersforthe Archive Servers. Every element
must consist of at least one defined parameter
althoughthe operator can define all the parameters.
Inthecaseofmultipledefinedparameters,thesearch
filterwillbealogicalconjunctionofthemall.
The set of parameters for “map” element, is
as
follows:
timeofsituation,
geographiccoordinates,
indexesofobjectsfordetailedpresentation,
indexesofobjectsforpresentationoftrails,
filtrationofsymbolsonthemap,
filtrationoflabelsonthemap.
The set of parameters for “file/image” element is
asfollows:
typeof
data(fileorimage),
identifieroffile/image,
loginoftheoperator whocreatedthefile/image,
sourceofthefile/image,
indexesofobjectscorrespondingthefile/image,
geographiccoordinates,
timeofcreationofthefile/image,
textnoteaboutthefile/image.
Thesetofparametersfor
“SMS/SDS”elementisas
follows:
sendernumber,
receivernumber,
identifieroftheSMS/SDS,
textoftheSMS/SDS,
loginoftheoperator whocreatedtheSMS/SDS,
sourceoftheSMS/SDS,
indexesofobjectscorrespondingtheSMS/SDS,
geographiccoordinates,
timeofcreationofthe
SMS/SDS,
textnoteabouttheSMS/SDS.
223
Figure7.AviewofTaskGeneratorControlduringthegenerationofvisualizationtask.
The set of parameters for “video” element is as
follows:
identifierofthevideo,
loginoftheoperator whocreatedthevideo,
sourceofthevideo,
indexesofobjectscorrespondingthevideo,
geographiccoordinates,
timeofcreationofthevideo,
textnoteaboutthe
video.
The set of parameters for “audio” element is as
follows:
sendernumberorIPincaseofVoIP,
receivernumberorIPincaseofVoIP,
forwardednumberorIPincaseofVoIP,
identifieroftheaudio,
loginoftheoperator whocreated
theaudio,
sourceoftheaudio,
indexesofobjectscorrespondingtheaudio,
geographiccoordinates,
timeofcreationoftheaudio,
textnoteabouttheaudio.
There is another way to quickly generate
visualizationtask.Theoperatorhasaccesstoahistory
of previously sent tasks, which
he can browse and
select tasks to resend them or to modify and then
resendthem.
Whenthetaskanditselementsareready,itcanbe
sendtotheUniversalRadioController(URC)which
redirects it to the CS. TaskGeneratorControl is also
available on stationary consoles and the EVP.
If the
taskwasdefinedthere,itissentdirectlytotheCS.At
theCSthelistofallthetasksiscreatedandmanaged.
Aftereachupdateofthelist,anotificationissentto
theEVP,thusthelistofallthetasksisvisibleinthe
EVP.
A view of TaskGeneratorControl during the
generationofvisualizationtaskispresentedinFig.7.
3.4 EventsVisualizationPost
TheEventsVisualizationPostoperatesonaPCwith
high resolution multiscreen display and software
designed for simultaneous visualization of data of
different types and synchronized in time. The PC
is
configuredforthepurposeofallowinganoperatorto
visualize tasks coming from the consoles and to set
newtasksforvisualization.ThePChasbeenbuilton
an Intel 64bit processor (in this case Intel Core i7
4790K4GHz),32GBofRAMandtwographiccards,
one being a processor integrated Intel HD Graphics
4600andsecondbeingadedicatedgraphiccardwith
AMD Radeon Chipset (Club3D Radeon 7850 2 GB
Eyefinity 6). Windows 8.1 Pro (64bit version) has
been chosen as EVP’s operating system. The
realizationofEVPisshowninFig.8.
Figure8. EVP, consisting of multidisplay for presentation
andstandarddisplayformanagement,duringoperation.
The basic concept for the EVP is to allow an
operatortovisualizeelementsofgiventasks.Forthat
purpose, a solution of using two screens has been
224
implemented.WehaveproposedtoconnecttheEVP
toaManagement Screen (whichpurposeis to allow
theoperatortomanagethevisualizationprocessand
allow them to receive and generate tasks) and a
Multidisplay (which is tasked with displaying
multimediaelementsoftaskstheoperatorworkson).
To allow
a big space for visualization a single
display is not enough, therefore a solution of
configuringanumberofdisplaydevicesintoasingle
logical display have been installed (hence the name
“Multidisplay”). According to Windows operating
system one can plug as many displays to the PC as
the number
of graphic outputs available in the
machine.Toachieveaunifiedmultidisplayasolution
forcreationof a logical screen hasbeendecidedon.
AMD’s technology Eyefinity 6 is a feature of novel
graphic cards from the Radeon line that allows to
createasinglelogicaldisplayfromuptosix
physical
displays connected to a single Radeon graphic card
via miniDisplayPort interface. For the EVP six
monitorshavebeenconnectedtoasingleEyefinity6
compatiblegraphiccardtocreatealogicaldisplay.
SincefortheManagementScreenasinglephysical
monitor suffices, a second logical screen from the
operating
systems point of view have been attained
by connecting a display to a DVI interface on the
motherboard.
TheEventsVisualizationPost(EVP)applicationis
programmed in C# .NET for 32bit or 64bit MS
Windows7,8.1,or10.TheEVPapplicationallowsfor
receivingeventreports(tasks)from
mobileconsoles,
generating new event reports (tasks) in the EVP,
managing the list of tasks, visualizing the data of
various types (data on the map, browser of naval
objects, files, images, SMS/SDS, video, and audio),
synchronizing the data in time, adding text notes to
the data, and much more. The
general structure of
softwareispresentedinFig.9.
Figure9. The general structure of software of the Events
VisualizationPost(EVP)application.
In general,three main modules can be
distinguishedintheEVPapplication,namely:
Management module, which is for the
management of data acquisition, data
visualization,communication,andtheflowofthe
application,
Visualizationmodule,whichisresponsibleforthe
presentationofvarious typesofdataonthemulti
screendisplay,
Communicationmodule,whichisresponsiblefor
thecommunicationwiththecentralserverandthe
properformattingofmessages.
Managementmoduleconsistsof:
Login module, which is used for logging the
operator in the system and sets all the starting
parameters,
Task generation module, which
provides task
generation, task history, element macros,
analogous to the TaskGeneratorControl
functionality,
Task list module, which is responsible for
managing the list of tasks, including acceptance,
preview,andrejectionoftasksandelements,
Task preview module, which is used for more
detailedpreviewofelements,includingbrowsing
themetadatafoundintheArchiveServers,
ManagementControls,which areused for
controlling the presentation of different types of
datainthevisualizationmodule,
Time synchronization module, which is
responsible for synchronizing
ManagementControls and their corresponding
visualizationcontrols,
Text notes module, which is used for creating
customtextnotestotheexistingdocumentsinthe
ArchiveServers.
Communicationmoduleconsistsof:
Transmitter module, which is responsible for
sendingoutgoingmessagestotheCENTER,
Receiver module, which is responsible for
receivingincomingmessagesfromtheCENTER,
Message interpreter, which is responsible for
interpreting
messagesinJSONformatandcreating
correspondingC#objects,
Message formatter, which is responsible for
formattingmessagesinJSONformatonthe basis
ofappropriateC#objects.
Visualizationmoduleconsistsof:
MapControls, which are designed for
visualization of marine objects on digital maps
(Czaplewskietal.2016),
BrowserControls, which are designed for
visualization of marine objects in the browser
(Czaplewskietal.2016),
FileUserControls, which are designed for
visualizationofimagesinrasterformatsandfiles
inthesystemfilebrowser,
SMSUserControls, which are designed for
visualization of SMS and SDS messages from
multiple
sourcesintheformofchat,
VideoUserControls, which are designed for
visualizationofcurrentandarchivalvideostreams
receivedfromtheAS,
AudioUserControls, which are designed for
visualization of current and archival audio
streamsreceivedfromtheAS.
The EVP consists of the following eleven class
libraries:
SWZ contains classes for the management of
data acquisition, data visualization, and the flow
oftheapplication,inparticularclassesforlogging,
task generation, managing the list of tasks, task
225
history and macros, detailed preview tasks
including browsing the metadata, controlling the
presentation on the multiscreen display,
synchronizing data, creating text notes to the
existingdocuments.
CommunicationLibrary contains classes for the
communication with the CS and interpreting or
formattingmessagesinJSONformat.
SwzTaskLibrarycontains
classeswithdefinitions
of a task, an element of a task, metadata of an
element,andcontrolsforthelistofthetask.
TaskGeneratorLibrarycontainsacontrolforthe
generating new tasks, called
TaskGeneratorControl,andcontrolsforthehistory
andmacros.
MapClassLibrary contains a user control
for
visualization of data on digital maps, called
MapControl,andsomesubordinatecontrols.
BrowserClassLibrarycontainsausercontrolfor
browsing the data on naval objects, called
BrowserControl,andsomesubordinatecontrols.
FileClassLibrary contains a user control for
visualization of files and images, called
FileUserControl,andsomesubordinate
controls.
SMSClassLibrary contains a user control for
visualization of SMS and SDS, called
SMSUserControl,andsomesubordinatecontrols.
VideoClassLibrary contains a user control for
visualization of video recordings, called
VideoUserControl,andsomesubordinatecontrols.
AudioClassLibrary contains a user control for
visualization of audio recording,
called
AudioUserControl, and some subordinate
controls.
MyToolsLibrarya fewauxiliaryclasses.
Asmentionedinthepreviousparagraphs,EVPis
capable of visualizing files, images and SMS
messages.Suchdata is downloadedfromtheAS.In
case of an image, it is visualized in the
FileUserControl.Ifthefileis
notanimageanoption
toeitheropenthefoldercontainingthefileoropening
itinthatfiletype’sdefaultapplicationarepresented
totheoperator.IncaseofaSMS,itisvisualizedinthe
SMSUserControl.Theexampleofthevisualizationof
animageispresentedinFigure
10.
One of the features of the EVP is an ability to
visualize video data regarding reported tasks. Both
archived and live video can be watched on the
Multidisplay. The records are accessed from AS by
connecting to appropriate RTSP video streams. For
playingvideorecordings,VideoUserControlusesthe
VLCclient.
Asmentionedinthepreviousparagraphs,
EVPiscapableofvisualizationofaudiodata.Similar
tothe video,suchmultimediaareaccessedfromthe
ASintheformofRTSPstreamscreatedaftersending
a specified request. For playing audio recordings,
AudioUserControl also uses the VLC client. The
example of
the visualization of audio recordings is
presentedinFigure11.
Visualizationofnavalsituationondigitalmapsis
realized by the MapControl which is a custom user
control contained in the MapClassLibrary. For the
rendering of digital maps, MapControl uses the
TatukGIS map engine. The data for visualization in
MapControlis
providedbytheMapServerintheCS.
ThedatamaycomefromAISreceiversandradarsof
theMarinetimeOfficedeliveredviawebservice,AIS
receivers, ARPA radars and GPS devices on mobile
unitsoftheBorder Guards. Theimportantfactisthat
the automatic deduplication of supervised
objects is
performedintheMapServer.Thismeansthate.g.ifa
single object was detected by ARPA radar and AIS
receiver, then the data from both sources will be
identifiedandmergedintooneobject.Theexamples
of visualization of naval situation is presented in
Figure12and13.
Figure10.AviewofFileUserControlduringvisualizationof
animage.
Figure11. A view of AudioUserControl during
visualizationofaudiorecordings.
226
Figure12.AviewofMapControlduringthevisualizationofnavalsituationofdigitalmaps.
Figure13.AviewofBrowserControlduringthevisualizationofnavalobjectsinthebrowser.
227
3.5 ArchiveServers
TheArchiveServers(AS)receive,interpretandcollect
thedatacoming fromexternalsystems,and provide
these data on demand. External systems, which are
the CENTER, URCs, OPs, the EVP, the Recorder,
telephoneandradioservers,andstationaryconsoles,
needfastaccesstocurrentandarchivedata.
Inorder
to meet these requirements we have selected an
architecture in which the AS form an independent
systembasedonBigDatatechnology.Inourcasethis
system is composed of data gathering, processing,
storing,searching,andsharingmodulesoperatingin
distributed infrastructure called a cloud. In contexts
of the
AS a cloud is an environment providing safe
storage and fast access to huge amount of data. It
guarantees organization of data coming from
different sources and delivers means for analysis,
characterization and enrichment of these data and
their descriptions, as well as tools for discovering
correlationswithingathereddata.
The
architecture of AS is characterized with the
followingfeatures:
distributed technology ensuring continuity of
operations,
uniform utilization of resources of all devices
formingthesystem,
copiesofdatastoredinseveralplaces,
data storage using a format facilitating access to
data,
accesscontrolforall
gathereddata.
The AS functions are aggregated into modules
basedondatatypesthegivenmoduleispredestined.
Each module is an independent program which has
implementedthefollowingactions:
message exchange means for connecting to the
outsideworld,acceptingrequestsanddistribution,
basedoncloudstateknowledge,of
taskstoplaces
thatpromisethefastestexecution,
data processing and aggregation linking
incomingdatawithalreadypresentdataaswellas
businesslogic,whichisdataprocessingbasedon
rules applying to particular data type and
knowledgeofhowthedatacouldbeusedinthe
future,
persistence data in the form of formatted
documentsarestoredinaNoSQLdatabase.These
data instead of being stored in a single shared
central place are distributed among local storage
media of the servers. For nontext large data
(>>1MB) a decentralized and distributed file
systemisused.
The system demonstrator consists of four
hardware servers, gigabit network switch, and 19”
Rackmountcase.Threeoftheserversarededicatedto
archive management, while the remaining server is
dedicated to visualization management, resources
and subsystems monitoring as well as a disk
resourcesserver.
Within each of the three
archive management
serversvirtualenvironmentshavebeensetup,which
are used for creating functional clusters. The
virtualization utilizes method called LXC (Linux
Containers).TheLXCisnotanindependentoperating
system running on the machine, but is a separate
spacewithintheoperatingsystem.Itallowsallocation
ofCPU,memory,
diskspaceandnetworkinterfaces.
For programs running in LXC containers processes,
networkaccess,userprivilegesandaccesstothefiles
areseparated. Fromthe point ofresource
commitmenttheLXCvirtualizesonlytheapplication
andnottheentireoperatingsystem,thereforeitdoes
not need a separate virtual machine
with its own
operatingsystem.ThismeansthattheLXCgenerates
little overhead because all the applications use
standard libraries for system calls, I/O and
networking.
The AS architecture does not impose restrictions
onthenumberofprocessedmessages,thesizeofthe
retained data or the number and complexity of
the
external requests. Their performance is limited only
bytheavailableprocessingpower,memoryanddisk
space.HorizontalscalingoftheASisrecommended,
astheNoSQLsystemsaredesignedandbuilttoallow
almostlinearincreaseinperformancebyaddingmore
machinestothecluster.
4 INITIALANALYSISAND
TESTS
Models of main elements of the discussed system
havebeenpresentedinthepaper(Bloketal.2016b).
Themodelsofthefollowingelementswereanalyzed:
CENTER, MUs, OPs, EVP and AS. The obtained
partialresultswereusedtoanalyzetheoverallsystem
performance under several scenarios selected based
on
system functionality requirements. In sec. 4.1
conclusionsfromtheabovementionedanalysisandin
sec. 4.2 functional tests results of the system under
developmentarepresentedandbrieflydiscussed.
4.1 Resultsofperformanceanalysis
The performance analysis has been focused on the
following parameters: average system reaction time,
bitrateonthe
interfacesandloadofprocessingunits
(processors)sincethesethreevaluesarecriticaltothe
operationoftheanalyzedsystem.
The results of the analysis show several critical
points of the system. In the CENTER the main
problem is high processing units load close to 90%
resulting from large volume of
collected map data,
which are stored in and retrieved from the central
database. This problem additionally results in
increased delay introduced by the CENTER.
Similarly,theprocessingunitloadintheEVPisalso
close to 90%, which results from simultaneous
processingofseveralmulti elementmultimediatasks.
As a
result, the maximum reaction time for control
messages is 3.208 seconds. For media maximum
reaction time is equal to 3.479 seconds when media
areuploadedtoAS,whileformedia presentationin
theEVPitisequalto1.15secondforpresentationof
ongoingsituationand3.727secondsforpresentation
of
archivalevent.
Theaforementionedproblemscanbesolvedwith
theuse ofmoreefficient multicoreprocessorswhich
can be readily adapted since the developed models
and software assume a multithread implementation
with processing threads designated to specific
functionalitiessubsets.Asaresult,wecanadjustthe
loadofdifferentfunctional
componentsofthesystem.
228
Anothercriticalpointare radiomodems, because
of their limited bandwidth allowing only a single
video stream, and Archive Servers because of large
volume of uploaded and downloaded multimedia
information.The problemrelatedto Archive Servers
canbesimplysolvedbyaddingadditionalserversto
the setup with software automatically
distributing
tasksbetweenparticularmachines.Theonlyproblem
indicatedhere,whichcannotbesolvedwithincurrent
project, is related to the maximum radio modem
bitratesinceithasbeenimposedbythespecification
of the concurrent project in which this modem is
being developed (Stefański J. 20142017). However,
becauseofmodularstructureofthedevelopedsystem
this radio modem can be readily upgraded to one
withincreasedbitrateorvideocodercanbechanged
foroneofferingasmalleroutputbitrate.
4.2 Functionaltests
At the current stage of system development the
MapServerfunctionalitytestsarebasedonanalysis
of
systemresponsetolocallygeneratedrequests.Inthe
debugging mode the EVP or console requests
addressedtothetaskprocessingortheEVPsupport
module are injected into the communication
module1. Since all modules log all crucial actions
alongwithprecisetimesintoseparatefiles,itiseasy
to analyze how the given request is handled.
Additionally,sincemanyrequestsresultinupdateof
thedatastored inthelocal database, thecorrectness
ofrequestsprocessingcanbealsoassessedbasedon
theanalysisofthechangeofthedatabasecontent.
IncaseoftheEVPsupportmoduleall
theexpected
requests, with the exception of add note requests,
comedirectlyfromtheEVP.Differently,therequests
inthetaskprocessingmodulealsocomefrommobile
consolesinwhichcase they are handled first by the
mobileMapServerwhich,if it is necessary forwards
therequesttothecentral
MapServerandreturnsthe
obtainedresponsetotheconsole.Thefunctionalityof
such requests is tested with mobile and central
MapServersconnectedusingIPnetworkandthetest
request generated locally in the mobile MapServer.
This requires analysis of logs and databases in both
the mobile URC and the CENTER. The
radio link
failureintheconnectionbetweenmobileandcentral
MapServercanbesimplysimulatedbyswitchingthe
CENTER off which results in mobile requests being
processed in the mobile MapServer without
communicationwiththeCENTER.
Thetestsofthetaskprocessingmodulecovered:
newtaskrequestsfromthe
EVPprocessedonlyat
theCENTER,
newtaskrequestsfromstationaryconsoleswhich
communicate directly with the CENTER; in this
casetheupdatedlistoftasksissenttotheEVP,
newtaskrequestfrommobileconsoleswhichare
passed by the mobile MapServer to the central
MapServer;the
updatedlistoftasksissenttothe
EVP,
operator id requests: local from the EVP and
stationary consoles and remote from mobile
consoles;if itispossibletheglobal operator idis
retrieved from the CENTER but in the case of
radio connection failure the mobile console
operator
receives the temporary local id;
additionally, when the EVP operator requests id,
the current list of tasks stored in the central
databaseissenttotheEVP.
The performed tests of the EVP support module
includedthefollowingscenarios:
visualizationstartrequestsformapdatagenerated
bytheEVP(the
addressofthecentralMapServer
isreturnedintheresponse),
visualization start requests for archival
files/photos,SMS/SDS,video,andaudiogenerated
bytheEVP(theEVPsupportmoduleretrievesthe
details of the requested task element from the
centraldatabase,sendsanHTTP/JSONrequestto
theAS,processesthe
responsewithmetadatafrom
the ASandforwardsittothe EVP; if multipage
results are retrieved from the AS, subsequent
HTTP/JSON requests are generated to the AS in
ordertogetallpagesandpassthemtotheEVP),
visualizationendrequestsfortheabove
mentionedtypesof
taskelementsgeneratedbythe
EVP(theEVPsupportmoduleupdatestasklistin
the central database and sends a response to the
EVP),
add note requests generated by the EVP or
consoles (the EVP support module forwards the
request as a HTTP/JSON message to the AS and
returns
theresultoftheoperation).
It is worth mentioning that in order to perform
functional tests of the EVP support module a
simulator of the AS was additionally implemented
basedontheLinuxxinetddaemonandasetofbash
shell scripts. This allowed us to verify the
implementation of communication
procedures
betweentheEVPsupportmoduleandtheAS,which
are based on HTTP protocol messages with JSON
content.
5 SUMMARY
Themaingoalofthesystempresentedinthepaperis
to supplement the basic map data gathered by the
Border Guard with multimedia information
(telephone and radio calls,
video, photos, files,
SMS/SDS) in order to be able to provide complete
presentationofongoingorreconstructionofarchival
events. To achieve this goal beside updating the
previously developed system elements, two new
crucialelementshavebeenadded.TheseareArchive
Servers (AS) based on the approach utilized in the
cloud
technologyandthemultidisplayvisualization
post(EVP)forvisualizingmultimediaevents.
This paper presents the architecture of the
discussedsystemandthefunctionalityofitselements,
i.e. the Central Server (CS), the Map Server (MS),
Observation Points (OP), Mobile Units (MU), the
EventsVisualizationPost(EVP),andArchiveServers
(AS). The functionality, the concept and the
realization of the system, hardware and software
implementation,aswellastheinitialtestshavebeen
presented in this paper. The STRADAR project
presentsawidevarietyoffunctionalitythatallowsfor
precisemonitoringofeventsinregardstotheworks
of Border Guard.
Mutual cooperation of the main
systemcomponents,thatareCS,EVP,andAS,allows
229
for visualization of current or archival operational
situation composed of any combination of
synchronized AIS and radar data, telephone and
radio calls, video recordings, images, files, and
SMS/SDS messages. The functionality presented in
Section 3 is just a fraction of the whole set of the
system’scapabilities.
This work has
been cofinanced by NCBiR
(National Center for Research and Development),
projectsDOBBIO6/10/62/2014.
REFERENCES
Gałęziowski,A.,2005,AutomaticNationalSystemofRadar
Control for Maritime Areas of Poland (in Polish),
PrzeglądMorski,5,pp.5070.
Fiorini, M., Maciejewski, S., 2013, Lesson Learned During
theRealizationoftheAutomatedRadarControlSystem
for Polish Seawaters (ZSRN), in: Marine Navigation
and Safety of
Sea Transportation: Advances in Marine
Navigation,CRCPress,pp.217221.
KaczmarekS.(projectmanager) 20132015.Koncepcjaoraz
implementacja integracji informacji w rozproszonych
elementachsystemuwymianydanychStrażyGranicznej
(Conceptandimplementationofinformationintegration
in distributed elements of the Border Guard data
exchangesystem),researchproject,National
Centrefor
Research and Development (NCBiR),
DOBR/0022/R/ID1/2013/03.
Stefański J. (project manager) 20142017. System szybkiej
transmisjidanychmultimedialnychdlapotrzebochrony
morskiej granicy państwowej (System for rapid
multimedia data transmission for the needs of
protectionofcountrymaritimeborder),researchproject,
National Centre for Research and Development
(NCBiR),DOBR
BIO6/09/5/2014.
Chang S.J. 2004. Development and analysis of AIS
applicationsasanefficienttoolforvesseltrafficservice,
Proceedings of MTS/IEEE OCEANSʹ04, doi:
10.1109/OCEANS.2004.1406499.
Tetreault B.J. 2005. Use of the Automatic Identification
System (AIS) for maritime domain awareness (MDA),
Proceedings of MTS/IEEE OCEANSʹ05, doi:
10.1109/OCEANS.2005.1639983.
MagnusS.Eide,
ØyvindEndresen,PerOlafBrett,JonLeon
Ervik, Kjell Røang, 2006. Intelligent ship traffic
monitoringforoilspillprevention:Riskbaseddecision
supportbuildingonAIS,MarinePollutionBulletin,vol.
54,issue2,pp.145148.
Wiersma J.W.F. 2010. Assessing Vessel Traffic Service
OperatorSituationAwareness,doctoralthesis,TUDelft,
Delft
UniversityofTechnology,Boxpress,Oisterwijk.
Blok M., Kaczmarek S., Młynarczuk M. and Narloch M.
2016a. MapServer information flow management
softwarefortheBorderGuarddistributeddataexchange
system,PolishMaritimeResearch,91(3),pp.1319.
Blok M., Czaplewski B., Kaczmarek S., Młynarczuk M.,
Narloch M. and Sac
M. 2016b. Multimedia distributed
systemforvisualizationofongoingandarchivalevents
for BG, The International TechScience Conference on
„Naval Technologies for Defence and Security”
NATCON2016,pp.6176.