Fastlane solves these problems by combining the results of two successful projects, Xflow and shade.js, to provide a compiler-driven adaptive data-flow-programming framework for parallel data processing on the web.
In the EU project CIMPLEX the research department of Agents and Simulated Reality (ASR) created solutions to visualize large amounts of diverse data. This allows for the analysis in the field of epidemic spreading of diseases.
The data of the disease spreading comes from two sources: Either recorded data like travel information or simulated data.
This data can be visualized using different visualization techniques e.g. network graphs to allow for easy detection of relationships and disease spreading over time. All views of the data can be synchronized.
Moreover users can work with the data in a collaborative manner using different devices like classical desktop web applications or more modern devices like tablets or VR and AR headsets.
Through data flow programming and novel technologies a data-parallel as well as thread-parallel processing is made possible. Hardware computing and rendering increases performance.
Epidemics are an international problem, more likely than ever before. Their course is complex and difficult to foresee. The past has shown how important a quick reaction for the effectiveness of countermeasures is.
One example from the project is the spreading of influenza. This disease usually occurs in winter time. The dry air and the use of air conditioning fostering its growth. International travel than increases it’s spread across boundaries.
It is essential to gain a comprehensive and up-to-date picture of the crisis situation from the outset, to analyze the situation and to communicate the necessary measures quickly and purposefully.
One aim of the project was to improve the communication with the affected persons while delivering an effective and purposeful disaster management at the same time.
New communication technologies, which are based on social networks and smartphones, can help to deliver and link the necessary information.
The tools and computer models that were developed were intended to inform decision-makers and citizens in real-time and support them in combating disease spread.
The technologies employed were threefold. First large scale, realistic, data-driven models to predict the disease spread. Second participatory data collection to gain information about disease outbreaks early on and last advanced methods for Big Data analysis and visualization.
In the European research project CIMPLEX (Bringing of Citizen, model and Data together in Participatory, Interactive Social Exploratories) such a new system was developed.
CIMPLEX combined information from a variety of sources such as social networks, cell phone positions, the social and economic environment, and the experiences and opinions of eyewitnesses.
New models for explanation, visualization and interaction with data and models both on individual and on collective level were to be developed. Theoretical, methodological and technological advances were aimed at in order to better foresee, explain and handle disease spreading.
All this were to be molded into a broadly usable ICT platform.
The proposed solution should be usable by a wide range of users. From policy makers trying to curb disease spread, researchers developing news models for disease spread prediction up to citizens.
The visualization should be able to yield different views on the underlying data and models and allowing for a collaborative analysis.
The visualization must be able to handle vast amounts of geo-referenced, data time-dependent data.
Moreover a custom deployment for domain specific use cases must be possible in order to maintain the flexibility of the system
To maximize the range of possible users the web was used as a target platform. It is widely spread, most users are familiar to it, it is available on many devices and supports classical 3D visualization as well as 3D and VR.
It also supports a wider range of user interactions. Ranging from usual keyboard and mouse interaction it also supports touch devices and more elaborate VR controllers.
Since the planned system would build on a plethora of possible data sources a service oriented architecture (SOA) was proposed. Thus easing reuse of components and independent development of services.
The architecture of the final system consisted of three layers.
The first layer was data acquisition. It delivered data from social sensing components as well as participatory data collection.
The second layer was models and simulation. It used epidemic simulation web services as well as integrated computational models.
The third layer was the exploratory layer. This layer allows for the visual exploration of the combined data sources from the first two layers.
The Visualization Framework was developed in a cooperation between the University of Stuttgart and the DFKI. The highly customizable web-based open-source framework allows to connect to different web services, and to analyze the data with a large variety of interactive visualizations that are connected via brushing and linking. The web-based applications created with the framework run on multiple devices such as smartphones, tablets, desktop computers and large display walls.
The GLEAMViz web service hosts simulation data in various formats. All of them contain compressed binary data for fast data exchange and that needs to be decoded on the client in order to be processed and visualized. By using parallel data processing the decoding time can be greatly reduced. Thus,the library offers decoders for GLEAMViz from ISI and agent based model datasets from ISI and FBK, including movement data from the DFKI.
We also have developed a configurator application that tremendously reduces the time to configure and deploy a custom version of the visualization framework including all data and simulation services. Based on a web page URL, a user is now able to select, which views, data and simulation services he wants to include in his custom deployment. The server backend then creates a unique executable file depending on the selections, and offers it for downloading. The downloaded executable, based on node.js and Docker, then fully automatically installs all dependencies, including data and simulation services, locally on the client machine. By using Docker, the installation is isolated and does not affect or modify the client host system. The configurator creates executables for Windows, Mac OSX and Linux.
Title: Bringing of Citizen, model and Data together in Participatory, Interactive Social Exploratories
Run Time: 01.01.2015 – 31.12.2017
FET Proactive Global Systems Science (GSS)
Grant agreement no: 641191
Deutsches Forschungszentrum für Künstliche Intelligenz GmbH, Germany