Modeling and Design - M&C Tool

To support the design of a system for a project


Safety and Security for big data Sensor Networks



MODELING AND COMPUTATION TOOL

Appropriate modeling and design support tools are thus essential to deal with such

system complexity,

and one alternative for support is to accomplish modeling tool integration

M&C Tool


Safety and Security

for big data Sensor Networks

The M&C Tool consists of two main components:


(a) a modeling component, which is used to carry out the system design activity by building a model of the system;

(b) a computational engine, which is used to process the system model and carry out the system analysis activity by either solving the system model (analytical approach) or executing the model (simulation model)


The modeling component makes use of SysML (Systems Modeling Language), a UML-based modeling language that supports the specification, analysis, design, verification and validation of complex systems. SysML, which is now considered the de-facto modeling standard in the systems engineering field, allows to represent a system from different viewpoints (structural, behavior, dynamical) and allows to take into explicit account the system complexity by use of hierarchical multi-level approaches. According to such an approach a given system can be modeled in terms of high level blocks (e.g., by use of a block definition diagram) that are then refined into more detailed blocks and eventually into operational blocks. A behavior can be associated to blocks and their internal structure can be described by use of internal blocks diagrams. An additional useful feature of SysML is the availability of parametric diagrams, which can be used to express constraints (equations) between value properties and specify constraints in an analysis context. In the design of system, SysML can thus be effectively used to deal with the system complexity by first modeling the various system constituents or submodels, i.e., by specifying the Plasma generation and charge confinement model, Electromagnetic and the Heat Generation and Transmission model, etc., and then integrating such models to evaluate their combined effects on the overall system design. The various submodels are then processed by use of a computational engine, which takes as input the SysML specification and applies the appropriate analysis modeling tool (e.g., Matlab for numerical analysis or Simulink for simulation-based analysis).

The proposed M&C tool also introduces innovative approaches to automatically derive the analysis model (e.g., the Matlab/Simulink model) from the system model (i.e., the SysML model). Such transformations are coded in model transformation specification languages (e.g., QVT or ATL) and then executed by use of a transformation engine that takes as input the system model and yields as output the corresponding analysis model.

The use of SysML, combined with the appropriate analysis tools and the use of model transformations, allows to take into account the relationships among the various submodels, so as to easily propagate a structural or parameter change in one part of the system to the other parts that may be affected by the change and to the corresponding analysis model.

Important to note: Access right is granted royalty-free to the present project partners, limited to project contractual terms execution time. The burden of proof to show that Access Right is needed for carrying out the present Project activities is borne on the requesting Party.


From the implementation point of view, the tool architecture and its design that will be implemented in the first phase of the project.

The main objectives of the proposed approach are:

•           standardization of interfaces

•           utilization of model based design and implementation

•           multi-level integrated system modeling

In order to guarantee a significant level of remote interoperability and ease-of-use, the M&C tool will be designed according to innovative approaches based on the SaaS (Software as a Service) paradigm.


It will exploit the benefits coming from computer clustering in the Cloud allowing the user to process data remotely and locally independently.


Its design will allow to deal with different types of computer and terminals to take into account the heterogeneity of the application domain. Existing interoperability standards (e.g., SOAP, XML, etc.) will be augmented adding remote computing capability and improved security clearance.


In figure the service scenario is depicted showing the different elements:


•           Service provider

•           Remote processing provided by cloud

•           User with or nor in-house processing capabilities


The activities carried out within the proposed task aim to:

•           Model and design the system and its sub models;

•           Define computation rules and analysis tools for the submodels

•           Specify and execute model transformations

•           Configure the tool for system modelling based on commercial tools

•           Define and develop a Web service for user utilization

The Web service application, outlined in figure will allow users to:

•           Enter application and register (User Interface)

•           Get informed about the application field of application and expected results (Tutorial)

•           Run the application according to the user input data (Algorithms)

Subscription and Bill the cost of utilization according to the user profile (pay per use, subscription, etc.) (Bill & Accounting). Feature not in figure. Important to note: Access right is granted royalty-free to the present project partners, limited to the project contractual terms execution time. The burden of proof to show that Access Right is needed for carrying out the present Project activities is borne on the requesting Party.



Characteristic of the tool will be its flexibility in terms of utilization. So it is possible:


•           Run the application in the user in-house computer, so no data transfer is done improving the security and privacy of results.

•           Run the application remotely with the data transfer and results further protected by the application algorithms