Download as pdf or txt
Download as pdf or txt
You are on page 1of 6

III.

International Conference on Electrical, Computer and Energy Technologies (ICECET 2023)


16-17 November 2023, Cape Town-South Africa

Methodology for measuring and improving inter-


organizational interoperability
Mamour FALL Mamadou Samba Camara Mouhamed DIOP
Faculté des Sciences et Techniques Ecole Superieure Polytechnique Ecole Superieure Polytechnique
Cheikh Anta DIOP University Cheikh Anta DIOP University Cheikh Anta DIOP University
Dakar Senegal Dakar Senegal Dakar Senegal
mamourfall@gmail.com mamadou.samba.camara@esp.sn mouhamed.diop@esp.sn

Abstract— Interoperability is a constantly evolving field, Web applications are getting closer and closer to traditional
particularly with the development of connected and intelligent desktop applications with processing often transferred to
devices. Numerous methods have been developed to assess browsers. However, these treatments can be expensive.
interoperability, based on maturity models. However, with the
evolution of technology, the development of the Internet, The objective of this study is to propose a methodology for
connectivity is no longer a real problem, and isolated systems assessing interoperability based on the processes of generation
are becoming increasingly rare. Improving interoperability is and integration of the data exchanged between systems in
therefore no longer oriented towards interconnection, but order to identify areas for improvement of the business
rather towards optimizing business processes. Activities process to obtain data to be exchanged generated in a given
generated by interoperability are a priori considered to be of no format. This approach has the advantage of taking into
value compared with business activities and contribute to account the complete functional coverage of the business
putting data into the right format for transmission or use. This process and reduces the processing carried out on the data
study proposes a methodology for assessing interoperability received by a system before a business process instance runs.
based on data exchanged between systems. The use of the
proposed Data Circuit based on generation and integration Section 1 will provide a state-of-the-art interoperability
processes makes it possible to identify points of improvement of assessment. Since the proposed methodology is based on an
the business process to have data produced in a format that analysis of the systems involved in interoperability, Section 2
requires the least processing before its use, especially in the will address interoperability analysis from a system
context of the execution of a process instance. engineering perspective to characterize requirements, while
Keywords—interoperability’s improvement, system Section 3 will discuss the proposed methodology. Finally, a
engineering, assessment, business process, non-value-added use case and discussion will close the study.
activities.
II. STATE OF THE ART
I. INTRODUCTION
Interoperability measurement aims at defining metrics to
Research on interoperability has led to the development of qualify the degree of interoperability. The implementation of
technical solutions that now allow applications to metrics is related to two principles: (1) the identification of the
communicate with each other. However, this communication parameters relating to interoperability, (2) the characterization
only supports the transmission of data that is often already of these parameters by metrics [2]. Interoperability problems
standardized via data channels that require processing by usually arise when two systems are not interoperable. The
sending and receiving systems. Interoperability is therefore solution to this problem, mostly related to technical aspects,
limited to software level. Technical solutions partially solve consists mainly in upgrading systems or in creating a gateway
interoperability problems because they ignore the overall allowing the systems to interact. However, other types of
analysis of the business process and its requirements. problems may exist outside of this assumption such as
Determining the right requirements for the system is very optimization, reliability, or process failures, among others.
critical in the system development life, hence, there is a need
to carefully consider this phase to build a high quality and A. Measures
reliable system [1]. The fact that interoperability can be improved means that
In the literature, most interoperability assessment methods metrics for measuring the degree of interoperability exist. An
are based on maturity models that provide an assessment of assessment determines the As-is state of a system or system of
individual systems and do not consider the overall business systems and provides a roadmap toward the To-be state [3].
process that requires the collaboration of multiple systems. In Assessments are done using qualitative and/or quantitative
addition, these maturity models do not integrate technological measurements.
developments with the arrival of Web 2.0 which facilitates 1) Qualitative measure: The qualitative measures allow
data sharing as well as collaborative work and teleworking. to assess interoperability in relation to the approach adopted

979-8-3503-2781-6/23/$31.00 ©2023 IEEE


and to define technical performance objectives. The enterprises, the EIMM to address interoperability issues at all
Qualitative measures are mainly subjective. In most cases, levels of the company (ATHENA, 2005) [6]. The six areas of
this kind of measure uses a rating scale composed of concern covered by the assessment are: - Business strategy
linguistic variables (e.g. “Good”, “Optimized” and and processes - Organization and skills - Products and
“Adaptive”) for qualifying a system [4]. It is mostly used by services - Systems and technologies - Legal environment,
the maturity models, which are approaches designed to assess security, and trust - Business modeling. The five maturity
the quality (i.e. competency, capability, level of levels of the Enterprise Interoperability Maturity Model are:
sophistication) of a selected domain based on a more or less Realized, Modeled, Integrated, Interoperable and Optimized
comprehensive set of criteria. Among the most popular [11].
maturity’s model, one can mention: 2) Quantitative measures: Quantitative models are
a) LISI which proposes five levels of maturity: approaches that make it possible to numerically measure the
Isolated, Connected, Distributed, Domain and Enterprise. To characteristics of interoperability. Quantitative measures are
establish these levels of maturity, it defines four areas of concerned with performance. They define numeric values to
interest named PAID, which stands for Procedures, characterize the interoperability [4]. In general, the rating
Applications, Infrastructure and Data. From 1998 to 2006, scale is from 0 to 100%.
LISI was the template for numerous maturity model and B. Limits
maturity model-like (leveling) interoperability measurement
approaches designed to measure both information and non- Many approaches to interoperability assessment are
identified in the literature based on the definition of maturity
information system interoperability [4]. The aim of LISI's
models that describe, for a specific domain of interest, several
measurement is to capture the essence of the potential levels of sophistication at which activities in that domain can
interactions available between systems, as recorded by the be conducted [10]. They are focused on maturity measures and
implementation choices made by developers. Three types of addressed potentiality, compatibility and/or performance
measure are proposed: "generic", "expected" and "specific". assessments. They are mostly oriented architecture, system
The generic type relates to the comparison of one system engineering and/or data with different approaches and metrics.
against the capability model, while the other two (expected LISI, is focused on technical interoperability while LCIM is
and specific) relate to the comparison of two or more systems on the data to be exchanged and the documentation of the
against each other. available interface. OIM describe the ability of organizations
b) Organizational Interoperability Model-OIM: The to interact and EIMM, defined in an a priori interoperability
Organizational Interoperability Model [5, 6] was developed context, allows companies to assess their interoperability
to assess interoperability at the level of human or potential, to know the likelihood that they need to support
organizational activity. Originally proposed for use in the effective integration and to detect precisely the weaknesses
assessment of military organizations, the authors argued that that cause interoperability problems.
such a model could be applied to different contexts (i.e., for Technical solutions for achieving interoperability between
general use) [6, 7]. It was specifically designed as an systems are based on the development of APIs (SOAP,
extension of the LISI 2.10 maturity model, which is WSDL, ebXML among others) [12,13] and on standardization
technology-intensive and focuses on the compatibility of initiatives concerning the format of exchanged data ((IEC
systems and technologies. It aims to assess the compatibility 62264-1) [14],
of at least two entities relative to organizational and Even if they allow communication between systems,
conceptual barriers, with an emphasis on business concerns technical solutions do not intervene as part of an overall
[5]. The OIM defined five levels of interoperability analysis of the business process and ignore certain aspects of
(Independent, Cooperative or Ad hoc, Collaborative, interoperability. It is mainly about the approach, the concerns,
Integrated or Combined, Unified) and four interoperability and the barriers. To fully achieve interoperability, it is
attributes (Readiness, Understanding, Command Style and therefore necessary to consider all activities (business and
Ethos). non-value added) of the business process as presented in [16]
c) Levels of Conceptual Interoperability Maturity by Camara et al. This approach will allow to identify
Model-LCIM : LCIM is a maturity model that assesses the functional requirements. With communication between
compatibility of two entities by targeting conceptual barriers systems established, the improvement in interoperability will
in the field of data interoperability [8]. It is a framework be in efforts made by systems to provide usable data to third
describing a spectrum from what is conceptually modeled to parties with minimal processing.
how it is technically implemented. Data (entities, own III. CONCEPTS
concepts), the way it is used (process concepts, methods) and
constraints are described by artifacts following the ideas of A. System engineering Process
systems architecture and modeling as known in the field of A process is a system of activities that uses resources to
systems engineering [9]. transform inputs into outputs. It describes the "what to do" in
d) Enterprise Interoperability Maturity Model- EIMM : a predefined order. It transforms data by creating added value.
The ATHENA project elaborated, for manufacturing Systems engineering is an iterative process of top-down
synthesis, development, and operation of a real-world system
that satisfies, in a near optimal manner, the full range of of a process supported by interacting systems or by an
requirements for the system. individual system.
Systems Engineering Methods are Stakeholder Analysis, IV. PROPOSITION
Interface Specification, Design Tradeoffs, Configuration
Management, Systematic Verification and Validation, In this study, we assume that interoperability is effective
Requirements Engineering etc. They ensure correct external between systems. Inter-organizational interoperability is
interfaces, interfaces among subsystems and software. The materialized by the existence of functional links between
conventional view of requirements definition is that this phase several entities within a business process. The collaboration
of system’s development begins with an informal description system is analyzed from a requirement engineering
of "what" the system is expected to do. perspective with the objective of identifying the
interconnection nodes, evaluating the interoperability’s
Systems engineering makes it possible to: efforts, and optimizing the data exchange and processing. The
• represent of services that each system must offer to system engineering process enables analysis to develop,
achieve the interoperability corresponding to the reorganize, and speculate the functionality of a system for
Functional Requirements (FR) and specific purposes. These specific goals always depend on the
user. The methodology is based on optimizing interoperability
• determine how these services are rendered or efforts through two processes: the Data Generation Process
expected (Non-Functional Requirements - NFR). (DGP) for the sender and the Data Integration Process (DIP)
B. Interoperability efforts for the receiver. The advantage of this approach is that it
provides a conceptual analysis of the interoperability scheme
The systems’ interoperability must be organized to and allows for an assessment of the compliance of the "As-is"
minimize communication between components to ensure that situation.
performance requirements are met and avoid process failures.
Each interconnection node is a critical point insofar as it must Analyzing an activity independently of the involved actors
be considered the data exchanged, the I/O between systems, and the way it is considered at the application level does not
the number of target systems and especially the process allow us to accurately measure its impact on the collaboration
failures. The effort made by each system to make the data process. The proposed methodology will be based on:
usable, before, and/or after the exchange, is decisive to assess
the interoperability. • identification of NVA activities,

Two types of effort should be considered: • identification of involved actors and

• A one-time effort: This is an organizational and • processing of exchanged data.


applicative upgrade to directly have data generated in NVA activities representing interoperability efforts are
the appropriate format in output so that it is directly identified through the analysis of:
usable by other systems, or as input by the requesting • the process of generating data to be exchanged from a
systems. In this case, systems must be upgraded to business process,
record data exchanged in the correct format at the time
of creation or generation. • the transformation process of the received data before
its use in a business process.
• a continuous effort: this is an application-level data
transformation that must be carried out by the system
providing or receiving the data for each process
instance.
C. NVA activities
The problems of interoperability can be of two kinds:
• Multiple and complementary systems are not
interoperable.
• The data received are not in the correct format.
Interoperability’s need, at the origin of the project, is
classified as NFR. Similarly, the activities generated by
interoperability are mostly NVA activities. They are
defined as the components of business processes that
represent efforts between partners to achieve
interoperability in information exchange [16]. However,
NVA activities can generate both FR (without adequate
data formatting, an instance of business process is bound Fig. 1. Data Generation Process (DGP)
to fail) and NFR (process failure, optimization...). As a
Figure 1, which shows the DGP process, also highlights the
result, interoperability problems can concern the activities
areas of improvement. Criticisms of the latter are shown in red
and orange. From the data collected at the interconnection requesting a specific information and the data format required
node, it is possible to reverse-engineer the creation, generation by each system. In an integrated and unified interoperability’s
and/or transformation process. approaches, a "neutral" format can be established. Data that
compose the information is presented in a format that allows
Handling NVA activities requires the identification of the each "receiving" system to adapt it to its needs. In this case,
actors involved in the process. The Data Integration Process there are two treatments to be foreseen before using the data,
(DIP) is different depending on the case selected. Considering one carried out by the "sender" and another by the “receiver”.
the first case, for example, integrating data from another They are the interoperability efforts.
medium involves sorting, entering, and validating data before
it is used by a third-party system in a business process V. APPLICATION
instance. The DIP shown in Figure 2 proposes a representation
of the options for the data processing circuit in an The case illustrated in Figure 5 represents the course’s
interoperability project. The optimal circuit is the receipt of planning process at “Ecole supérieure Polytechnique de
data by the receiving system in a ready-to-use format by an Dakar” which has an application called "VT System". In the
automated process directly from the sending system. The current mode of operation, there are several entities that can
transformation process depends on the level of compliance of be considered as organizational units using autonomous
the data with what is expected as input for a business activity. systems. Each training course has a pedagogical manager who
ensures the scheduling of courses and who requests the
allocation of a room from an actor named focal point in charge
of managing the occupation of the rooms. The starting point
of the planning is the entry of schedules on an Excel file by
the pedagogical manager who transmits it to the focal point by
email or on an electronic medium.
The information contained in the file is entered into the VT
system by the focal point, which at the same time assigns the
classrooms. The output information is sent by the focal point
to the class supervisor and teacher. It may also happen that a
pedagogical manager assigns rooms instead of the focal point.
The class supervisor receives the printed form of the enriched
schedule, shares it with the students and fills in the textbook
Fig. 2. Data Integration Process (DIP) that will be signed by the teacher once the course is done. As
the person in charge of training is not connected to the VT
Integration must also consider the situation where all system, the focal point enters the schedule transmitted in
exchanges are automated, and those where they are partially Excel format in the application.
or not at all. From the analysis of the first interconnection node, the
For the second situation, four subcases have been identified: paths shown in bold draw the data processing circuit in the
● An actor uses an isolated system that requires “As-is” situation. It appears that the circuit is not optimized
with the non-existence of a data channel between the two
manual integration of the received data.
systems. Furthermore, the usage diagram in Figure 7 shows
● An identified actor in the process who does not have that there is a confusion of actions between the pedagogical
manager and the focal point. The pedagogical manager issues
features or software interfaces that allow him to
a programming request to the focal point, which registers it in
perform what is expected of him. the system before processing it.
● An actor identified in the process who sees the The case presented at Figure 3 corresponds to a situation
features relevant to his work available to another where all exchanges are not automated:
actor than him. • the class supervisor uses an isolated system,
● The functionalities are implemented but the user • the focal point prints the timetable and transmits it to
does not have access to the system due to network the class supervisor,
constraints (e.g., geographical position). • the class supervisor notifies the new course’s
In the case where all exchanges are automated, NVA scheduling to students,
activities are only identified in the data integration process. • teachers and class supervisor do not have access to VT
Otherwise, additional NVA activities may appear. These System.
include, for example the integration of an actor using an
isolated system and the development of new functionalities for From the combined analysis of Figures 4 and 5, it comes
an actor. out that in addition to improvements in the data processing
circuit, the system engineering process allows for an inventory
However, we will have to consider before opting for of functionality by user to determine activities performed in a
formatting by the sending system the number of systems redundant manner as shown in Figure 5. Each time an actor
intervenes, there are NVA activities that are identified around
business’ ones. Most of NVA activities aim at putting the data
in the right format.

Fig. 3. Sequence diagram of the planning and execution of the courses

Fig. 4. Data circuit between the Pedagogical Manager and the focal Point

The execution of a business activity is often surrounded by lead to inconsistencies and errors in the execution of a
NVA activities. For example, the filling of the textbook by the business process instance. It also limits the responsibility for
class supervisor is preceded by its retrieval at the focal point the completeness of the information to the actors involved in
level at the beginning of the day. The activity diagram the treatment process.
presented in Figure 5 is obtained after processing the NVA
activities carried out by the different actors. The objective of CONCLUSION
the solution is to process the NVA activities in such a way that Functional interoperability does not imply interoperability
each time a piece of information is available, another actor at software level. Similarly, technical solutions that promote
cannot manually process it to perform a business activity. collaborative working, such as Web 2.0, web services and
The transfer of certain activities to the software, such as APIs, only create communication gateways based on pre-
notifications, the filling in of the textbook by the class established formats. Each system is responsible for sending
supervisor or its signature by the teacher, can be accompanied data in the required format, and for receiving data using an
by checks and reminders to ensure that all actors have received ETL or ELT tool before their use. These operations are
the information intended for them or have correctly performed interoperability efforts. The proposed data circuit in this study
the activities expected of them. In the same way, the follow- aims to improve interoperability through data-based
up of the courses will be done directly in the system with the evaluation, from production to transfer to a third-party system.
integration of the class supervisors and the teachers. The Data presentation thus becomes an interoperability concern.
interest of this approach is to eliminate manual rework that can This approach breaks with the logic of evaluating systems
separately and subjecting data to repetitive processing at each NVA activities. The processing carried out on these NVA
interoperation, producing them, if possible, in a ready-to-use activities will make it possible, among other things, to limit
format. the number of participants to those performing business
activities, and to translate programmable tasks into
In perspective, a global analysis of the business process functionalities managed directly at software level.
according to the indications of the data circuit will enable a
precise distinction to be made between business activities and

Fig. 5. Activity diagram for scheduling and monitoring courses

REFERENCES Éd., in Lecture Notes in Computer Science, vol. 5333. Berlin,


Heidelberg: Springer Berlin Heidelberg, 2008, p. 273-282. doi:
[1] D. Chen, G. Doumeingts, et F. Vernadat, “Architectures for enterprise 10.1007/978-3-540-88875-8_48.
integration and interoperability: Past, present and future”, Computers [11] C. Campos, R. Chalmeta, R. Grangel, et R. Poler, “Maturity Model for
in Industry, vol. 59, no 7, p. 647-659, sept. 2008, doi: Interoperability Potential Measurement”, Information Systems
10.1016/j.compind.2007.12.016. Management, vol. 30, no 3, p. 218-234, juill. 2013, doi:
[2] N. DACLIN, D. CHEN, et B. VALLESPIR Enterprise interoperability 10.1080/10580530.2013.794630.
measurement-Basic concepts. EMOI-INTEROP, 2006, vol. 6. [12] G. Paschina, A. Clematis, G. Zereik, et D. D’Agostino,
[3] M. S. Camara, R. Dupas, Y. Ducq, et B. Mané, “Interoperability “Implementation of an interoperable interface to exchange B2B
Improvement in Inter-Enterprises Collaboration: A Software messages between heterogeneous computer platforms”, 2014.
Engineering Approach”, in Enterprise Interoperability VI, K. Mertins, [13] H. R. Motahari Nezhad, B. Benatallah, F. Casati, et F. Toumani, “Web
F. Bénaben, R. Poler, et J.-P. Bourrières, Éd., Cham: Springer Services Interoperability Specifications”, Computer, vol. 39, no 5, p.
International Publishing, 2014, p. 201-211. doi: 10.1007/978-3-319- 24-32, mai 2006, doi: 10.1109/MC.2006.181.
04948-9_17.
[14] IEC, IEC. 62264-1: Enterprise-Control System Integration–Part 1:
[4] G. da Silva Serapião Leal, W. Guédria, et H. Panetto, “Interoperability Models and Terminology. IEC: Geneva, Switzerland, 2013.
assessment: A systematic literature review”, Computers in Industry,
vol. 106, p. 111-132, avr. 2019, doi: 10.1016/j.compind.2019.01.002. [15] ISO 19440:2020 Enterprise modelling and architecture — Constructs
for enterprise modelling
[5] CLARK, Thea et JONES, Richard. “Organisational interoperability
maturity model for C2.” In: Proceedings of the 1999 Command and [16] M. S. Camara, Y. Ducq, et R. Dupas, “A methodology for the
Control Research and Technology Symposium. 1999. evaluation of interoperability improvements in inter-enterprises
collaboration based on causal performance measurement models”,
[6] S. FEWELL et T. CLARK “Organisational interoperability: International Journal of Computer Integrated Manufacturing, vol. 27,
evaluation and further development of the OIM model.” Defence no 2, p. 103-119, févr. 2014, doi: 10.1080/0951192X.2013.800235.
Science and Technology Organisation Edinburgh (Australia), 2003
[7] G. da Silva Serapião Leal, W. Guédria, et H. Panetto, “Interoperability
assessment: A systematic literature review”, Computers in Industry,
vol. 106, p. 111-132, avr. 2019, doi: 10.1016/j.compind.2019.01.002.
[8] T. Ford, J. Colombi, S. Graham, et D. Jacques, “Survey on
Interoperability Measurement”, p. 67, juin 2007.
[9] WANG, Wenguang, TOLK, Andreas, et WANG, Weiping. “The levels
of conceptual interoperability model: applying systems engineering
principles to M&S.” arXiv preprint arXiv:0908.0191, 2009.
[10] W. Guédria, Y. Naudet, et D. Chen, “Interoperability Maturity Models
– Survey and Comparison – " in On the Move to Meaningful Internet
Systems: OTM 2008 Workshops, R. Meersman, Z. Tari, et P. Herrero,

You might also like