Download as pdf or txt
Download as pdf or txt
You are on page 1of 4

Big Data and Intelligence Analysis in the Knowledge Age

Raul J. Perez Rodriguez


October 10, 2022

The Intelligence and Analysis Cycle as part of its neuralgic phase


(elaboration) is a methodology that dates back to the beginning of
intelligence as a discipline, which is still in full use due to the excellent
results it has produced over time in production. However, given the
exponential development of information and communication technologies
that mark the Knowledge Age, it must be adapted and digitized to efficiently
deal with the treatment and analysis of the large volumes of data that are
generated (Big Data).

Peter Drucker in his 1969 book "The Age of Discontinuity" states the need for an
economic theory to be generated that places knowledge at the center of the
production of wealth, emphasizing that the most important thing was not the
amount of knowledge , but rather their productivity; Consequently, the Knowledge
Era arises, characterized by the conversion of knowledge into a factor of
production and the increase in the value of the intangibles of organizations. All
this leads to the transformation of the industrial society into a knowledge society,
with its respective labor force or knowledge worker.
The rise, development and exponential growth of information and communication
technologies has led to a similar increase in the information that is produced and
exchanged, making it practically unmanageable for human capacity, forcing them
to develop technologies based on data processing software. and information
analysis to manage it in order to produce knowledge that adds and generates
value.
Big Data refers to the massive analysis of data, of such magnitude that traditional
software is unable to manage. Likewise, it defines the new technologies that
enable the capture, storage and processing of these data and the use made of
the information and knowledge obtained through said technologies. The concept
of Big Data is relatively new, its origins dating back to the 1960s and 1970s with
the first data centers and the development of relational databases. It was not until
2005 – 2006 that the amount of data generated by social networks and other
online services became evident, developing Hadoop and then Spark as open
code created specifically to store and analyze large data sets, easy to use and
cheap. to store. The production of data continues to grow and with 5G technology
together with the Internet of Things, the large number of devices, artifacts and
objects connected to the network generate even more data; Not to mention what
comes with the appearance of Artificial Intelligence as a catalytic technology in
the production of data as well as in its treatment.
The characteristics of Big Data are generally defined on the basis of the “V”,
which coincidentally is the letter of the alphabet with which all these properties
begin:
 Speed: associated with the immediacy required for data processing.
 Variety: refers to the diversity of techniques for processing data from
numerous sources and in different formats.
 Volume: Increasing amount of data impacting the growth of applications
and the architecture built to support the collection and storage of
increasingly varied data.
 Veracity: deals with the reliability of the information collected, to obtain
quality data.
 Value: efficiency in obtaining data that represents valuable information.
 Variability: adaptability of the models or treatments that are applied to the
data by virtue of its constant change.
 Visualization: synthesize the knowledge produced in a clear and simple
way in friendly and easy-to-understand graphics.
The Intelligence Glossary of the Spanish Ministry of Defense in 2007 defines the
Intelligence Cycle as the "Process of generating and communicating new
knowledge, truthful and adjusted to the needs and requirements of a user, from
obtaining and transformation of appropriate information. Sequence of activities
through which information is obtained that, duly treated, becomes knowledge
(intelligence) that is made available to a user”.
The National Intelligence Center (CNI) of Spain uses four steps or phases:
direction, obtaining, elaboration and diffusion; each phase with its respective
subphases:
Source: National Intelligence Center (NIC)

However, this traditional model has been questioned by many professionals in


the field, considering it somewhat obsolete in its essential principles and its
propensity to face the new realities of the Knowledge Age. In recent years there
have been frequent attempts to modify, improve and transform it into its own
nature; however, it continues to be an indicative scheme and the generally
accepted methodology for the generation of useful knowledge adjusted to the
requirements of the final recipient, the decision maker.
Some authors specializing in Big Data consider an Intelligence or Information
Management Cycle to deal with huge amounts of data based on four steps:
1. Information Capture: aimed at knowing where the information that is
needed is and how it can be captured. Information is everywhere and it is
about knowing how to obtain it. There are several methods for capturing,
among others: Web Scraping is a technique that uses software programs
to extract information from websites; information management through
various APIs created for this purpose that facilitate communication
between various software components; and services like Apache Flume
designed to collect and aggregate large volumes of data.
2. Storage: once the data is captured, it is necessary to save it; For this, and
depending on the job to be given and the type of information involved,
there are spreadsheets for traditional structured information, or NoSQL
systems for storing unstructured information quickly and flexibly. .
3. Treatment: once the capture and storage of the data is completed, its
treatment corresponds, which depends on the type of information it is and
its use; being able to extract knowledge and search for repetitive patterns
in the data through statistics and artificial intelligence (machine learning).
4. Value enhancement: it is based on the adequate treatment and analysis
of data and information to establish relationships between them that
provide patterns aimed at producing useful knowledge and aligned with
the information requirements and needs of decision makers. The
dissemination of the intelligence produced can be presented in graphic
reports (dashboards) that are easy to understand and understand.
Within the organizational environment, specifically the corporate or business
environment, Big Data constitutes the basis of Business Intelligence (Business
Intelligence) by treating the large amount of data and internal information with
analytical intelligence software designed for this purpose, in a quantitative
manner. For Competitive Intelligence that deals with data and information from
the organizational environment, Big Data is also useful, but it alternates with
traditional methods that provide the qualitative mark to the product of this
analysis.
The Intelligence and Analysis Cycle as part of its neuralgic phase (elaboration) is
a methodology that dates back to the beginning of intelligence as a discipline,
which is still in full use due to the excellent results it has produced over time in
production. However, given the exponential development of information and
communication technologies that mark the Knowledge Age, it must be adapted
and digitized to efficiently deal with the treatment and analysis of the large
volumes of data that are generated (Big Data).

You might also like