Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1of 9

Big Data

(Data security and integrity)


Author name (1) – Author name (2) –
Atherv Saxena Aniket Singh
Student of National PG College,
Student of National PG College,
Lucknow , India.
Lucknow , India.
email- athervsaxena99@gmail.com

Abstract – Now-a-days data is most important thing for


any company or organization for its survival in the
world. Anything ranging from person name and their
market or this changing world. A new problem is created
contact information, to products sold, to profits made, to
by continuous growth in the importance and volume of
employees hired, etc. has become essential in day-to-day
data: it cannot be handled by traditional analysis
continuity. To fulfill all the above needs DATA and its
techniques. Therefore, this problem was solved through
storage has to be their. The building block upon which
the creation of a new paradigm: Big Data. Big Data
any organization thrives is data. Now a days,
originated new issues related not only to the volume or
organizations completely rely on the data for all their
the variety of the data, but also to data security and
improvements in every field.
privacy. To obtain a full perspective of the present day
scenario, we decided to carry out an investigation with
the objective of highlighting the main issues regarding
BIG DATA, is a term used to describe a massive amount
Big Data security and integrity. We explained the process
of data and its storage in the highest efficient form. In
of systematically arranging of data and measures to
simple words, big organizations have their data spread in
secure data and how to maintain its integrity. It is almost
a massive amount and they deal with their storage in an
impossible to carry out detailed research into the entire
efficient manner which gives the ease to work on that
topic of security because there are numerous ways to
data. All of that falls under big data. Efficiency of big
attack data and in present day no organization is able to
data includes the security and integrity of big data. The
secure its data cent percent efficiently, there is always a
large the amount of data, the bigger the threat of losing it
loop hole and variety of organizations and its massive
or being stolen. In this time we have uncountable ways to
data types.
protect our data and maintain its integrity but very few
Keywords – Big Data, security, integrity, fourier are most efficient ones. Organizations are continuously
transform, Hadoop working on this aspect of big data and day to day
advancements are coming in the vision. Traditional
Introduction –We can’t imagine a world without data database systems are not able to capture, store and
storage no matter big small. If we do so then this world analyze large amount of data. Amount of big data
would be a place where every detail about a person or continue to grow by the increasing rate of internet. New
organization or company, every transaction no matter ways for businesses and government to analyze
small or big performed, or every information which can unstructured data are provided by big data analytics.
be documented for future use is lost directly after use.
That would make organizations lose the ability to extract Big data is a term that describes any large volume amount
valuable information and perform analyses, which leads of structured, semi-structured and unstructured data that
them to become a failure in this fast moving has the potential to be mined for information. The
thereVs that define Big Data are Variety, Velocity and
Volume.

Volume: Not just in the form of text data, but also in the
form of videos, music and large image files. Data is
stored in terms of Terabytes and even Petabytes in following parameters, or a combination of them, or even
different companies. We need to re-evaluate the all of them together.
architecture and applications built to handle the data
Big Data originated new issues related not only to the
provided by big data analytics.
volume or the variety of the data, but also to data security
and privacy. It is recorded that of all the data in recorded
human history, 90 percent has been created in the last few
Velocity: Data is streaming in at unimaginable speed and
years. In 2003, 5 exabytes of data were created by
must be dealt with in a timely manner. RFID tags, sensors
humans, and this amount of information is, at present,
and smart metering are driving the need to deal with tons
created within two days. We are living in the era of Big
of data in near-real time. Reacting fastly enough to deal
Data, Big Data not only increases the scale of the
with data velocity is a challenge for most organizations.
challenges related to privacy and security as they are
addressed in traditional security management, but also
create new ones that need to be approached in a new way
Variety: It refers to the ever increasing different forms of no stats of increasing data is going down. Furthermore,
data that can come in the form of texts, images, voices this data is mostly unstructured, signifying that traditional
and geospatial data, computer generated simulations. The systems are not capable of analyzing it. The division of
variety of data can be characterized along several Big Data security into four principal topics has been used
dimensions. Some of these are: by the International Organization for Standardization in
order to create a security standard for security in Big
Data.
 Structural variety: It refers to the difference in
the representation of the data. Satellite images of
wildfires from NASA are very different from
tweets sent out by people seeing the spread of
fire.
 Media Variety: Media variety refers to the
medium in which the data gets delivered. For
example: The audio of a speech and the
transcript of a speech represent the same
information in various media.
 Semantic variety: It comes from different
assumptions of conditions on the data. Like
conducting 2 income surveys on two different
groups of people and not being able to compare
or combine them without knowing more about
the populations themselves.

Objectives for data security-


There are two other advantages of defining Big Data:
Although there are attempts at coming up with a security
Variability: In addition to the increasing speed and solution/definition that can address many different
varieties of data, data flows can be highly inconsistent scenarios, we believe that there is no one size fit all
with periodic peaks. To manage daily, seasonal and solution for data security. Instead, multiple dimensions
event-triggered peak data loads can be challenging. Even need to be tailored for different application aspects to
more so with unstructured data involved. achieve practical solutions. We need to understand the
right definitions of security risk, for example, in data
sharing scenarios, probability of re-identification given
Complexity: Today's data comes from multiple sources. certain background knowledge could be the considered
And it is still an undertaking to link, match and transform right measure of security risk. Given these 3Ds, one can
data across systems. However, it is necessary to correlate imagine a multi-objective framework where different
relationships, hierarchies and multiple data linkages or dimensions could be emphasized:
your data can quickly spiral out of control. A data
environment fall along the extremes on any one of the  Maximize utility, given risk and costs constraints: This
would be suited for scenarios where limiting certain
security risks are paramount.
 Minimize security risks, given the utility and cost done on how to efficiently enforce such policies in
constraints: In some fields, (e.g., medical care), recently developed big data stores.
significant degradation of the utility may not be allowed.
Automatically designing, evolving, and managing access
In this setting, the parameter values of the protocol are
control policies. Dealing with dynamic environments
chosen in such way that we try to do our best in terms of
where sources, users, and applications as well as the data
given our utility constraints. Please note that in some
usage are continuously changing, the ability to
cases, there may not have any parameter settings that can
automatically design and evolve policies is critical we
satisfy all the constraints.
make sure that data is readily available for use while at
 Cost minimization, given the utility and risk the same time assuring data confidentiality. Atmosphere
constraints: In some cases, (e.g., cryptographic and tools for managing policies are also crucial.
protocols), you may want to find the protocol parameter
Integrity preserving techniques- A major issue
settings that may allow for the lowest expensive protocol
arising from big data is that correlating many (big) data
that can satisfy all the utility and cost constraints.
sets one can extract any information whether relevant to
Data Integrity- Several data confidentiality techniques him or not. Relevant issues and research directions that
and mechanisms exist – the most notable being access need to be investigated include:
control systems and encryptions. We need approaches for
Techniques to control what is extracted and to check that
access control systems for big data:
what is extracted can be used and/or shared.
Merging large numbers of access control policies. In
Defining personal as well as population boundaries. It is
many cases, big data entertains integrating data
important to understand what is extracted from the data
originating from multiple sources; these data may be
as this may lead to discrimination and affect other’s
associated with their own access control policies (referred
privacy. Also when dealing with security with privacy, it
to as “sticky policies”) and these policies enforced even
is important to understand the privacy policies and
when the data is integrated with other data. Therefore
copyrights defined by one in his/her particular schemes.
they need to be integrated and conflicts solved.
Privacy enhancement techniques. Several such techniques
Administering authorizations for big data and in
including oblivious RAM, security multiparty
particular for granting permissions. If fine access control
computation, multi input encryption and various other
is required, manual administration on large data sets is
types of encryption. However they are not yet practically
not feasible. We need technologies by which
applicable to large data sets. We need to engineer these
authorization can be automatically granted, possibly
techniques, using for example parallelization, to fine tune
based on the user digital identity, profile, and context,
their implementation and perhaps combine them with
and on the data contents and metadata.
other techniques, such as differential privacy. A approach
Enforcing access control policies on heterogeneous multi- which can be used is that, first use the clean or sanitized
media data. Access control is an important type of access data (which means a data which comes under a particular
control by which authorizations are granted or denied field) then the non-sanitized data.
based on the content of data. Access control is critical
Defining data privacy policies. Policies must be designed
when dealing for video surveillance applications which
in such a way which is easily understandable by users.
are important for security. For privacy such videos have
We need to understand user expectations in terms of
to be protected. Supporting access control requires
privacy and not to make it too hard to obey.
understanding the contents of protected and this is very
challenging when dealing with multimedia large data Data services and mentioning them. Data is sold with
sources. their privacy preserving terms and policies. Data services
must be listed on purchase. No hidden terms and
Enforcing access control policies in big data stores. Some
condition should be there such that user can’t follow
of the big data systems allow its user’s to submit arbitrary
them.
jobs using programming languages such as Java. For
example, in Hadoop, users can submit arbitrary Data publication. Perhaps we should abandon the idea of
MapReduce jobs written in Java. This gives rise to publishing data, given the privacy implications, and
significant challenges to enforce fine grained access rather require the data user to use a controlled
control efficiently for different users. Although there is environment (perhaps located in a cloud) for using the
some existing work that tries to inject access control data. In this way, it would be much easier to control the
policies into submitted jobs, more research needs to be proper use of data.
Privacy implication on data quality. Recent studies have developing an infrastructure authentication protocol that
shown that people lie especially in social networks controls access, while setting up periodic status updates
because they are not sure that their privacy is preserved. and continually verifying data integrity by using
This result in a decrease in data quality that then affects mechanisms such as checksums.
decisions and strategies based on these data.
Maintaining Transactions Logs-
Data Management- Data management focuses o Storage management is a key part of the Big Data
security equation. Using signed message o digests to
n what to do once the data is contained in the Big Data provide a digital identifier for each digital file or
environment. It not only shows how to secure the data document, and to use a technique called secure untrusted
that is stored in the Big Data system, but also how to data repository (SUNDR) to detect unauthorized file
share that data. modifications by malicious server agents.
Security at Collection or Storage The handbook lists a number of other techniques as well,
Big Data usually implies a huge amount of data. It is, including lazy revocation and key rotation, broadcast and
therefore, important not only to find a means to protect policy-based encryption schemes, and digital rights
data when it is stored in a Big Data environment, but also management (DRM). However, there's no substitute for
to know how to initially collect that data. In order to simply building your own secure cloud storage on top of
solve these problems, a mechanism with which to protect existing infrastructure.
data owners’ privacy by creating a parameter to measure
the acceptable level of privacy should be used. Another
approach found in, suggests that security storage can be
protected by dividing the data stored in the Big Data
system into sequenced parts and storing them in different
cloud storage service providers.

Policies, Laws, or Government

Every disruptive technology brings new problems with it,


and Big Data is no exception. The problems related to
Big Data are mostly related to the increase in the use of
this technique to obtain value from a large amount of data
by using its powerful analysis characteristics. This would Distributed Programming Frameworks must be
imply a risk on people’s privacy. In order to reduce that safe- Distributed programming frameworks such as
risk, many authors propose the creation of new legislation Hadoop make up a huge part of modern Big Data
and laws that will allow these new problems to be distributions, but they come with serious risk of data
confronted in an effective manner. Some frameworks or leakage. They also come with what's called "untrusted
initiatives that attempt to establish a robust government mappers" or data from multiple sources that may produce
of the security of the data in a Big Data system. error-ridden aggregated results.
Sharing Algorithms The distributed modules of data must be safe with distinct
In order to obtain the maximum possible information encryption techniques and maintain integrity by defining
from data, it is necessary to share that data among the access levels and authorization. This way may lead
cluster in which Big Data is running or to share those complications but at the security’s point of view this
results for collaboration. However, again, we have the method provide a better security than all others.
problem of how to guarantee security and privacy when Security of Non-Relational Data-
that sharing process is taking place. This problem is Non-relational databases such as NoSQL are common but
solved by increasing the surveillance of the user taking they're vulnerable to attacks such as NoSQL injection.
part in data sharing, while others propose securing the Beyond those core measures, plus layers such as data
transmission itself by creating a new technique based on tagging and object-level security, you can also secure
nested sparse sampling and coprime sampling. non-relational data by using what's called pluggable
Techniques used to secure Big Data- authentication modules (PAM); this is a flexible method
for authenticating users while making sure to log
Data Provenance- transactions by using a tool such as NIST log. Finally,
This is a whole other category of data that needs there's what's called fuzzing methods, which expose
significant protection. The CSA recommends first cross-site scripting and injecting vulnerabilities between
NoSQL and the HTTP protocol by using automated data Track admin data.
input at the protocol, data node, and application levels of
Use single sign-on (SSO).
the distribution.
Use a labeling scheme to maintain proper data federation.
Filtering at Endpoint and Validation -
Endpoint security is paramount and your organization can Continuous Error checking or Auditing of
start by using trusted certificates, doing resource testing, Data-
and connecting only trusted devices to your network by Auditing at each access level is a must in Big Data
using a mobile device management (MDM) solution (on security, particularly after an attack on your system.
top of antivirus and malware protection software). From Organizations should create a cohesive audit view
there, you can use statistical similarity detection following any attack, and be sure to provide a full audit
techniques and outlier detection techniques to filter trail while ensuring there's easy access to that data in
malicious inputs, while guarding against cyber attacks. order to cut down incident response time.
Real-Time Complaint Service and Security Audit information integrity and confidentiality are also
Monitoring essential. Audit information should be stored separately
Although, complaints are the headaches for the and protected with lower user access controls and regular
organizations but considering them as a feedback some monitoring. Make sure to keep your Big Data and audit
problems or discrepancies can be solved in real time. It's data separate, and enable all required logging when
best to tackle it head-on with real-time analytics and you're setting up auditing (in order to collect and process
security at every level of the stack rather than after whole the most detailed information possible). An open-source
stack. We can mine logging events, deploy front-end audit layer or query orchestrator tool such as Elastic
security systems such as routers and application-level Search can make of all this easier to do.
firewalls, and begin implementing security controls
throughout the stack at the cloud, cluster, and application Fourier masking encryption algorithm-
levels. Data positioning attacks can be reduced by this
There are several encryption algorithms that are the
technique.
outcome of extensive research in the recent years. Some
Encryption of data or Cryptography of data - of them indeed provide good security but some others are
Mathematical cryptography hasn't gone out of style; in vulnerable to either brute-force or cryptanalytic attack.
fact, it's gotten far more advanced. By constructing a Some of them are easy to implement in hardware with
system to search and filter encrypted data, such as the high processing power and storage capacity and some
searchable symmetric encryption (SSE) protocol, others are good for limited devices like mobile phones,
enterprises can actually run Boolean queries on encrypted PDAs etc. Most of the available cryptographic algorithms
data. Relational encryption allows you to compare are based upon number theory which use finitefield such
encrypted data without sharing encryption keys by as GF(p) or GF(2n ). Most of these number theoretic
matching identifiers and attribute values. Identity-based algorithms are more secure when they use large prime
encryption (IBE) makes key management easier in public numbers or large binary words [1, 2]. But when the
key systems by allowing plaintext to be encrypted for a precision and bit-width increases, the hardware in which
given identity. Attribute-based encryption (ABE) can the algorithm is to be implemented must be sophisticated
integrate access controls into an encryption scheme. in processing power and storage capacity and hence it
Finally, there's converged encryption, which uses tends to high cost. We have developed such an
encryption keys to help cloud providers identify duplicate encryption/decryption algorithm which exploits the
data. versatility of Fourier series [3, 4].

Access control at each level- Fourier Series (FS): A series proposed by the French
mathematician Fourier about the year 1807. The series
Access control is about two core things: restricting user
involves the sines and cosines of whole multiples of a
access and granting user access. For setting up granular
varying angle and is usually written in the form:
access controls, we can consider some points-
y = H0 + A1sin x + A2sin 2x + A3sin 3x + … B1cos x +
Normalize mutable elements and de-normalize
B2cos 2x + B3cos 3x + …. (1)
immutable elements.

Track secrecy requirements and ensure proper


implementation. By taking a sufficient number of terms the series may be
made to represent any periodic function of x [4]. The
Maintain access labels.
original definition of FS restricts the angles to be
harmonically related. But we extend the original  Choose any password in the form of ASCII text.
definition of FS according to our need for the  Convert the values of the characters in the
compatibility with our proposed algorithm. The form of chosen password to their equivalent extended
the FS that we developed is presented in the next section. ASCII values and put the values in the vector P =
There are several encryption algorithms for symmetric {P1, P2, P3 … PM}. [Here M = number of
key encryption. There are also block cipher algorithms characters in the password].
that operates on a single block at a time. But block  If M is odd then pad a one ‘1’ at the end of the
ciphers can also be implemented for stream encryption in vector P. [So that after this padding operation
several modes. For example, DES (Data Encryption PM becomes equal to ‘1’ and the value of M is
Standard) is a block cipher technique but can be used for increased by one].
stream encryption in ECB (Electronic Code-Book), CFB  Using the definition of the (2) construct the
(Cipher Feed Back), Counter modes. Other encryption masking function. As the masking function is
algorithms are AES (Advanced Encryption Standard), ready, encryption/decryption procedure is
BlowFish, RC4 etc. All these algorithms are based upon followed.
number theory (finite-field is used) and use some
additional storage like S-box (Substitution Box), Part – 2 (Encryption/Decryption)
Permutation Table, Initial Value etc. These algorithms {Encryption}
are also complex in nature, as the encryption process is
very much complex. But when complexity is increased,  Take the plain text sequence in the form of
the difficulties of hardware and software implementation ASCII text. II.
are also raised. Computational complexity does not  Convert the characters in the plain text to their
matter if the algorithm can provide with a strong defense equivalent extended ASCII values and put the
against fraud. But if we consider developing any values in the vector X = {X1, X2, X3 … XN}.
encryption algorithm for limited devices like mobile [Here N = number of characters in the plain
phones, PDAs, BlueTooth hardware etc, storage capacity text].
and processing power of that device is also a matter of  Generate the cipher text sequence Y = {Y1, Y2,
concern. In that case, we must explore for an algorithm Y3 … YN} by Yi = Xi ⊕ f(i, Xi-1) where X0 =
which is simpler in hardware implementation but holds a P1.
very strong position against intruders. {Decryption}
Proposed Algorithm- We have developed a function  Take the cipher text sequence in the form of
which is ASCII text.
 Convert the characters in the cipher text to their
equivalent extended ASCII values and put the
like Fourier series [3, 4] or trigonometric polynomial that values in the vector Y = {Y1, Y2, Y3 … YN}.
is used to generate unpredictable key sequence. The [Here N = number of characters in the cipher
function takes a password of any length chosen by the text].
encryption party and results in key sequence. We found  Retrieve the plain text sequence X = {X1, X2,
that by evaluating Fourier series, a good randomness can X3 … XN} by Xi = Yi ⊕ f(i, Xi-1) where X0 =
be achieved. The sequence is then XORed with the P1.
plaintext to get the cipher text. The function is the heart
of this algorithm and named as ‘Masking Function’. Cryptography terms

Encryption:
It is the method of locking encryption using
cryptography.

Decryption:
Process of unlockimg encrypted data using cryptographic
technique.
Fourier_Masking_Encryption _Algorithm (FMEA)

Part – 1 (Construction of Masking Function)


Key: 3. Client receives this data and decrypts it.
It uses the fact that it is difficult to factorize a large
A secret password used to encrypt and decrypt number.
information and data.
Example:
Symmetrical Encryption
Select two prime numbers p = 27 and q = 36 now public
This is the simplest kind of encryption that involves only key n = p*q = 972
one secret key to cipher and decipher information. The
most commonly used symmetrical encryption technique We need an exponent e, such as e is-
are AES-192,RC4,DES,AES-256.  An integer
 Not a factor of n
 1<e<$(n) let e = 4
 Public key is made of n and e
 We calculate $(n) = (p-1)(q-1) = 910
 Now calculate private key d, d = (k*$(n) + 1)/e
for some integer k = 2, value of d = 455

Convert DE into numbers 45

Encrypted data c = 454mod972 thus our encrypted data


becomes 729 .

Decrypted data = 729455mod972 thus our decrypted data


is 45 which is equivalent to DE.
Asymmetrical Encryption // C program for RSA asymmetric cryptographic
// algorithm. For demonstration values are
It is relatively new method. It consist 2 keys one is public // relatively small compared to practical
key that is made available freely to anyone who wants to // application
send a message to you. The second is private key and is #include<stdio.h>
secret that only you can know. It is better in ensuring #include<math.h>
security of information transmitted during
communication. Common used algorithms are // Returns gcd of a and b
int gcd(int a, int h)
RSA,DSA,PKCS.
{
int temp;
while (1)
{
temp = a%h;
if (temp == 0)
return h;
a = h;
h = temp;
}
}

// Code to demonstrate RSA algorithm


int main()
{
// Two random prime numbers
RSA Algorithm double p = 3;
double q = 7;
RSA is an algorithm used by modern computers
to encrypt and decrypt messages. It is an asymmetric // First part of public key:
cryptographic algorithm. double n = p*q;
Process involves : // Finding other part of public key.
// e stands for encrypt
1. A client (for example browser) sends its public key double e = 2;
to the server and requests for some data. double phi = (p-1)*(q-1);
2. The server encrypts the data using client’s public while (e < phi)
key and sends the encrypted data.
{ DIGIT MULTIPLIER OPERATION
// e must be co-prime to phi and USED
// smaller than phi. 0 1 s-p
if (gcd(e, phi)==1) 80 s-80p
1
break;
2 60 s-60p
else
e++; 3 40 s-40p
} 4 20 s-20p
5 00 s
// Private key (d stands for decrypt) 6 20 s+20p
// choosing d such that it satisfies 7 40 s+40p
// d*e = 1 + k * totient 8 60 s+60p
int k = 2; // A constant value 9 80 s+80p
double d = (1 + (k*phi))/e;

// Message to be encrypted After performing these operations on encrypted data s we


double msg = 20; would get a number say r

printf("Message data = %lf", msg); Step 6: to get decrypted data we would find square root
of r and we would get decrypt data.
// Encryption c = (msg ^ e) % n
double c = pow(msg, e); Say in the given case since unit digit is 1 we would
c = fmod(c, n); subtract 80p = 720 from s as r = 9001 - 720 = 8281
printf("\nEncrypted data = %lf", c);
So r = 8281 whose square root will give 91.
// Decryption m = (c ^ d) % n
double m = pow(c, d); Advantages
m = fmod(m, n);
Since p will be indistinguishable by looking at encrypted
printf("\nOriginal Message Sent = %lf", m);
number so it will be safe for encrypting valuable
return 0; information.
}
Any information which have more and more digits would
be more secure for instance the aadhar card has 12 digits
OUR NEW ENCRYPTION TECHNIQUE so the square of this 12 digit number would be more.
Following encryption technique is arithmetic formula Also a particular number other than unit’s place number
based technique which will be useful in encrypting large could be used as public key.
digit passwords effectively and characters too. This could
be understood by taking a simple example let us take for It could be combined with RSA to make it more tough to
ease that we want to encrypt a number say 91 decrypt the term k used there we could first encrypt the
data according to this formula and then can use in the
Step 1: for encryption we would consider unit digit formula for encrypted data then after decrypting and
which is 1 here now the unit digit be q and the rest digit finding c it would be further simplified accordingly.
be p here q = 1 and p = 9.
It could be used in multi encryption technique having
Step 2: now we will make q a two digit number by independent key.
adding 0 to the tenth’s place so q = 01
REFERENCES
Step 3: now we will use the given formula p2 + p
1. Security Issues associated with big data in cloud
Here data will be 81 + 9 = 90 computing by Venkata Narasimha Inukollu, Sailaja Arsi
and Srinivasa Rao Ravuri.
Step 4: now we will square q and add it to the unit’s and
tenth’s place then our encrypted data will become 9001. 2. Information Security in Big Data by Lei XU ,
Let encrypted data be s so s = 9001. This will be our Chunxiao Jiang , Jian Wang Jian Yuan and Yong Ren
encrypted data note that if q2 is itself a two digit integer
then we will suppose we take q = 04 then q2 = 16 not 016 3 .Use of Digital Signature with Diffie Hellman Key
we will only consider one’s and tenth’s place Exchange and AES Encryption Algorithm to Enhance
Data in Cloud Computing by Prashant Rewagad and
Step 5: to decrypt the data we would first compare the Yogita Pawar
unit’s digit to the integers from 0-9 and will perform the
operation as stated in following table
4.The rise of Big Data on cloud computing by Ibrahim
Abaker Targio Hashem, Ibrar Yaqoob Sameer Ullah
Khan ,Abdullah Gani.
Date: Monday, October 08, 2018
Statistics: 1082 words Plagiarized / 3886 Total
words
Remarks: Medium Plagiarism Detected - Your
Document needs Selective Improvement.
--------------------------------------------------
-----------------------------------------

Plagiarism Checker X
Originality Report
Similarity Found: 08%

You might also like