Numerical Method IEEE Paper

You might also like

Download as pdf or txt
Download as pdf or txt
You are on page 1of 3

Image Compression

* Note: Sub-titles are not captured in Xplore and should not be used

Muhammad Ammar Abbas Aal e Fatima Shahzaib Anwer


Electrical Department Electrical Department Electrical Department
Air University Air University Air University
Islamabad Sector E-9 Islamabad Sector E-9 Islamabad Sector E-9
210203@students.au.edu.pk 210752@students.au.edu.pk 210720@students.au.edu.pk

Abstract—This project investigates image compression using To ensure our algorithm behaves as expected, we tested it
low-rank approximation via alternating least squares (ALS) with various input matrices and verified that it produces valid
optimization. It involves developing a QR factorization routine, QR factorizations. We tested the algorithm’s computational
implementing the ALS algorithm, and evaluating its performance
on compressing grayscale images. Through experimentation and scaling by analyzing its runtime for matrices of increasing
analysis, it aims to understand the efficiency and effectiveness of size.
ALS-based compression compared to SVD methods. 3. Choice of QR Factorization Variant:
We opted for the Householder QR factorization method
I. I NTRODUCTION due to its numerical stability and efficiency. The Householder
In response to the escalating demand for efficient visual data algorithm exhibits O(n2r)O(n2r) computational complexity,
management, particularly in the face of storage and bandwidth making it suitable for large-scale applications.
constraints, this project delves into the exploration of low- 4.Testing and Validation:
rank approximation techniques for image compression, with We tested our implementation by comparing its outputs
a particular focus on alternating least squares (ALS) opti- against known QR factorizations. To demonstrate the expected
mization. ALS optimization, by decomposing the Frobenius scaling, we measured the algorithm’s runtime for increasing
norm minimization problem into manageable sub-problems, matrix sizes and analyzed its computational complexity.
seeks to iteratively optimize matrices representing images .
to find an optimal low-rank approximation that minimizes
III. A PPROACH
reconstruction error. This approach offers a compelling so-
lution to representing images with fewer parameters, making 1.QR Factorization Implementation:
it suitable for resource-constrained environments. The project Implement a function that computes the QR factorization
aims to comprehensively investigate ALS-based image com- of a given matrix. Test your implementation to ensure it
pression, from the development of QR factorization routines provides correct outputs for various matrices and verify its
for efficient computation to the evaluation of compression computational scaling.
quality against traditional singular value decomposition (SVD) 2. Algorithm 1 Implementation:
methods. Through meticulous experimentation and analysis, Implement Algorithm 1, the alternating least squares
the project seeks to uncover the computational trade-offs and method, using the QR factorization routine. Break down
performance characteristics of ALS optimization compared to the Frobenius norm into solvable sub-problems and explain
established compression techniques, ultimately paving the way this process in your report. Optimize the implementation by
for the development of more effective and resource-efficient considering efficient block operations.
image compression solutions to meet the evolving needs of 3. Image Compression:
diverse real-world applications. Load the provided image file, ”einstein.jpg”. Compute the
best rank 75 approximation of the image using your ALS
II. P ROCEDURE implementation. Compare the result with the best rank 75
1.Implementation of QR Factorization Routine: image from the SVD using built-in routines like those available
The QR factorization routine takes an n×rn×r matrix BB in Matlab or Julia. Analyze and compare the qualitative and
with rnrn as input and returns an n×rn×r matrix QQ with or- quantitative differences between the ALS and SVD compres-
thonormal columns and an r×rr×r upper triangular matrix RR. sion methods.
We implemented the Householder QR factorization algorithm, 4. Compression Rate and Runtimes:
which is numerically stable and efficient. Report the compression rate and run times of your ALS
2. Validation of Algorithm: function for fitting least squares models on the image data at
ranks r=45,55,65,75,85r=45,55,65,75,85.
Identify applicable funding agency here. If none, delete this. 5. Different Starting Points:
Choose an rr value from the provided set and start your should be taken to optimize the computation of Householder
algorithm from different initial points WW and ZZ. Compare transformations to minimize numerical errors and maximize
the objective value J(W,Z)J(W,Z) and the resulting matrices performance.
WW and ZZ. Explain any differences or similarities observed
in the objective value and resulting matrices V. T EST T HE R ESULT
6. Method of QR Factorization: Testing the implementation of Householder reflections for
The QR factorization done by using Householder reflec- QR factorization involves verifying that the algorithm pro-
tions. The function takes an input matrix B of size n x r and duces correct results for a variety of input matrices while also
returns the orthogonal matrix Q and the upper triangular matrix assessing its performance in terms of numerical stability and
R. The Householder reflections are computed iteratively for efficiency. Here’s how you can test the design:
each column of B. 1. Correctness Testing:
Test with Identity Matrix: Start by testing the implemen-
IV. I MPLEMENTATION M ETHOD tation with a simple identity matrix to verify that Q is
Householder reflections are a powerful method for comput- indeed orthogonal and R is an upper triangular matrix. Test
ing QR factorization of a matrix. This method systematically with Random Matrices: Generate random input matrices of
transforms a given matrix into an upper triangular form using various sizes and structures. Compute the QR factorization
a series of orthogonal transformations called Householder re- using your implementation and compare the product QR
flections. Here’s an explanation of the Householder reflections to the original matrix to ensure it equals the original matrix
method for QR factorization: within a certain tolerance. Test with Known Solutions: Use
1. Householder Transformation: matrices with known QR factorizations to validate the results
A Householder transformation, also known as a House- produced by your implementation. Compare the computed Q
holder reflection or a Householder matrix, is a unitary matrix and R matrices to the expected values. Test for Numerical
that reflects vectors through hyperplanes. Given a vector v in Stability: Introduce matrices with varying condition numbers
R n , the Householder transformation H is defined as: to assess the numerical stability of the algorithm. Check if
= 2 small changes in the input matrix result in small changes in
2 H=I2 v 2 the computed QR factorization.
vv T 2. Performance Testing:
where I is the identity matrix, vv T is the outer product Execution Time Measurement: Measure the execution time
of v, and v is the norm of v. of the algorithm for matrices of different sizes. Compare the
2. Householder QR Factorization: performance with alternative QR factorization methods.
The Householder QR factorization algorithm starts with the Memory Usage: Monitor the memory usage of the algo-
original matrix A and iteratively transforms it into an upper rithm, especially for large matrices, to ensure efficient memory
triangular matrix R by applying a series of Householder management. 3. Edge Cases Testing:
transformations. These transformations are chosen to zero out Empty Matrix Handling: Test the implementation with
specific entries below the diagonal of A, thereby gradually empty matrices to ensure proper handling of edge cases.
transforming it into R. Square vs. Rectangular Matrices: Test the implementation with
3. Algorithm Steps: both square and rectangular matrices to ensure it handles
Initialization: Start with = A=B, the original matrix. different input shapes correctly.
Iterative Transformation: For each column j of A, compute 4. Integration Testing: Integration with Other Algorithms:
the Householder reflection H j that zeros out the sub- Integrate the QR factorization algorithm into larger numerical
diagonal entries of the jth column below the diagonal. Update algorithms, such as least squares regression or eigenvalue
Matrix A: Apply the Householder transformation to A from computation. Verify that the integrated algorithms produce
the left: : = A:=H j A. This step effectively eliminates correct results
sub-diagonal entries below the jth column. Extract R: The
upper triangular matrix R is obtained from the upper-left VI. D ISCUSSION ON RESULTS
corner of the transformed matrix A. Build Q: The orthogonal An important result of implementing the Householder re-
matrix Q is constructed as the product of all Householder flections method for QR factorization is its ability to efficiently
transformations: = 1 1 Q=H n H n1 H 1 . decompose a given matrix into its orthogonal and upper trian-
4. Result: gular components. This decomposition has several significant
At the end of the process, matrix A is transformed into an implications and applications in various areas of mathematics,
upper triangular matrix R, and the product of all Householder engineering, and computational science. Let’s discuss some of
reflections yields an orthogonal matrix Q such that = A=QR, the key aspects and implications of this result:
representing the QR factorization of the original matrix A. 1. Numerical Stability:
5.Implementation Considerations: The Householder reflections method offers superior numer-
The choice of vectors v for the Householder transforma- ical stability compared to other QR factorization methods,
tions is crucial for numerical stability and efficiency. Care such as Gram-Schmidt orthogonalization. By systematically
eliminating sub-diagonal entries using orthogonal transfor- an alternative approach, particularly useful in scenarios in-
mations, Householder reflections minimize numerical errors volving sparse matrices or selective modification of elements.
and floating-point round-off effects, making it suitable for Although it may require more operations compared to House-
solving ill-conditioned systems of equations and numerical holder reflections, especially for dense matrices, Givens rota-
simulations. tions offer advantages in specific contexts. Overall, while each
2. Orthogonality and Orthonormality: method has its strengths and weaknesses, the Householder
The orthogonal matrix Q produced by Householder re- reflections method emerges as the preferred choice for QR
flections has orthonormal columns, meaning that = Q T factorization in most practical scenarios due to its numerical
Q=I. This property preserves the lengths and angles between stability, efficiency, and versatility.
vectors, making it useful for preserving geometrical properties
and performing orthogonal transformations in applications
such as rotation matrices, coordinate transformations, and
signal processing.
3. Applications in Least Squares and Regression:
QR factorization is a key component in solving least squares
problems, where it is used to find the least squares solution
to overdetermined systems of linear equations. By decompos-
ing the design matrix into orthogonal and upper triangular
components, Householder QR factorization enables efficient
and numerically stable computation of least squares solutions,
which has applications in regression analysis, curve fitting, and
optimization.
4. Eigenvalue Computation:
QR factorization plays a crucial role in computing eigenval-
ues and eigenvectors of a matrix. By transforming the original
matrix into Hessenberg form using Householder reflections,
it becomes easier to compute eigenvalues and eigenvectors
accurately and efficiently. This has applications in spectral
analysis, control theory, and structural mechanics.
5. Signal Processing and Compression:
In signal processing and data compression, Householder
QR factorization can be used for orthogonalizing signals and
reducing redundancy. Orthogonal transformations performed
by Householder reflections can decorrelate signals, extract
principal components, and compress data efficiently while
preserving important information.

VII. C ONCLUSION
In conclusion, the classical Gram-Schmidt algorithm,
Householder reflections method, and Givens rotations method
each offer unique approaches to QR factorization, with distinct
implications for numerical stability, efficiency, and versatility
in various applications. The classical Gram-Schmidt algorithm,
while straightforward, suffers from numerical instability due
to accumulation of rounding errors, limiting its applicability
in scenarios where numerical precision is crucial. Conversely,
the Householder reflections method provides a robust and
efficient solution by systematically transforming matrices into
upper triangular form using orthogonal transformations. This
method minimizes numerical errors and floating-point round-
off effects, making it well-suited for solving ill-conditioned
systems of equations and numerical simulations. Furthermore,
the resulting orthogonal matrix with orthonormal columns fa-
cilitates applications in least squares, eigenvalue computation,
and signal processing. The Givens rotations method presents

You might also like