Download as pptx, pdf, or txt
Download as pptx, pdf, or txt
You are on page 1of 58

INT 222

ADVANCE WEB
DEVELOPMENT
By:Divya Thakur
UID:28300
Contents
• JSON
• BUFFER
• STREAM
• COMPRESS/DECOMPRESS USING ZLIB
JSON

• It stands for Javascript Object Notation.


• It is a lightweight format for storing and transporting data.
• It is often used when data is sent from a server to a webpage.
• The JSON format is syntactically similar to the code for creating
JavaScript objects. Because of this, a JavaScript program can easily
convert JSON data into JavaScript objects.
• Since the format is text only, JSON data can easily be sent between
computers.
JSON Syntax Rules
• JSON syntax is derived from JavaScript object notation syntax:
• Data is in name/value pairs
• Data is separated by commas
• Curly braces hold objects
• Square brackets hold arrays

JSON Data - A Name and a Value


• JSON data is written as name/value pairs (aka key/value pairs).
• A name/value pair consists of a field name (in double quotes), followed by a
colon, followed by a value:
• JavaScript has a built in function for converting JSON strings into
JavaScript objects:
JSON.parse()
• JavaScript also has a built in function for converting an object into a
JSON string:
JSON.stringify()
• JSON has dta in form of key: value pair such as objects
• The main difference between an javascript object and json is that key
and value both are also in “double quotes”
Example
• Create a simple object as given below:
const data={
name: "divya",
age: 00,
department:"cse"
};
console.log(data); // to access the object data

o/p : { name: 'divya', age: 00, department: 'cse' }


Now to convert this above object to json STRING :
• JSON.stringify() method is use:
const jsonData=JSON.stringify(data);
console.log(jsonData);
o/p :
{"name":"divya","age":00,"department":"cse"}

• It is important to use double quotes for property as well as value as


shown the example above .
• If we try to call the json property directly , we will not get data with .
dotation also .There are various ways to call json property.
To convert json data into javascript string
• JSON.parse() method is used:
const objdata=JSON.parse(jsonData);
console.log(objdata);
o/p:
{ name: 'divya', age: 00, department: 'cse' }
TASK
• Create a object .
• Convert it into JSON.
• Add this data to another file.
• Read the data from that file.
• Again Convert it back to JS Object.
• Show data on the console.
• Show the json data on browser via server .
• Create a custon Json Package and serve its data on browser.
Introduction to Buffer Module
Buffer module is commonly used to
work with binary data directly. It is
a part of Node.js and allows you to
work with raw binary data in a
variety of encodings.
Buffer Module
• The buffers module provides a way of handling streams of binary data.
• Buffer objects are used to represent a fixed-length sequence of bytes. Many
Node.js APIs support Buffers.
• The Buffer object is a global object in Node.js, and it is not necessary to
import it using the require keyword.
• The Buffer class in Node.js is used to perform operations on raw binary data.
• Generally, Buffer refers to the particular memory location in memory.
• Buffer and array have some similarities, but the difference is array can be
any type, and it can be resizable. Buffers only deal with binary data, and it
can not be resizable.
• Each integer in a buffer represents a byte. console.log() function is used to
print the Buffer instance.
But why to work with binary data
directly?
• Performance: Operations on buffers are generally faster compared to
string manipulation when dealing with binary data, especially when
working with large datasets.
• Flexibility: Buffers can be resized, sliced, concatenated, and
manipulated in various ways to suit different use cases.
• Interoperability: Buffers can be easily converted to and from other
data types, such as strings, arrays, and TypedArrays, facilitating
interoperability with different parts of an application or external
systems.
Creating Buffers
1. const buf1 = Buffer.alloc(10);
// Creates a zero-filled Buffer of length 10.
2. const buf2 = Buffer.alloc(10, 1);
// Creates a Buffer of length 10,
// filled with bytes which all have the value `1`.
3. const buf3 = Buffer.allocUnsafe(10);
buf3.fill(0)
console.log(buf3)
This method allocates a new uninitialized Buffer of the specified size, and the content of the buffer
is not initialized. It may contain old data from the memory, so it is important to fill the buffer
before using it.Creates an uninitialized buffer of length 10.
// This is faster than calling Buffer.alloc() but the returned buffer instance might contain old data
that needs to beoverwritten using fill(), write(), or other functions that fill the Buffer's contents.
Concatenating Buffers:
const buffer1 = Buffer.from('Hello', 'utf-8');
const buffer2 = Buffer.from(' World', 'utf-8');

// Concatenating buffers
const concatenatedBuffer = Buffer.concat([buffer1,
buffer2]);
console.log(concatenatedBuffer.toString('utf-
8')); // Output: Hello World
Writing and Reading Data:
• You can write data into the buffer and read it back:

// Writing data to the buffer


buffer.write('Hello', 'utf-8');
// Reading data from the buffer
const data = buffer.toString('utf-8');
console.log(data); // Output: Hello
4. const buf4 = Buffer.from([1, 2, 3]);
// Creates a Buffer containing the bytes [1, 2, 3].
5. const buf6 = Buffer.from('tést');
// Creates a Buffer containing the UTF-8-encoded bytes for the string 'tést':
• In hexadecimal notation
const buf6 = Buffer.from('tést', 'utf-8');
const hexRepresentation = buf6.toString('hex');
console.log(hexRepresentation);
• In decimal notation:
const buf6 = Buffer.from('tést', 'utf-8');
for (const byte of buf6) {
const decimalValue = byte.toString();
console.log(decimalValue);
}
Example
• // Create a Buffer from a string
const dataString = 'Hello, World!';
const buffer = Buffer.from(dataString, 'utf-8');
• // Accessing the data in the buffer
for (const byte of buffer) {
console.log(byte); }
• // Modifying the buffer
buffer[7] = Buffer.from('z')[0];
• // Convert the modified buffer back to a string
const modifiedString = buffer.toString('utf-8');
console.log(modifiedString);
Buffers and character encodings#
• When converting between Buffers and strings, a character encoding
may be specified. If no character encoding is specified, UTF-8 will be
used as the default.
const buffer = Buffer.from('Divya', 'utf8');
1. console.log(buf.toString('hex'));
2.console.log(buf.toString('base64'));
Similarly can be done for various enodings techniques.
Buffers and iteration
• Buffer instances can be iterated over using for..of syntax:

const mybuffer = Buffer.from([1, 2, 3]);


for (const b of mybuffer)
{
console.log(b);
}
Processing URL,
QueryString and Paths
https://www.example.com/products?id=123&page=1#overview
What are URLs?
URL stands for Uniform Resource Locator. It is a
reference, or an address used to locate resources
on the internet, such as web pages, images,
documents, and other files.

Ex: https://www.google.com/search?q=Value+of+pi
A typical URL has the following
components:
• Protocol: This indicates the protocol used to access the resource, such as
HTTP, HTTPS, FTP, etc.
• Domain: This specifies the domain name or IP address of the server where
the resource is hosted.
• Path: This is the specific location of the resource on the server's file system.
• Query String: This is optional and is used to pass parameters to the resource.
It starts with a question mark (?) and consists of key-value pairs separated by
ampersands (&).
• Fragment: Also optional, this specifies a specific section within the resource,
typically used in HTML documents to navigate to a specific part of a webpage.
https://www.example.com/products?
id=123&page=1#overview

• Protocol: https://
• Domain: www.example.com
• Path: /products
• Query String: id=123&page=1
• Fragment: overview
Processing URLs:
const url = require('url');

const urlString = 'http://example.com/path?foo=bar&baz=qux';


const parsedUrl = url.parse(urlString,true); //parse url
string

console.log(parsedUrl.pathname); // Outputs: '/path'


console.log(parsedUrl.query); // Outputs: { foo: 'bar',
baz: 'qux' }
Processing QueryString
// require query string
const querystring = require('querystring');
//Query string to parse
const queryString = 'foo=bar&baz=qux';
//Parse the query string: The querystring.parse() method is called with
the queryString as an argument.
//This method parses the query string into an object where each key-
value pair corresponds to a parameter in the query string.
const parsedQuery = querystring.parse(queryString);

console.log(parsedQuery); // Outputs: { foo: 'bar',


baz: 'qux' }
Create a Node.js application that utilizes the
stream and URL modules to handle data
submitted through a form on the '/submit'
route using the 'GET' method, ensuring that
the received data is saved into a file on the
backend?
const http =require('http');
const fs=require('fs');
const url=require('url');

http.createServer((req,res)=>{
let parsedURL=url.parse(req.url,true);
console.log(parsedURL);
console.log(parsedURL.query.name);
if(parsedURL.pathname=='/'){
let readableStream=fs.createReadStream('public/index.html');
readableStream.pipe(res);
}else if(parsedURL.pathname=='/submit' && req.method=="GET"){
let writableStream = fs.createWriteStream('form_data.txt');
let query = parsedURL.query;
writableStream.write(query.name+'\n');
writableStream.write(query.email);
writableStream.on('finish',() => {
console.log("Form has been saved Successfully");
});
writableStream.end();
res.end("Data has been successfully saved");
}

}).listen(4000);
Whaaat is this, Stream?
Streams are sequences of data made
available over time. Rather than reading or
writing all the data at once, streams allow
you to process data piece by piece, which is
particularly useful when dealing with large
datasets or when real-time processing is
required.
So, Stream Module….

In essence, the Stream Module enables


efficient handling of data streams, allowing
developers to process data in smaller,
manageable chunks, leading to better
performance.
• The node:stream module provides an API for implementing the
stream interface.
• The Stream module provides a way of handling streaming data.
• Each type of stream is an EventEmitter instance and throws several
events at different instances of time.
• Some of the most commonly used events are :
• DATA : event is fired when there is data available to read.
• END : event is fired when there is no data left to read.
• ERROR :event is fired when there is an error receiving or writing data.
• FINISH : event is fired when no data to read and everything is removed.
• To access the node:stream module:
const stream = require('node:stream');
• Using streams can lead to more efficient and scalable applications,
especially when dealing with large amounts of data. They also provide
better memory management compared to traditional synchronous file
reading or writing.
Reading data

const fs = require('fs');

// Create a readable stream with a smaller chunk size (e.g., 64 bytes)


const readableStream = fs.createReadStream('example.txt', { encoding: 'utf8', highWaterMark:
64 });

// Listen for the 'data' event, which indicates that a chunk of data is available
readableStream.on('data', (chunk) => {
console.log('Received chunk of data:');
console.log(chunk);
});

// Listen for the 'end' event, which indicates that all data has been read
readableStream.on('end', () => {
console.log('Finished reading data from the file.');
});

// Listen for the 'error' event, in case of any errors during reading
readableStream.on('error', (err) => {
console.error('Error reading data:', err);
});
Writing Data

const fs = require('fs');

// Create a writable stream to a file


const writableStream = fs.createWriteStream('output.txt');

const chunk="hello world";

// Write data to the stream


writableStream.write(chunk);

//End the writable Stream


writableStream.end()
// Listen for the 'finish' event to know when writing is complete

writableStream.on('finish', () => {
console.log('Data has been written to the file.');
});

// Handle errors that may occur during writing

writableStream.on('error', (err) => {


console.error('Error writing to the file:', err);
});
TASK

• Create two files : stream.js and stream.txt .


• Suppose stream.txt is in server,i want to read stream.txt through
stream.js file and further run the server and show the content of
stream.txt on the browser through stream.js.

• Note: Use the concept of events.


const fs= require("fs");
const http =require ("http");
const server=http.createServer();
server.on('request',(req,res)=>
{
fs.readFile('stream.txt',(err,data)=>
{
if (err) throw err;
res.end(data.toString());
});
});
server.listen(3000,()=>
{
console.log("listening to port ");
});
Do the same task with stream
• Create a Readable stream .
• Handle Stream Events .
1. createReadStream():
const rstream=fs.createReadStream(“filename”);
: data will be read chunck by chunck in continuous manner.
Now , to read data an event will be fired => data
rstream.on(‘data’,(chunckdata)
{ res.write(chunckdata);
}
const fs= require("fs");
const http =require ("http");
const server=http.createServer();
server.on('request',(req,res)=>
{const rstream =fs.createReadStream("stream.txt");
rstream.on('data',(readdata)=>
{ res.write(readdata); })
rstream.on('end',()=> // how to give response
{res.end(); })
});
server.listen(3000,()=>
{
console.log("listening to port ");
});
Task
• Using error event , check wheather the file exist or not , and print the
same on the browser
Trying to access the file that doesnot exist

• rstream.on('error',(err)=>
•{
• console.log(err);
• res.end("file doesnot exist");
• })
Piping Streams
The process of connecting the output of
one stream to the input of another
stream. This allows you to easily transfer
data from one stream to another without
manually handling the data flow
const fs = require('fs');

// Create a readable stream to read data from a source file


const readableStream = fs.createReadStream('example.txt', 'utf8');

// Create a writable stream to write data to a destination file


const writableStream = fs.createWriteStream('destination.txt');

// Pipe the data from the readable stream to the writable stream
readableStream.pipe(writableStream);

// Listen for the 'finish' event on the writable stream


writableStream.on('finish', () => {
console.log('Data piped successfully from source to destination.');
});
// Listen for the 'error' event on the readable
stream, in case of any errors during reading
readableStream.on('error', (err) => {
console.error('Error reading data:', err);
});

// Listen for the 'error' event on the writable


stream, in case of any errors during writing
writableStream.on('error', (err) => {
console.error('Error writing data:', err);
});
Test your Knowledge
Design a Node.js server using the HTTP and FS
modules to efficiently read the contents of a file
('example.txt') and stream it to another file
('example2.txt') when a client accesses the
server's root URL ('/')?"
const http= require('http');
const fs=require('fs');

http.createServer((req,res)=>{
if(req.url=='/'){
readStream=fs.createReadStream('example.txt',
{highWaterMark:8});
writeStream=fs.createWriteStream('example2.txt');
readStream.pipe(writeStream);
res.end('Doneeeee');
}
}).listen(5000);
Compressing and
Decompressing Data using
Zlib
zlib is a built-in Node.js module that
provides compression and decompression
functionalities using the zlib library, which is
a general-purpose data compression library.
It supports various compression algorithms.
• Here are some key features and functionalities of the zlib module:

• Compression: zlib provides methods for compressing data using algorithms


like Gzip and DEFLATE. These compression methods reduce the size of the
data, making it suitable for transmission over networks or storage on disk.
• Decompression: It also supports decompressing compressed data using the
same algorithms. This allows you to restore the original data from
compressed representations.
• Stream Interface: zlib provides stream-based interfaces for both compression
and decompression. This means you can process data incrementally, which is
useful for large datasets or data streams.
• Error Handling: The module provides error handling mechanisms, allowing
you to handle errors that may occur during compression or decompression
operations.
Compression and Decompression using zlib
const zlib = require('zlib');

// Example data
const input = 'Hello, world!';

// Compress the data


zlib.gzip(input, (err, compressedData) => {
if (err) {
console.error('Error compressing data:', err);
return;
}

// Decompress the data


zlib.gunzip(compressedData, (err, decompressedData) => {
if (err) {
console.error('Error decompressing data:', err);
return;
}
console.log('Decompressed data:', decompressedData.toString()); }); });
Test your Knowledge

How can you create a Node.js server that


serves a specific text file, compresses it with
gzip encoding, and dynamically responds to
HTTP requests? Provide a detailed code
solution.
Import Necessary Files

const http = require('http');


const fs = require('fs');
const zlib = require('zlib');
Our Main Code

const server = http.createServer((req, res) => {


const filePath = 'example.txt';
const readStream = fs.createReadStream(filePath);

res.writeHead(200, {
'Content-Type': 'text/plain',
'Content-Encoding': 'gzip' // Setting the content encoding to gzip
});

// Compressing the file and piping it to the response stream


readStream.pipe(zlib.Gzip()).pipe(res);

readStream.on('error', (err) => {


console.error('Error reading file:', err);
res.statusCode = 500;
res.end('Internal Server Error');
});
});
Finishing Touches….

const PORT = 3000;


server.listen(PORT, () => {
console.log(`Server is running on port $
{PORT}`);
});
Node.js createGzip method: Compress File
• Let's see a simple example of Node.js ZLIB module to compress a file
"test.txt" into "test.txt.gz".
• const zlib = require('zlib');
• const gzip = zlib.createGzip();
• const fs = require('fs');
• const inp = fs.createReadStream('test.txt');
• const out = fs.createWriteStream('test.txt.gz');
• inp.pipe(gzip).pipe(out);
Node.js createUnzip method: Decompress File
• const zlib = require('zlib');
• const unzip = zlib.createUnzip();
• const fs = require('fs');
• const inp = fs.createReadStream('input.txt.gz');
• const out = fs.createWriteStream('input2.txt');
• inp.pipe(unzip).pipe(out);

You might also like