Professional Documents
Culture Documents
ICTNWK529 AT2 Install and Manage Complex Networks Noman Bandi
ICTNWK529 AT2 Install and Manage Complex Networks Noman Bandi
ICTNWK529 AT2 Install and Manage Complex Networks Noman Bandi
Elements
A network lets your computer connect to the Web so that you can check e-mail,
update a website, or teleconference.
It also lets you communicate locally with other computers on the same
local network.
And for almost every network are likely to build these days, that means a combination
of wired and wireless connections.
The router acts as a bridge between office network (local area network or LAN) and
the Internet (the wide area network or WAN), and also allows all computers connected
to it to share the connection.
A router also typically acts as an office network’s DHCP server, enabling each device
that connect to have an individual and private IP address.
Wireless routers also have embedded firewalls to protect a network from threats and
intrusion.
Use WPA or WPA2 security for protecting your Wi-Fi network, and never leave the
router's administrator password at its default setting.
Before setting up business network, assess the business needs as it will affect network
o What kinds of data and files are you storing and sharing?
This section explains about the basic concept of network designs and the business
requirements.
To support our network-based economy, designers must work to create networks that
are available nearly 100 percent of the time. Information network security must be
designed to automatically fend off unexpected security incidents.
Using hierarchical network design principles and an organized design methodology,
designers create networks that are both manageable and supportable.
Computers and information networks are critical to the success of businesses, both
large and small.
They connect people, support applications and services, and provide access to the
resources that keep the businesses running.
To meet the daily requirements of businesses, networks themselves are becoming
quite complex.
Today, the Internet-based economy often demands around-the-clock customer service.
This means that business networks must be available nearly 100 percent of the time.
They must be smart enough to automatically protect against unexpected security
incidents.
These business networks must also be able to adjust to changing traffic loads to
maintain consistent application response times.
It is no longer practical to construct networks by connecting many standalone
components without careful planning and design.
LAN
MAN
Distinct Identity
Technical and economic feasibility
1.3 Plan network implementation to provide network services and
resources to meet business requirements
Schedule wide area connectivity following the introduction of the core switches
in the computer room, selecting a time that does not conflict with the access
layer installation.
Coordinate wide area connectivity with the telecommunication vendor providing
this portion of the network service.
Inform all employees of the scope of implementation for each phase, along with
dates and times.
Implementation of new equipment generally means systems and data will not be available
at the time of the change.
This gives employees the opportunity to plan their work around the
resulting downtime.
The implementation of network in an organization.
Network assets can include network hosts (including the hosts' operating systems,
applications, and data), internetworking devices (such as routers and switches), and
network data that traverses the network.
Developing security strategies that can protect all parts of a complicated network while
having a limited effect on ease of use and performance is one of the most important and
difficult tasks related to network design.
Security design is challenged by the complexity and porous nature of modern networks
that include public servers for electronic commerce, extranet connections for business
partners, and remote-access services for users reaching the network from home, customer
sites, hotel rooms, Internet cafes, and so on.
To help you handle the difficulties inherent in designing network security for complex
networks.
Factors consider in designing the network
The user should get the best response time and throughput.
Minimizing response time entails shortening delays between transmission and receipt of
data; this is especially important for interactive sessions between user applications.
Throughput means transmitting the maximum amount of data per unit of time.
The data should be transmitted within the network along the least - cost path, as long as
other factors, such as reliability, are not compromised.
The least-cost path is generally the shortest channel between devices with the fewest
intermediate components.
Low priority data can be transmitted over relatively inexpensive telephone lines; high
priority data can be transmitted over expensive high speed satellite channels.
Reliability should be maximized to assure proper receipt of all data.
Network reliability includes the ability not only to deliver error -free data, but also to
recover from errors or lost data.
The network's diagnostic system should be able to locate component problems and
perhaps even isolate the faulty component from the network.
2.3 Implement security strategy
A Security Strategy
A key issue in network security management is how to define a formal security Policy.
A good policy specification should be easy to get right and relatively stable, even in a
dynamically hanging network.
Much work has been done in 0020 Automating network security management.
The architecture of a network includes hardware, software, information link controls,
standards, topologies, and protocols.
A protocol relates to how computers communicate and transfer information.
There must be security controls for each component within the architecture to assure
reliable and correct data exchanges.
Otherwise the integrity of the system may be compromised.
Managing computer and network security is easier than it may seem, especially when we
establish a process of Management Security Forum duties.
2.5 Continually monitor internal and external network access for security
breaches
This section discuss about the effect and monitoring of internal and external threats.
The biggest threat to data is internal and external sources that want to steal that data.
The right security is the only way to defend it, and your data is one of your biggest assets.
You might get away with poor security and monitoring for a while, but poor defences can
lead to devastating results.
Most companies don’t even realize that internal threats are the biggest concern for
security.
Insider threats are a growing trend in the security industry, because they are the hardest to
identify and usually last the longest.
It takes months before the business determines that an insider is the root cause of a data
leak.
Even with internal threats dominating the cyber security industry, organization must
monitor and defend against external threats.
External threats can also be coupled with internal threats.
For instance, a social engineering hacker could get an internal user to provide sensitive
credential information.
The hacker then uses this information to gain external access to the internal network.
Phishing and malware sites are external threats, but the hacker needs the employee to
open the website and provide details about his credentials.
One of the most common external threats that don’t require any type of social engineering
is a distributed denial of service (DDoS).
These threats can lead to damaging results from server downtime.
DDoS can also happen from within the organization, but since it’s much easier to track the
attacker, it’s not as common as data theft internally.
The right router and monitoring service helps prevent a successful DDoS attack.
Auditing and monitoring are the solutions for files and data that contain sensitive
information.
3. Install and configure a complex network to meet business requirements
Structured Cabling
The term Structured Cabling describes a standardised way of connecting wires, allowing
computers and electronics to communicate and network.
Structured Cabling is a type of infrastructure that supports the performance of an
organisations network.
It acts like the glue that connects all the computers, phones and devices together.
A properly designed and installed Structured Cabling system provides a cabling
infrastructure that delivers predictable performance as well as the flexibility to
accommodate changes, maximise system availability, future proof your business, and
provides the capability to embrace IoT (Internet of Things).
Normal cabling is defined as point to point, where a cable is run directly to and from
devices that need connectivity.
In a structured cabling system, a series of patch panels and trunks are used to create a
structure that enables devices to be connected, moved or removed without the need to pull
in new cables each time a change occurs.
The DNS protocol is used to resolve FQDN (Fully Qualified Domain Names) to IP
addresses around the world.
This allows us to successfully find and connect to Internet websites and services no matter
where they are.
In many cases, where a local DNS server is not available, we are forced to either use our
ISP's DNS servers or some public DNS server, however, this can sometimes prove
troublesome.
Today, small low-end routers have the ability to integrate DNS functionality.
Example network, to enable the DNS Service so workstations can properly resolve
Internet domains but also local network names.
First step is to enable the DNS service and domain lookup on the router.
Next configure the router with a public name-server, this will force the router to perform
recursive DNS lookups.
Next configure your DNS server with the host names of your local network.
Then try to ping 'wayne' directly from your router's CLI prompt, you should receive an
answer.
At this point, you can configure your workstations to use your router's IP address as the
primary DNS server.
3.3 Install and configure servers, routers, switches or other devices to
provide network services
This section discuss about how to routers and servers influence network services in an
organization.
Using routing and switching technologies allows your staff, even those located in different
locations, to have equal access to all your business applications, information and tools.
The study of complex networks is a young and active area of scientific research.
Organizations operating in international markets must keep pace with rapid changes in
local market costs, capabilities and regulations.
The cost of changing a local carrier may be a major consideration due to the potential
disruption of day-to-day operations.
Examples are the Domain Name System (DNS) which translates domain names to Internet
protocol (IP) addresses and the Dynamic Host Configuration Protocol (DHCP) to assign
networking configuration information to network hosts.
Authentication servers identify and authenticate users, provide user account profiles, and
may log usage statistics.
4.1Integrate multiple network services across network
Directory services
e-Mail
File sharing
Instant messaging
Online game
Printing
File server
Voice over IP
Video on demand
Video telephony
World Wide Web
Simple Network Management Protocol
Time service
Wireless sensor network
Many Internet Protocol-based services are associated with a particular well- known port
number which is standardized by the Internet technical governance.
Different services use different packet transmission techniques.
In general, packets that must get through in the correct order, without loss, use TCP,
whereas real time services where later packets are more important than older packets use
UDP.
4.2Analyse and resolve interoperability issues
Interoperability
Interoperability is a characteristic of a product or system, whose interfaces are completely
understood, to work with other products or systems, at present or future, in either
implementation or access, without any restrictions.
Network Interoperability is the continuous ability to send and receive data among the
interconnected networks, providing the quality level expected by the end user without any
negative impact to the sending and receiving networks.
Interoperability is the property that allows for the unrestricted sharing of resources
between different systems.
This can refer to the ability to share data between different components or machines, both
via software and hardware, or it can be defined as the exchange of information and
resources between different computers through local area networks (LANs) or wide area
networks (WANs).
Broadly speaking, interoperability is the ability of two or more components or systems to
exchange information and to use the information that has been exchanged.
Network interoperability becomes indispensable in order to achieve end-to- end
connectivity.
The more diverse networks exist, the greater becomes the need to ensure that they can
interoperate in order to make end-to-end communication possible.
There exist a host of reasons why implementing network interoperability successfully is
considered difficult.
Fundamental to all those problems is the correct balance between the telecom operator’s
liabilities and benefits associated with these activities.
From the cost perspective, designing the network architecture for interoperability implies
the willingness to accept complex set of benefits and associated liabilities.
Telecom operators are acutely sensitive to five major liabilities:
Increased cost of acquisition associated with the addition of interoperable
network/application modes.
Added cost and complexity of adding features to achieve all network compatibility.
Increased time for acquiring a new system (time to accept interoperability features and
perform proper testing required to certify interoperability).
Increased complexity and cost associated with the management of the configuration of
interfaces.
Increased power and decreased speed to accommodate modes providing backward
compatibility.
Network interoperability being the ability of two networks to communicate can be
achieved in two ways: either by having the two networks confirm to a common protocol
standard or by defining a standard interface to which all networks need to adhere, or by
providing a gateway that translates between the two protocols.
Network security policies are essential elements in Internet security devices that
provide traffic filtering, integrity, confidentiality, and authentication.
Network security perimeter devices such as firewalls, IPSec, and IDS/IPS devices
operate based on locally configured policies.
Networks are subject to attacks from malicious sources.
Attacks can be from two categories: "Passive" when a network intruder intercepts data
traveling through the network, and "Active" in which an intruder initiates commands
to disrupt the network's normal operation or to conduct reconnaissance and lateral
movement to find and gain access to assets available via the network.
5. Plan, design and implement voice and video business communications
system
Voice Communications System (VCS) Voice Communication System is a state-of-art
solution for ATC communication.
Based on voice-over-ip technology, it allows effective interconnection of multiple
communication system including UHF and VHF radios, telephones, and intercoms.
A codec
A codec is either a hardware device or a software-based process that compresses and
decompresses large amounts of data used in voice over IP, video conferencing and
streaming media.
A codec takes data in one form, encodes it into another form and decodes it at the
egress point in the communications session.
There are two types of codecs used in communications.
The first codec is typically hardware-based, and it performs analog-to-digital and
digital-to-analog conversion.
A common example is a modem used for sending data traffic over analog voice
circuits.
In this case, the term codec is an acronym for coder/decoder.
The second type of codec is now more commonly used to describe the process of
encoding source voice and video captured by a microphone or video camera in digital
form for transmission to other participants in calls, video conferences, and streams or
broadcasts.
In this example, the term codec stands for compression/decompression.
A codec's primary job is data transformation and encapsulation for transmission across
a network.
Voice and video codecs use a software algorithm running on a common processor or
in specialty hardware optimized for data encapsulation and decapsulation.
Video conferencing standards and protocols are necessary to define common means
for video encapsulation and session management.
Encapsulation standards define how video and audio are captured, converted to digital
format, transmitted between endpoints and decoded.
Session Initiation Protocol (SIP) is widely supported for video session management,
though many older systems rely on H.323.
Gateways and multipoint control unit (MCUs) can make SIP and H.323 work together.
Encapsulation protocols vary in terms of vendor support and performance.
Popular encapsulation standards include the International Telecommunication Union's
(ITU) H.264, as well as VP8 for video, ITU G.711/G.722/G.729 for voice, and ITU
H.239/T.120 for data, such as screen sharing or web conferencing.
H.264 is widely supported by video conferencing vendors, while VP8 is widely used
in WebRTC-capable browsers, like Google Chrome and Mozilla Firefox.
6. Manage and support a complex network
Some of the open source tools for managing and troubleshooting networks:
o Nagios Core
o NIPAP
o Wireshrk
o Ntopng Community
o pfSense
o Cacti
The biggest challenge with today's network management solutions, is to proactively
identify faults before it impacts end-users.
OpManager detects, isolates and troubleshoots faults, raises alarms to remediate faults
quickly.
OpManager, the integrated network management software allows to:
Set multiple thresholds for performance metrics.
Get proactively notified for threshold violations and faults through email and SMS.
Process SNMP traps and syslogs and raise alerts.
Automatically log alarms as tickets into a service desk software.
In today's complex network of routers, switches, and servers, it can seem like a
daunting task to manage all the devices on your network and make sure they're not
only up and running but also performing optimally.
This is where the Simple Network Management Protocol (SNMP) can help.
SNMP was introduced in 1988 to meet the growing need for a standard for managing
Internet Protocol (IP) devices.
SNMP provides its users with a "simple" set of operations that allows these devices to
be managed remotely.
The core of SNMP is a simple set of operations (and the information these operations
gather) that gives administrators the ability to change the state of some SNMP-based
device.
SNMP usually is associated with managing routers, but it's important to understand
that it can be used to manage many types of devices.
Throughput Indicators:
This metric corresponds to the time required to transfer the request from the client to
the server or the response from the server to the client.
This value has a strong impact on the overall response time experienced by each user.
Network automation
Network automation is a methodology in which software automatically configures,
provisions, manages and tests network devices.
It is used by enterprises and service providers to improve efficiency and reduce human
error and operating expenses.
Network automation tools support functions ranging from basic network mapping and
device discovery, to more complex workflows like network configuration management
and the provisioning of virtual network resources.
Network automation also plays a key role in software-defined networking, network
virtualization and network orchestration, enabling automated provisioning of virtual
network tenants and functions, such as virtual load balancing.
Automation can be employed in any type of network, including local area networks
(LANs), wide area network (WANs), data centre networks, cloud networks and
wireless networks.
In short, any network resource controlled through the command-line interface (CLI)
or an application programming interface (API) can be automated.
A network server is a computer system, which is used as the central repository of data
and various programs that are shared by users in a network.
There are several categories of interfaces, platforms and protocols used to execute
script-driven or software-based network automation.
The CLI is the most traditional vehicle for deploying network automation.
Though freely available, time-tested and highly customizable, it requires proficiency
in CLI syntax.
Remote management is one way through which people manage their websites easily
from their computer through a server supported on windows 7, windows vista or
windows XP.
Today almost every device is connected to the internet.
Some of the bet tools for remote managements is as below:
o TeamViewer
o Ammy admin
o ISL Light
o UltraVNC
o Remote Desktop
TeamViewer
It is the best remote management tool available.
It has established itself as a reliable service, giving home users opportunity to support
friends for free.
Corporate licenses, while not free, are not expensives and promotions from time to
time, provide the additional discount.
We can instruct a user to download a small program.
Once its run user is presented with a code which is used by you to tap into the remote
support session.
We can also install TeamViewer on multiple computers/servers and connect whenever
needed.
Ammy admin
Ammy Admin is known for ease of use, it works behind NAT, meaning that we can
connect to the computers which are connected to the internet and behind a firewall
easily.
No special configuration required.
A connection is encrypted and secure.
In order to run it, it does not require installation. It also has built-in chat and file
manager so that you can communicate and transfer files easily.
ISL Light
ISL Light does not provide a free version, but it has a 15 day trial instead.
While we can use Ammy and Teamviewer to support friends for free, ISL Light is
orientated more towards enterprise clients.
Looking at their website, you will quickly find that is used by the most elite
companies around the world.
They also feature integrations with various services including AVG Business
Managed Workplace.
UltraVNC
UltraVNC is an open source project.
Like RealVNC is using a VNC protocol.
It is actively developed and while UVNC does not have a cloud component,
developers have created a special version which can be used in the cloud available in
the downloads section.
Remote Desktop
Remote Desktop is integrated into Microsoft Windows.
We could find the icon in the programs, but many admins are used to run the program
via the search function, by typing mstsc.
While Remote Desktop is great for connecting the servers and computers, it can be
improved with Remote Desktop Connection Managers.
7. Test network functionality and obtain sign-off
Network performance refers to measures of service quality of a network as seen by the
customer.
There are many different ways to measure the performance of a network, as each
network is different in nature and design.
Performance can also be modelled and stimulated instead of measured; one example
of this is using state transition diagrams to model queuing performance or to use a
Network Stimulator.
Once the network performance is analysed we need to obtain sign off from the
management.
This section discuss about the tools for testing and the method to record it.
Network emulation is a technique for testing the performance of real applications over
a virtual network.
This is different from network simulation where purely mathematical models of traffic,
network models, channels and protocols are applied.
The aim is to assess performance, predict the impact of change, or otherwise optimize
technology decision-making.
Network emulation is the act of introducing a device to a test network (typically in a
lab environment) that alters packet flow in such a way as to mimic the behaviour of a
production, or live, network — such as a LAN or WAN.
This device may be either a general-purpose computer running software to perform the
network emulation or a dedicated emulation device which usually does link
emulation.
Two open source network emulators are Common Open Research Emulator (CORE)
and Extendable Mobile Ad-hoc Network Emulator (EMANE).
They both support operation as network black boxes, i.e. external machines/devices
can be hooked up to the emulated network with no knowledge of emulation.
They also both support both wired and wireless network emulation with various
degrees of fidelity.
CORE is more useful for quick network layouts and single machine emulation.
EMANE is better suited for distributed high fidelity large scale network emulation.
The above results should be documented and recorded.
The document should contain:
o For Project: [project name]
o Function or Task: [name]
o Project Team:
o Core Team:
o Extended Team:
o Submitted for testing on: [date]
o Tested by: on: [date]
o Proof of Compliance:
o User Acceptance Checklist completed
o Test Results attached
o Outstanding issues with resolution plans attached