Download as pdf or txt
Download as pdf or txt
You are on page 1of 11

Facebook Network Appliance

White Paper

Version 7.1
FNA: White Paper | 1

Copyrights and Trademarks


© 2017 Facebook, Inc. All rights reserved.
FNA: White Paper | 2

Contents
1 What is Facebook Network Appliance (FNA)?..............3
1.1 Content Type and Transit..........................................3
1.2 Latency Based Routing.............................................4
2 Network Partner Portal..................................................5
2.1 Overview Page..........................................................5
2.2 Caching Insights.......................................................6
2.3 Peering......................................................................7
2.4 Support Program.......................................................8
3 Key Benefits..................................................................8
4 Qualification, Ordering, and Equipment Notes..............9
4.1 ISP Qualification.......................................................9
4.2 Ordering Requirements.............................................9
4.3 Equipment Notes....................................................10
5 Equipment Delivery.....................................................10
FNA: White Paper | 3

1 What is Facebook Network Appliance (FNA)?


Facebook Network Appliance (FNA) is Facebook’s content caching program. FNA provides Internet Service Providers
(ISPs) with an efficient means of delivering static Facebook content from within their network. Upon deployment, an ISP
will offload a significant amount of Facebook content from its backbone network and vastly improve the Facebook user
experience.
The FNA kit consists of a Top-of-Rack (ToR) switch and from four to twenty servers. The hardware is suitable for
deployment in data centers, colocation facilities, and outside plant environments (the industry standard 19-inch form factor
allows integration into most network environments).

A High-Level Facebook Network Appliance System Overview

1.1 Content Type and Transit


Most Facebook traffic consists of static content such as pictures and video. Without FNA, this content is delivered via
peering or transit routes connected to FB-CDN AS32934.
FNA deployments allow ISPs to directly deliver static content from within their data centers, head-ends, or Points of
Presence (PoPs). This approach reduces the overall utilization of the ISP’s backbone network and traffic Round-Trip Time
(RTT).
In a typical FNA deployment, approximately 80% of Facebook traffic will be offloaded from the ISP’s backbone network
with only a single FNA node.
FNA: White Paper | 4

A Data Flow Comparison for Networks with and without FNA

FNA inter-operates with networks using the following methods common to most content caching applications:
●● The external Border Gateway Protocol (eBGP) is used to signal subscriber prefixes
●● Cache-miss / cache-fill and data monitoring is handled via the network operator’s peering sessions with
Facebook (where available)
●● FNA prefers to cache-fill via IPv6, even when a client request is received via IPv4
As Facebook traffic volume grows, FNA can become essential to an ISP’s capacity management strategy.

1.2 Latency Based Routing


Upon installation of an FNA kit, a latency based routing test is performed to optimize traffic routing. This test evaluates
the round-trip latency of the specific ISP network. This is accomplished by sending a comprehensive sample (every FNA
servable netblock) of static content from the netblock population announced. This test makes it possible for Facebook
to observe and construct the most optimal routing path. In this manner, each FNA deployment is custom fit to the ISP’s
needs.
The accuracy of the latency based routing service is dependent upon the percentage of FNA servable netblocks
announced. Providing a large percentage of the servable netblock population allows the service to construct more optimal
(low-latency) routing paths. A small percentage of netblocks announced (for example, a test netblock) does not allow the
test to function as intended and traffic may be routed in a seemingly random course. That is why it is important that every
FNA servable netblock is provided to Facebook to optimize routing paths for the FNA deployment.
FNA: White Paper | 5

2 Network Partner Portal


FNA deployments include access to the Network Partner Portal; a one-
stop solution for FNA management resources. The Network Partner Portal
provides operators with the ability to view various system metrics and
obtain system support. This portal incorporates the same automated fleet
management tools that Facebook uses in their data centers.
Some of the tools and metrics dashboards the portal offers include:
●● Overview Panel
○○ Traffic Overview with Mouse-Over Call-Out
○○ Order Tracker
○○ Cache Request
●● Caching Insights
○○ Traffic Snapshots for Backbone Offload
○○ Throughput and Retransmission Rates
○○ FNA Kit Management Panel
●● Peering Insights
○○ Autonomous System Information
○○ Peering Session Status
●● Support Program

2.1 Overview Page


The Overview panel allows network operators to view a traffic overview graph that features a point-in-time readout when
a mouse is moved over it. The view can be toggled between day, week, and month. You can also view your orders or
request a FNA cache (an additional FNA kit).

FNA Overview Panel: Daily, Weekly


FNA: White Paper | 6

2.2 Caching Insights


The Caching Insights panel allows network operators to view various traffic information graphs such as client traffic, cache
fill traffic, and cache fill latency.

FNA Caching Insights Dashboards


FNA: White Paper | 7

2.3 Peering
The Network Partner Portal offers the ability to view peering details associated with your autonomous system(s). Peering
details include a summary, sessions, insights, policy, and contact information.

Peering Insights Dashboard


FNA: White Paper | 8

2.4 Support Program


Support for FNA is managed through the Network Partners Support Program. This support ticket platform allows you to
create new tickets with FNA Operations (noc@fb.com), interact with open tickets, and view resolved issues.

The Network Partners Support Program Interface

3 Key Benefits
The following are some immediate benefits an ISP can expect after deploying an FNA kit within their network facility:
●● Facebook content delivery latency reduction
○○ Page load times are reduced
○○ Access speed is increased
○○ Overall RTT is reduced

●● Increased backbone capacity life cycle in exchange for minimal infrastructure resources
○○ FNA offloads a significant amount of Facebook traffic from an operator’s backbone network
○○ Equipment footprint is relatively small with regard to space and power requirements

●● Greater traffic management and control


○○ Facebook is able to actively monitor FNA nodes to mitigate most network failure scenarios

●● Utilization of system and network monitoring tools


○○ The Network Partner Portal enables an ISP to view operational data such as caching insights and peering
capacity
FNA: White Paper | 9

4 Qualification, Ordering, and Equipment Notes

4.1 ISP Qualification


Facebook considers these criteria when qualifying an ISP for the FNA program. If you have any questions, contact FNA
(fna@fb.com) for clarification.
●● The ISP is considered an ‘eyeball’ ISP (not a transit ISP).
●● The installation site has at least 5Gbps of peak Facebook traffic in a region that does not currently have a
Facebook PoP.
●● The ISP maintains the ability to deliver Facebook content from a single deployment location in an efficient
manner.
●● The installation site currently has a Facebook content delivery latency greater than 200ms.
●● The ISP has the ability to allocate IPv6 address space for the FNA kit.
●● The installation site is located in a developing market / high-growth region.

4.2 Ordering Requirements


The installation site should provide the following minimum space, power, and network requirements for the FNA kit.
Include this information when ordering in the Network Partner Portal, URL: https://partners.facebook.com/network.

Ordering Requirements and Notes


Category Requirement Notes
Routing and Addressing Public Autonomous System Number (ASN) ●● To optimize the FNA deployment, announce
all prefixes
●● IPv4: Public routable /26
●● Prefixes may be advertised to more than
●● IPv6: Public routable /64
one FNA node for the purpose of load
balancing
Router to Switch Addressing
●● Each FNA requires access to AS32934 for
●● IPv4: Public routable /29, /30 or /31
cache-fill / cache-miss traffic
●● IPv6: Public routable /125, /126 or /127
●● AS32934 prefixes do not need to be
advertised to FNA
●● FNA will not advertise any prefixes
Space (Rack Unit, RU) ●● 2RU required per server ●● A minimum deployment requires 9RU
●● 1RU required per switch
Power ●● Maximum 750W per server ●● Typical power draw for a minimum
deployment is 1kW
●● Maximum 400W per switch
Cables ●● Specify either single or multi-mode ●● Switch to server Twinax cables are provided
with the Stock Keeping Unit (SKU) order
Contacts and Addresses ●● Provide an engineering contact who can answer ●● Provide a name, title, email address, and
technical questions about the network phone number
●● Provide a shipping and installation contact with ●● Provide a name, title, email address, and
technical knowledge of the installation site (may phone number
be the same as engineering contact)
●● Provide a network operations center (NOC) ●● Provide a name, title, email address, and
contact phone number
●● Provide installation site information ●● Provide facility name and complete location
address
●● Specify a shipping address if different from
the installation address
FNA: White Paper | 10

4.3 Equipment Notes


The following table provides summarized equipment information and facility requirements for the hardware included with
an FNA order.

Equipment Notes
Equipment Notes
Switch ●● A minimal installation requires two 10GE (recommended four 10GE) uplinks connected to the ISP network routers as a
Link Aggregation Control Protocol (LACP) group.
●● Each additional group of four servers requires two 10GE (recommended four 10GE) uplinks (for example, a 12 server
deployment requires minimum six 10GE uplinks).
Servers ●● Each deployment of four servers can serve minimum of ~ 12Gb/s of traffic. Augmenting capacity is accomplished by
deploying additional groups of four servers. A single node supports up to 20 servers connected through a single ToR
switch.
●● Deployments are scaled for 12+ months of traffic growth.
●● Redundancy is built into each FNA node.
●● Full N+1 redundancy is achieved through deploying a second, equal size node at the same site.
Cables and ●● Power cables are provided with each SKU order.
Interconnects
●● Network cables are provided for connectivity between each server and the switch.
●● Uplink optical transceivers (long reach or short reach) are included.
●● Fiber jumpers for uplink connectivity from the ToR switch to the ISP router are not included.

5 Equipment Delivery
Facebook has partnered with their installation vendors to deliver and maintain FNA equipment within the ISP’s network.
The following process outlines how equipment is delivered:
1. After the order has been confirmed, Facebook asks the ISP to accept a service agreement with an FNA installation
vendor. This agreement commits FNA’s installation vendor to deliver and maintain the equipment to the ISP at no
cost.
2. The ISP commits to providing adequate facility resources including power, space, cooling, and network connectivity.
In the case of equipment failure, the FNA installation vendor will coordinate the delivery of replacement components and
will initiate a return merchandise authorization (RMA) of faulty components from the host site.

You might also like