Facebook’s network is growing at an alarming rate. The media consumption traffic on the site is handled by 40 GB Ethernet interfaces provided by the Wedge open-source network switch. The Wedge sits at the top of a rack of servers to connect them to Facebook's network. It was announced in June 2014, and thousands are already deployed in the company. Plus, the social network made the design of the Wedge open source so other manufacturers could build switches like it. However, the 40 GB Ethernet ports in the switch are proving no match for the fast-growing traffic on Facebook's network.
To handle this, the company announced a blog post that it is now developing a version of the switch with 100-Gigabit Ethernet interfaces. The bandwidth-intensive content on the social network includes big video files and virtual-reality content like 360-degree videos that Facebook recently introduced.
"Whenever there is capacity, people will build stuff to consume it," said Jay Parikh, vice president of global engineering and infrastructure, at the Structure conference in San Francisco on Thursday. Facebook said that its servers are more powerful and as many as four servers can be hooked up to each port of the Wedge.
The Wedge 100 will have 32 100G ports, the same maximum number as on today's Wedge but all at the higher speed. Like the current switch, it has a non-blocking design, meaning that it can fully feed all of its ports at the same time if necessary. Thus the total capacity of the Wedge 100 is 3.2Tbps.
The laudable thing that Facebook does is that it designs its own open source switches and software and makes them available for manufacturers. The company uses an internally developed OS, called FBOSS, on its own switches. But it has also joined up with Big Switch Networks to offer an open-source network operating system called Open Network Linux for others to use.