-
Tips for becoming a good boxer - November 6, 2020
-
7 expert tips for making your hens night a memorable one - November 6, 2020
-
5 reasons to host your Christmas party on a cruise boat - November 6, 2020
-
What to do when you’re charged with a crime - November 6, 2020
-
Should you get one or multiple dogs? Here’s all you need to know - November 3, 2020
-
A Guide: How to Build Your Very Own Magic Mirror - February 14, 2019
-
Our Top Inspirational Baseball Stars - November 24, 2018
-
Five Tech Tools That Will Help You Turn Your Blog into a Business - November 24, 2018
-
How to Indulge on Vacation without Expanding Your Waist - November 9, 2018
-
5 Strategies for Businesses to Appeal to Today’s Increasingly Mobile-Crazed Customers - November 9, 2018
Facebook Takes Its Artificial Intelligence ‘Big Sur’ Server Open Source
Big Sur is Open Rack-compatible and contains eight high-performance GPUs, which have enabled the company to improve its AI computing ability by four-fold since the last iteration of the same product. Alternative uses including fraud detection and email spam filtering, among other use cases. They will be crucial to such ventures as self-driving cars, and vendors ranging from Google and IBM to Microsoft, Apple and Baidu are growing their investments in the space. Because of this, there is greater chance to discover more in machine learning and AI. In addition, it has “flexibility to configure between multiple PCI-e topologies”. And distributing training across eight GPUs allows us to scale the size and speed of our networks by another factor of two.
Advertisement
The social networking giant said that releasing the blueprints of Big Sur would help researchers and other tech firms benefit from the constant tweaking of Facebook’s engineers. High performance computers generate a lot of heat, and keeping them cool can be costly.
Big Sur was designed for efficient operation and serviceability. It could also drive down the cost of such innovation. What with the next OCP Summit taking place in March 2016, we’d wager it’s likely Facebook will tell us more either before, or on that date. All the handles and levers that technicians are supposed to touch are colored green, so the machines can be serviced quickly, and even the motherboard can be removed within a minute.
Facebook calls the unit “toolless” as there is only one part of the design that you need a screwdriver in order to get inside.
It’s not sharing the design to be altruistic: Facebook hopes others will try out the hardware and suggest improvements.
Facebook would not say how much it has invested in researching the field of machine learning or building the GPU server effort. One is that large data sets used to train the systems have become publicly available. The system isn’t just about raw compute power though.
Over the past several years, Facebook has open sourced infrastructure components and design and has developed software that can read stories, answer questions about scenes, play games, and even learn unspecified tasks. But we realized that truly tackling these problems at scale would require us to design our own systems.
Advertisement
Facebook is being touted as the first company to adopt Nvidia’s Tesla M40 GPU accelerator that debuted last month. Leveraging Nvidia’s Tesla Accelerated Computing Platform, the duo claim Big Sur is twice as fast as previous generations that were built using off-the-shelf components and design. It can also be used to run Google’s TensorFlow.