Welcome!

Agile Computing Authors: Dana Gardner, Pat Romanski, Janakiram MSV, Sematext Blog, Jim Malone

Related Topics: Containers Expo Blog, Industrial IoT, Microservices Expo, Agile Computing, @CloudExpo, Apache

Containers Expo Blog: Blog Feed Post

Bare Metal Blog: FPGAs: Reaping the Benefits

All the goodness FPGAs bring hardware in general, and ADO hardware in particular.

All the goodness FPGAs bring hardware in general, and ADO hardware in particular.

In two previous installments, I talked at a high level about the uses of FPGAs, risk mitigation, and the potential benefits. Today I’d like to delve into the benefits that the industry in general, and F5 in particular, gain from using FPGAs, and why it matters to IT. If you’re a regular reader, you know that I try not to be a chorus line for F5 solutions, but don’t shy away from talking about them when it fits the topic. That will continue with this post. While I will use F5 for the specifics, the benefits can be generalized to the bulk of the industry.

Used to be, way back in the day, everyone walked everywhere. That worked for a long period of world history. The horse was adopted for longer trips, and it about doubled travel speed, but still, the bulk of the world populace walked nearly all of the time. Then along came cars, and they enabled a whole lot of things. One of the great benefits that the automobile introduced was the ability to be more agile. By utilizing the machinery, you could move from one town to another relatively quickly. You could even work in a town 30 miles – a days’ walk for a physically fit person – from your home. At this point in human – or at least first world – history, walking is a mode of transportation that is rarely used for important events. There are some cities so tightly packed that walking makes sense, but for most of us, we take a car the vast majority of the time. When speed is not of the essence – say when you take a walk with a loved one – the car is left behind, but for day-to-day transport, the car is the go-to tool.

There is a corollary to this phenomenon in the Application Delivery world. While in some scenarios, a software ADC will do the trick, there are benefits to hardware that mean if you have it, you’ll use the hardware much more frequently. This is true of far more than ADCs, but bear with me, I do work for an ADC vendor Winking smile. There are some things that can just be done more efficiently in hardware, and some things that are best left (normally due to complexity) to software. In the case of FPGAs, low-level operations that do a lot of repetitive actions are relatively easily implemented – even to the point of FPGA and/or programming tools for FPGAs coming with certain pre-built layouts at this point. As such, certain network processing that is latency-sensitive and can be done with little high-level logic are well suited to FPGA processing. When a packet can be processed in X micro-seconds in FPGA, or in X^3 milliseconds by the time it passes through the hardware, DMA transfer, firmware/network stack, and finally lands in software that can manipulate it, definitely go with the FPGA option if possible.

And that’s where a lot of the benefits of FPGAs in the enterprise are being seen. Of course you don’t want to have your own FPGA shop and have to maintain your own installation program to reap the benefits. But vendors have sets of hardware that are largely the same and are produced en-masse. It makes sense that they would make use of FPGAs, and they do. Get that packet off the wire, and if it meets certain criteria, turn it around and get it back on the wire with minor modifications.

But that’s not all. While it was a great step to be able to utilize FPGAs in this manner and not have to pay the huge up-front fees of getting an ASIC designed and a run of them completed, the use of FPGAs didn’t stop there – indeed, it is still growing and changing. The big area that has really grown the usage of ever-larger FPGAs is in software assistance. Much like BIOS provides discrete functionality that software can call to achieve a result, FPGAs can define functions with register interface that are called directly from software – not as a solution, but as an incremental piece of the solution. This enables an increase in the utilization of FPGAs and if the functions are chosen carefully, an improvement in the overall performance of the system the FPGAs are there to support. It is, essentially, offloading workload from software. When that offload is of computationally intensive operations, the result can be a huge performance improvement. Where a software solution might have a function call, hardware can just do register writes and reads, leaving the system resources less taxed. Of course if the operation requires a lot of data storage memory, it still will, which is why I mentioned “computationally expensive”.

The key thing is to ask your vendor (assuming they use FPGAs) what they’re doing with them, and what benefit you see. It is a truth that the vast majority of vendors go to FPGAs for their own benefit, but that is not exclusive of making things better for customers. So ask them how you, as a customer, benefit.

And when you wonder why a VM can’t perform every bit as well as custom hardware, well the answer is at least partially above. The hardware functionality of custom devices must be implemented in software for a VM, and that software then runs on not one, but two operating systems, and eventually calls general purpose hardware. While VMs, like feet, are definitely good for some uses, when you need your app to be the fastest it can possibly be, hardware – specifically FPGA enhanced hardware – is the best answer, much as the car is the best answer for daily travel in most of the world. Each extra layer – generic hardware, the host operating system, the virtual network, and the guest operating system – adds cost to processing. The lack of an FPGA does too, because those low-level operations must be performed in software.

So know your needs, use the right tool for the job. I would not drive a car to my neighbors’ house – 200 feet away – nor would I walk from Green Bay to Cincinnati (just over 500 miles). Know what your needs are and your traffic is like, then ask about FPGA usage. And generalize this… To network switches, WAPs, you name it. You’re putting it into your network, so that IS your business.

Walking in Ust-Donetsk

And yeah, you’ll hear more on this topic before I wrap up the Bare Metal Blog series, but for now, keep doing what you do so well, and I’ll be back with more on testing soon.

Read the original blog entry...

More Stories By Don MacVittie

Don MacVittie is currently a Senior Solutions Architect at StackIQ, Inc. He is also working with Mesamundi on D20PRO, and is a member of the Stacki Open Source project. He has experience in application development, architecture, infrastructure, technical writing, and IT management. MacVittie holds a B.S. in Computer Science from Northern Michigan University, and an M.S. in Computer Science from Nova Southeastern University.

@ThingsExpo Stories
Is the ongoing quest for agility in the data center forcing you to evaluate how to be a part of infrastructure automation efforts? As organizations evolve toward bimodal IT operations, they are embracing new service delivery models and leveraging virtualization to increase infrastructure agility. Therefore, the network must evolve in parallel to become equally agile. Read this essential piece of Gartner research for recommendations on achieving greater agility.
SYS-CON Events announced today that 910Telecom will exhibit at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Housed in the classic Denver Gas & Electric Building, 910 15th St., 910Telecom is a carrier-neutral telecom hotel located in the heart of Denver. Adjacent to CenturyLink, AT&T, and Denver Main, 910Telecom offers connectivity to all major carriers, Internet service providers, Internet backbones and ...
Personalization has long been the holy grail of marketing. Simply stated, communicate the most relevant offer to the right person and you will increase sales. To achieve this, you must understand the individual. Consequently, digital marketers developed many ways to gather and leverage customer information to deliver targeted experiences. In his session at @ThingsExpo, Lou Casal, Founder and Principal Consultant at Practicala, discussed how the Internet of Things (IoT) has accelerated our abil...
With so much going on in this space you could be forgiven for thinking you were always working with yesterday’s technologies. So much change, so quickly. What do you do if you have to build a solution from the ground up that is expected to live in the field for at least 5-10 years? This is the challenge we faced when we looked to refresh our existing 10-year-old custom hardware stack to measure the fullness of trash cans and compactors.
The emerging Internet of Everything creates tremendous new opportunities for customer engagement and business model innovation. However, enterprises must overcome a number of critical challenges to bring these new solutions to market. In his session at @ThingsExpo, Michael Martin, CTO/CIO at nfrastructure, outlined these key challenges and recommended approaches for overcoming them to achieve speed and agility in the design, development and implementation of Internet of Everything solutions wi...
19th Cloud Expo, taking place November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, will feature technical sessions from a rock star conference faculty and the leading industry players in the world. Cloud computing is now being embraced by a majority of enterprises of all sizes. Yesterday's debate about public vs. private has transformed into the reality of hybrid cloud: a recent survey shows that 74% of enterprises have a hybrid cloud strategy. Meanwhile, 94% of enterpri...
Today we can collect lots and lots of performance data. We build beautiful dashboards and even have fancy query languages to access and transform the data. Still performance data is a secret language only a couple of people understand. The more business becomes digital the more stakeholders are interested in this data including how it relates to business. Some of these people have never used a monitoring tool before. They have a question on their mind like “How is my application doing” but no id...
Pulzze Systems was happy to participate in such a premier event and thankful to be receiving the winning investment and global network support from G-Startup Worldwide. It is an exciting time for Pulzze to showcase the effectiveness of innovative technologies and enable them to make the world smarter and better. The reputable contest is held to identify promising startups around the globe that are assured to change the world through their innovative products and disruptive technologies. There w...
SYS-CON Events announced today Telecom Reseller has been named “Media Sponsor” of SYS-CON's 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Telecom Reseller reports on Unified Communications, UCaaS, BPaaS for enterprise and SMBs. They report extensively on both customer premises based solutions such as IP-PBX as well as cloud based and hosted platforms.
SYS-CON Events announced today that Venafi, the Immune System for the Internet™ and the leading provider of Next Generation Trust Protection, will exhibit at @DevOpsSummit at 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Venafi is the Immune System for the Internet™ that protects the foundation of all cybersecurity – cryptographic keys and digital certificates – so they can’t be misused by bad guys in attacks...
Smart Cities are here to stay, but for their promise to be delivered, the data they produce must not be put in new siloes. In his session at @ThingsExpo, Mathias Herberts, Co-founder and CTO of Cityzen Data, will deep dive into best practices that will ensure a successful smart city journey.
The 19th International Cloud Expo has announced that its Call for Papers is open. Cloud Expo, to be held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, brings together Cloud Computing, Big Data, Internet of Things, DevOps, Digital Transformation, Microservices and WebRTC to one location. With cloud computing driving a higher percentage of enterprise IT budgets every year, it becomes increasingly important to plant your flag in this fast-expanding business opportuni...
DevOps at Cloud Expo, taking place Nov 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with 19th Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is no time to wait for long dev...
In today's uber-connected, consumer-centric, cloud-enabled, insights-driven, multi-device, global world, the focus of solutions has shifted from the product that is sold to the person who is buying the product or service. Enterprises have rebranded their business around the consumers of their products. The buyer is the person and the focus is not on the offering. The person is connected through multiple devices, wearables, at home, on the road, and in multiple locations, sometimes simultaneously...
Internet of @ThingsExpo, taking place November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with 19th Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The Internet of Things (IoT) is the most profound change in personal and enterprise IT since the creation of the Worldwide Web more than 20 years ago. All major researchers estimate there will be tens of billions devices - comp...
For basic one-to-one voice or video calling solutions, WebRTC has proven to be a very powerful technology. Although WebRTC’s core functionality is to provide secure, real-time p2p media streaming, leveraging native platform features and server-side components brings up new communication capabilities for web and native mobile applications, allowing for advanced multi-user use cases such as video broadcasting, conferencing, and media recording.
Data is the fuel that drives the machine learning algorithmic engines and ultimately provides the business value. In his session at Cloud Expo, Ed Featherston, a director and senior enterprise architect at Collaborative Consulting, will discuss the key considerations around quality, volume, timeliness, and pedigree that must be dealt with in order to properly fuel that engine.
Amazon has gradually rolled out parts of its IoT offerings in the last year, but these are just the tip of the iceberg. In addition to optimizing their back-end AWS offerings, Amazon is laying the ground work to be a major force in IoT – especially in the connected home and office. Amazon is extending its reach by building on its dominant Cloud IoT platform, its Dash Button strategy, recently announced Replenishment Services, the Echo/Alexa voice recognition control platform, the 6-7 strategic...
Akana has announced the availability of version 8 of its API Management solution. The Akana Platform provides an end-to-end API Management solution for designing, implementing, securing, managing, monitoring, and publishing APIs. It is available as a SaaS platform, on-premises, and as a hybrid deployment. Version 8 introduces a lot of new functionality, all aimed at offering customers the richest API Management capabilities in a way that is easier than ever for API and app developers to use.
Cloud computing is being adopted in one form or another by 94% of enterprises today. Tens of billions of new devices are being connected to The Internet of Things. And Big Data is driving this bus. An exponential increase is expected in the amount of information being processed, managed, analyzed, and acted upon by enterprise IT. This amazing is not part of some distant future - it is happening today. One report shows a 650% increase in enterprise data by 2020. Other estimates are even higher....