Welcome!

Agile Computing Authors: Elizabeth White, Pat Romanski, Yeshim Deniz, Liz McMillan, Andy Thurai

Related Topics: @ThingsExpo, Java IoT, Agile Computing, @CloudExpo, @DXWorldExpo

@ThingsExpo: Article

IoT and Fog Computing Architecture | @ThingsExpo #IoT #DigitalTransformation

Fog computing represents an evolution from a centralized toward a decentralized cloud system

Cloud computing changed data analytics for good. It enabled companies to drastically decrease resources and architecture previously assigned with business intelligence departments. It also enabled laymen to run advanced business analytics. Cloud was also the architecture of choice for storing and processing big data.

Data piling is a continuous process, which is going to explode with emerging Internet of Things concept. Answer to this issue developers found in new concept called fog computing. As opposed to clouds, fog computing architecture is capable of conducting all required computations and analytics directly at the data source. This way, single network administrator is able to control the work of thousands (or even millions) of different data generating devices, by real time and predictive analytics, without overloading the network with huge piles of data going back and forth.

This process goes on as far as devices are working in a regular way. The moment when some problem occurs, and some device requires repair or maintenance, administrator receives a notice. With this approach that also includes advanced BI software that continuously conducts analysis of device's work, administrators are able to overlook huge number of devices, using little network capacity and bandwidth.

Benefits of fog computing
Fog computing concept comes with long list of benefits. Some of them are:

  • It frees up network capacity - As we said earlier, fog computing uses much less bandwidth, which means it doesn't cause bottlenecks and other similar occupancies. Less data movement on the network frees up network capacity, which then can be used for other things.
  • It is truly real-time - Fog computing has much higher expedience than any other cloud computing architecture we know today. Since all data analysis are being done at the spot it represents a true real time concept, which means it is a perfect match for the needs of Internet of Things concept.
  • It boosts data security - Collected data is more secure when it doesn't travel. This way, cyber criminals can't intercept it and use it for fraudulent purposes. The whole concept also makes data storing much simpler, because it stays in its country of origin. Sending data abroad might violate certain laws.

Disadvantages of fog computing
Like any new concept fog computing also comes with few disadvantages, although it is hard to say whether these disadvantages can even be compared with those that we experienced in previous computing architectures. Some developers state these three main disadvantages of systems that work on fog computing paradigm:

  • Analytics is done locally- You probably noticed that we mentioned this as a benefit in previous paragraph and a trick that enables fog computing systems to use much less bandwidth. Well, some developers argue that cloud computing concept should enable people to access data from everywhere around the world. Of course fog computing concept enables developers to access most important IoT data from other locations, but it still keeps piles of less important information in local storages;
  • Some companies don't like their data being out of their premises- basically this is the same with all cloud environments. Some people still don't believe outside servers, and since with fog computing lots of data is stored on the devices themselves (which are often located outside of company offices), this is perceived as a risk by part of developers' community.
  • Whole system sounds a little bit confusing- Concept that includes huge number of devices that store, analyze and send their own data, located all around the world sounds utterly confusing.

More complex hardware and software
Fog computing is a very complicated architecture, and the fact that every device does its own data analytics requires combination of both hardware and software components. Some of the required hardware includes: Wi-Fi routers, computer chips, various switches and IP cameras. Some of the companies that already developed their own systems in this field are: Cisco, IBM, Intel, EMC and several others.

If we think about it fog computing represents an evolution from centralized toward a decentralized cloud system. Mobile communications and the dynamic life people are living requires provisioning resources locally. Progress of Internet of Things idea largely contributed to fog computing development and in the future all world networks will be forced to apply this system, due to increasing growth of big data.

More Stories By Nate Vickery

Nate M. Vickery is a business consultant from Sydney, Australia. He has a degree in marketing and almost a decade of experience in company management through latest technology trends. Nate is also the editor-in-chief at bizzmarkblog.com.

IoT & Smart Cities Stories
The deluge of IoT sensor data collected from connected devices and the powerful AI required to make that data actionable are giving rise to a hybrid ecosystem in which cloud, on-prem and edge processes become interweaved. Attendees will learn how emerging composable infrastructure solutions deliver the adaptive architecture needed to manage this new data reality. Machine learning algorithms can better anticipate data storms and automate resources to support surges, including fully scalable GPU-c...
Machine learning has taken residence at our cities' cores and now we can finally have "smart cities." Cities are a collection of buildings made to provide the structure and safety necessary for people to function, create and survive. Buildings are a pool of ever-changing performance data from large automated systems such as heating and cooling to the people that live and work within them. Through machine learning, buildings can optimize performance, reduce costs, and improve occupant comfort by ...
The explosion of new web/cloud/IoT-based applications and the data they generate are transforming our world right before our eyes. In this rush to adopt these new technologies, organizations are often ignoring fundamental questions concerning who owns the data and failing to ask for permission to conduct invasive surveillance of their customers. Organizations that are not transparent about how their systems gather data telemetry without offering shared data ownership risk product rejection, regu...
René Bostic is the Technical VP of the IBM Cloud Unit in North America. Enjoying her career with IBM during the modern millennial technological era, she is an expert in cloud computing, DevOps and emerging cloud technologies such as Blockchain. Her strengths and core competencies include a proven record of accomplishments in consensus building at all levels to assess, plan, and implement enterprise and cloud computing solutions. René is a member of the Society of Women Engineers (SWE) and a m...
Poor data quality and analytics drive down business value. In fact, Gartner estimated that the average financial impact of poor data quality on organizations is $9.7 million per year. But bad data is much more than a cost center. By eroding trust in information, analytics and the business decisions based on these, it is a serious impediment to digital transformation.
Digital Transformation: Preparing Cloud & IoT Security for the Age of Artificial Intelligence. As automation and artificial intelligence (AI) power solution development and delivery, many businesses need to build backend cloud capabilities. Well-poised organizations, marketing smart devices with AI and BlockChain capabilities prepare to refine compliance and regulatory capabilities in 2018. Volumes of health, financial, technical and privacy data, along with tightening compliance requirements by...
Predicting the future has never been more challenging - not because of the lack of data but because of the flood of ungoverned and risk laden information. Microsoft states that 2.5 exabytes of data are created every day. Expectations and reliance on data are being pushed to the limits, as demands around hybrid options continue to grow.
Digital Transformation and Disruption, Amazon Style - What You Can Learn. Chris Kocher is a co-founder of Grey Heron, a management and strategic marketing consulting firm. He has 25+ years in both strategic and hands-on operating experience helping executives and investors build revenues and shareholder value. He has consulted with over 130 companies on innovating with new business models, product strategies and monetization. Chris has held management positions at HP and Symantec in addition to ...
Enterprises have taken advantage of IoT to achieve important revenue and cost advantages. What is less apparent is how incumbent enterprises operating at scale have, following success with IoT, built analytic, operations management and software development capabilities - ranging from autonomous vehicles to manageable robotics installations. They have embraced these capabilities as if they were Silicon Valley startups.
As IoT continues to increase momentum, so does the associated risk. Secure Device Lifecycle Management (DLM) is ranked as one of the most important technology areas of IoT. Driving this trend is the realization that secure support for IoT devices provides companies the ability to deliver high-quality, reliable, secure offerings faster, create new revenue streams, and reduce support costs, all while building a competitive advantage in their markets. In this session, we will use customer use cases...