Welcome!

Agile Computing Authors: Liz McMillan, Pat Romanski, Elizabeth White, Dana Gardner, Andy Thurai

Blog Feed Post

Realtime Messaging with Pusher

Realtime Messaging with Pusher

Every application has it’s dark corner. Some part of the application of which you are not proud of. That part of the application is working code, it’s in production for some time but you still have an uneasy feeling about it. You fear that it will come back at you at the worst possible time.

At Codeship one of these parts was our Dashboard reload. In this article I will show you how we made it awesome.

The Codeship Project Dashboard

When you are on your dashboard, or watching the builds of a particular project or observe a certain build.

Watching the builds of a particular project

Watching the builds of a particular project

We are updating the page without you having to hit reload all the time. Every time a new build starts or we receive some new output from your test commands we update the page and you can see the changes instantly.

Back to the past

Previously, we did the most simple implementation you can think of. There was a Javascript timeout on each of the 3 pages and we would just reload the page every 10 seconds. That’s not close to instant but it got the job done and it worked for more than 1 year. That’s still impressive. As you can imagine there is a lot of room for improvements.

Realtime Updates with Pusher

Over the time we optimized this feature step by step. We added some checks which allowed us the respond with a 304 (Content not modified) for not having to render the page again and again even when there are no updates. The next step was to let the client save the last time it got updates from the server and included that information in the next request. This allowed us to just deliver the missing parts. We were still having that 10 second page reload. We decided to change that!

The web changed dramatically in recent years. Web sockets have been around for quite some time now and a huge ecosystem evolved around these new technologies. We never had the intention to implement our own realtime push infrastructure, thats totally out of focus. We needed somebody who handles and provides a reliable push infrastructure for us. In the back of my head i knew about PusherApp, checking their site in the past but never having a good use case to actually use Pusher.

37 changed files with 315 additions and 159 deletions.

Implementing Pusher was quite easy and really fun. Let’s go into the details.

Private Channels

PusherApp supports private channels, this basically means the Pusher Javascript asks your application if it is allowed to subscribe to the channel. They have a simple naming convention, private channels need to start with the word “private”. Our private channels look like this: “private-build-190852″ where 190852 is the build id. The same goes for the project channels.

The Javascript sends a POST request to /pusher/auth and you need to either return a 200 or a 403. You need to authorize the client with the pusher service to let them know the client is actually allowed to receive updates. Using the Pusher Gem this is just a few lines of ruby. How you decide if a client can subscribe to that private channel is totally up to you. In our case we check if the user is allowed to access the build or project.

def auth
    if pusher_access.can_access? params[:channel_name]
      response = Pusher[params[:channel_name]].authenticate(params[:socket_id])
      render json: response
    else
      render text: "Not authorized", status: 403
    end
end

Multiple Channels.

If you go to the Codeship Dashboard you can see multiple projects. If you have just 1 project you see builds from that project. If you have multiple projects you see the last 10 builds for these projects. In this case we subscribe to multiple channels. For example if I go to the dashboard I’m subscribed to the channels “private-project-7978″ and “private-project-211″. If there is a new build starting for one of those projects we send a message to that channel and everybody who watches that project in their dashboard gets the update.

The numbers

On a normal day we push 1.3 Million messages and have 300 concurrently connected clients.

Codeship – A hosted Continuous Deployment platform for web applications

What’s next

Right now we don’t include the updated parts into the message we send over Pusher. This means the client still does the page reload and gets the updates from the server. If we deliver the updates straight to the client we save a lot of requests which are not needed because we have the data that changed already in our hands at the time the push gets triggered.

What we think

We are very happy with the changes we made and Pusher itself. It made the application logic a lot simpler. We don’t check for not modified content anymore, since we only trigger reloads after changes happened. The next step is to push the modified content directly to all connected clients. I’m looking forward to the next improvements we gonna ship.

Let us know your experiences with Pusher. How do YOU achieve Realtime Messaging for your app?

Improve your dark corners today!

Further information

Read the original blog entry...

More Stories By Manuel Weiss

I am the cofounder of Codeship – a hosted Continuous Integration and Deployment platform for web applications. On the Codeship blog we love to write about Software Testing, Continuos Integration and Deployment. Also check out our weekly screencast series 'Testing Tuesday'!

IoT & Smart Cities Stories
Contextual Analytics of various threat data provides a deeper understanding of a given threat and enables identification of unknown threat vectors. In his session at @ThingsExpo, David Dufour, Head of Security Architecture, IoT, Webroot, Inc., discussed how through the use of Big Data analytics and deep data correlation across different threat types, it is possible to gain a better understanding of where, how and to what level of danger a malicious actor poses to an organization, and to determin...
The hierarchical architecture that distributes "compute" within the network specially at the edge can enable new services by harnessing emerging technologies. But Edge-Compute comes at increased cost that needs to be managed and potentially augmented by creative architecture solutions as there will always a catching-up with the capacity demands. Processing power in smartphones has enhanced YoY and there is increasingly spare compute capacity that can be potentially pooled. Uber has successfully ...
The deluge of IoT sensor data collected from connected devices and the powerful AI required to make that data actionable are giving rise to a hybrid ecosystem in which cloud, on-prem and edge processes become interweaved. Attendees will learn how emerging composable infrastructure solutions deliver the adaptive architecture needed to manage this new data reality. Machine learning algorithms can better anticipate data storms and automate resources to support surges, including fully scalable GPU-c...
We are seeing a major migration of enterprises applications to the cloud. As cloud and business use of real time applications accelerate, legacy networks are no longer able to architecturally support cloud adoption and deliver the performance and security required by highly distributed enterprises. These outdated solutions have become more costly and complicated to implement, install, manage, and maintain.SD-WAN offers unlimited capabilities for accessing the benefits of the cloud and Internet. ...
Dion Hinchcliffe is an internationally recognized digital expert, bestselling book author, frequent keynote speaker, analyst, futurist, and transformation expert based in Washington, DC. He is currently Chief Strategy Officer at the industry-leading digital strategy and online community solutions firm, 7Summits.
As IoT continues to increase momentum, so does the associated risk. Secure Device Lifecycle Management (DLM) is ranked as one of the most important technology areas of IoT. Driving this trend is the realization that secure support for IoT devices provides companies the ability to deliver high-quality, reliable, secure offerings faster, create new revenue streams, and reduce support costs, all while building a competitive advantage in their markets. In this session, we will use customer use cases...
Machine learning has taken residence at our cities' cores and now we can finally have "smart cities." Cities are a collection of buildings made to provide the structure and safety necessary for people to function, create and survive. Buildings are a pool of ever-changing performance data from large automated systems such as heating and cooling to the people that live and work within them. Through machine learning, buildings can optimize performance, reduce costs, and improve occupant comfort by ...
René Bostic is the Technical VP of the IBM Cloud Unit in North America. Enjoying her career with IBM during the modern millennial technological era, she is an expert in cloud computing, DevOps and emerging cloud technologies such as Blockchain. Her strengths and core competencies include a proven record of accomplishments in consensus building at all levels to assess, plan, and implement enterprise and cloud computing solutions. René is a member of the Society of Women Engineers (SWE) and a m...
With 10 simultaneous tracks, keynotes, general sessions and targeted breakout classes, @CloudEXPO and DXWorldEXPO are two of the most important technology events of the year. Since its launch over eight years ago, @CloudEXPO and DXWorldEXPO have presented a rock star faculty as well as showcased hundreds of sponsors and exhibitors! In this blog post, we provide 7 tips on how, as part of our world-class faculty, you can deliver one of the most popular sessions at our events. But before reading...
Poor data quality and analytics drive down business value. In fact, Gartner estimated that the average financial impact of poor data quality on organizations is $9.7 million per year. But bad data is much more than a cost center. By eroding trust in information, analytics and the business decisions based on these, it is a serious impediment to digital transformation.