Welcome!

Agile Computing Authors: Elizabeth White, Yeshim Deniz, Liz McMillan, Pat Romanski, Andy Thurai

Related Topics: @DevOpsSummit, Java IoT, Microservices Expo, Linux Containers, Containers Expo Blog, Agile Computing, @CloudExpo

@DevOpsSummit: Article

DevOps to Drive Business | @DevOpsSummit @Logzio #DevOps #Microservices

Marketers and DevOps engineers will need to work together to use the data to solve certain problems

DevOps Is Changing from Solving Problems to Driving Business

DevOps has traditionally played important roles in development and IT operations, but the practice is quickly becoming core to other business functions such as customer success, business intelligence, and marketing analytics.

Modern marketers today are driven by data and rely on many different analytics tools. They need DevOps engineers in general and server log data specifically to do their jobs well. Here's why: Server log files contain the only data that is completely full and accurate in the context of how search engines such as Google are crawling websites.

If a search engine spider encounters an error and does not load a page, the webmaster does not know because traditional traffic analytics tools such as Google Analytics do not track those issues. Log file data, on the other hand, does reveals what problems bots are encountering on a website - and many of those issues can hurt a site's appearance and rankings in Google.

Too many response code errors can lead Google to cut the rate at which it crawls your company's website. You want to monitor and confirm that search engines are crawling everything that you want to appear in public search results (everything else should be blocking search engine bots). When pages are assigned new URLs, it's important that the redirection will refer incoming links appropriately.

What Is SEO?
Contrary to what too many charlatans still proclaim (and unfortunately too many people still believe), "SEO" is not a bag of tricks to rank first in Google. This is 2015, not 2000. As I explain in a personal essay of mine and whenever I speak at digital marketing conferences, here is the definition of "SEO":

SEO is helping search engines to crawl, parse, index, and then display your website in organic search results for desired, relevant keywords and search queries.

Server log files - in addition to items including XML sitemaps, schema markup, website hierarchy, internal linking practices, meta tags, mobile-responsive design, and site speed - must be examined and addressed when needed to do exactly that.

How to Examine Server Log Files
DevOps engineers have traditionally used proprietary software to analyze the logs of their systems, networks, servers, and applications. However, the open-source ELK Stack - Elasticsearch, Logstash, and Kibana - has become extremely popular and is now used by companies including Netflix, LinkedIn, Facebook, Microsoft, and Cisco. (We use the ELK Stack to monitor our own environment, and to help the DevOps community, our CEO, Tomer Levy, has written a guide to deploying the platform.)

Regardless of how you choose to analyze your server log files, marketers and DevOps engineers will need to work together to use the data to solve certain problems. Here is a partial list of them (with examples from our own web server using one of our analytical dashboards).

What DevOps Engineers Need to See
Server Bot Crawl Volume

The number of requests made by search engine crawlers is important to know. If the marketing and sales teams want website content to be included in search results in Yandex in Russia but the search engine is not crawling your website, that is a significant issue. (In response, you'd want to see the Yandex Webmaster documentation and this reference article on Search Engine Land.)

Response code errors

For those who might need a refresher, the popular SEO software company Moz has a great guide on the meanings behind different status codes. I have a Logz.io alert system setup to tell me when 4XX and 5XX errors are found because those are significant in both marketing and IT contexts.

Temporary redirects

302 redirects, which are used when a URL is redirected only for a temporary period of time, do not pass what SEOs call "link juice" to the new URL. (The more and better the links that point to a given web page, the better the chance that it will rank highly in search engines.) It's better to use 301 redirects (permanent redirects) instead.

Crawl budget waste & duplicate crawling

Google assigns a crawl budget to every website based on a lot of different factors -- if a website's budget is, say, 1 GB of page data per day, then it is crucial to ensure that the 1 GB consists only of pages that the company wants to appear in public search results.

Even though technical SEOs and DevOps engineers can block (all or any) search engines in robots.txt files and meta-robots tags, Google might still be crawling advertising landing pages, internal scripts, web pages with sensitive information, and more. Log files will list every URL that is being crawled by search engines -- despite what you may have instructed them not to access.

If you hit your given crawl limit but still have new materials on your website that you want to be found in search results. Google might leave your website before indexing it. Duplicate URL crawling -- often through the addition of URL parameters in the tracking of marketing campaigns -- is one of the most common causes of crawl waste.

To fix this issue, I would see the guides on Google and Search Engine Land here, here, here, and here.

Crawl priority

Google might have deemed an important part of your website to be not worthy of being crawled too often. The log files will show what individuals and subdirectories as a whole are crawled most and least often. If this is the case, you can change the crawl-priority settings in your XML sitemaps to tell Google that a given part of your site is updated enough that it deserves to be crawled more frequently.

Last crawl date

Have you added something to your website that you need Google to index as soon as possible? The log files contain the data that when tell you when a URL was last crawled by a given search engine.

Crawl budget

The instances of Google crawling our website is one important thing that I personally like to check because the overall crawl volume is a rough proxy for how much the search engine "likes" your site. After all, Google does not want to waste resources on poor websites.

From Negative to Positive
DevOps used to be all about problems - engineers, after all, have always monitored platform performance, fixed cluster disconnects, and taken care of similar issues. If an employee at a company knew the operations person, it meant that the employee had a lot of problems.

Today, DevOps has the opportunity to be more visible and help in a positive way by becoming the information driver within an organization. DevOps engineers now provide the data that supports countless business decisions - and marketing is just another area in which they can help.

Logz.io makes log data meaningful by offering ELK, the world's most popular open-source log analytics platform, as a service with features including alerts, role-based access, and unlimited scalability. If you are interested in the technical SEO dashboards in this article (and other dashboards for DevOps purposes), you can get more information here.

More Stories By Samuel Scott

Samuel Scott is Director of Marcom for log analytics software platform Logz.io. Follow him and Logz.io on Twitter.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


IoT & Smart Cities Stories
CloudEXPO New York 2018, colocated with DXWorldEXPO New York 2018 will be held November 11-13, 2018, in New York City and will bring together Cloud Computing, FinTech and Blockchain, Digital Transformation, Big Data, Internet of Things, DevOps, AI, Machine Learning and WebRTC to one location.
The best way to leverage your Cloud Expo presence as a sponsor and exhibitor is to plan your news announcements around our events. The press covering Cloud Expo and @ThingsExpo will have access to these releases and will amplify your news announcements. More than two dozen Cloud companies either set deals at our shows or have announced their mergers and acquisitions at Cloud Expo. Product announcements during our show provide your company with the most reach through our targeted audiences.
The Internet of Things will challenge the status quo of how IT and development organizations operate. Or will it? Certainly the fog layer of IoT requires special insights about data ontology, security and transactional integrity. But the developmental challenges are the same: People, Process and Platform and how we integrate our thinking to solve complicated problems. In his session at 19th Cloud Expo, Craig Sproule, CEO of Metavine, demonstrated how to move beyond today's coding paradigm and sh...
@DevOpsSummit at Cloud Expo, taking place November 12-13 in New York City, NY, is co-located with 22nd international CloudEXPO | first international DXWorldEXPO and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is no time t...
What are the new priorities for the connected business? First: businesses need to think differently about the types of connections they will need to make – these span well beyond the traditional app to app into more modern forms of integration including SaaS integrations, mobile integrations, APIs, device integration and Big Data integration. It’s important these are unified together vs. doing them all piecemeal. Second, these types of connections need to be simple to design, adapt and configure...
Cell networks have the advantage of long-range communications, reaching an estimated 90% of the world. But cell networks such as 2G, 3G and LTE consume lots of power and were designed for connecting people. They are not optimized for low- or battery-powered devices or for IoT applications with infrequently transmitted data. Cell IoT modules that support narrow-band IoT and 4G cell networks will enable cell connectivity, device management, and app enablement for low-power wide-area network IoT. B...
Contextual Analytics of various threat data provides a deeper understanding of a given threat and enables identification of unknown threat vectors. In his session at @ThingsExpo, David Dufour, Head of Security Architecture, IoT, Webroot, Inc., discussed how through the use of Big Data analytics and deep data correlation across different threat types, it is possible to gain a better understanding of where, how and to what level of danger a malicious actor poses to an organization, and to determin...
Nicolas Fierro is CEO of MIMIR Blockchain Solutions. He is a programmer, technologist, and operations dev who has worked with Ethereum and blockchain since 2014. His knowledge in blockchain dates to when he performed dev ops services to the Ethereum Foundation as one the privileged few developers to work with the original core team in Switzerland.
Cloud-enabled transformation has evolved from cost saving measure to business innovation strategy -- one that combines the cloud with cognitive capabilities to drive market disruption. Learn how you can achieve the insight and agility you need to gain a competitive advantage. Industry-acclaimed CTO and cloud expert, Shankar Kalyana presents. Only the most exceptional IBMers are appointed with the rare distinction of IBM Fellow, the highest technical honor in the company. Shankar has also receive...
Digital Transformation and Disruption, Amazon Style - What You Can Learn. Chris Kocher is a co-founder of Grey Heron, a management and strategic marketing consulting firm. He has 25+ years in both strategic and hands-on operating experience helping executives and investors build revenues and shareholder value. He has consulted with over 130 companies on innovating with new business models, product strategies and monetization. Chris has held management positions at HP and Symantec in addition to ...