Message queues todayApril 30, 2019
Monitoring the way to Industrial SME 4.0May 15, 2019
Frontend Monitoring for 21st Century Businesses
Whether you have an e-commerce platform or just a website where you display your work, websites are becoming the showcases of most businesses today. This means that it is more crucial than ever to have the whole system under control, from the infrastructure to the most visual part.
As we saw in other posts on this blog, backend control (the logic that makes a web page work) is much more common and widespread. But we should also not forget the frontend (part with which users interact). If we take the example to a more classic store, it seems clear that it is as important to keep the store clean as the storefront.
What does frontend monitoring involve?
Lately, there is a trend that is taking a lot of the backend to the front, so sometimes the dividing line is blurred even more. This happens especially in the new web apps technologies, in which the client is the one who does the most part of the operations possible, leaving the server the only work of managing the database and a little more.
Let's go back to the example of the store
If we continue with the example of the store, we can understand perfectly, from where comes the importance of taking care of the frontend, in the same way that we take care of the backend.
Imagine that the store, indeed, has perfectly orchestrated its warehouse, with an exhaustive control of the stock, the employees working in perfect synchronism and not a speck of dust on the floor. Sounds good, doesn't it? But what if the automatic door takes too long to open or it’s blocked, or if there is no light in the window? How many potential customers do you think are being lost?
The same thing happens on websites, you could say that it's happening even more considering the large amount of offers that can be found, if a website does not work or takes too long to respond nobody tries again, you go to the next and spend your money there.
Any small delay in the loading of a website causes losses
In 2006, Marissa Mayer, Google's vice president before moving to Yahoo, conducted an experiment in which she increased the search engine results from 10 to 30, causing a drop in traffic and profits of 20%.
Analyzing the data and looking for the reasons for these losses, it was concluded that the cause of the traffic drop was due to the increased loading time of the page from 0.4 to 0.9 seconds.
A delay of half a second caused losses of 20%. According to Google's vice president "Users respond to speed".
Analysis is not possible without data collection
It is well known that Google is one of the companies with the most control over its processes. Marissa Mayer, could not have reached the conclusions she reached if it were not for all the data collected during the experiment.
Not only for A/B tests or experiments, at Muutech we firmly believe that monitoring and analyzing the data obtained is the key to the success of your online work.
According to another study for Amazon, they detected that an almost insignificant delay of 100ms in the load of their web, would cause them a 1% of losses. With this data, an online broker could be losing millions of dollars every second if his website is slower than the competitors.
The importance of frontend monitoring
Without going into too much detail, we could know the status of a web page or web service by asking us these questions. Is my web site available? Is it working properly? Is it fast enough?
Although the theory is simple, in many cases, answering these questions is very complicated. Perhaps in your test conditions, everything works correctly, but it is necessary to know if it works well in all cases. If we recover the previous example, it could be the case that the sensor of the automatic door did not detect people with black jackets and therefore did not open in their path, loosing a lot of potential clients.
Zabbix and web monitoring
To complete the monitoring task, at Muutech, we took advantage of the open source monitoring tool Zabbix, capable of defining different web scenarios consisting of one or more requests or steps. They are executed periodically from the server to check the operation of the service.
This type of monitoring, also called synthetic monitoring, unlike RUM (Real User Monitoring) monitoring, generates external traffic different from the one generated by users, in order to make controlled requests and analyze response data as:
- Download speed for all steps
- Number of steps with a fault
- Error Messages
- Response speed
- Response Code
Whether it's visualizing key graphs, monitoring page views, storing error histories or even generating automatic alerts, from Zabbix and with the help of Muutech, you'll be able to understand how the web works, detect any failure and learn how to fix it easily.
Undoubtedly, the monitoring of websites, and especially the frontend is more necessary than ever, in the same way that you would clean the window of your physical store and put your products in the best possible way, you should be doing the same with your web pages. Whether you are a small e-commerce or a large company, you will need to control what your users are seeing on their screens.