Page 41

DDoS Attack

Posted on June 15th, 2010 by Victoria Pal in Tech

DDoS attackThe end goal of Denial-of-service attack (DoS attack) or distributed denial-of-service attack (DDoS attack) is to make a resource unavailable to its users. The attack can be carried out on different levels, depending on the target and people involved. In essence, the attack represents a highly intense attempt by one person or more to hinder web sites or specialized internet services from running properly, causing financial losses to the owner of that resource.

Depending on the motivation behind the attack, different targets might suffer from it. The attack can be implemented by a single person, carrying out a personal agenda. Such an attack is highly unlikely to cause too much trouble and will only be temporary. A much bigger problem are organized attacks performed by a group of individuals in pursuit of financial gains. Such attacks can be sustained for longer periods of time and might as well ruin companies relying on online presence. Banks, online retailers, government sites, payment gateways and even root nameservers can become easy prey for organized DDoS attacks.

Read more...

Server Cache

Posted on June 10th, 2010 by Victoria Pal in Monitoring

Server CacheWeb caching describes the process of storing frequently used objects closer to the user through browser, proxy or server cache. Caching frequently used content has a positive effect in most cases because it reduces server load, bandwidth and latency. It also helps increase the responsiveness for users on the web.

Modern browsers support caching and require most elements of a page only during the first time a user opens that page. Then CSS files, images and other media are stored in a dedicated folder. This leads to faster loading times and better user experience. The whole process happens without any input from the user.

Read more...

Server Virtualization Is Growing

Posted on June 2nd, 2010 by Victoria Pal in Tech

Server Virtualization"I think there is a world market for maybe five computers" is a famous misquote of Thomas J. Watson. The same Tom Watson, who took Computing Tabulating Recording Corporation and turned into what is today known as IBM. In essence, whoever said that the world will only need 5 computers wasn't totally wrong. Virtualization was first implemented in 1960 to aid better utilization of large mainframe hardware. 50 years later, it has grown to be part of all Fortune 100 success stories.

Lately, "fake servers" became a slightly satiric way to say "virtual servers". Why? Mainly because they are gaining a larger market share. Large enterprises were quick to adopt the virtual machines approach, while small business started late. However, by year-end 2010, enterprises with 100 to 999 employees will have a higher penetration of virtual machines deployed than the Global 500. For long years, small businesses could not afford even the entry level products, and this is what changed lately. Increased competition by server vendors has made server virtualization technology affordable to smaller companies.

Read more...

Remote Monitoring - Server, Desktop, Video and Mind

Posted on May 27th, 2010 by Victoria Pal in Monitoring

Remote monitoringI was on the computer, doing a search on "Remote Monitoring" the other day, and I came across one obvious, two common and one bizarre result.

Remote Server Monitoring

Remote monitoring is a phrase that needs context. It shouldn't even be the text itself, but it might be the main theme of a site, book, conversation, etc. For example, when you are reading this blog, you are thinking of website monitoring and remote server monitoring, which is quite normal. In fact, it means we've done our job right and you are here to learn more about monitoring of various network devices. Basically, what www.websitepulse.com is all about.

Read more...

The 100% Server Uptime Illusion

Posted on May 25th, 2010 by Victoria Pal in Tech

100% server uptimeYou've seen 98.9%, 99.9% and even 100% server uptime labels on hosting sites. The first thought which comes up to people is "Well, 99.9% sounds pretty well, 0.01% is no big deal" and they are right. The fact many people are missing is the fact that even if the server is reachable, it doesn't necessarily mean it's fully functional. So many other things can go wrong with a server. A 99.9% uptime sticker is no guarantee that your business won't face other problems, while the server is up and running. Hosting providers calculate uptime in ways that aren’t intuitive from a user’s perspective, not taking in account certain website downtime.

  • First of all, not all downtime is counted. Scheduled maintenance is not part of the equation. Several hours of downtime is not what hosting companies would include. When monthly server uptime is calculated, such planned service interruptions are not included (deducted).
  • Some, if not most, hosting providers tend to overlook shorter downtime periods. 3-5 minutes of unavailability are not considered significant and don't end up on the 99.9% sticker. You can see how multiple short downtime periods can make a difference at the end of the day.
  • 0.01% means twice the trouble. 99.9% uptime means 8 hours and 45 minutes downtime per year. So when you go down a notch to 98.9%, your calculated downtime goes up twice to 17 hours and 30 minutes.
  • Resources are limited. "Unlimited" is just a nice way to sugarcoat "enough for the majority of common users". Your business might not be a common one and demand might be high. There might be other significant sites, sharing the same hosting hardware. At one point it can all build up to an overload, resulting in poor loading times, broken transactions, even website. The server will still be online, but the performance won't meet your needs. Application monitoring is a good way to find out such problems quick and easy.

100% uptime is not feasible. Eventually something will go wrong. Network backbones will cause problems, power failures will happen, server software and hardware will fail every now and then and human error will always be a considerable factor. So, my advice is: Don't try to find the ultimate provider, as it doesn't exist. Instead find a decent provider with good server uptime and performance, 24 hour technical support and frequent backups. Oh, and why don't you go for a nice remote website monitoring service, just in case. It might be the best money you have ever spent.

Read more...