Page 40

Iceland Data Centers – Location, Location, Location

Posted on February 14th, 2012 by Victoria Pal in Tech

IcelandOver the last few years the northern countries have been busy promoting ideal conditions for construction of data centers. Most of them pitch the idea of green data centers, powered by renewable energy sources, free cooling, ideal location and all that jazz.

There is a great site called Data Center Map. They provide detailed information on major data server locations. After only a couple of minutes on the site you will begin to see patterns. Data centers are usually built on international fiber optic crossroads, places with better economy conditions, close to power plants and any place where location can provide advantage in one way or another.


How WoW Manage Their Downtime

Posted on February 9th, 2012 by Victoria Pal in Tech

World of WarcraftWorld Of Warcraft (WoW) servers, like other servers, go down from time to time. The great thing about WoW downtime is that it is, most of the time, predictable. Server downtime occurs when the Warcraft servers are inoperable or inaccessible to users. Downtime can occur when Realm servers are temporarily unavailable for maintenance or when an unexpected error appears within the Authentication servers.

Unscheduled downtime at Blizzard rarely lasts more than a couple of hours. This is possible mostly due to their scheduled downtime. Wait, what? Yes, they spend a lot of time maintaining the servers you login to, well, let’s be honest – every day!


How the Great Firewall of China Works

Posted on January 30th, 2012 by Victoria Pal in

The Great Firewall of China

China with its largest population of web users in the world, has one of the most restricted internets, making sure that netizens cannot post nor read about information the government deems threatening.


What Does Robots.txt Do?

Posted on December 28th, 2011 by Victoria Pal in Tech

Robots.txtThe robots.txt file simply contains instructions for search engine robots on what to do with a particular website. While the search engine robots follow the instructions from that file, spam bots simply ignore it in most cases.

A web robot is a program that checks the content of a web page. If a robot is about to crawl a website, it will first check the robots.txt file for instructions. A command “Disallow”, for example, tells the robot not to visit a given set of pages on this site. Web administrators use this file to restrict the bots to index the content of a particular website for different reasons - they do not want the content to be accessible by other users; the website is under construction, or a certain part of the content must be hidden from the public.


The Importance of the WordPress Expires Header

Posted on December 15th, 2011 by Victoria Pal in Tech

WordPress Expire HeaderThe importance of “expires header” is growing along with the web page designs which are becoming richer in scripts, images, Flash, etc.

As a result of the growing complexity of web designs, a web page takes longer to load, which is why the site needs an expires header. It simply makes all components such as stylesheets, images and others cacheable or, in other words, prevents unnecessary HTTP requests after the first page view and hence load time is reduced.