3 Ways to go Green with IT

By Tom McDonald | Apr 22, 2011 2:29:00 PM

Upgrading your computer

Everyone likes upgrading their PC because it means they can now use a faster computer with more features, but it’s also a great way to save money on electricity costs while going green. As technology advances so does the techniques used to save power. Anyone who had a laptop a decade ago remembers the problems with heat, size and horrible battery life. Nowadays these problems are barely a concern with laptop battery life being at minimal 3-4 hours, but generally can go up to 10 hours or beyond. New breakthroughs in battery technology have helped, but it has been the tech industry as a whole that has increased battery life. As new CPUs and Memory chips are being created, one of the main goals is to make sure the next generation runs faster, but also uses less electricity and generates less heat. This is done through new techniques created to create smaller transistors, which allows more to be placed on a single chip, and less electricity to be needed to use them. This combined with new features that keep energy consumption in mind have allows computers to lower their speeds when idle to decrease and consume less power, but can increase speed again when needed.

Read More >

Comparison between traditional IT BC plan and an VMware implementation

By Tom McDonald | Apr 15, 2011 12:17:00 PM

Many business’s IT infrastructures are based around this set up, with the operating system bound to a specific set of hardware and a specific Application bound to that OS. From there the server runs at about 5-10% of its capacity for most of the day with it peaking only during heavy usage. The data has to be backed up to a local SAN for recovery purposes, generally needing special software to be employed to ensure its being backed up fully and efficiently.

If this is a vital server and has a disaster recovery and business continuity plan implemented with it to ensure that downtime is kept as low as possible, then it will have an identical server installed for failover. This server is only used if the original server fails, but is still uses power and space. Not only that, but this server has to be the same identical model, containing the same hardware configuration, firmware, and local storage to ensure immediate complete compatibility with the original server. This adds cost as you need to have a second set of the hardware and it has to be that same model, limiting upgrade paths for the business.

This set up generally falls into the “Boot and Pray” model of disaster recovery, as the complexity of the set up causes the admin to hope that it works rather than being able to guarantee a smooth transition from server. This has to be done with every vital server that needs to have a redundant back up and each one has its own unique set up, creating a large amount of complexity that is involved with managing all these different machines. This complexity increases the company’s RTO and RPO and makes recovering a much larger ordeal.

Read More >

Disk Fragmentation, how it happens and what Defragging actually does

By Tom McDonald | Apr 1, 2011 9:15:00 AM

What Causes Fragmentation

As you use your computer fragmentation happens overtime, this is caused from adding and deleting files. What happens is when you delete a file the file isn’t actually delete it is just marked as ok to write over, so the next time windows needs to write a new file it just looks for the first available spot that is ok to write to and adds it there.

Read More >

What is RAM? A quick summary of what RAM is and how upgrading helps you

By Tom McDonald | Mar 30, 2011 2:05:00 PM

Most people have hear d the term RAM used before among computer users although many don’t know what exactly it is or does, they know that having more is probably useful to them. RAM literally stands for Random Access Memory and is used to fix the problem of reading data from Spinning Drives. Your computer writes/reads data from the hard drive but because traditional hard drives are spinning mechanical devices, it has the problem of being slow, so slow that the rest of the computer has to wait for the hard drive to finish before moving on to its next task. This creates a “bottleneck,” which is simply the slowest part of the computer, which forces the rest of the computer to slow down to that pace. Because hard drives could only go so fast they formed a way to get around this, RAM. What the RAM does is store files temporality on the chip, these files are ones that the OS needs to access constantly or needs quick access to at any given moment. By having this split between the Hard drive and RAM you create a place to store files permanently and another to store ones you need to access quickly, these files tend to be programs that are currently open, which is why it takes so long to load the program the first time, but only a few seconds to gain access to it after its loaded.

Read More >

8 Things to ask when choosing the right Managed Services Provider for your IT needs

By Tom McDonald | Mar 24, 2011 10:09:00 AM

Our last article, 5 Different Managed Service Provider (MSP) price models, how to choose the best one for your SMB, focused on the pricing models of different MSPs and how that would relate to your own business. This article focuses on what they actually cover and what to look for in the contract itself. Here are 7 topic areas that you should ask your MSP before signing the contract:

Read More >