Shared or Dedicated Hosting – The Final Answer

Shared vs. Dedicated HostingEntrepreneurs often face the decision between shared and dedicated hosting. Of course, VPS hosting also plays a factor, but that’s for a different post. Both shared and dedicated hosting serve purposes, but which one is right for you?

There are some specific guidelines you should follow when choosing the right type of hosting for your needs. It starts with the nature of your business and goes further into what you expect out of your website. Here are some things to consider before making the decision between shared and dedicated hosting.

Are you an Ecommerce Business?

If you sell online, dedicated hosting is your only real choice. You have an obligation to protect the data of your customers including names, emails, addresses and credit card numbers. With shared hosting, you don’t gain the same security as dedicated hosting.

Along with the security aspect, if you’re an ecommerce business, you probably want to make sure your website performs at its’ best. This is only going to happen with dedicated hosting, especially if you see large amounts of traffic, even if it’s only for part of the year. Shared hosting simply cannot handle the type of traffic dedicated hosting can handle.

Is your Website Just a Company Profile?

If you don’t have plans to sell anything online and your website will just provide a profile for your company, you may be just fine with shared hosting. Since shared hosting is much less expensive and you won’t need to process any payments or create a large number of product pages, it will work very well for a multi-page company website.

However, if your website starts to receive a large influx of traffic, shared hosting won’t do the trick anymore. As long as you see less than 500 visitors per day, you’ll be fine with shared hosting for a company profile website.

Will SEO be one of your Marketing Strategies?

Dedicated HostingSEO or search engine optimization is a great way to drive traffic to your website and business. If this will be one of your marketing strategies, you may need dedicated hosting. One of the major SEO factors is the speed of your website and it will load fastest with dedicated hosting.

Shared hosting will work fine for okay SEO, but if you want to take over the top rankings on Google and you want to really gain traffic, you need dedicated hosting to support it.

Those only focusing on a smaller, local market (population of 200K or less) will probably do just fine with shared hosting. However, if your local market is all of New York City or you focus on a national or global market, dedicated hosting is probably the best choice.

Do you Plan to Add Blog Posts, Videos and Images?

If you plan to blog and your blog posts will include images and/or video, you may need to upgrade to dedicated hosting at some point. Images and video take up a ton of room and even though many shared hosting accounts claim to give you unlimited space; you will slow down the load times of your website and other websites.

Often, when this happens, your hosting company will tell you to upgrade your account to VPS or dedicated hosting, depending on what they offer. If you plan to put up a large number of images or videos, you probably want to start off with dedicated hosting.

Are you Just Getting Started?

Dedicated Web HostingMaybe you just started your business and you really don’t know what to do. If you’ll be managing your hosting and your website on your own, start with shared hosting. Even if you have to upgrade in a few months, using shared hosting will help you get your feet wet.

There are several reasons to use shared hosting and several reasons to use dedicated hosting. They both serve a different market. It’s important to understand which type of hosting works best for you before you just jump in and start spending money.

Essential vs. Wildcard vs. EV SSL Certificates

SSL CertificatesSSL certificates are an important part of online business. They bind small digital files with a cryptographic key to ensure safety during online transactions. After they have been installed on a server, they will activate a padlock, which will show up in the browser. This allows for a secure connection between the browser and the server.

Why Use SSL Certificates

There are two main reasons to use SSL Certificate: Authentication and Security. When using a SSL Certificate you will not only gain encryption, but they also provide authentication. You will be sure the information is sending to the right server and not to a hacker’s server. Since your customers will be sending information throughout several computers, it’s necessary to authenticate your server before the information is delivered.

Security is probably the number one reason to use SSL certificate. The encryption makes it very difficult for hackers to steal credit card information, passwords, email addresses and other bits of information. The SSL Certificate makes the information unreadable to everybody except the server receiving your information.

Types of SSL Certificates

Types of SSL CertificatesAll SSL certificates use the same methods for protection and authentication, for the most part. While this is the case and all certificates must be verified by the Certificate Authority, there are a few different types.

Essential SSL

An Essential SSL certificate will secure just one domain. It will secure the www.YourDomain.com and the YourDomain.com versions and the CSR must be generated for both. You will get a green padlock in the address bar of the browser with this type of SSL certificate.

The owner of the Essential SSL Certificate will need to complete the domain control validation through email. This is done with either the domain’s WHOIS Registrant email OR one featuring admin, hostmaster, postmaster or webmaster. Essential SSL also provides quick issuance.

Wildcard SSL

A Wildcard SSL Certificate will secure one domain and all of the subdomains. It will secure the www.YourDomain.com, the YourDomain.com and the subdomain.YourDomain.com. The CSR will only need to be generated for the .Yourdomain.com.

With the Wildcard SSL, you get a green padlock in the address bar of the browser. You will need to complete the domain control validation through email with the WHOIS Registrant’s email or the admin, hostmaster, postmaster or webmaster email options. This SSL Certificate also provides quick issuance.

EV SSL

The EV SSL will only secure one domain in the www.YourDomain.com and YourDomain.com form. The CSR must be generated for both. This SSL Certificate will not only provide a green padlock in the address bar of the browsers, but will also provide a green address bar.

The legitimacy of your domain must be verified with additional documents provided to the Comodo. The verification process does require a callback to the actual domain owner. Issuance takes time and can take up to 6 weeks with this SSL Certificate.

EV certificates provide the maximum amount of trust to visitors, but they also require the most effort to verify.

Secured SiteAll three SSL certificate provide different levels of protection. The Essential and Wildcard both provide a $10,000 warranty, while the EV provides a $250,000 warranty. The EV is also the most expensive and the Essential is usually the most affordable. However, the EV takes the longest to verify, as well.

Overall, the three SSL certificates will do about the same thing. However, they are a bit different and you may need one for your subdomains or you may want the highest level of protection. Make sure you weigh the three options and figure out what you really need before you purchase the right SSL certificate for your specific needs.

Is it Time for a Dedicated Server?

Dedicated ServerYou started with a shared hosting account years ago and have since graduated to VPS hosting. Maybe you’ve even gone through a few levels of VPS hosting and you think it’s time for a dedicated server.

Making the jump from shared, cloud or VPS hosting to a dedicated server isn’t always easy. It requires the right budget and the ability to deal with the migration period. Here are a few of the signs you should look for telling you it’s time to upgrade to a dedicated server.

You’ve Seen a Huge Traffic Increase

Maybe the number one reason to upgrade to a dedicated server, when traffic increases, you need better hosting. If you stay on a shared server, you may be forced to upgrade due to crashing the server. In addition, you may slow down every site, including yours, on the server due to the larger amount of traffic.

When traffic increases past about 1,000 visitors per day, shared hosting simply won’t cut it anymore. Sure, you can upgrade to VPS hosting, but if your growth in traffic continues, you’ll be on a dedicated server in no time.

You Know You’re About to Experience a High Volume of Traffic

Using a Dedicated ServerEven if you don’t have a high volume of traffic now, if you know a specific even will cause a huge amount of traffic, you may need a dedicated server. A great example of this has been seen time and time again on the show Shark Tank. Many entrepreneurs go onto the show without the right hosting and watch their website crash within minutes of airing.

Without a dedicated server, or in this case, multiple dedicated servers, those going on Shark Tank may lose their biggest sales day ever. Some have gone in prepared and watched the sales come in, while others have watched their website fall. If you know you will see a huge spike in traffic, a dedicated server is necessary.

Your Site Moves Too Slow

While traffic is one of the major causes of a slow website, if it’s on shared or VPS hosting, it’s not the only cause. If your site loads slowly, you may need to upgrade your hosting. Often, you need better hosting and more resources to fully support your site. Get a dedicated server and the problem will go away.

You Plan to Run Multiple Sites

You can run as many domains as you want on a shared hosting or even a VPS hosting package. However, if you run too many sites with too much traffic, it will cause issues. Having a dedicated server to split between your sites will provide the necessary resources to support each one properly.

There are several reasons to upgrade to a dedicated server. If you know you need one, here are a few of the benefits you can expect.

More Control

Shared and VPS hosting don’t give you full control. In fact, no other type of hosting gives you complete server control like dedicated hosting does. You will be able to do anything you want with your server and you won’t need to worry about other sites hogging resources.

Better Site Speed

The fastest hosting on the planet remains dedicated hosting. You won’t find better hosting if you need better speed. With the world needing things instantly and refusing to wait even a few seconds, you cannot afford to have a website that loads slowly.

Better Security

Upgrade to a Dedicated ServerDo you get better security by living on a large property of your own with a gate surrounding it or by living in a neighborhood with hundreds of neighbors? Shared hosting is like living in an apartment building, while VPS hosting is like living in a subdivision. However, dedicated hosting is like living on your own, large, fully secured property.

With dedicated hosting, you don’t have to worry about any type of neighbors causing you issues. Since you’re the only one on the server, any issues come directly from you and nobody else.

Dedicated hosting is the best choice, in many cases, for businesses that have outgrown shared or VPS hosting. It will provide many benefits and will give you a great foundation for your website.

Virtual vs. Physical Server – Which One is Right for You?

Virtual vs. Physical ServerVirtualization is a very popular way to build your IT structure. Many companies are choosing cloud and virtual servers over a physical server for their business applications. However, this may not be the right choice for your company.

In some cases, the need for a physical server and the benefits it provides completely outweigh the advantages gained from a virtual or cloud server. Some IT directors will only use physical machines because of the reliability and efficiency.

It’s not a one-size-fits-all situation and every business is different. Depending on your specific needs, you may want a physical server or you may want a virtual server. It all depends on your goals. Here are some of the things to consider before making your final decision.

Cost

For most businesses, the main deciding factor will be cost. You have a budget you must stick to and you may not have enough cash to support physical servers. Virtualization is often chosen to save you money, but there are exceptions to this rule.

The cost for physical servers depends on how many you need to support the traffic you will receive and your data storage needs. It’s important to look at the actual cost of what you need when it comes to both physical and virtual servers. In some cases, those needing multiple servers to support a high traffic site will actually spend less on physical servers than virtual or cloud servers.

Data space and network costs can be very expensive in the cloud environment. If you have a spike in traffic, it can make the overall cost of your virtual servers very expensive. A spike in traffic, with the right physical server set up, won’t cost you anything extra, in most cases.

Performance

Physical ServerAnother huge factor to consider for businesses deciding between virtual and physical servers is performance. If you take two physical servers and you virtualize one of them, the regular server will always out-perform the virtualized one. There’s a performance penalty for virtualizing.

However, when you take them out of the one-on-one comparison, the efficiency of the virtualized server is always going to be better. Dedicated servers only run at about 30% to 50% of their capacity. This leaves a buffer for traffic spikes, while a virtualized server will use close to 100% of the resources.

Your needs may determine the performance necessary for your business. If you run an ecommerce website and one minute of downtime or slow time could cost you hundreds of dollars, your decision is quite different compared to a company not selling products or services directly on their website.

Security

Virtual ServerMaybe you run a company using and collecting sensitive information on a regular basis. Security may be one of your biggest concerns. When looking at the security of a virtual server compared to a physical server, the physical server will always win. You won’t be sharing the resource with anybody else and there are many factors to consider with the security of a virtual server.

There are several important factors to consider when choosing between physical and virtual servers for your business. These three are the main deciding factors for most companies and you must take them into consideration first.

How to Boost Your Server Efficiency

Dedicated ServersEfficient servers keep customers happy. When your servers lack efficiency customers may complain, ask for a refund, bash you on social media and give you bad reviews. All of these negative things may lead to issues landing new customers and keep current customers.

Instead of losing customers, you just need to understand how to make your servers super-efficient. Here are a few ways you can boost efficiency for your servers.

Management Tools

The right tools can make it easy to manage your servers, optimize them and create a better performing server. When you can get more performance per watt and maximize uptime, you’ll be able to keep your customers happier.

A simple management tool for your servers will help to manage power and provide safeguards against a power outage. It can also help to ensure you maximize the resources of your servers. Other tills will also allow you to check the health of your servers and look at each cluster, as necessary.

Management tools go a very long way to ensuring full server efficiency. Make sure you have the right tools in place for monitoring power and the actual servers.

Virtualization

Maybe you don’t have the capital or space for more servers, but you want to get the most out of your servers, you can virtualize. This will allow you to create multiple computers out of one, which allows for more efficiency. It takes sophisticated software to partition the servers, but once it’s done, the efficiency will be much better.

With virtualization, you can use multiple operating systems on the same machine, along with keep the expenses lower. Of course, when you don’t have to buy more servers, it saves you space and cash.

Buy Recent Equipment

Server EfficiencyWhen it’s time to buy new servers, make sure they really are new servers. Buying outdated equipment can cause issues with efficiency. Technology moves so fast that even a server a few years old may be outdated. Using the latest technology is probably the easiest way to increase server efficiency.

Keeping your IT up-to-date is vital to your company, especially if you’re a hosting company. Even if you’re not and you just use your servers for your own business, you need the best possible equipment available. With lesser equipment comes latency and many other issues.

Monitor 24/7

Server issues can happen at any time and they don’t choose when it’s most convenient for you. If you’re not monitoring your servers every single minute of every day, you may not have the most efficient servers possible. What happens if your server goes down and nobody is around to fix the issue?

You can outsource server monitoring to a professional if you don’t have the ability to monitor your servers 24/7. This is probably the most important thing you can do and the best for keeping your servers efficient every single minute of every day.

Server efficiency is vital to your success. Whether you’re a hosting company or you just use the servers for your business, you need the most efficiency possible. Use the tips above to increase server efficiency and keep those machines running well.

Do you Really Need a CDN?

A CDN or Content Delivery Network may be a necessity for your website. However, it’s important to understand whether this is something you need or it’s something you can do without.

The online world has become much more fast-paced and as a website owner, you have to keep up. With bounce rates increasing and those browsing expecting things faster, you need the best possible tools to make your website and blog load fast. A CDN may be one of those tools.

What Exactly is a CDN?

A CDN is a Content Delivery Network, which is basically hosting for static content, such as JavaScript, CSS or images. It helps to take the burden off of serving static content by allowing your website to serve dynamic content faster without being throttled by serving static content.

The interconnected servers making up the CDN will help to take the pressure off of the server you use making your website faster. They are spread across the globe, which also allows users in different areas of the world to access cached copies of your content from a server located much closer to their location.

CDN Benefits

CDNS are known as “low-hanging fruit” compared to other page speed solutions. They are a very easy step to improve the speed of your website and provide plenty of benefits. A faster website leads to lower bounce rates, which has a direct effect on your ranking in the search engines. It also helps to provide a higher quality experience for your visitors.

You will also save bandwidth with a CDN on your host server. This will help to increase your website’s availability and provide a better experience when dealing with excessive traffic.

Often, a CDN will actually result in a cleaner HTML markup giving you even more advantages. The CDN will support the latest HTTP Protocols, as well, which will help to give you even more speed.

Integrating a CDN for your Website

The process of adding a CDN to your website is pretty straightforward, if you use WordPress, Joomla! or Drupal. After choosing a CDN provider and signing up, you will need to identify the portions of your website known as static content. These portions will be mirrored by the CDN.

Those using a popular CMS, such as WordPress or Joomla! will be able to do this with a plugin. It will be rather easy, but you may need to modify your DNS records and name servers for your domain. If you’re using a custom website design, it may be a bit harder. If this is the case, you may want to speak with a developer before choosing your CDN.

Make sure you test your website after making all the necessary modifications. You will want to make sure the content mixes properly and SSL issues are resolved. You may also need to adjust your caching rules.

A CDN can be a very helpful tool for speeding up your website. Not every website will require one, but if you want the best speed possible, using the right CDN will make a difference.

What to Expect with Cloud Hosting for the Rest of 2016

The CloudCloud hosting is in a very exciting place right now. It’s no longer just an up-and-coming technology. Many different surveys have shown that it has become mainstream with about 90% adoption.

With the cloud industry evolving and growing at lightning speed, what can we expect for the rest of 2016? There are many things that could happen. Here are five of our predictions for the cloud industry for the rest of this year.

Multi-Cloud and Hybrid Becoming More Popular

We have already seen the hybrid cloud market start to grow, but it’s set to really take center stage throughout the second half of 2016. Since consumers don’t want a “one size fits all” solution, using hybrid cloud solutions is a much better option for many companies. This will cause both the hybrid option and the multi-cloud option to grow quite a bit throughout the rest of the year.

The hybrid solution will allow companies to use a mixture of private and public cloud options for different areas of their business. This may allow for lower costs and better performance. Companies choosing the multi-cloud option will be able to combine technologies to help prevent vendor lock-in.

Security Must Improve

Cloud SecurityThe one downfall to cloud hosting is the security. It’s one of the largest concerns for businesses using cloud products and even individuals using the cloud. It’s important for security to improve, as business leaders don’t always know where their security really is at. In addition, hackers have been targeting more small and medium businesses, so any possibility of a hacker attack needs to be eliminated.

Security measures may become more granular to help solve the problem. Regardless, security is a huge concern in the cloud industry and it will likely take a few steps forward throughout the rest of 2016.

Huge Overhauls Will Continue

We’ve already seen some companies start moving everything to cloud hosting. This will continue and huge cloud overhauls will happen for many companies. The companies that have already been on a cloud system for a few years will start to deploy new solutions as the technology gets better.

Many businesses have already started to outgrow their systems and will need to rebuild something from the ground up. With new technology available, 2016 is the year where many businesses will invest in new cloud solutions allowing them to rebuild their infrastructure to better fit their needs.

Managed Service Becomes Even More Important

When the cloud first came into existence, businesses were worried more about choosing the best vendor. Now that it has become more mainstream, price wars will heat up and managed services will become far more important.

Since all businesses have access to the same tools and technology, it won’t be about the infrastructure as much as solving the problems some have with the cloud. Managed service providers will become a more viable option for many and far more important throughout the rest of this year. These solutions will provide a competitive advantage for some and a convenience for others.

Containers will Keep Growing

ContainersWith containers, efficiency can be increased and they will keep growing. They take up less space compared to virtual machines and they come with operating systems. Containers also help to keep overhead down. They won’t replace virtual machines, but they will continue to grow throughout the rest of this year.

These are just a few of the predictions for the fast-growing cloud industry. The industry will continue to grow throughout the rest of this year with plenty of new, exciting technologies coming.

Link Building Strategies That Actually Work for 2016

SEOSome may tell you link building is dead, but they are wrong. There are still many very effective ways to build your links and gain higher ranking on Google and other search engines.

While methods from two years ago may not work anymore, there are ways to build links today. You do need to keep some factors in mind, such as a content first strategy is important and you want high quality links from related websites. Here are some of the top strategies to help build links successful for 2016.

Find Broken Links

Websites linking to sources often end up with broken links when a source is no longer available or moves to a new URL. If your site fits with the information, you can ask the owner of the website to redirect the links to a functional page on your website. If you have information that fits with the link, you may even be able to get them to swap your link for the broken links.

Have Images Linked Back to You

By performing a Google images service, you can find images on your website being used on other websites. Then, you can contact the owner of the website and ask them to put your website in as the source for all of the images they are using. This is a great way to build links without much work at all.

Provide Great Content

Link BuildingOne of the best and most organic ways to build links for your website is to provide great content. Simply giving your visitors what they want will get you more visitors. Some of those visitors will be blog and website owners looking to link back to your website in their own content. This can prove to be very powerful and provide plenty of natural links over time.

The Skyscraper Technique

Another content creation strategy that works great for building links is called the Skyscraper Technique. This technique includes finding content that gets a ton of traffic and creating even better content. For example, if you find a list of 10 great ways to build links for SEO, you could create a list of 15 or 20.

Use the Right Tools

SEO Link BuildingLink building tools are not dead and can certainly be used to build great links to your website. However, you do need to be selective in the ones you use. Some of the best include:

  • Buzzstream – Great if you have a team around your website.
  • Quicksprout – A great tool for creating better content than another web page.
  • Kerboo – A data intelligence platform giving you plenty of great data for link building.
  • Offsite Metrics – If you want to monitor the use of your images on other websites, this is the tool for you.

There are a few great tools you can still use to help build backlinks. However, they are no longer directly link building tools. Instead, they are tools that will help make your chosen strategy easier.

If you want to build links to your website and increase your overall ranking, these tips and tools will help you. Be careful and make sure you are only using strategies and tools that won’t cause your site harm.

What are the Major Advantages of Business Colocation?

ColocationAs a business, hosting is a very important thing. It may be what keeps you going and keeps everything running smoothly.

You have many different hosting options all the way from shared hosting to dedicated and even colocation. For many businesses, colocation makes quite a bit of sense, especially if you already own servers.

Colocation will allow you to store the equipment you use (servers) in rack space located within a secure data center. You will be able to use a public IP address, power from the service provider and bandwidth from the service provider.

Advantages of Colocation for Businesses

When it comes to using colocation for your business servers, there are several advantages. Here are just a few of the major advantages you will gain.

Better Network Security

Colocation Data CenterIn most cases, renting data center space will be far more secure than housing your servers at your actual place of business. Most data centers provide top-notch network security with the best firewalls and IDS systems you can find. This will ensure nobody gets in without authorization.

Room for Growth

Maybe your business is growing and you’ve already outgrown the area you house your servers in. With colocation, you have plenty of room to grow and expand, as needed. Data centers are often very large and offer plenty of room for the IT infrastructure of your company to grow. You won’t need to invest as much money into growth, either.

Better Connectivity

With a colocation data center you will gain access to a fully redundant network. This provides better connections for your business applications and keeps everything running without interruption.

Excellent Capabilities

Data centers will give you the ability to use higher levels of bandwidth when you have a huge amount of traffic. Maybe a video went viral or you just see more traffic during the holiday season, with colocation, you won’t have to worry. Data spikes happen in business and a colocation center will be able to help you handle the traffic without issue.

Redundant Power

Colocation centers also provide you with redundant power, which is another level of security. It will ensure you stay up when the power goes out. Many use diesel power generators, double battery backup systems and other power backups to ensure your servers are always up.

Moving Towards Cloud Migration

Data CenterIf you plan to move to the cloud in the near future, colocation may be the first step. This allows you to move all of your equipment out of your facility and gain increased capacity and performance for your business. As you move to the cloud, this will be a very important part of the process if you want to ensure a smoother transition.

There are several benefits of colocation for businesses. When you want to grow or you simply need space your servers are taking up, colocation may be the answer. It will allow you to run your servers without the need for the larger overhead of a larger space.

Biggest Advantages of a Dedicated Server

Dedicated ServerCompared to any other type of hosting, a dedicated server comes with plenty of advantages. While many are moving to the cloud or trusting their cheap shared hosting, dedicated servers still provide the best hosting option for most companies and websites.

A dedicated server certainly provides more power than shared hosting, VPS hosting or cloud hosting. It gives you more control and plenty of resources to use for your website. Here are a few of the biggest advantages you gain with a dedicated server.

Flexibility

It’s a bit overrated by some, but flexibility is vital to the success of many websites and applications. A dedicated server gives you the ability to fully customize the server however you see fit. You can adjust the CPU, disk space, RAM and software, even if you have managed dedicated hosting.

Shared hosting doesn’t allow much flexibility and you’re often stuck with the software they give you. VPS hosting offers some customization and flexibility, but it simply cannot compare to that of a dedicated server.

With a dedicated server, you can customize the server environment to fit your specific needs. You get to choose the software and platform best for your projects.

All the Resources are Yours

Dedicated HostingDedicated hosting is the only hosting option offering all of the resources of the server to one client and one client only. Shared hosting uses a “free for all” method of sharing, while VPS hosting partitions the resources. In both cases, you have to share the server with other clients and you don’t gain the full power of the server.

If you need all of the CPU and RAM or you just need the security of knowing you are the only client on the server, a dedicated server is the only way to go.

Better Performance

Dedicated servers outperform all other types of hosting when properly utilized. They provide faster website load times, better uptime and better overall performance. You don’t have to worry about anybody else impacting the performance of the server with a traffic spike or a user error.

More Security

If you’re on a shared server, you count on hundreds, maybe thousands of other clients to keep their sites and accounts secure. A mistake made by someone else may cost you.

With a dedicated server, you don’t have this type of security concern. You’re the only one on the server, so if a mistake is made, it was your fault not someone else’s. You also get the ability to customize the server security however you please.

Unique IP Address

Dedicated Server HostingA unique IP address helps with SEO and other parts of any online project. With shared hosting, you share the IP address with a bunch of other websites. If one of those sites becomes a spam site, it may cause you to struggle with your ranking.

Your dedicated server will have a unique IP address only for you. This means, you gain full control of it. If you plan to run an eCommerce site, this is vital for security.

There are many benefits and advantages to a dedicate server. Whether you choose to buy a server and use rack space or you plan to rent a server and let another company manage it, you gain all types of benefits compared to shared, VPS or cloud hosting.

Cloud Computing vs. Dedicated Servers – What You Should Know

Cloud ComputingEverything in the hosting industry seems to be about cloud computing these days. It provides scalability, redundancy and on-demand services, but it is really as good as many advertisers say it is?

While it may not fit with every business or website, it’s important to know the differences between cloud computing and dedicated servers. You want to be sure you get the right hosting and here’s what you should know about both.

What the Heck is the Cloud, Anyway?

Understanding what the cloud is will help in your search for the right hosting. The cloud is often prone to hardware problems and even network issues. It’s not always as good as advertised and can be prone to bottlenecks. The cloud is shared services where the RAM, CPS and network resources are shared with others.

Which is Faster?

There is no doubt that a good dedicated server will outperform most, if not all, cloud services when it comes to speed. Cloud may provide excellent storage, but it’s not going to provide the same speed. Many use cloud computing due to the scalability. However, a dedicated server can easily be scaled and gives you plenty of speed compared to cloud services.

Disk IO Comparison

When a dedicated server is configured correctly, it often provides excellent value for your money, especially with the disk IO. Cloud services, on the other hand, often lead to an unpredictable disk IO. In many cases, if one client on cloud hosting starts to see a ton of traffic, it slows everything down for the rest of those on the cloud computing. In addition, not all cloud computing disk IO issues are easily solvable within the cloud framework.

Redundancy Comparison

Dedicated ServersRedundancy is one of the reasons many choose cloud services. It’s more redundant than a single dedicated server. This is simply the truth and there’s not much you can do. However, the node for cloud services isn’t much more reliable than a single dedicated server. If the node fails, it will be much the same as if the CPU or the RAM failed for a dedicated server.

Complexity Comparison

One of the very mystic things about cloud computing is the complexity. Dedicated is the better choice when complexity is considered. Complexity will add costs and cloud hosting is certainly more complex than dedicated server hosting. If you offer web design services, using WordPress, Drupal or Joomla with cPanel or Plesk isn’t as easy with cloud services as dedicated servers. You will get more value out of a dedicated server, as well.

Price Comparison

Cloud Computing vs. Dedicated ServerWhile cloud services may seem cheaper up front, they can get very expensive very quickly. A well-managed dedicated server or servers can provide excellent performance with plenty of value. When comparing the two, you get more value out of dedicated server than cloud computing, in most cases.

Overall, cloud computing vs. dedicated server hosting will be a debate many will disagree on. For most businesses, a dedicated server will provide more bang for your buck, more speed and a better, hosting choice.

An Introduction to Web Servers: Part 2

A web server will typically run as a daemon (system service) under a single application process. That initial process will then spawn child processes that handle virtual servers, individual websites, or even individual requests. As such, a web server could spawn hundreds or even thousands of processes per day, per hour, or even per minute.

For security, child processes normally run as unprivileged users (i.e. nobody) to prevent hackers from gaining access to the server operating system through a back door in the web server. Dynamic web applications can also serve as an added security risk, so a web server may run scripting modules as separate CGI programs, rather than as part of the web server itself.

SSL and TLS are standard on all web servers and facilitate the use of secure transactions via the HTTPS protocol.

Web servers use compression to speed up requests and serve web pages and content using less bandwidth. Gzip compression is a standard method of delivering compressed data.

The standard HTTP port for web servers is port 80, and the standard port for HTTPS is 443; however, most web servers can be configured to run on virtually any port.

 

An Introduction to Web Servers: Part 1

Once you have chosen an operating system, setup some basic security, and decided on a web-based control panel, you will need to decide what software you will run on your server. Some control panels will install your software for you, but it may be worth it to choose one that is right for your needs.

Web server software displays the HTML pages stored on a server. In addition, a web server may also process scripts, run additional processes, and connect to a database server. It is essentially the intermediary between your dedicated server and the Web. Without it, no one will ever see what you have on your server. That makes web servers one of the most crucial components of server, and choosing the right one is critical.

There are a few key factors to consider when selecting a web server:

1. Performance – speed, load averages, and memory consumption

2. Stability and security

3. Scalability

4. Features

Some web servers are fast and lightweight but may not provide as many features. Others may offer a wide range of features but may not scale well. The following are web servers that work well in most situations and have proven to be reliable for small and large websites. They all have the basic features one would expect, like support for virtual servers and SSL (secure socket layer), as well as static and dynamic web pages. Additionally, each has its unique features that make it ideal for specific types of websites.

How to Fix the Bash Shellshock Vulnerability on Linux

In the previous post, we explained how to check your Linux server for the highly publicized Shellshock vulnerability in Bash. Fortunately, most, if not all, major Linux distributions have already uploaded the fix into their package management repositories. All you have to do is install the latest version. Unfortunately, there is some evidence to suggest that those updates are currently incomplete. Nevertheless, keeping it updated will thwart potential attackers. Red Hat and other companies are working at this very moment to roll out full fixes. They may even already be available by the time this article is published.

On Debian or Ubuntu distributions, run the following command to update bash:

$ sudo apt-get update && sudo apt-get install –only-upgrade bash

For Red Hat, Fedora, and CentOS, run:

# yum update bash

Once you have ensured that you have the latest version of bash, you should re-run the vulnerability check to see if your server is now safe from it.

You should not need to reboot your server or take any further action once you have completed the update. Nevertheless, over the next few days, more updates might be released as developers and system vendors learn more about the extend of the vulnerability and the types of malware that cyber criminals produce to exploit it.

 

How to Check Your Server for Bash Shellshock Vulnerability

The hosting world has been hit with yet another highly publicized server vulnerability. This one affects the ubiquitous shell program GNU Bash and is referred to as Shellshock. Most Linux, BSD and Mac OS X operating systems and variants use Bash or derivatives of it. All Bash versions between versions 1.14 and 4.3 are potentially vulnerable. Fortunately, it is easy to check for the vulnerability and easy to fix.

To test your Linux server for the vulnerability, login via SSH and type this command from the bash prompt:

env VAR='() { :;}; echo Bash is vulnerable!’ bash -c “echo Bash Test”

If the vulnerability is present, the output will look like this:

Bash is vulernable!

Bash Test

The “Bash is vulnerable” is the line where an attacker could potentially inject code through any service or program that uses bash scripting. These programs may be more prevalent than you might think. If your Bash installation is not vulnerable, the output will not print the “Bash is vulnerable” line but should still print “Bash Test”.

In the next post, we will cover ways to fix the vulnerability on a number of Linux distributions.

 

Server Architecture: Past, Present, and Future – Part 3

The Future

One challenge that chip makers have juggled is the power requirements for faster chips and the heat output associated with them. In some cases, cooling fans and heat sinks have become inefficient ways to keep chips cool, leading a few to invest in liquid cooling and other unconventional methods.

As the world becomes more conscious of the need for energy conservation and corporate social responsibility becomes more of a requirement rather than a suggestion, many businesses with servers are looking for ways to reduce energy output and costs. It is, therefore, no surprise that AMD and Intel are introducing chips that are designed to be “green” and reduce energy usage, while still giving applications their necessary power.

Companies like VIA see this as an opportunity to introduce their low-powered embedded chips into the server market. Dell, for example, is using VIA’s nano chips for their mini servers.

Another embedded chip manufacturer, ARM Holdings, plans to introduce server processors based on their Advanced RISC Machine (ARM) chips in the coming years. ARM has become well known for powering mobile phones and other small devices, but the company believes its processors can issue in a new era of low-powered, energy-efficient servers.

Without a doubt, Intel, AMD, and others will have to find ways to compete with this new trend of smaller, more energy-efficient processors. In that regard, the future remains uncertain. What is certain is that server processors will continue to get smaller and faster. There may come a time when a server is kept in the drawer of an office desk, which is a future many business owners and IT professionals will welcome.

Server Architecture: Past, Present, and Future – Part 2

Other companies that produce x86 processors include Cyrix, AMD, and VIA. AMD, in particular, has become Intel’s biggest competitor.

At the turn of the century, Intel introduced their high-powered 64-bit processor line called IA-64 or Itanium. Originally produced at HP, Intel later joined in the development with the intention of making server-class processors to compete with IBM PowerPC and Sun SPARC processors. The Itanium mainly powered servers running the HP-UX operating system (HP’s UNIX-like OS), but Microsoft and others also supported it.

AMD’s release of their 64-bit extension for x86 processors (called x86-64), however, put a serious dent in the Itanium market, one that would eventually lead to its demise. Recently, Microsoft announced that it would stop supporting Itanium processors.

In order to compete with AMD’s server processors, such as their Opteron, Intel decided to join in on AMD’s party and create their own x86-64 processors, based on their previous Xeon server models.

Today, multi-core Xeon and Opteron processors are very popular choices for dedicated servers, particularly those that power web applications. Current clock speeds reach as high as 3.8 GHz. Although PowerPC, SPARC, and other architectures still exist, they have only become prevalent in more specialized markets.

Server Architecture: Past, Present, and Future – Part 1

Servers have evolved significantly over the past three decades. At one time, a single server filed an entire room, required its own cooling system, and had processors that ran slower than many of today’s mobile phones. Today, servers are smaller, faster, and more energy efficient, but the most important element is still the processor.

Because server technology is constantly evolving, it is important to know the history of server processors, how they came to be, and where they are headed. Microprocessors have evolved from single-core, inefficient chips to multi-core powerful multi-taskers with on-chip cache and incredible speed.

Brief Processor History

Early server processors were primarily RISC (reduced instruction set computing) chips, dating back to 1964. By the standards of the time, these processors were extremely fast and revolutionary.

Later RISC-based processors would include IBM’s PowerPC chips and Sun Microsystem’s SPARC. For several decades (from the 80s until present) these chips have powered many of the world’s servers.

Intel’s 8086 processor introduced a new type of chip architecture, which would come to be known as x86. The 8086 was introduced in 1978 as Intel’s first 16-bit processor. That first chip had a maximum CPU clock of 10 MHz. Although most people think of these processors as being designed primarily for desktop computers, the 16-bit and later 32-bit versions have both powered a multitude of small and large servers.

 

How to Add/Remove Yum Repositories

Red Hat Enterprise Linux, CentOS, Fedora and other Linux distributions based on RHEL all use YUM as a package management system to install, remove, and update software. Each distribution has its own main repository, but you can also install or remove third-party repositories whenever you like.

To add a YUM repository, type as root:

yum-config-manager –add-repo repository_url

For example:

yum-config-manager –add-repo http://www.serverschool.com/serverschool.repo

And the output will look like:

Loaded plugins: product-id, refresh-packagekit, subscription-manager

adding repo from: http://www.serverschool.com/serverschool.repo

grabbing file http://www.serverschool.com/serverschool.repo to /etc/yum.repos.d/serverschool.repo

serverschool.repo | 413 B 00:00

repo saved to /etc/yum.repos.d/serverschool.repo

You can then enable the repository with:

yum-config-manager –enable serverschool

To disable a respository:

yum-config-manager disable serverschool

For more information about repository management with YUM, see this Red Hat documentation.

 

How to Add and Remove APT Repositories

Ubuntu based Linux distributions rely on a program called APT to handle package management. Using the command “apt-get”, you can install, remove and update other programs. The packages installed with APT are determined by software repositories, and while every Linux distribution has default repositories, you can also add or remove third-party sources.

To add a Personal Package Archive (PPA) to your Ubuntu server or virtual private server (VPS), run this command:

sudo add-apt-repository [ppa name]

For example, if the PPA is ppa:server/school, you would type:

sudo add-apt-repository pp:server/school

The output will look like this:

tavis@serverschool:~$ sudo add-apt-repository ppa:server/school

[sudo] password for tavis:

Server school is a fake program I made up for the purposes of this article

More info: http://www.serverschool.com

Press [ENTER] to continue or ctrl-c to cancel adding it

…followed by gpg information

Then, to refresh the repository list, you need to run:

sudo apt-get update

Finally, install the software you wanted:

sudo apt-get install server-school

To remove a PPA, you can use a similar command structure:

sudo add-apt-repository –remove [ppa name]

So, to remove the server school repository, you would type:

sudo add-apt-repository –remove ppa:server/school

As with any third-party software, be careful when adding PPAs, especially from sources you do not fully trust. If you are unsure about a PPA, it is best to leave it off of your server.

How to Add and Remove APT Repositories
Ubuntu based Linux distributions rely on a program called APT to handle package management. Using the command “apt-get”, you can install, remove and update other programs. The packages installed with APT are determined by software repositories, and while every Linux distribution has default repositories, you can also add or remove third-party sources.
To add a Personal Package Archive (PPA) to your Ubuntu server or virtual private server (VPS), run this command:
sudo add-apt-repository [ppa name]
For example, if the PPA is ppa:server/school, you would type:
sudo add-apt-repository pp:server/school
The output will look like this:
tavis@serverschool:~$ sudo add-apt-repository ppa:server/school
[sudo] password for tavis:
Server school is a fake program I made up for the purposes of this article
More info: http://www.serverschool.com
Press [ENTER] to continue or ctrl-c to cancel adding it
…followed by gpg information
Then, to refresh the repository list, you need to run:
sudo apt-get update
Finally, install the software you wanted:
sudo apt-get install server-school
To remove a PPA, you can use a similar command structure:
sudo add-apt-repository –remove [ppa name]
So, to remove the server school repository, you would type:
sudo add-apt-repository –remove ppa:server/school
As with any third-party software, be careful when adding PPAs, especially from sources you do not fully trust. If you are unsure about a PPA, it is best to leave it off of your server.

Top Web Server Software for Dedicated Servers

Netcraft publishes a list of the web’s most widely used web server software every month. Here is a brief look at each of those top web servers and what they can do.

Microsoft IIS (37% market share) – Microsoft Internet Information Services is the web server designed specifically for Microsoft Windows Server operating systems. It has recently gained popularity due to Windows Azure cloud services and is now the most widely used web server on the web. It is proprietary and requires a license for Windows to legally run.

Apache HTTP Server (35% market share) – Apache enjoyed the top spot for most used web server for over a decade, but that reign has come to an end. The free and open source web server software is available for installation on most Linux distributions, BSD, variants of Unix and even Windows.

Nginx (14% market share) – Nginx is known for being a high-performance web server that can handle high load and traffic. Many extremely popular websites with millions of daily visitors depend on it. It is now the third most popular web server and continues to chip away at Apache’s #2 spot. Like Apache, it is also free and open source.

There are a number of other web servers that make up a considerable percentage of the web, including Google’s own custom web server, but since these are not available to the public, we have not included them.

Understanding Systemd and How to Use It: Part 2

In part one, you learned a little about what systemd is and which Linux distributions plan to use it. In part 2, you will learn how to use systemd to start and stop services. We will use Red Hat Enterprise Linux, CentOS and Fedora in the explanation, but most of it will apply to other distributions that use systemd as well.

With the old init system, you could use the “service” command to start and stop services. For example, to restart Apache, you would type from the command line:

service httpd restart

The more direct way of doing it was to find the actual init script in /etc/rc.d/init.d and restart it using that script. The service command has now been replaced by the systemctl command. For now, you can still use the “service” command, and the OS will just remind you that it is no longer the standard way.

[root@serverschool ~]# service sshd restart

Redirecting to /bin/systemctl restart sshd.service

Service scripts now have the “.service” extension, and you can use them by executing the systemctl command:

systemctl restart httpd.service

The important thing to note is that the action is now listed before the script. You would type “start httpd.service” rather than typing “httpd start”.

For more information about the systemctl command, see this documentation.

Understanding Systemd and How to Use It: Part 1

Systemd has gradually made a name for itself in the Linux world and is or will eventually be the default service management system for a number of major Linux distributions. Those accustomed to the old init systems will not find Systemd to be horribly complex, but it does feature some significantly different approaches to service starting and management.

Systemd runs as a daemon, hence the “d” at the end of the name. It manages all other daemons from boot to shutdown. Rather than using a shell script to initialize each daemon, Systemd relies on a declarative configuration file. It also is capable of starting processes concurrently, allowing for faster boot times.

While Systemd has found a home in numerous Linux distributions, it is not without its detractors, including Linux creator Linus Torvalds and Slackware founder Patrick Volkerding over the way development is being handled and the use of config files rather than shell scripts. Despite this, Fedora, Arch Linux, CoreOS, openSUSE, Red Hat Enterprise Linux and CentOS all use it by default, with Debian and Ubuntu planning to do the same. Therefore, it is a good idea to learn how to use it, and part 2 of this introduction will start you on that journey.

 

How to Configure Linux Password Policies

One of your best weapons in the fight for server security is strong password management. Using the password policies you set in Linux, you enforce strong passwords, require password renewals and many other effective security measures.

First, you should install the cracklib module for PAM. Cracklib tests password strength. If you are using RHEL, CentOS or Fedora, it is installed by default. You can find password security options in /etc/pam.d

To set the minimum password length, edit /etc/pam.d/system-auth on Red Hat distributions or /etc/pam.d/common-password on Debian distros. The length setting will look something like this:

password requisite pam_cracklib.so retry=3 minlen=8 difok=3

Where minlen is the length in characters. In this example, the minimum length is 8 characters. The “difok” setting specifies the number of characters that must be different from the previous password.

Next, you can set password complexity in a line that contains “password” and “pam_cracklib.so”. It will look like this:

password requisite pam_cracklib.so retry=3 minlen=10 difok=3 ucredit=-1 lcredit=-2 dcredit=-1 ocredit=-1

“ucredit” is the number of uppercase letters. “lcredit” is the number of lowercase letters. “dcredit” is the number of numerals, and “ocredit” is the number of symbols.

For more on PAM and all that it can do to manage your passwords, see the online documentation.

 

 

4 Common Open Source Licenses

As you enter the world of server management, you are likely going to encounter free and open source software. Even a Windows system administrator these days will likely have to at least run Linux in a virtual machine at some point. Therefore, having a little background knowledge on how Linux and other open source software differs from proprietary software can be useful.

The following are 4 common open source licenses:

1. GNU GPL – Perhaps the most widely used, the GPL is also the most strict. While it allows anyone to freely use, download, distribute, modify or even sell the software, it also strictly requires that any redistribution carries the same license. Linux is famously licensed under GPL v2.

2. BSD – This type of license is considered more permissive. With it, you can essentially create a proprietary version of the software after you have performed your modifications. You are not required to release anything under the same license.

3. Apache – Similar to the BSD license, Apache allows for releasing your changes to the code under a difference license. It provides some additional legal information about copyrights, trademarks and patents that the BSD license does not explicitly mention.

4. GNU LGPL – Designed as a compromise for linking non-GPL libraries, the LGPL allows for linking with code that has a different license, but like the GPL itself, the code licensed under it must be re-released under the same license.

 

5 Signs You Need a Managed Server

Unmanaged servers are available all over the web for lease. They are cheap, rapidly deployed and usually connected to very fast networks inside of secure data centers. Nevertheless, an unmanaged server is not for everyone. Here are five signs you need a managed server rather than an unmanaged one.

  1. Your frustration level has reached an all-time high, and you are finding it difficult to setup server applications and troubleshoot problems.
  2. You want to spend more time on your business and less time on the technology that powers it.
  3. You have suffered some serious security issues and do not know how to fix them or prevent them from happening again in the future.
  4. You have operational expenditure that you can afford to spend on server management and find that cheaper than having full-time IT staff to make sure your server keeps running smoothly.
  5. You simply do not have the time to manage your server by yourself.

Manging your own server can be difficult, time consuming and ultimately expensive if you have to hire staff to take care of it. On the other hand, it might be worth the extra money to have a managed server and not have to spend time dealing with technical issues.

 

Send System Messages to Server Users

If your server has multiple users, you might want an easy way to send messages to them and make sure they receive them. The best way to do that is to send it right in the console. One tool you can use to do just that is “wall”. With it, you can send messages to all logged in users at once, and unless they have specifically disabled it, they will receive it.

To send a message to logged in users, type wall, press Enter, type the message you want to send and finally press CTRL+D. For example, to send a test message:

# wall

This is a test message

[CTRL+D]

Output:

Broadcast Message from tavis@serverschool

(/dev/pts/2) at 20:35 …

This is a test message

If you do not want the “broadcast message” banner, simply use the “-n” option. This message will reach all users. It is particularly helpful if you want to tell them the server will be unavailable shortly due to maintenance.

# wall

This server will be unavailable in 5 minutes do to scheduled maintenance. Please save your work and log out.

For more information about wall, type “man wall” from the command line.

Is Mac OS X Good for Servers?

When you think of Mac OS X, you probably tend to think of various iTunes, graphic design, music production and other artsy activities. It is primarily a desktop operating system, but Apple does sell a server add-on for its OS. The question is: Is that server version useful for real-world server operations?

Some of the advantages of OS X server include:

  • Ease of administration – Like many Apple products, it is designed to be relatively easy. It includes graphical administration tools and easy setup of client systems.
  • Low cost – Although it is obviously more expensive than a free Linux distribution, it is still less expensive than Windows Server or a commercial Unix license.
  • Unix strength – Underneath, OS X Server includes many Unix-like tools that give it a surprising amount of power.

Some of the disadvantages are:

  • Not an enterprise OS – Do not expect to easily deploy a cluster of OS X servers. It is not built like an enterprise OS and does not include many of the tools you might want if you were to think big.
  • Hardware – OS X Server supports Apple hardware, which is more costly and difficult to support than alternatives.
  • Vendor lock-in – If you build your server on OS X, you are locked into the hardware and software. As with any proprietary OS, you give up the freedom to easily migrate to something else.

OS X Server has its pros and cons. Ultimately, if have a specialized product involving Apple systems, it might make sense. For web hosting or large scale enterprise, you will probably want to look elsewhere.

What is a Journaling File system

As we discussed in a previous post, Linux servers offer many different types of file systems, and every other server operating system also offers a choice of file systems. One type of file system you might encounter is called a journaling file system. What is it and how does it differ from a standard Linux file system?

A journaling file system is designed to protect against data loss by recording disk transactions to a log in case of system failure. Upon reboot, the file system normally compares the log to the actual files and corrects any discrepancies. Without this type of journaling in place, a single crash could cause disk corruption.

The old default Linux file system, Ext2, did not have journaling. Newer Linux file systems such as Ext3 and Ext4 use journaling. XFS supports journaling as well. Similarly, older Windows file systems, such as FAT and FAT32 do not support journaling, whereas NTFS does.

The main disadvantage of a journaling file system is that it involves more disk accessing than other file systems. This theoretically could make them slower, but with many modern disks and file systems, you might not notice a difference. There is also some debate about how to use journaling with solid state drives (SSD) or even if one should use them at all.

For more information about journaling file systems, see this article.

What are binary and source packages?

While learning to use a Linux or BSD dedicated server, you are likely to encounter the terms binary and source software packages. Depending on your actual operating system, it may use one, the other or both as default methods of software installation.

A source package is a file archive that contains the full source code of a given software. In order to install that software, you would need to unpack that archive and build the software from source using whatever required building tools are necessary (i.e. make, cmake, or others).

Some operating systems, such as Gentoo or FreeBSD, will also provide package repositories that allow you to automatically build software from source. The advantage is that programs built from source are usually better optimized for your architecture and settings.

A binary package is one that is pre-compiled and built to general specifications that should be compatible with your OS and architecture but that may not be tailored to meet your specific settings. Most Linux distributions include binary package repositories that make installation easy. Binary packages require dependencies to match the specifications spelled out when the packages were originally built. Therefore, installing them manually can sometimes be a pain. When using a repository, however, they are easy to install and much faster than building from source.