Data’s Really Moving: Superior Fibre Optic Technology is Here

Reading Time: 3 minutes

PrintIn all this talk about, servers, databases, cloud storage etc etc., fibre optics have been a bit of a lesser light despite being the unsung hero in the successes we’ve seen in the supercharging the digital world. Now some of you will know what a Terabyte is, and others won’t. Most of you will be familiar with a Gigabyte though, and a Terabyte is 1,024 gigabytes.

That’s a lot of capacity, and with that understood fibre optics cable these days can achieve transfer speeds up to 255Tbps. Simply put, that’s blazing fast and nearly every Canadian web hosting provider we don’t overlook the value those transfer rates have in letting us work with our clients in making their websites – and in the bigger picture their online marketing efforts – really pay off.

What’s different nowadays is that it’s not just one single-core fibre. Fibre optic cables now feature a multi-core glass stem that allows for an enormous load of data to flow through it. While this technology is still a long way off from being used extensively throughout cities, it’s setting the tone for the newer upgrades poised to hit the fibre optics market, and they’re not far off.

Information Super Expressway

Many cable providers plan to upgrade their lines, with some of the major cities in the world soon to see cables capable of handling 400Gbps. That’s a huge upgrade from the current standard of 10Gb/s that many companies use. The newer fibre optic cables also include a host of upgrades and capabilities that really outdistance them from their predecessors. Here’s a quick reference of the improvements:

Security – The integrity of an optic cable actually plays a role in how easily hackers can force their way into data stores. Now, fiber optic cables use light. That makes it tough to ‘grab interference’ as the term goes. About the only way to break into them is to physically break the cord. Since they use light to transmit data, all of the light escapes and makes it easy for network security to notice breaches in data.

Design and Speed – These newer materials are lighter and thinner too. They can be more easily wrapped in a protective coating and hung from poles. In North America the majority of wires are still suspended from poles and not buried in the ground, so weight is a major issue. The reduced weight also allows them to be thicker, which means more data transfer.

Reliability – Since all of the data is transferred through a core of glass, it’s by in large entirely insulated from any threat that might interrupt or shut down transfer. There’s no fear of electrometric interference or radio-frequency interference, otherwise known as EMI and RFI. There’s also very little risk of impedance or crosstalk. Although it’s still susceptible to temperature changes, it’s more reliable when running through water or industrial equipment.

New fibre optic technology continues to expand in leaps and bounds, and it’s not a moment too soon with the way demand for internet usage continues to skyrocket in North America. The aim course is to reach the point where the world is full of nearly instantaneous data transmission, and with us being as much as an eager beaver as can be when it comes to web hosting and online communications – that sounds pretty darn great!

Cloud Computing: Nearly 50 Years in the Making

Reading Time: 3 minutes

Cloud computing concept. Photo collage.

Anyone familiar with the acronym ARPANET? Perfectly understandable if you aren’t, even many in the IT World likely wouldn’t have a clue.

So let’s introduce ARPANET and it’s significance to one of the more predominant technical developments in computing these days.

ARPANET is the network that became the basis for the Internet, and began with the interconnection of four university computers in the late 60s. ARPANET sent information in small units called packets that could be routed on different paths before being reconstructed at their destination. The development of the TCP/IP protocols in the 1970s made it possible to expand the size of the network in a much more orderly way.

The initial purpose was to communicate with and share computer resources among users at the 4 different connected institutions, and with that connection the application principle behind cloud computing was born.

Today, we often hear the term “It’s in the Cloud” or “that’s Cloud based”. But what does that mean exactly (and as some might ask, ‘why is Cloud capitalized?’)

To put it simply, data existing in a cloud simply means that it exists on multiple computers at once. Before understanding cloud technology you’ll need to have a basic grasp of server technology. Servers are powerful computers, similar to the technology in a desktop computer, just must more dynamic. When you take all the CPU power and memory of a server and make it a virtual server, the advantage becomes that it doesn’t physically exist anywhere, and can be moved from one host to another as needed and automatically.

This is the “Cloud”. The virtual server that is created can move from one host to another as needed and be given more resources or drive space by grabbing more from the storage servers, known as SANs (Storage Area Network).

Trends in Cloud Technology

So despite being 48 years in the making, it’s only in the last 15 years or so that Cloud computing has established itself. Even that number may surprise some who would think it’d be much less given the ‘newness’ of the technology to the public. In the early 2000s, Clouds were used, but they were more privatized. A single company might have a datacenter that contained clusters of physical and or virtual servers where they host websites, email, or in-house applications for clients.

It was 2007 when we began to see the advent of public cloud space services such as Dropbox and Google Drive. Users were now able to house their personal files in a secure space. Basically, these spaces are folders that exist within a company’s data centres. This data is hosted in clustered environments so that their users can enjoy guaranteed uptime and recovery. Existing in these datacenter’s virtual spaces also allows the data centres to offer their services for very little cost, or free, if the space is small enough.

Further, we now have cloud-enabled services such as Google Apps Marketplace and SaaS (Software as a Service) products like Microsoft Office 365. These new technologies allow the cloud to be, not just a place that you can store data, but utilize entire applications and services such as word processing, spreadsheets, CRM programs, creative suites – you name it.

Cloud Computing’s Future

As this technological entity is still evolving, we continue to see new developments but none is as significant as the trend towards mobile device optimization. Clouds may merge more aggressively and we may also see the forced de-privatization of clouds as larger companies force small and medium-sized businesses to bring their data into these more expansive environments in an effort to reach prospective customers more effectively.

Choose and have a web hosting provider that has their thumb on the pulse of digital marketing and web hosting trends.

Google vs Bing

Reading Time: 3 minutes

googlesuxThere’s no debating that Google is the world’s #1 search engine, and any website that exists for e-commerce should be tailored to match (at least reasonably) Google’s ever-changing search algorithms. Reliable web hosting is one thing and gets you solidly set up on the information superhighway, but you need to compete.

But what about Bing? Some people are quite surprised to learn that Microsoft’s search engine is still a legitimate competitor for Google when it comes to being someone’s go-to searcher. Bing is also holding its own when it comes to being a destination for Search Ads, and it seems there’s a good number of reasons why some businesses are still considering Bing for at least part of their ad placements.

Currently, Google owns 65% of market share for US searches, but that’s down from 72% since 2010. Where’s that 7 percent gone? You guessed it – Bing’s had the most growth over the last year, moving up to 19.7% (in part continuing to be powered by Microsoft’s acquisition of Yahoo years ago).

So when it comes to cost-effective search ads – what’s the better choice – Google or Bing?


Keep in mind that Internet Explorer is still the default web browser for a Windows device, and Bing comes as the default search engine.

This is a clear reflection that most Bing users are:

  • Less computer savvy, given the fact that they’re not able to or not interested in upgrading to a more modern and functional web browser
  • Generally over 35 years of age
  • More of the blue-collar employment type as compared to white-collar

It’s not difficult to identify how these findings validate Bing as a still-popular search engine. The blue-collar, over-35 working middle class crowd makes up a HUGE part of the purchasing public in both Canada and America and – you guessed it – because their PC came with Bing as the default search engine, more often than not Bing is their search engine.


Google has hundreds of thousands more total searches and holds more of the market share in all countries except Russia, China, South Korea, and Japan. Research suggests that Google users are:

  • Generally younger
  • College / University educated
  • More white-collar than blue-collar
  • Much more tech savvy
  • Facebook users
  • Less likely to have children

So if your products and/or services mean that you create a large scale or worldwide campaign, Google’s your place. But if you have ones that are likely to be best targeted to domestic customers, you should still at least consider Bing. This will be particularly true if you are marketing a product or service that – for example – is geared to older, male buyers predominantly. You know the ones who very likely wouldn’t even consider installing a different web browser.

Google is much more expensive for CPC (cost per click) rates, rising 26% since 2012 and expected to continue to rise. Conversely, Bing’s current CPC rate is nearly 0.75$ less, 33.5% less expensive overall and clearly offering a much more appealing cost-per-lead rate. That’s really something to take into consideration, especially if you meet the aforementioned criteria for a certain product or service that goes with an older demographic.

Google is still the undisputed king of search engines, but don’t count Bing out entirely. And as always, content is king. Learn how to create smart ads that bring prospective customers to your website.

Here at we are always keen to share digital marketing insights with our web hosting clientele. Get out there and get click-throughs!

For SEO Services please visit

Google News with Schema & More Strategic Descriptiveness with ETA Search Ads

Reading Time: 4 minutes

Businessman suggested effective 'SEO' optimisation approach. Hands presenting a 'SEO' flowchart.

One thing we can count on for this New Year of 2017 is that the world of SEO continues to change at lightning speed. It’s a day in and out reality for us digital marketers to be keeping up with customer usage and expectations, Google’s never ending algorithm updates, and any other of the daily seismic shifts in the world of digital marketing. Complacency will easily destroy your rankings, and rankings are oh so important when it comes to thriving or surviving in modern marketing.

Considering that 93% of online experiences begin with search, prioritizing the latest best practices in digital marketing and optimization will be critical to the success of any website.So this new year is as good as any to take a fresh look at the trends and developments that we have seen throughout the past year and see how they’ll play a part in dictating where we are heading next. Let’s get to them shall we?


Using a schema markup is increasingly advisable for advertisers with changes to Google and user trends.Schema lays out your site more clearly for search engines to understand its nature, and of course this helps to ensure that it is displayed correctly. Schema can also be particularly helpful when Google decides to display rich answers, as is the case with quick answers or a rich card.

Google has made known its preference to display answers that make it easier for users to find what they are looking for. Rich snippets are displayed for recipe and videos, AMP articles, local businesses, music, reviews, and TV & movies. Although this may change in the future, using the schema helps to ensure that your site is more likely to be identified and served up accordingly.

Then there’s Google directing more towards Quick Answers, jumping up from just over 22% in December 2014 to over 40% by the beginning of 2016 and having continued to increase incrementally over this past year. Schema improves the effectiveness of your snippet boxes.

Schema is these days tied very closely to RankBrain and artificial intelligence.This is to be expected, with this machine learning now standing as Google’s third most important ranking factor and pushing brands to make sure their sites are easy for the machine to interpret. Schema can help make this a reality. As artificial intelligence is likely to grow in the future, using schema now can keep your site prepared for whatever we’ll see in the way of advanced AI in the future. Which will almost certainly be considerable!

Hybridization and Getting to Know Google ETAs

As users become increasingly sophisticated online and the demands of digital marketing has professionals working in digital proximity like never before, it is the brands that mature with the flow of modern marketing and smartly separate their digital marketing departments that gain the early benefits.

Mobile users clicking on your PPC ads expect a user experience that’s consistent and continuing with what they had when they landed on your site organically. The answer becomes running hybrid campaigns, and professionals need to be know what they entail, how to run them, and where to best invest resources.

Ideas to consider as you begin strategizing:

  • Host training sessions where you help members of different teams familiarize themselves with each other’s goals and strategies
  • Create collaborative projects where members of different teams come together for common objectives
  • Develop common documents between the different teams that define roles, expectations and shared understandings on brand tone and voice
  • Changes to the layouts of your Google PPC ads

Google has been experimenting this past year with their standard text ads. Specifically, they have been increasing the number of characters allowed in some of the meta descriptions and titles. This is a big plus for those of you who now how to use that added space to share text that’s optimized for identifying your business and its nature.

More About Google ETAs

Some marketers continue to be challenged when it comes to taking advantage of this trend because they have not been rolled out to all websites, nor has Google announced that they are permanent. For the sites that do receive the extra space, however, there are great opportunities for including more keywords and more compelling descriptions to help attract people to the website.

To take advantage of the increased counts for Google ETA ads, you should consider:

  • Continuing to use your main keyword at the beginning of your title and meta description in case you are restricted to the original character limits
  • Using the extra space to expand your description, using keywords very selectively
  • Meta descriptions that are less than 100 characters should be increased to avoid having your description being buried with the new longer limits

If there’s one umbrella this can all be tucked under, it’s the overseeing reality that you must always be reevaluating your digital marketing strategy and tactics. 4GoodHosting has digital marketing advice for all its web hosting clients, and has a wide network of digital marketing industry consulting professionals. All you need to do is ask.

For SEO Services please visit