Weighing Cloud Analytics

Information technology leader is touching CLOUD ANALYTICS on a virtual interface. Information technology concept and business strategy metaphor for efficient cloud computing resource utilization.

Taking a critical look at big data gives companies valuable insights that allow them to make better-informed strategic decisions. Looking at these insights – and then making smart strategic moves based on your deductions – gives you the edge over your competitors and provides a more complete picture of your business. Undertaking big data analytics isn’t the simplest of processes though. You need the computing resources in place for it to be done effectively, and here at 4GoodHosting we like to share what we know as a Canadian web hosting provider who understands Internet marketing as well.

So, What Exactly is Cloud Analytics?

Cloud analytics is a cloud-based solution which enables businesses to carry out analyses and related intelligence procedures through integrated cloud models, whether that’s with hosted data warehouses, SaaS business intelligence (BI) or cloud-powered social media analytics. A whole range of analytical tools and techniques are put to work to help companies extract information from massive data and then present it in a way that is easily categorized, readily available via a web browser, and – most importantly – digestible and comprehensible for those who have interests in it.

A Unified Vision For The Business

Many companies face problems when different elements within the organization do not share the same perception of what is going well for the company – and what isn’t. Often, they are all working from their own data sets without a collectively agreed-upon ‘big picture.’

Cloud analytics makes it easier to identify and firmly define what that big picture should be.

One of the crucial advantages of using cloud analytics is its ability to consolidate big data from all sources and communication channels that a company employs. The capacity that cloud computing offers allows everything to feed in in a linear and timely manner: you can gather large-scale data from all your internal apps, devices, social networks and data subscriptions. Needless to say, that would be difficult to do in-house and on a single network.

Using a cloud-based data management platform lets you easily blend data from a range of sources, enabling it to be matched, merged and cleansed – with the collective volume of data being far more accurate results that enable you to have a unified vision of your business.

Increased Ease of Accessibility

The key to ensuring everyone sees – and comprehends – this unified vision lies in the ease of accessibility that cloud-based data management platforms provide. Compared to in-house applications which companies tend to be slow in adopting, cloud-based apps are much easier to use and can often be self-taught, reducing the need for staff training and instead utilizing the natural intuitiveness of the staff.

Further, employees don’t need to create one-off reports or log into separate systems to undertake analytics. That’s because the technology tends to be embraced more quickly throughout the company. This rapidity means cloud analytics become more accessible to everyone, and the fact they all quickly grasp and embrace it means it’s even more of an intelligent and productive choice.

Improved Collaboration

Many companies struggle to build a system that allows team members to collaborate effectively. A mix of in-house and external systems can be less than conducive to developing analytical models and sharing the results. As a result, development lacks pace, work is completed redundantly, and a good many people never get to contribute. This is particularly true for telecommute team members.

It’s entirely different with cloud-based data analytics. Teams can work together to curate data, create analytics designs and evaluate outcomes, no matter where they’re based. The fact that each member has access to real-time insights is of particular significance here. It’s a real benefit for operational teams who need those insights to make critical decisions in the interest of the success of the business.


Cloud service providers take security very seriously. In fact, most public cloud providers have better security mechanisms in place and are much better at systemic security services than company managed, in-house systems.

In-house systems generally use a mix of older technologies and legacy apps that have more vulnerabilities than the state-of-the-art systems found in cloud data centres. Cloud systems also have less complex architecture, making them easier to monitor and defend.

The cloud also has an inherent ability to help companies meet recovery time objectives and recovery point objectives should a data disaster occur. Giant backup storage capacity and huge numbers of redundant failover servers make it a cinch for the Cloud to be able to do this, whereas for an independent it would be a staggering expense.

Here at 4GoodHosting, we are one of Canada’s premiere web hosting providers and have our thumb on the pulse of everything related to web hosting, including the newest technological advances that allow you to optimize your Internet marketing efforts and position yourself with maximum efficiency in the business world.

Regulating, Optimizing, and Identifying with Intel’s Smartly Designed Data Center Manager

young handsome business man engeneer in datacenter server room

No one will be better informed than a Canadian web hosting service provider to detail the way today’s data centers can really ramp up operating costs. There’s a whole host of reasons for that, but nearly every one of them is contained in those centers’ digital architecture, and their sensors and instrumentation specifically. Making correct analyses of where inefficient operation is occurring is often beyond the means of even the most digitally savvy of us, but certainly not for the smart folks at Intel.

Intel’s Data Center Manager helps data center operators lower costs and extend infrastructure life spans by automating data collection and presenting insights into ideal operating conditions and configurations. It involves identify and monitoring as many individual data points, so when there’s a problematic inefficiency, users are able to know exactly where it is.

One of the common issues DCM data reveals is a need to increase the temperature in datacenters and thus minimize cooling costs. This shouldn’t come as a surprise entirely, given the ever-increasing workload these data centers face and the according way they will tend to run hot as a result. There are more data points than ever, and so by extracting that data and looking at it from a more objective perspective, you can be confident in choosing to turn up the temperature as a means of lowering your air conditioning costs.

From there, the DCM team can set threshold levels and implement algorithms to try to predict temperatures and to alert datacenter operators of potential problems.

Just one example of how Intel’s DCM is super effective in helping to manage and keep a Data Center cost controlled. Here’s more:

Languages to Communicate Across OEMs

All hardware manufacturers follow the Intelligent Platform Management Interface, or IPMI, specifications to report performance metrics independently of the hardware’s CPU, firmware, or operating system. Each brand customizes their IPMI feed slightly to differentiate their products, and that’s to be expected.

DCM provides a simplified data feed to infrastructure and application performance managers to interpret or to connect with a facilities management interface. The out-of-band solution has its own discovery mechanism to locate network devices and languages, and if a new language surfaces that’s unrecognizable, it’s added to the library. Intel reports that updating and maintaining this library is a priority.

Virtual Gateway with Remote Diagnostics and Troubleshooting

Off the success of DCM, Intel asked the development team if they could access any other useful information. By running a remote session, they found they could access logs and BIOS data to monitor system health metrics. DCM’s companion product is called Virtual Gateway (click here to see Intel’s product detailing) and it features a set of APIs that let datacenter operators tap into those resources with a keyboard-video-mouse (KVM) interface. Intel’s logic here is in the understanding that not many data center operators will want to add more hardware unless it’s absolutely necessary, and Virtual Gateway allows them to avoid that scenario.

Lastly, it’s good to know that all data center hardware built after 2007 will have at least some degree of compatibility with Intel’s Data Center Manager, and that includes many already-installed / long-serving components from are not made by Intel.

No matter what business you’re in, you want to keep operating costs reasonable and for those of us in the web hosting business this is an extremely valuable tool that allows us to pass on the benefits of efficient data center operation along to customers in the form of lower service rates. Here at 4GoodHosting, we’re always on the prowl for any such resource that allows us to do what we do even better day in and out and provide you with best web hosting services at the best prices!

What Exactly? The Ins and Outs of Subdomains


www.merriam-webster.com is pretty much the go-to online dictionary of choice these days, and it offers a full 10 different categorical definitions for the word domain, but most people will understand it to mean ‘space thats yours.’ The prefix ‘sub-‘ generally indicates a state of being beneath or under the suffix, whether literally or figuratively. Here at 4GoodHosting, we’re web hosting experts in Canada but we’re the furthest thing from that when it comes to dictionaries.

Just this once though we’ll take a dissection approach to the word subdomain, and talk briefly about how it relates to web hosting as a whole. If we’re taking the literal meaning of it based on what’s explained above, it means ‘under space’ and while that’s vague and indeterminate it’s still fairly applicable.

A subdomain is the part of the website address before the domain name. More in sticking with our definition here though, subdomains are known as ‘third level’ domains, or canonical names, and as such they’re ‘beneath’ a website’s standard URL that’s registered, recognized, and functional within web directories. To put it more simply and understandably, a website’s URL will begin with the very recognizable http: (hyper text transfer protocol) – but you’re probably more familiar with it as it’s subdomain – www.______. com / ca etc etc.

Let’s use Merriam-Webster again as our example here. Their URL is https://www.merriam-webster.com/ but you’ll know them and link them with subdomain shown up on the first line of this blog post. That’s a subdomain!

So why subdomains?

Subdomains are commonly used to categorize portions of the website, and they can be easily moved to another server if the category gets very popular.

Subdomains are also used by free web hosting providers to resell web space under their own domain name (e.g. http://membername.hostname.com). Each member will have their subdomain, but every one of them there will still share the domain name of the hosting provider.

Subdomain names are also practical to balance the web servers for a high traffic website. Multiple web servers are assigned different subdomains like www.sitename.com, www1.sitename.com, www2.sitename.com etc, though each of them contain the same application code. When the request comes from the browser, the load balancing software redirects it to one of these servers. DNS load balancing is a simple method of load balancing using subdomains pointing to different IP addresses.

Webmaster Considerations

First and foremost, subdomains makes the URLs shorter and easier on the eyes. It allows website owners to categorize the content of the website, and – most importantly – facilitates better search engine rankings as most engines treat the subdomain as a separate website address.

However, there are certain things to consider before setting up subdomains for your website.

One of particular note is that if you use cookies in your website. A cookie set from a subdomain cannot be read from the main domain, and vice versa because of the security association feature tied to the domain which set it. This is also true for session cookies. If the user is logged in on the main site, and then moves to a subdomain, the subdomain site will not be able to access the same session cookie, and will assign a new session – which then forces the user to log in again. Implementing URL rewrites instead of session cookies is the common solution here, but it’s something a webmaster needs to be aware of.

In addition, your website stats will often not include the statistics of the subdomains and you’ll have to set up separate statistics for your subdomains. Subdomains allow a website to be broken down into smaller pieces without losing the brand image associated with the domain name. The subdomains can be hosted on separate servers in order to reduce the burden on the main domain hosting server.

Many web hosting providers do not provide subdomains in their hosting packages and / or charge extra for subdomain setup and maintenance. It’s also common that if you have a subdomain and want to move your site, you have to choose a provider which supports subdomains.

So if you’re looking for Canadian web hosting with subdomains included, we’ve got that for you with our Business and Advanced web hosting packages, competitively priced and featuring all the rock-solid reliability that 4GoodHosting is known for.

The Importance of Link Building for SEO

Businessman suggested effective 'SEO' optimisation approach. Hands presenting a 'SEO' flowchart.

We’re all familiar with what it means to ‘vouch’ for someone or something. That is, to say that you can attest to the quality or reliability of that individual or product. Many of those competing in the online world may not know the value of link building, however, but your Canadian web hosting provider will be explicitly aware of that value. Google of course is the world’s premier search engine, and it uses links to measure the authority of web pages. If a web page attracts links from other pages Google’s algorithm, the software it uses to rank pages and decide which page is number one for a search result increases the importance of a page – and thus how high it places in search engine results.

So in a sense it’s just like a vouch for that page – you feel there’s value there, and you’re recommending visitors might want to consider visiting it via your page to gain more in the way of quality information on the subject.

So let’s now take a look at some of the more common link building techniques, and ones that will be much easier to incorporate into your webpages, blog, or other digital communications.

Forums – Link building techniques using forums typically involve creating a profile page and a signature link. Signature links allow links to be created at the bottom of posts or comments left in a forum. Signature links tend to be regarded as low quality though, and while they may send a measurable amount of traffic they likely won’t have any long lasting impact on the pages you link them to. These links are not merit based, and search engines – Google in particular – are getting better at ignoring these links and assigning them no value.

Link Farms – The link farm technique is a standard approach form SEO companies. A link farm can often have a vast number of websites and web pages within it, all under the control of the individual link builder. Building a link farm link allows them to offer a set number of links for a specific fee, and the extent to which this can be done is pretty much unlimited. Where link farming becomes problematic is when it’s discovered by the search engines. If that occurs – and it does much more frequently these days – the links within them will cease to have any positive impact on the sites they link to.

Directories – These are useful and can create links that are nearly guaranteed to provide some value in the sense of attracting traffic to your website. However, they are a form of spam and, as with other spam techniques, links from directories tend to not be merit based and will almost always carry next to no weight in terms of helping to increase your ranking. Notable exceptions here include the DMOZ directory and Yahoo, among others.

Hubs – These are pages created on services like Squidoo. Hub pages, or lenses as they are also referred to, are a great idea and allow people to create pages about a subject matter they are passionate about. The issue here is in the fact that spammers have also realized it’s an easy way to create pages for more illegitimate link building. As a result, using hubs as part of your link building for better SEO isn’t as highly recommended.

Blogs – Genuine blogs are a great source of landing spots for outbound links with your link building. But be wary of those that are known as SPLOGS, meaning spam blogs. They tend to be a favourite spot for link spammers. The most common technique for this spam approach is to leave a comment and associated link to their client’s website. Long story short, an outbound link of this variety will have little value other than a very small amount of web traffic, unless your comment happens to be on a high traffic popular blog. Most however tend to be deleted by the blog’s webmaster.

Social bookmarks – Once upon a time in the early days of Internet marketing, social bookmarking sites were a source of valuable links. Yet again, link spammers have spoiled that for everyone. Most of the credible bookmarking sites now categorize these links as nofollow links, so it’s best to look past them when weighing which ones might work well for you.

Be in touch with us here at 4GoodHosting to learn more about the most effective ways to improve website SEO, including ways to incorporate link building. We want to get ahead, and we imagine you do too. Let’s talk!

For SEO Services please visit http://4goodhosting.com/seo-services.

Build it Better: Getting the Most Out of Your Website

It’s increasingly rare these days to find a business that doesn’t have their own website. As is the case with every one of them, there’s a whole lot of work that goes into a website – planning, designing, and the actual construction of the different HTML elements that together make up those web pages we’re happy to just visit and browse through as we like.

Be aware that there are choices you can make when putting your website together, however, that can have negative repercussions down the line and quite possibly sabotage the successes you envisioned for yourself when you decided to take your business onto the World Wide Web.

Start with the Right Conceptual Thinking

First off, all of the following are FALSE:

  1. I need a flashy website that looks ‘cool’

Believe us when we tell you that substance trumps flush in a big way when it comes to the entirety of your online presence. Stressing to your web designer that it’s all-important to have a ‘cutting edge’ site is creating the wrong perspective for what you want to be accomplishing here.

Keep in mind that this isn’t a competition about ‘looks’, it’s about effectively conveying to your customers THAT WHICH IS IMPORTANT TO THEM – which, in nearly every instance, is a quality product or service that’s priced reasonably and comes with positive reviews from buyers just like them. What your website looks like will pretty much be the last of their considerations.

  1. It’s always good to save few bucks by having my friend / relative / other design my website for a discount

For sure, hiring designers or developers with many years of experience and a proven portfolio of work can be expensive. The best professionals can and do charge what they do because – in addition to designing the website – they also often research, plan, and strategize the entire conversion process and site flow as well.

Needless to say, you stand to benefit IMMENSELY from this value-added part of their service, and these types of analytical approaches are very likely going to be beyond the ability of the guy or gal you know.

Hiring a qualified design pro can also mean that your project isn’t completed as quickly as you’d like. Without going on at length, we’ll just suffice to say that – again – any delay that’s related to making your website a more soundly constructed portal THAT WORKS TO SERVE YOUR MARKETING INTERESTS effectively is very much worth it in the long run.

Conversions, sales, followers, in-store visits (if you’ve got a brick n mortar) etc etc. – you want them, and working with a reputable and experienced web designer / developer is the straightest path to getting there.

  1. A simple ‘electronic brochure’ is really all I need from my website

Maybe so if you’re fine with just staying where you are in as far as the size and scope of your business (and its success). In all likelihood though, you’d like to see more out of your business and in this day and age the BEST way to build your business is to promote it effectively online. And if we were to tell you that there’s not too much involved in going from an electronic brochure to a fully-vamped website, why wouldn’t you choose the one that’s going to give your more sales / service opportunities and make your brand all the more visible?

Right then.

Taking the Right Approach

This starts with answering one direct question – what is it that you need your website to do? The answer can vary, but it’s safe enough to assume that anyone who’s in business will have ‘helping us sell more of our product(s)’ at or near the top of their list.

So, with that understood, your website MUST:

  • Attract the right customers

Determine what your customer demographic has for a ‘buyer profile’ – how do they buy, when do they buy, what types of advertising / marketing collaterals are they attracted to? If you can’t answer these questions yourself, it’s very wise to hire someone who can analyze these correctly for you

Once these questions are answered, your designer should be able to start making design choices that are in line with the findings. Your website’s primary focus must be on attracting the right prospective customers to your site, keeping them there, and converting many of those visits into conversions etc.

  • Be optimized for people and their searches through Google, Bing etc.

Are you familiar with the acronym SEO? You should be, and if not then – again – you REALLY need to hire a web copywriter to put together your site content. SEO is search engine optimization, and having quality SEO for your content means your site will be presented higher up in search results conducted by potential visitors / customers

  • Be sufficiently easy to use

A frustrated or flustered visitor is never a good thing, and 9 times out of 10 they become a contributor to your site’s bounce rate (also a big negative). Your website must be explicitly user friendly, where content is supported, messages are clearly communicated, calls-to-action are firmly identified, and visitors are guided – subtly – toward a conversion.

  • Help you stand out in the crowd

This one tends to be overlooked sometime, but it’s important to remember that there are likely hundreds, if not thousands, of websites out there that are similar to yours. You need to distinguish your site from them as best you can, ensuring that your visitors are likely to remember you, respect you, and – down the line – even refer you!

Some of this stuff can be very overly-conceptual for many folks, so if it’s above and beyond your comprehension then do trust your website to a digital marketing professional. You’ll be glad you did.

Determining the Best Website Strategy

Every page of every business website has some type of conversion objective, even if that objective is simply to redirect them to the page where you’re making your primary conversion objective accessible.

This last part if important to keep in mind, because it’s a fact that not every visitor to your website will enter it via the landing page. A clear, comprehensive website strategy outlines the paths to conversion from multiple entry points on the website. How they move from their entry points, and what type of information they request on their way to a) buying or b) bouncing is very relevant to take into consideration.

So, we’ll now introduce the 4 types of content found on effective websites;

‘Know’ content – lets them learn about you, your brand, and the products and services that you believe will be of value to them. Always be sure to show yourself to be an authority with your subject matter.

‘Like’ content – helps visitors see who you are, where you stand on certain issues related to your products / services, and what you have to say. It can offer quick looks into your personal life, your personality, and your opinions

‘Trust’ content – to attach credibility, reliability, and trust to you in the eyes of your visitors. Shows you to be an expert with a history of proven results, and conveys an understanding that there’s no risk to choosing you as a retailer or provider.

‘Conversion’ content – utilized to persuade visitors to take specific actions and convert new visitors into clients, customers, subscribers, or any other role in the promotion of your business online

Let’s Get Going

Now you’ll have a more broad-reaching conceptual approach. Remember – don’t rush headlong into anything, think things through and get second opinions along with revisiting your own.

When you understand your website’s function, eliminate distractions, establish multiple different paths to conversion, and apply the 4-type content process highlighted above, you’ll be on the right path for sure. Last but not least, keep in mind that all websites need web hosting, and quite often a web hosting service provider in Canada will offer Internet Marketing services that include optimizing your website. 4GoodHosting certainly does, and you’re encouraged to learn more at www.4goodhosting.com

Defining DNS…. And What’s Exactly In It For Hackers?

DNS isn’t exactly a buzzword in discussions among web hosting providers or those in the web hosting industry, but it’s darn close to it. DNS is an acronym for Domain Name Servers and what DNS does is see to it that after entering a website URL into your browser you then end up in the right spot – among the millions upon millions of them – on the World Wide Web.

DNS. Domain name system sign on white background

When you enter this URL, your browser starts trying to figure out where that website is by pinging a series of servers. These could be resolving name servers, authoritative name servers, or domain registrars, among others. But those servers themselves – often located all around the world – are only fulfilling an individual part in the overall process.

The process itself is a verification of identities by means of converting URLs into identifiable IP addresses, which the networks communicate with each other and by which your browser confirms that it’s taking you down the right path. In a world with literally billions of paths, that’s a more impressive feat than you might think, especially when you consider it’s done in mere seconds and with impressive consistency.

It’s quite common to hear of DNS in conjunction with DDoS, with is another strange acronym that is paired with the term ‘attack’ to create a phenomena noun. What DDoS is and how it’s related so explicitly to DNS much of the time is as follows:

A DDoS attack is a common hack in which multiple compromised computers are used to attack a single system by overloading it with server requests. In a DDoS attack, hackers will use often use infected computers to create a flood of traffic originating from many different sources, potentially thousands or even hundreds of thousands. By using all of the infected computers, a hacker can effectively circumvent any blocks that might be put on a single IP address. It also makes it harder to identify a legitimate request compared to one coming from an attacker.

The DNS is compromised in the way browsers essentially can’t figure out where to go to find the information to load on the screen. This type of attack happens typically involves hackers creating a little army of private computers infected with malicious software known as a Botnet. The people that are often participating in the attack don’t realize their computer has been compromised, and is now a part of the growing problem.

Why Go To So Much Trouble?

With all of this now understood, it begs the question – What’s in it for hackers to do this?

technology, cyberspace, virtual reality and people concept - man or hacker in headset and eyeglasses with keyboard hacking computer system or programming over binary code projection

It’s believed that the initial appeal of hacking is in proving that you can penetrate something / somewhere that’s purported to be impenetrable, and where someone with a skill set similar to yours has gone to significant effort to make it that way. It’s very much a geeks’ chest thumping competition – my virtual handiwork is better than yours!

As hackers become established and the ‘novelty’ of hacking wears off however, these individuals often find new inspiration for their malicious craft. The more time they spend doing it, the sooner they realize that a certain level of skills can introduce them to opportunities for making money with hacking. Among other scenarios, this can be either by stealing credit card details and using them to buy virtual goods, or by getting paid to create malware that others will pay for. And that happens much more often than you might think.

Their creations may silently take over a computer, or subvert a web browser so it goes to a particular site for which they get paid, or lace a website with commercial spam. As the opportunities in the digital world increase, hacking opportunities increase right along with them and that’s the way it will continue to be

Here at 4GoodHosting, we are constantly reevaluating the security measures we have in place to defend our clients’ websites from DDoS attacks, as well as keeping on top of industry trends and products that help us keep hackers and their nefarious handiwork away from you and your website. It’s a priority for sure.

Data’s Really Moving: Superior Fibre Optic Technology is Here

PrintIn all this talk about, servers, databases, cloud storage etc etc., fibre optics have been a bit of a lesser light despite being the unsung hero in the successes we’ve seen in the supercharging the digital world. Now some of you will know what a Terabyte is, and others won’t. Most of you will be familiar with a Gigabyte though, and a Terabyte is 1,024 gigabytes.

That’s a lot of capacity, and with that understood fibre optics cable these days can achieve transfer speeds up to 255Tbps. Simply put, that’s blazing fast and nearly every Canadian web hosting provider we don’t overlook the value those transfer rates have in letting us work with our clients in making their websites – and in the bigger picture their online marketing efforts – really pay off.

What’s different nowadays is that it’s not just one single-core fibre. Fibre optic cables now feature a multi-core glass stem that allows for an enormous load of data to flow through it. While this technology is still a long way off from being used extensively throughout cities, it’s setting the tone for the newer upgrades poised to hit the fibre optics market, and they’re not far off.

Information Super Expressway

Many cable providers plan to upgrade their lines, with some of the major cities in the world soon to see cables capable of handling 400Gbps. That’s a huge upgrade from the current standard of 10Gb/s that many companies use. The newer fibre optic cables also include a host of upgrades and capabilities that really outdistance them from their predecessors. Here’s a quick reference of the improvements:

Security – The integrity of an optic cable actually plays a role in how easily hackers can force their way into data stores. Now, fiber optic cables use light. That makes it tough to ‘grab interference’ as the term goes. About the only way to break into them is to physically break the cord. Since they use light to transmit data, all of the light escapes and makes it easy for network security to notice breaches in data.

Design and Speed – These newer materials are lighter and thinner too. They can be more easily wrapped in a protective coating and hung from poles. In North America the majority of wires are still suspended from poles and not buried in the ground, so weight is a major issue. The reduced weight also allows them to be thicker, which means more data transfer.

Reliability – Since all of the data is transferred through a core of glass, it’s by in large entirely insulated from any threat that might interrupt or shut down transfer. There’s no fear of electrometric interference or radio-frequency interference, otherwise known as EMI and RFI. There’s also very little risk of impedance or crosstalk. Although it’s still susceptible to temperature changes, it’s more reliable when running through water or industrial equipment.

New fibre optic technology continues to expand in leaps and bounds, and it’s not a moment too soon with the way demand for internet usage continues to skyrocket in North America. The aim course is to reach the point where the world is full of nearly instantaneous data transmission, and with us being as much as an eager beaver as can be when it comes to web hosting and online communications – that sounds pretty darn great!

Cloud Computing: Nearly 50 Years in the Making

Cloud computing concept. Photo collage.

Anyone familiar with the acronym ARPANET? Perfectly understandable if you aren’t, even many in the IT World likely wouldn’t have a clue.

So let’s introduce ARPANET and it’s significance to one of the more predominant technical developments in computing these days.

ARPANET is the network that became the basis for the Internet, and began with the interconnection of four university computers in the late 60s. ARPANET sent information in small units called packets that could be routed on different paths before being reconstructed at their destination. The development of the TCP/IP protocols in the 1970s made it possible to expand the size of the network in a much more orderly way.

The initial purpose was to communicate with and share computer resources among users at the 4 different connected institutions, and with that connection the application principle behind cloud computing was born.

Today, we often hear the term “It’s in the Cloud” or “that’s Cloud based”. But what does that mean exactly (and as some might ask, ‘why is Cloud capitalized?’)

To put it simply, data existing in a cloud simply means that it exists on multiple computers at once. Before understanding cloud technology you’ll need to have a basic grasp of server technology. Servers are powerful computers, similar to the technology in a desktop computer, just must more dynamic. When you take all the CPU power and memory of a server and make it a virtual server, the advantage becomes that it doesn’t physically exist anywhere, and can be moved from one host to another as needed and automatically.

This is the “Cloud”. The virtual server that is created can move from one host to another as needed and be given more resources or drive space by grabbing more from the storage servers, known as SANs (Storage Area Network).

Trends in Cloud Technology

So despite being 48 years in the making, it’s only in the last 15 years or so that Cloud computing has established itself. Even that number may surprise some who would think it’d be much less given the ‘newness’ of the technology to the public. In the early 2000s, Clouds were used, but they were more privatized. A single company might have a datacenter that contained clusters of physical and or virtual servers where they host websites, email, or in-house applications for clients.

It was 2007 when we began to see the advent of public cloud space services such as Dropbox and Google Drive. Users were now able to house their personal files in a secure space. Basically, these spaces are folders that exist within a company’s data centres. This data is hosted in clustered environments so that their users can enjoy guaranteed uptime and recovery. Existing in these datacenter’s virtual spaces also allows the data centres to offer their services for very little cost, or free, if the space is small enough.

Further, we now have cloud-enabled services such as Google Apps Marketplace and SaaS (Software as a Service) products like Microsoft Office 365. These new technologies allow the cloud to be, not just a place that you can store data, but utilize entire applications and services such as word processing, spreadsheets, CRM programs, creative suites – you name it.

Cloud Computing’s Future

As this technological entity is still evolving, we continue to see new developments but none is as significant as the trend towards mobile device optimization. Clouds may merge more aggressively and we may also see the forced de-privatization of clouds as larger companies force small and medium-sized businesses to bring their data into these more expansive environments in an effort to reach prospective customers more effectively.

Choose 4goodhosting.com and have a web hosting provider that has their thumb on the pulse of digital marketing and web hosting trends.

Google vs Bing

googlesuxThere’s no debating that Google is the world’s #1 search engine, and any website that exists for e-commerce should be tailored to match (at least reasonably) Google’s ever-changing search algorithms. Reliable web hosting is one thing and gets you solidly set up on the information superhighway, but you need to compete.

But what about Bing? Some people are quite surprised to learn that Microsoft’s search engine is still a legitimate competitor for Google when it comes to being someone’s go-to searcher. Bing is also holding its own when it comes to being a destination for Search Ads, and it seems there’s a good number of reasons why some businesses are still considering Bing for at least part of their ad placements.

Currently, Google owns 65% of market share for US searches, but that’s down from 72% since 2010. Where’s that 7 percent gone? You guessed it – Bing’s had the most growth over the last year, moving up to 19.7% (in part continuing to be powered by Microsoft’s acquisition of Yahoo years ago).

So when it comes to cost-effective search ads – what’s the better choice – Google or Bing?


Keep in mind that Internet Explorer is still the default web browser for a Windows device, and Bing comes as the default search engine.

This is a clear reflection that most Bing users are:

  • Less computer savvy, given the fact that they’re not able to or not interested in upgrading to a more modern and functional web browser
  • Generally over 35 years of age
  • More of the blue-collar employment type as compared to white-collar

It’s not difficult to identify how these findings validate Bing as a still-popular search engine. The blue-collar, over-35 working middle class crowd makes up a HUGE part of the purchasing public in both Canada and America and – you guessed it – because their PC came with Bing as the default search engine, more often than not Bing is their search engine.


Google has hundreds of thousands more total searches and holds more of the market share in all countries except Russia, China, South Korea, and Japan. Research suggests that Google users are:

  • Generally younger
  • College / University educated
  • More white-collar than blue-collar
  • Much more tech savvy
  • Facebook users
  • Less likely to have children

So if your products and/or services mean that you create a large scale or worldwide campaign, Google’s your place. But if you have ones that are likely to be best targeted to domestic customers, you should still at least consider Bing. This will be particularly true if you are marketing a product or service that – for example – is geared to older, male buyers predominantly. You know the ones who very likely wouldn’t even consider installing a different web browser.

Google is much more expensive for CPC (cost per click) rates, rising 26% since 2012 and expected to continue to rise. Conversely, Bing’s current CPC rate is nearly 0.75$ less, 33.5% less expensive overall and clearly offering a much more appealing cost-per-lead rate. That’s really something to take into consideration, especially if you meet the aforementioned criteria for a certain product or service that goes with an older demographic.

Google is still the undisputed king of search engines, but don’t count Bing out entirely. And as always, content is king. Learn how to create smart ads that bring prospective customers to your website.

Here at 4goodhosting.ca we are always keen to share digital marketing insights with our web hosting clientele. Get out there and get click-throughs!

For SEO Services please visit 4goodhosting.com

Google News with Schema & More Strategic Descriptiveness with ETA Search Ads

Businessman suggested effective 'SEO' optimisation approach. Hands presenting a 'SEO' flowchart.

One thing we can count on for this New Year of 2017 is that the world of SEO continues to change at lightning speed. It’s a day in and out reality for us digital marketers to be keeping up with customer usage and expectations, Google’s never ending algorithm updates, and any other of the daily seismic shifts in the world of digital marketing. Complacency will easily destroy your rankings, and rankings are oh so important when it comes to thriving or surviving in modern marketing.

Considering that 93% of online experiences begin with search, prioritizing the latest best practices in digital marketing and optimization will be critical to the success of any website.So this new year is as good as any to take a fresh look at the trends and developments that we have seen throughout the past year and see how they’ll play a part in dictating where we are heading next. Let’s get to them shall we?


Using a schema markup is increasingly advisable for advertisers with changes to Google and user trends.Schema lays out your site more clearly for search engines to understand its nature, and of course this helps to ensure that it is displayed correctly. Schema can also be particularly helpful when Google decides to display rich answers, as is the case with quick answers or a rich card.

Google has made known its preference to display answers that make it easier for users to find what they are looking for. Rich snippets are displayed for recipe and videos, AMP articles, local businesses, music, reviews, and TV & movies. Although this may change in the future, using the schema helps to ensure that your site is more likely to be identified and served up accordingly.

Then there’s Google directing more towards Quick Answers, jumping up from just over 22% in December 2014 to over 40% by the beginning of 2016 and having continued to increase incrementally over this past year. Schema improves the effectiveness of your snippet boxes.

Schema is these days tied very closely to RankBrain and artificial intelligence.This is to be expected, with this machine learning now standing as Google’s third most important ranking factor and pushing brands to make sure their sites are easy for the machine to interpret. Schema can help make this a reality. As artificial intelligence is likely to grow in the future, using schema now can keep your site prepared for whatever we’ll see in the way of advanced AI in the future. Which will almost certainly be considerable!

Hybridization and Getting to Know Google ETAs

As users become increasingly sophisticated online and the demands of digital marketing has professionals working in digital proximity like never before, it is the brands that mature with the flow of modern marketing and smartly separate their digital marketing departments that gain the early benefits.

Mobile users clicking on your PPC ads expect a user experience that’s consistent and continuing with what they had when they landed on your site organically. The answer becomes running hybrid campaigns, and professionals need to be know what they entail, how to run them, and where to best invest resources.

Ideas to consider as you begin strategizing:

  • Host training sessions where you help members of different teams familiarize themselves with each other’s goals and strategies
  • Create collaborative projects where members of different teams come together for common objectives
  • Develop common documents between the different teams that define roles, expectations and shared understandings on brand tone and voice
  • Changes to the layouts of your Google PPC ads

Google has been experimenting this past year with their standard text ads. Specifically, they have been increasing the number of characters allowed in some of the meta descriptions and titles. This is a big plus for those of you who now how to use that added space to share text that’s optimized for identifying your business and its nature.

More About Google ETAs

Some marketers continue to be challenged when it comes to taking advantage of this trend because they have not been rolled out to all websites, nor has Google announced that they are permanent. For the sites that do receive the extra space, however, there are great opportunities for including more keywords and more compelling descriptions to help attract people to the website.

To take advantage of the increased counts for Google ETA ads, you should consider:

  • Continuing to use your main keyword at the beginning of your title and meta description in case you are restricted to the original character limits
  • Using the extra space to expand your description, using keywords very selectively
  • Meta descriptions that are less than 100 characters should be increased to avoid having your description being buried with the new longer limits

If there’s one umbrella this can all be tucked under, it’s the overseeing reality that you must always be reevaluating your digital marketing strategy and tactics. 4GoodHosting has digital marketing advice for all its web hosting clients, and has a wide network of digital marketing industry consulting professionals. All you need to do is ask.

For SEO Services please visit 4goodhosting.com