Machine Learning For Improved Data Security

The reality these days is that malware is constantly reinventing itself, and as such the threat to data storage never minimizes to the extent that we’d like it to. Indeed, data breaches have been a major issue for company IT departments for as long as they have been storing data. Nowadays, it seems there’s a new wrinkle in malware development and distribution that reminds us the threat is as present as ever and an inescapable reality.

However, there is a new technology that is genuinely slowing the malware threat in countless industries, and data security stands to benefit from it considerably. Like any Canadian web hosting provider, we’re very attuned to the need for better security for big data, and especially so considering the ever-increasing level of sensitive and personal information being stored in large data centers. We tip our hats to those who have the expertise needed to counter the growth and advances of malware.

The technology we’re talking about is machine learning, and more specifically artificial intelligence (AI) within it. Many insiders claim it will revolutionize the way we go about protecting data. As it is now, companies are frequently dealing with more and more attacks as their networks and the data volume they handle grows.

Machine Learning From Antivirus Data

One specific area within data security for AI shows especially strong promise. Traditional antivirus (AV) software used the specific signature of malware to identify it, but that method is not ideal for a number of reasons. By making small changes to their malware to alter the signature slightly, these hackers in many cases made it so that the malware could slip past AV software undetected.

Current AI antivirus technology promises a far more sophisticated solution, despite not being AI in the traditional sense. By using machine learning (ML), this technology works by training a program with a large collection of malware data. Eventually it becomes able to recognize the characteristics of potential malware threats in general, and isn’t limited to only looking for signatures as the identifier of particular malware.

This means that provided the programs are kept up to date with new malware to so that they are constantly relearning and being challenged, even malware with completely new signatures can be rendered ineffective and there’s no need to update their software as strictly.

This is a perfect fit, as there is already a large body of data to train the programs on, and the bulk of new malware is not really ‘new’ – it builds off the foundations of other malicious programs. If your machine learning program has encountered a number other malware programs with most of the same core functionality, it becomes a situation where the hacker will need to invest a massive amount of time into creating malware that can disguise itself sufficiently.

Many of the cybersecurity AI firms will claim this is an all-powerful solution, but that might be a bit too grand of a claim. It does provide enough of a deterrent to protect against most typical threats, and primarily because hackers aren’t inclined to create a full malware program completely from scratch.

One important point to understand, however, is that without a large enough set of data these programs won’t be able to train themselves as effectively. Currently there is not quite enough data from network attacks to train machine learning programs as reliably as IT security professionals would like. There have been several hopeful attempts to find a suitable dataset, but so far that’s not been accomplished.

AI: The Solution for Human Error

As you might expect, all of these technological advances can be rendered ineffective if human error comes into play. If an authorized person is the one facilitating the breach, even the best security tools won’t be of assistance, and this of course does happen fairly often.

The majority of data breaches are not the result of malware forcing its way through firewalls undetected. Most breaches are the result of a simple mistake, and often it’s negligence or an untimely oversight. And commonly victims will say it’s an unfortunate reflection of the fact that they’re understaffed, underfunded, and undertrained.

Social engineering education is the solution here. Why? Because with it employees aren’t trained only to defend against common social engineering hacking tactics. When employees are trained, machine learning can be a strong and effective complement to best practices.

One particular tool that carries a big stick in this regard is Onfido (https://onfido.com/gb). It prevents identity fraud by verifying the login with a photo ID, a selfie, and machine learning algorithms. It identifies whether the right person is trying to log in, and then crawls the Web for any potential problems with that identity. Fraudulent data access is prevented with this technology, and even if passwords are compromised.

Monitoring Behavior Patterns

Another machine learning variation is capable of identifying the baseline online behavior for a particular identity, and then any deviation from standard patterns are flagged as indicating a possible malware threat. It’s not unlike your credit card company calling when someone makes a charge on your card on a different continent, but in the digital landscape instead.

It’s very promising for people like us to see how machine learning, and perhaps eventually true AI, can deliver the type of complement that effective data security practices need in response to the new realities of IT security risks. It’s only just beginning to open its lungs and breathe in full, but hopefully once it starts to roar we’ll be able to rest a little bit easier when it comes to knowing data is secure in our data centers.

 

 

Be Wary of Backlinks in Disavow Files

The term ‘black hat’ SEO isn’t bandied around as frequently as it used to, and that’s probably for two reasons. First, developments in the web development industry have made it so that it’s not the buzzword it once needed to be. And then there’s the fact that engaging in less-than-ethical SEO aims via shady link building just doesn’t occur like it used to. There are many reasons for that, but Google’s Disavow File is a big part of why black hat SEO doesn’t get your very far anymore.

Here at 4GoodHosting, like any Canadian web hosting provider we don’t pretend to be web development experts but we are sufficiently knowledgeable regarding the subject. Sharing anything and everything we know with our customers is part of what we do, with the hope that by enabling those managing their website they’ll get more out of their digital and internet marketing efforts. It’s competitive out there, and the value of strong search engine rankings can’t be overstated.

For those of you who may not be familiar with them, a disavow file is a .txt file that contains instructions to Google informing them that you’d prefer if they ignored any number of links to your website. And the reason for doing so? Because they’re having a negative impact on your rankings. Google created it so that webmasters can push back against unsolicited mail and be able to ignore any links that are still pointed to their website after link treatment options have been unsuccessful.

Typically, it’s a last resort option used when certain pesky links aren’t neutralized by standard link removal means. It’s very effective and scads of users have put it to good use over recent years. The best of them contain both root domains and links to individual pages along with comments detailing your removal efforts undertaken for each.

Pay Extra Attention

A recent piece of advice from a Google webmaster trends analyst and industry expert has been warmly received across the industry, and it concerns how some links can actually pair with disavow files to cause your rankings to drop quite pronouncedly. The long and short of it is that you may wish to adjust the links in your disavow file if your rankings drop after submitting it.

There could be any number or reasons behind your employing a disavow file – maybe you engaged in some shady link building practices, or experienced negative SEO, or simply decided to look over backlink profile when auditing and found many low-quality backlinks flagged by link audit tools like CognitiveSEO, SEMrush, or Ahrefs.

These are all good tools for carrying out a link audit, but on occasion there may be some links that you are unsure about, and it’s in these instances when a disavowed file can cause your rankings to drop.

What we can learn here is that we have total control over which backlinks appear in your disavow file and can add or remove them at any time.

The question is typically in trying to understand if the backlinks are bad but have not been flagged by Google yet. A primary consideration in determining this is to see if the backlinks are new. If so, they may not have been categorized or flagged by Google as bad backlinks, but that does not mean they won’t be in the future.

As a result it’s not a good idea to spend a huge amount of time chasing those one or two links that look bad, but that you wish to remove from your disavow file. It’s better to put that time and energy into creating more high-quality content, but that’s somewhat besides the point here. Another common question is how long it takes for links to count after being removed from your disavow file.

Google SEO is constantly evolving, and the ways they counter spammers by incorporating delays in ranking changes after making changes is one of their ways to deter the practice. Spammers can no longer take advantage of instant trial and error testing. When it comes to the use of disavow files, though, the common – and logical – belief is that it will be a little problematic / confusing because there is a time delay from when you remove a link from the disavow before it starts to count again.

That’s not the case, however. Google has clarified that there is NO artificial delay or time-penalty and that rather it involves Google recrawling, re-indexing and reprocessing the pages. It is still unclear whether Google will reinstate links removed immediately, and there is likely a significant time delay.

Just how long this reprocessing element takes is also unclear, but the belief is that it is not something that happens overnight, even if that appears to be the case.

The process does take a small amount of time, so don’t expect an immediate recovery and don’t expect your disavow file to be all-powerful in addressing shady links pointing to your website. Do use them, though, and the consensus is that it’s best to let things settle for 3 months after making any significant changes. You can expect some links to be updated faster than others but overall it seems this is the best bet when trying to analyze the effect of any link network changes you make.

There’s much more on this topic discoverable with a quick Google search of ‘best ways to use google disavow files’, and if this is something that’s relevant for you we encourage you to build on the understanding you’ve gained here. Hope it’s been helpful.

How to Fix 502 Bad Gateway Error in WordPress

Website owners will always find it frustrating when they’re forced to deal with site errors when they’re not entirely sure why it is happening. The all-too-common 502 Bad Gateway Error can have various potential causes. When you’re unsure what is causing it, you’re obligated to test multiple solutions until you find the right one.

Here at 4GoodHosting, we were once greenhorns when it came to troubleshooting web development issues, but in the many years since what’s helped make us a top Canadian web hosting provider is the way we’ve gone through the process to be ever more capable with solving our customers technical support requests. Enabling them to be able to undertake these ‘repairs’ on their own is beneficial all round, so today we’re going to discuss the 502 error in WordPress and the best methods for remedying the situation. Let’s dig into it

What’s the 502 Bad Gateway Error?

Browsing a website involves the browser sending many requests to the server hosting the site. In normal situations when there is nothing amiss the server will send back the information you asked for, and then the site will load it. However, if the server does not return those requests and instead comes back with ‘invalid’ for the request, then the connection has timed out or there is another problem.

All of this stems from the fact that the connection between the websites and your browser is not a straightforward one. For instance, your request via your browser could be routed though the proxy server before it makes the way to the server hosting the site. When this is the scenario, it can be difficult to pinpoint exactly where the problematic issue is occurring.

Even worse, many websites will use more than one server, and it can make things more complicated when you have to locate which server’s error is leading to the 502 error.

Here are the 2 most common causes for the 502 Bad Gateway Error:

  1. Something is wrong in your DB

In these cases the server hosting the website has timed out and is currently unavailable, or is not working as it should. Your server proxy server goes wrong as a result.

  1. One of your themes or plugins includes faulty PHP scripts

When you can identify the fact that the issue is located within the server it is generally better news because it means the issue is on your end, and that means you’ve got the means of fixing it. You need to contact your provider and explain the situation. Most websites cannot afford any long-term downtime, so you really stand to benefit from being able to take action on your end and troubleshoot the problem on your own without delay

How to Fix the 502 Bad Gateway Error in WordPress

Before you move into troubleshooting mode, you can run a quick test to determine if the server is working as normal. Try running a traceroute to check if the server is available. If you pass this test, then the issue is probably exclusively on your end. If the traceroute test shows your server is not reachable, you will have to contact your WordPress hosting provider.

Resetting your routers and checking your DNS settings is also worthwhile. Sometimes little fixes like this can make the 502 error disappear, and they’re simple enough steps to try on your own. Failing that, here are the best fixes for re-establishing accessibility for your website.

  1. Clear Your Browser Cache

When your browser relies on its cache rather than loading the latest website version from remote server it is possible to have the 502 Bad Gateway. Fortunately, the fix is easy in these instances. Reloading the website server times should do the trick. However, if you still have the 502 error message on your screen and you are on Windows, the CTRL +F5 prompt will make your browser reload the site with the latest version and empty your browser cache. This method works well for nearly all browsers, and including Firefox and Chrome.

For OS X things are a little bit different. The prompt is CMD+CTRL+F5 in Safari.

Should this fix fail then you have to manually empty the browser cache. Those using Chrome can do this by going to ‘Settings’ >> ‘Advanced’ and find the option ‘Clear Browsing Data’.

Next, locate the option ‘Cached Images and Files’ under the ‘Basic’ tab, and then press ‘Clear Data’ once you have selected it. Do keep in mind though that if you do not disable the setting before you press the ‘Clear Data’ button, this order will proceed to delete all your browsing history. If that would be a problem for you then this approach my not be best for you.

For Firefox, Internet Explorer, Safari and almost all other browsers you can perform this process to deal with the situation above. Try reloading the website again after clearing the browser cache. Usually the 502 error will have been fixed but if the problem is still there then you need to move on to more heavy-handed fixes.

  1. Temporarily Disable Your CDN

As mentioned, sometimes browser requests may be routed through the reverse proxy server. This means that you are utilizing the proxy server to place an intermediary between your server and the audience’s browsers.

Services like CDN route incoming traffic relying on the reverse proxy to move that traffic with more efficiency. The additional layer sometimes causes trouble as you attempt to connect to the website’s original server. The 502 Bad Gateway Error will stem from this issue as well.

You can determine if your CDN is the cause of the issue by temporarily disabling your CDN. After that, you can test if your site is loading right without the service. The disabling process can be different depending on the CDN services you are applying to your site, but generally it’s not particularly difficult.

After disabling your CDN service, your site will then load directly from your original server and able to skip the standard request / permission process to access it. If your site loads then you can understand the issue must be related to your CDN service. You will unfortunately have to wait quite some time to re-enable the CDN service. After that happens you will of course want to check that the 502 Error has been eliminated.

If this doesn’t work either, try recommendation number three.

  1. Test Your WordPress Plugins and Themes

Sometimes the issue is directly in the structure and makeup of your WordPress website.

There may by even a tiny error within your WordPress themes or plugins where one of them is trying to execute a script that the original server cannot load correctly. This can lead to an invalid response with the browser sending requests that lead to the 502 error.

You are only able to activate one WordPress theme on the site, so it is simple to deactivate the one you are currently using and then switch to the default theme temporarily. Then try to access your site again to determine if the problem is solved. Oppositely, you can activate multiple plugins on your site but this will mean you need to test all of the plugins you are using on your site to find out which one is the culprit.

In instances where you are blocked out of the WordPress dashboard because of the error, you must disable these plugins and themes manually. You don’t actually delete them though, and reactivating them shouldn’t take much time.

Make sure you take a backup of your site before trying this.

FileZilla is probably the best choice for doing this. Open the FTP client you selected and log into your site via FTP client and proceed to find ‘public html’ >> ‘wp-content’ >> ‘plugin folder’. Each plugin you installed on site should be locatable, no matter whether it is active or not.

Right click on a plugin folder to view the different options that can be applied to the plugin, One of them is ‘Rename’. Choose that option and then change the folder name to another one like ‘disabled.thepluginname’ or something similar. Make it easy identifiable as your temporary one only used for this purpose.

When you then access your site again you will not find these plugins disabled on the site. It will still load without these plugins. If these plugins are behind the 502 error, you’ll be able to browse normally now. Also, be sure to apply the command CTRL+F5 prompt to make you’re deleting the browser cache before reloading.

It is recommended to disable your plugins one by one to test which one is the problem. You also need to remember to restore the plugins that you have determined are NOT the problem. For WordPress theme, the same process is effective for testing it.

Once you find the right plugin or theme that’s causing the 502 error, go ahead and delete it all and find another replacement. Or if the plugins or themes are outdated, you can try to update it and see whether that’s an additional fix that improves page loading speed or general performance.

Conclusion

WordPress makes website creation much easier, and troubleshooting it is often not as challenging as you might think – if you know what you’re doing. Some errors have multiple potential causes, but if you can go through a process-of-elimination method and go through these steps here then you’ll find you can do the troubleshooting on your own. Again – always remember to take copies or backups of your website before you make any major changes.

Struts vs Spring as Java Framework

Spring and Struts are a pair of the most popular frameworks for Java web. More than a few of you among the developers out there will either already be familiar with them. If you’ve found the one that works best for you then this week’s blog may be one to pass over, but it may be a good read for those who’d like some insight into which one is best for you. Java has no internal organization, so both Spring and Struts offer a web development framework that allows the user to focus on building solid web applications.

There are a handful of different iterations of both. Here at 4GoodHosting, as a leading Canadian web hosting provider we’re very familiar with the value of being in the know regarding frameworks. And considering Java web is nearly ubiquitous for web app developers we’ll weigh some of the most common differences between Struts and Spring today.

Spring

Spring is a Java web framework. Java relies on objects collaborating and interacting with each other to produce applications. From these interactions come dependencies that Java doesn’t have the means to organize. Spring’s framework gives these components organization, handling your application’s framework to get you up and running quickly.

Spring’s components help you with different elements of your build. MVC handles web applications and replaces the older Struts model for increased functionality with more significant developments. MVC makes web app building less challenging because it your components are separated into three parts. The results is that it becomes easier to build and reuse code without too much modification.

Benefits of Spring:

  • Flexible
  • Easily integrated with other programs
  • Code can be tested easily

Spring’s Drawbacks:

  • Complex to learn
  • Less stable than Struts

Struts

Struts is also an open source Java web framework that helps with organization of the Java components in your app. It’s a front controller pattern that comes with fewer options than Spring, but not significantly less of them. A lot of the difference will depend on your preference.

Struts is an older, legacy application and many heritage sites are still constructed using it. It has integrations with Spring, especially with Struts2. It remains an accessible framework to learn, but it’s not entirely compatible with modern app development depending on the scope of the project.

Struts 2 was an outgrowth of the legacy system, and it helps to simplify Struts to make it more compatible with modern web development. It keeps the same architecture as the old system, with refinements and updates to its components. It must be said that it does have a history of security bugs.

Struts Benefits:

  • Simple design
  • Good tag features
  • Multiple view options

Struts Drawbacks:

  • Poor documentation
  • Has frequent compatibility issues

Struts2 vs Spring

A comparison between Struts2 vs Spring primarily becomes a question of updated legacy or documentation and thriving community. Struts 2 is more of an enterprise solution and it really benefits from having such elegant workarounds. It streamlines the development cycle very nicely.

The action is like a controller. As every time a request is made, the action is then initiated. This is unlike the ones in MVC architectures.

Spring, however, is really going to be a lot more efficient here. When we compare Struts2 vs Spring, Struts 2 gets high marks for its elegance, but Spring gets right down to the nitty gritty to give you a cleaner result that’s more consistent.

Strut 2 Features:

  • Ajax Support
  • Extensive support for both themes and templates
  • Configurable MVC components
  • Simple and reliable Java Object-based actions

Spring Features

  • Roles through MVC separated neatly
  • Flexibility with scale and highly adaptable
  • Flexible mode provides easy integration process
  • Spring tag library is robust but simple

Spring 2.0 Vs Struts

Spring’s modal view controller was introduced in response to common issues with Struts. It successfully delineates different aspects of your triad, making it easier to prototype and test. It’s not necessary to keep writing or modifying the code to get the result you’re after. This comparison of Spring 2.0 vs Struts starts with what’s easier.

Struts continues to generate actions in response to requests. For example, MVC neatly packages that action into the controller where it repeats itself without generating any type of a mess. It’s a neat little package, and popular for that reason.

If stability is your primary aim then you can integrate some aspects of Struts into the Spring framework. However, the Spring architecture does offer more flexibility within your design execution for Spring 2.0 vs Struts.

Struts Features:

  • Stability
  • Explicit control of your design

Spring MVC Features:

  • Clear web development
  • Handles aspects missing from Spring MVC

Choosing Struts Vs Spring

Struts is a legacy system to the extent that it’s always good to become familiar with how it works. Older systems likely won’t be able to integrate with Spring, and that makes it so that nearly every developer will encounter something made with Struts in their working life.

Struts still has a pretty dedicated following too, though documentation for it isn’t as vast as it is for Spring. Among its collection of satisfied users are those working within a problematic Java application and enjoying its stricter, more opinionated framework. If you appreciate direction, Struts will make the grade for you. We recommend it if you’re working with legacy programs or with clients unwilling to migrate to something else. Also, for simple applications that won’t have many requests, the structure does feel safer.

Boundaries aren’t always beneficial, however, and so working with Spring provides a much more open framework. It’s less opinionated, so you have the freedom to break from convention more frequently as needed. There’s no shortage of documentation now that the addition of MVC and the veneer of Spring Boot have provided fixes for Spring’s glaring issues. Java can be clunky, but with Spring you really benefit from the organizing framework.

Struts vs Spring: Conclusion

Finding one to be clearly superior to the other is difficult and most people will feel similarly. We recommend Struts with its legacy applications and neat, button-ed up design that will work well for the bulk of developers. Spring will work well with more creative, flexible designs where space is needed to move past convention now and again.

Consider the types of projects you’re going to be working on within Java and choose the one that gives you options to enhance the efficiency and reach of your work. As is always the case, experimentation and being in no real rush to come to conclusion is the best way to find out which web app development framework is going to be best for you.

 

 

Cloudflare is changing the game

In a world where Google, Amazon and Facebook dominate the tech space, Cloudflare has stolen away the headlines for the betterment of the internet with its recent announcement. The company announced on its 8th birthday that they would be launching a domain registry, and it is unlike any we have seen before.

Cloudflare, to the shock of many in the industry, has decided not to charge anything above the federally mandated cost to register a domain with the government. That is right; this multi-billion dollar company has chosen to not make a single penny off of your domain registration. In a world where the average Canadian spends between $10-$15 per domain, this is remarkable.

Cloudflare is not a small company and is about the same scale as Google at the moment. It has a core set of business that sees itself as a content distribution platform and secure infrastructure vendor for millions of client across the globe. It also has recently announced it is on a path to an IPO and has raised hundreds of millions of dollars in preparation for this. So why do this?

Cloudflare is a unique company in the tech and capital market as they are doing two different things than any other major brand. First, the company does not see the internet as a property that you can corner, and instead looks to promote a free, equal and open internet, much like the values from Internet 1.0. Secondly, the company is doing things for the good of the internet, and although this might ultimately fail once the company scales, it is still a refreshing view from a larger company in the tech space.

This does leave one important question for consumers, what does this mean for the cost and registration of their domain? Well, it is a little up in the air. The Cloudflare system is still being tested and should be live within the month, but it looks to be set up similar to every other registry system. If you are up for renewal, it might be time to take a look around and see if you can benefit from using this new system. As well, for those who are operating hosting or other third party services, your overall cost to your company to get a website should start to drop for your packages if you choose Cloudflare as your registry option.

However, this does still leave some questions. Will the other registry companies like GoDaddy also drop their prices, or will they continue the same old costing options going forward? As well, if you are looking for other nations or domain names, will Cloudflare offer those? Finally, will Cloudflare provide an easy to use swapping option? These are all tough questions, and we will need to wait and see how Cloudflare’s announcement has changed the industry in only a few short weeks.

What are your thoughts? Is this just a bump in the road for the major registry options on the web, or the start of more competitive space for those looking to register domains?

Coming Soon: Quantum Computing

Many people are amazed at just how powerful computers and IT technology have become, and equally blown away by the extent to which they’ve become dominant forces changing nearly every aspect of our lives these days. There’s the old expression ‘you ain’t seen nothing yet’ and it seems that even though the digital revolution has been just that – revolutionary – this expression seems to be appropriate as we’re soon going to have see countries, economies, and every aspect of the global community reshaped by Quantum Computing.

As a leading Canadian web hosting provider, we’re just like everybody else in the IT-related business world that realized just how seismic a shift quantum computing is going to deliver to the world, and experts says that the technology could be realized within 10 years from now. When you think about how 2008 didn’t seem that long, that should put it in perspective. Let’s have a look at what exactly is quantum computing and detail how countries are doing the best they can to be the first to develop it and successfully implement it.

Quantum computers are a real handful, even for experts on computing. These machines process information at the elementary particle scale with electrons and photons and the like, and where different laws of physics apply. Conventional computers, on the other hand, process information as a stream of bits, each of which can be either a one or zero in current computing’s binary language.

Quantum bits, known as qubits, can register zero and one simultaneously. What this will allow, in theory, is the special properties of qubits making it so that the quantum computer can perform calculations at far higher speeds than current supercomputers. The main value of this would be in the realms of chemistry, material science or particle physics. These super powerful and intelligent machines could really make a big difference, and some examples could be aiding in discovering new drugs, optimizing financial portfolios and finding better transportation routes or supply chains. It almost certainly will also advance A.I. – another fast-growing field. Quantum computing could accelerate a computer’s ability to find patterns in large troves of images and other data.

Long story short, quantum computers could operate millions of times faster than today’s most advanced supercomputers, analyzing problems in minutes where a conventional computer could take any number of decades or even more than a century to solve. The technology is still in its infancy but it’s very likely that it will have a major impact on A.I., healthcare, transportation, communications, financial services, weather forecasting and much more.

Naturally, this type of power comes with risk. Some have talked of the possibility of a coming ‘cryptocalypse’ in national security where state secrets, your emails, bank accounts and credit cards are at risk because quantum computers would render traditional internet security programs useless. Both countries want to be leading this emerging science.

Race is On

Currently, both the USA and China are spending massive amounts of money trying to be the first to realize working quantum computing. There are many reasons for that, but a principle one certainly is that a quantum computer could in several decades be powerful enough to break the codes of today’s best cryptography. The implications for national security interests are obvious.

This isn’t anything of an ‘arms race’ as yet, and experts working in the field say there’s a healthy respect between each side given the nature of exploratory research so far. The hope in the US is that strong government backing will help attract a broader group of engineers and entrepreneurs to this field, and that is should and likely will be less like the cloister of Manhattan Project physicists developing the first first nuclear weapons last century. Rather, the hope is that it will be more of a collection of tinkerers and programmers who built thriving industries around the personal computer, the internet and smartphone apps.

The implications for healthcare and transportation in particular are potentially huge also, and particularly for the way it could massively reduce inefficiencies and make quicker and more reliable correlations between diseases and causes / cures and allow for transportation advances like ‘smart’ traffic lights and the like.

It promises to be reshaping of the landscape on the grandest scale, and it’s really all very exciting!

 

The Deal with Bare Metal Servers

‘Cloud’ is definitely the biggest buzzword in the computing world these days, and while those who are tech savvy will know all about it, even the best of them may not know all of its potential applications. Cloud hosting, for example, is an alternative to having websites hosted on shared hosting or dedicated servers. Often times it’s not easy to determine what type of server is the best fit, choosing from the three main options – Shared Hosting / Dedicated Server Hosting / Cloud or Virtual Private Server (VPS) Hosting.

Shared hosting is by far the most common option for small businesses and individuals, and here at 4GoodHosting we’re like most Canadian web hosting providers in that most of our customers get by just fine with shared hosting plans. They consist of many websites hosted on a single server, and they offer extremely good value for money. A website on shared hosting can handle up to 30,000 visitors per month, and that’s usually no more than most sites will need. Shared hosting also has the advantage of being very simple to set up, making it ideal for the beginner or non-technical user. The packages typically will come with unlimited bandwidth as well.

Dedicated server hosting is quite different, with a single server hosting the website(s) or application(s) of a single user. The dedicated server’s advantage is that the entire server is geared for optimum performance because you have the entirety of it to yourself. Yes, dedicated hosting can be expensive, but that vast amount of processing power means it’s worth the expense if fast page-load times, a dedicated IP, and the ability to handle a lot of traffic – as many as 100,000 visitors per month – are important to you. In addition, dedicated servers are very secure and will allow multiple IPs for services that need to be kept separate.

Moving on to cloud hosting, also known as virtual private server (VPS) hosting, we can say it’s probably the most difficult to describe of the three. To summarize it, it’s like having access to nearly unlimited resources and you access as many or as few of them as you need at any given time. It’s kind of like the best of both worlds; you get the wealth of computing resources that you’d get with a single dedicated server, but for the affordable price of shared hosting. All with real scalability and flexibility included also.

VPS hosting is also good for the more technically inclined out there, because of all the customization you can do if you know what you’re doing.

Bare Metal Servers

There’s a new hosting alternative on the scene now, however, and that’s bare metal server hosting. It’s a relatively recent development that offers a hybrid solution, providing performance and cost-effectiveness by pairing the best parts of dedicated hardware and cloud technology at the same time. They’re not entirely new, and some say they’re just a reinvention of dedicated servers. However, the way they integrate with cloud-based technologies makes them different from dedicated servers by offering increased flexibility and cost control.

Bare metal servers aren’t virtual servers, they’re ‘physical’ ones and they are ‘single tenant’ – each one belongs to a single customer. No matter if it’s running any amount of work for the customer or has multiple simultaneous users, a bare metal server is dedicated to that one customer renting it exclusively. And unlike servers found in data centres, bare metal servers are not shared between multiple customers.

Significant but short-term processing needs is where a bare metal server shines. Data can be stored, processed or analyzed on a server for as long as needed, and when it’s no longer needed then the server can be wound back down. This means resources aren’t wasted, and you’re not running the server for longer than necessary.

This is quite different from VPS hosting or cloud servers. With those there is a typical cloud server infrastructure, and dozens of virtual machines could be running on the same physical server. Each will have its own processing requirements too. Bare metal servers are single-tenant, so resources are allocated to that one user exclusively, and they can count on guaranteed higher performance.

No Hypervisor = Superior Performance

The hypervisor layers is another term only the most tech savvy of you will be aware of. It’s the virtual machine monitor which creates and runs VMs, and manages the execution of the guest operating systems. Bare metal servers eliminate this layer and this lets them offer higher performance. Keeping a hypervisor running drains resources and this inevitably leads to a degradation in performance on cloud servers. Bare metals servers have no hypervisor layer because they are dedicated, physical machines.

From a technical perspective, a bare metal server is the same as a dedicated server for all intents and purposes. It offers the same high-performance resources that are dedicated to one user, but has the advantage of flexible, pay-as-you-use billing and you’re not signed to any contracts for your web hosting.

All About the Hybrid

Bare metal servers shine even more when they’re combined with a more traditional cloud infrastructure. Those with an existing cluster of virtual machines hosting their website can then link a bare metal server to your VMs and have them work in unison.

High-performance bare metal servers are strongly suited for situations where companies need to perform short-term, data-intensive functions without being slapped with any overhead performance penalties, such as typically will be the case with high-volume data processing. Before them, organizations couldn’t move these workloads into the cloud without being forced to accept lowered performance levels. Bare metal servers have changed that.

A bare metal / cloud hybrid solution may be something your business or organization would like to look into, as it pairs virtualized cloud services with a dedicated server environment that can eliminate the hypervisor overhead without giving up your flexibility, scalability and efficiency.

5 Best Malware Removal Tools

The thing about cyber threats is that as computing technology advances, the scope and capability of malicious software advances too. There’s not much to be done about that, and the fact that the two will likely always keep pace with each other in this way will likely continue forever. What never changes is that the best defence against malware is to be proactive in keeping your ecosystem free of invaders or infections. That, and being suspicious pretty much any chance you have to be that way.

Being on top of our security needs like a guard sentry on watch is pretty much the norm for any Canadian web hosting provider, and here at 4GoodHosting we’re all over ours pretty much all the time. Running data centres put a whole new scope on defending against malware, but here today we’ll discuss what it is the average individual can do with a look at our take on the 5 best malware removal tools.

We shouldn’t straight off the hop assume everyone out there’s familiar with what exactly malware is, so let’s give a brief overview of that.

Malware is a condensed term built out of malicious and software. What malware does is that it tricks its way into your system – or tricks you into allowing it access – and hacks data. Common types of malware include names you’ve probably heard before like virus, spyware, worm, and trojan. You don’t have to be tech knowledgeable to know when you’ve got one. Common symptoms include PC crashes, restarts or freezes, pop-ups, and warning messages being displayed, or unresponsive systems or similar issues.

In worst cases scenarios the person suffering the malware attack may have their data or system held for ransom, with the malware attacker demanding money to release or disinfect your system. However, in most cases the hackers that will go to this type of trouble will be looking to fry bigger fish rather than an individual person.

Next, let’s look at common types of malware.

Common Types of Malware

Virus

Viruses appear as an executable file that – once permitted to run – corrupts the files and damages a system’s core functionalities.

Trojan

Trojans named after the way the Greeks tricked their way into Troy with a subterfuge, the ‘Trojan Horse’ that looked like a harmless gift. A trojan looks and behaves like a legitimate software, but it creates loopholes in your system’s security that permit other malware to enter the system.

Spyware

Just as the name suggests, spyware is created to spy on your online activities. Spyware can also gain access to confidential data like credit card pins, passwords, and more.

Worms

Worms are created to affect the network of devices either across the entire network, or locally, and they can promote a whole variety of function issues and system slowdowns.

Ransomware

We mentioned ransoms earlier here, and this type of malware is created to put you in a difficult situation where you hopefully agree to pay some money to have everything put back to normal. Ransomware can completely lock the system and the system owner is threatened with having all of their data erased.

Adware

Often disguised as advertisements for software that are used to sabotage your system security and open the system to malware attacks.

Botnet

Botnet is a group of internet-connected devices that are infected and controlled by a similar malware that becomes rooted in all of them at the same time.

Top 5 Best Malware Removal Tools

  1. Spybot – Search & Destroy

Spybot is an anti adware and spyware software that’s compatible with the Microsoft Windows operating systems. Spybot is also available in a free trial version. One drawback of Spybot is that it consumes much time to scan the drives.

  1. SUPERAntiSpyware

SUPERAntiSpware is very effective for neutralizing spyware attacks. However, it is not available with any real-time scanning attribute. SuperAntiSpyware is available in a free version but to make real use of it you need to buy the license key version. SuperAntiSpyware is compatible with Windows operating systems.

  1. Emsisoft Anti-Malware

Emsisfot anti-malware tool protects your system against Ransomware, Bots, Banking Trojans and PUPs. It is equipped with advanced cleaning and restoration capabilities, and it is compatible with Windows operating systems in both free and paid versions.

  1. Combofix

Combofix is freeware designed to target spyware specifically, and allows for manual removal of spyware. It’s compatible with Windows XP/7/Vista 32 bit and 64-bit system while incompatible with Windows 8.1/10.

  1. Kaspersky Lab

Kaspersky Lab provides 24/7 real-time scanning of your system, and gets top marks for affordable anti-malware software. It is compatible with both Windows and Mac operating systems, and works to defend against all of the most common malware to ensure it’ll take something very obscure and uncommon to infect your system.

As mentioned, the best defence is being proactive and being suspicious of pretty much any type of 3rd party software offered to you. Having one of these 5 tools at the ready is a good idea if your business needs put you at risk in this regard more than others.

The Dangers of Abandoned Domain Names

Many people will have a domain name they once owned that eventually lost its value and was discarded. Most of those folks won’t have given much thought to it after declining to renew it with their web hosting provider, and 9 times out of 10 it’s true that nothing more will come of it. However, cyber security experts are now letting people know that an abandoned domain name can allow cybercriminals to gain access to email addresses of the company or individual that previously owned it.

Here at 4GoodHosting, we’re not unlike any other Canadian web hosting provider in the way we claim domain names for clients across hundreds of different industries. Many of whom will have that same domain name for themselves to this day, but some will have abandoned one or more because they found something better or simply because the domain name wasn’t required anymore for whatever reason.

Here’s what happens when a domain name expires. It goes into a reserved state for a certain time, during which time the the recent owner has the ability to reclaim it. If and when that time expires, it becomes available for re-registration for whomever at no additional costs, identity or ownership verification. Now while it is true that SEO professionals and spam trap operators are good at keeping track of abandoned domain names for various purposes, many of them will not know they are a potential security risk. So let’s discuss this here today.

Insider Access Information

Look no further for a pressing concern than the fact that the new owner of the domain name can take control of the email addresses of the former owner. The email services can then be configured to receive any number of email correspondences that are sensitive in nature. These accounts can then be used to reset passwords to online services requiring sensitive info like personal details, financial details, client-legal privileged information, and a lot more.

Recently this has been more in the new because of research performed on domain names abandoned by law-firms in Australia that were cast off as a result of different mergers and acquisitions between companies. These law firms had stored and processed massive amounts of confidential data, and when the domain names were abandoned they still left breadcrumbs that could possibly lead the new owners of those domains to sensitive information.

The possibility of this being VERY problematic should be easy to understand. Email is an essential service in every business, and is a company lost control of their email lists it could be devastating, especially considering sensitive information and documents are often exchanged over emails between clients, colleagues, vendors and service providers due to the simple convenience of doing so.

The study Down Under found that an average of nearly a thousand ‘.au’ domain names (country code TLD for Australia) become expired every day, and we can assume that number is considerably larger here in North America. Further, the list of expiring domain names is typically published in a simple CSV file format and accessible to whoever would like to see it, giving access to anyone who wants to see the domain names that have expired.

Communications storied in the cloud are especially at risk. IIf all the messages aren’t deleted from these cloud platforms, they may remain accessible for the new owner of the domain and then you now have the potential for a leak of sensitive info.

Of further concern is the fact that if that email address has been used to sign up for an account on social media platforms like Facebook, Twitter, or LinkedIn, etc. then the domain’s new owner can reset the passwords and gain access to those accounts.

To avoid this scenario, Companies should ensure that the domain name remains valid for an indefinite period even if it has been abandoned. All the notifications that may contain confidential information should be unsubscribed from the emails.

In addition, disconnecting or closing the accounts that are created using business emails is recommended. Enable two-factor authentication for all the online services that allows it as well, and be sure to do this as soon as possible and leave it in place indefinitely. This is good advice not only for businesses or venture that make use of multiple domains and have moved on from plenty in the past, but it’s good advice for anyone in today’s day and age of cyber threats.

3 Myths About the Posting of Duplicate Content

Hearing the world ‘duplicate content penalty’ strikes fear in the hearts of most marketers. However, understand that it’s only people with no SEO experience that use this phrase with any frequency. Most have never read Google’s guidelines on duplicate content, and they just somehow conclude that there’s going to be heck to pay if something appears twice online.

Here at 4GoodHosting, part of what makes us a good Canadian web hosting provider is the way in which we’re frank with our customers about exactly how it is the in the world of digital marketing. Is publishing duplicate content advisable? No, it’s certainly not. Is it going to be catastrophic for my visibility online as someone with a real interest in successful digital marketing. Very unlikely, and that’s going against what many of you have likely heard.

Let’s bust some duplicate content myths today.

Myth #1: Non-Original Content on a Site Will Mean Lower Rankings Across Your Domain

There has yet to be any evidence that non-original content hurts a site’s ranking, except for in one truly extreme and rare instance. The same day a new website went live, a very lazy PR firm copied the home page text and pasted it into a press release. By putting it on various wire services they immediately created hundreds of versions of the same homepage content plastered all over the web. Google took note, and not in a good way, and the domain was manually blacklisted.

Why was this so much of a problem, when similar instances – albeit on a lesser scale – occur every day? For starters, let’s consider volume. There were hundreds of instances of the same text. Next, timing; All the content appeared at the same time. Next, Context; It was identical homepage copy on a brand new domain.

There’s a lot to be tolerated, but laziness isn’t going to be. However, this isn’t what people are talking about when they offer the phrase ‘duplicate content.’ It takes more than simply same word-for-word copy from one well-known site copied to another lesser known one to make red lights go off at Google.

It’s a fact that many sites – including some of the most popular blogs on the internet – frequently repost articles that first appeared somewhere else. There’s no expectation that this content will rank, but they also know it won’t make their domain less credible.

Myth #2: Scrapers Will Hurt Your Site

Experts familiar with Google Webmaster Tools know that when a scraper site copies a post any links to his site through that copy are disavowed. And if you’ve ever seen the analytics for a big blog you’ll know that some sites get scraped ten times even before the clock reaches 8am. Trackback reports bear this out, and no they do NOT have a full-time team watching GWT and disavowing links all day? Scrapers and duplicate content are quite simply NOT a priority for them.

Scrapers don’t help or hurt you, and primarily because the sites they’re serving aren’t even relevant or visible in the first place, and the scrapers usually take the article verbatim, links and all. Those links pass with little or no authority, and the occasional referral visit isn’t going to get those recipients very far 9 times out of 10.

On the very rare occasion that Google does get confused and the copied version of your content is outranking your original, Google will want to know about it. Tell them using the Scraper Report Tool.

Google Authorship is also highly recommended. It’s a way of signing your name to a piece of content, permanently associating you as the author with the content. With Authorship, each piece of content is connected to only one author and those blogs that correspond to being ‘contributor to’ blogs. No matter how many times it gets scraped, this remains the case.

Keep in mind as well that there is a big difference between scraped content and copyright infringement. Sometimes, a company will copy your content (or even your entire site) and claim credit for its creation. Most of you will know what plagiarism means, but for those who don’t it is the practice of someone else taking your work and passing it off as their own. Scrapers aren’t plagiarizing within the scope of what they do. Anyone who signs their name to your work, however, is plagiarizing it. It’s a BIG no-no.

Myth #3: Republishing Guest Posts on Your Own Site Will Do Harm

Many contributors are guest bloggers, and it’s unlikely that their usual audience sees all their guest posts. For this reason it may be tempting to republish these guest posts on one’s own blog. It’s NOT a hard and fast rule, but content on your own site should be strictly original. But not for fear of a penalty, and more so because original content offers value and that’s good for your web presence in a much more holistic (and rewarding) way.

Some bloggers are actually encouraged to republish their guest post on their own site after a few weeks go by. Often this is done with adding a specific HTML tag to the post

rel=“canonical

Canonical is simply an uncommon word that means ‘official version.’ If you ever republish an article that first appeared elsewhere, using a canonical tag to tell search engines where the original version appeared is wise. Add the tag and republish as you see fit.

If the original was a “how to” post, hold it up to a mirror and write the “how not to” post. Base it on the same concept and research, but use different examples and add more value. This “evil twin” post will be similar, but still original.

Googlebot visits most sites on a daily basis. If it finds a copied version of something a week later on another site, it will identify where the original appeared and move on without creating anything of a fuss. Dinging a domain because unoriginal text was found isn’t nearly the problem for them that others make it out to be.

Fact is, a huge percentage of the internet is duplicate content, and Google is very much aware of it. They’ve been separating originals from copies since 1997, a darn long time since the phrase ‘duplicate content’ became a buzzword around 2005.