Protecting a VPN From Data Leaks

One thing that certainly hasn’t changed from previous years as we move towards the quarter pole for 2019 is that hackers are keeping IT security teams on their toes as much as ever. That shouldn’t come as much of a surprise given the cat and mouse game that’s been going on in cyberspace between the two sides for a long time now. Cyber threats are as sophisticated as ever now, and for everyday individuals they biggest concern is always that the privacy of sensitive data will be compromised.

One of the most common responses to enhanced and more enabled threats is to go with a Virtual Private Network and all the enhanced security features that come with them. Here at 4GoodHosting, we’ve been promoting them for our customers very actively in likely what same way every other Canadian web hosting provider has. There’s merit to the suggestion, as VPN connections protect online privacy by creating a secure tunnel between the client – who is typically uses a personal computing device to connect to the internet – and the Internet.

Nowadays, however, VPN networks aren’t as automatic as they were when it comes to trusting in secure connections and understanding that there won’t be data leaks. The good news is that even people with the most average levels of digital understanding can be proactive in protecting their VPN from data leaks. Let’s look at how that’d done here today.

Workings of VPN

A reliable VPN connection disguises the user’s geographical location by giving it a different IP address. There is also architecture in place to encrypt data transmitted during sessions and provide a form of anonymous browsing. As it is with almost all internet tools, however, VPN connections can also face certain vulnerabilities that weaken their reliability. Data leaks are a concern amongst information security researchers who focus on VPN technology, and it’s these issues that are most commonly front and centre among them:

  1. WebRTC Leaks

Web Real-Time Communication (WebRTC) is an evolution of the VOIP (Voice over Internet Protocol) for online communications. VoIP is the technology behind popular mobile apps such as Skype and WhatsAppp, and it’s been the leading force behind making legacy PBX telephone systems at many businesses entirely obsolete now.

WebRTC is also extremely valuable with the way that it allows companies to hire the best personnel. Applicants can be directed to a website for online job interviews with no need for Skype or anything similar installed.

Everything would be perfect, except for the fact that the IP addresses of users can be leaked, and even through a VPN connection.

  1. DNS Hijacking

It’s fair to say that hijacking domain name system (DNS) servers is one of the most tried-and-true hacking strategies, and interestingly a large portion of that has been made possible by well-intentioned efforts to enact internet censorship. The biggest DNS hijacking operation on the planet is conducted by Chinese telecom regulators through the Great Firewall, put in place with the aim of restricting access to certain websites and internet services.

DNS hijacking encompasses a series of attacks on DNS servers, but arguably the most common one involves taking over a router, server or even an internet connection with the aim of redirecting traffic. By doing so hackers are able to impersonate websites; your intention was to check CBC News, but instead you’ll be directed to a page that may resemble it but actual uses code to steal passwords, compromise your identity, or leave you with malware on your device.

Often times WebRTC and DNS hijacking are working in conjunction with each other: a malware attack known as DNS changer that can be injected into a system by means of JavaScript execution followed by a WebRTC call that you’re unaware of. Done successfully, it can gain your IP address.

Other lesser-known vulnerabilities associated with VPN networks are Public IP address, torrents, and geolocation

How to Test for Leaks

It might be best to cut right to chase here sort of – The easiest way to determine if you’ve got a leak is to visit IPLeak.net, and do it with your VPN turned off. This site is a very nice resource. Once you’ve visited, then leave seat and turn your VPN back on before repeating the test.

Then, you compare results.

The torrents and geolocation tests available are fairly worthwhile themselves, but probably not as much of a factor indicator as the DNS. Navigating the internet is done by your device communicating with DNS servers that translate web URLs into numeric IP addresses. In the bulk of those instances, you’ll have defaulted through your ISP servers, and unfortunately these servers tend to be very leaky on their own to begin with.

Leakage through your local servers can serve up your physical location to those with bad intentions, even with a VPN set up and utilized. VPN services route their customers through servers separate from their ISP in an effort to counter these actions.

Once you determine your data is leaking, what is there you can do to stop it? Read on.

Preventing Leaks and Choosing the Right VPN

A good suggestion is to disable WebRTC in your browser, and doing so even before installing a VPN solution. Some developers have set this to be a default configuration, while most better ones will have this is an enabled option.

Search ‘WebRTC’ in the help file of your browser and you may be able to find instructions on how to modify the flags or .config file. Do so with caution, however, and don’t take actions until you’re 100% certain they’re the correct ones or you may risk creating quite a mess for yourself.

Other good preventative measures include:

  • Going with the servers suggested when configuring your VPN – typically not those of your Internet service provider (ISP) but ones maintained by the VPN provider. Not all of them have them, though
  • Aiming to have a VPN that has upgraded protocols making it compatible with the new IPv6 address naming system. Without one, you’ll have a much greater risk of leaks. If you’re about to move to a VPN, this should be one of your primary determinations
  • Making sure your VPN uses the newest version of the OpenVPN protocol, and especially if you’re on a Windows 10 OS device (it has a very problematic default setting where the fastest DNS servers is chosen automatically. OpenVPN prevents this)

Overall, the security of tunneled connections is going to be compromised big time by a leaky VPN. If the security of your data is a priority for you, then you should be evaluating VPN products, reading their guides and learning about best ways to secure your system against accidental leaks.

Keep in mind as well this isn’t a ‘set it and forget it’ scenario either. You need to check for leakage from time to time to ensure nothing has changed with your system. Last but not least, make sure the VPN you use has a kill-switch feature that will cut off your connection immediately if a data leak is detected.

New Epic Quickly Becoming Browser Of-Choice for Those Big on Privacy

Things change quickly in the digital world, and what was barely even on the radar can become a front and centre issue overnight in some cases. Go back 10 years and the issue of privacy in web browsing wasn’t something the vast majority of people paid even the slightest bit of attention to. Nowadays, however, it’s definitely a hot-button topic given all the news that’s come out about web browsing histories and the like being tracked, monitored, and then made available to whoever doesn’t mind paying for information about what people like YOU search for online.

Some people don’t have a problem with that. Other people have quite a significant problem with that. If you’re part of the second group there then you may have already switched over to using a web browser like DuckDuckGo or something similar. It’s a fine privacy-promoting web browser in itself, but it’s a bit of a generalist in that it works suitably well across the board but not especially well for any one framework.

And that’s where and why Epic coming onto the scene is as noteworthy as it is. It is a Chromium-based browser designed to ensure privacy without giving up anything i speed or functionality. It blocks ads as well as prevents user tracking, and also includes built-in protection against a wide range of surveillance methods cryptocurrency mining scripts among them.

It promises to be just what the Doctor ordered for those who think these types of overwatch activities are unacceptable, and here at 4GoodHosting we’re like any other quality Canadian web hosting provider in that we agree with you wholeheartedly. Let’s take a look at what makes this new no-tracking web browser such a good fit and why it promises to be especially well received.

Surfers 1 / Watchers 0

It’s fair to say that it’s really a shame that the innocence and carefreeness of using the world wide web to gain information is gone now, and that government agencies, corporations, and malicious hackers lurking in the shadows and taking notes is entirely unacceptable. Even those who aren’t overly incensed at having their privacy violated will almost certainly choose to stay ‘incognito’ if the opportunity to do so exists.

Epic’s creator, Alok Bhardwaj, attributes much of his need to build such a resource on coming to understand that on average, there are some 10 or so trackers on pretty much every website you visit. For some still, there’s up to 30 or 40 companies that are logging your visit.

Fortunately, his new Epic browser includes built-in protection against a wide range of surveillance tactics, and without any of the BS like what was seen in 2015 in the States with AT&T’s policy where subscribers had to pay up to 50% more to secure a reasonable level of privacy.

The original version of Epic has been around since August of 2018, but the Chromium-based version of it is still new to the scene. It allows users to enjoy private browsing without sacrificing speed or functionality, and also blocks ultrasound signal tracking and cryptocurrency mining scripts. Plus, with a new mobile browser on the way, Epic continues to take actions that support the company’s belief in a free internet.

 

Sight for Sore Eyes: Privacy-Focused Web Browser

U.S. President Donald Trump’s 2017 decision to cann internet privacy rules as passed by the Federal Communications Commission in the previous year put an effective end to internet users having more rights concerning what service providers can do with their data. Here in Canada we certainly haven’t been immune to the increasingly grey areas of what can and can’t be done as far as monitoring a web browser user’s history.

Likely no one needs convincing that relying on governmental agencies to solve data privacy issues will likely result in little if anything being done. So we’re left to take matters into our hands as much as we can. Good news on that front, as Epic is an exceptionally private browsing experience that’s also fast and intuitive and based on Google’s open-source Chromium project for long-term practicality in the bigger picture of things.

That perspective was very important in the development of this new browser, according to Bhardwaj. Microsoft announced that the company would build their next browser on Chromium, and so the decision was made to build a browsing experience that’s very private, but just as fast as using Google Chrome.

Mission Accomplished

We’d say it is – Epic is one of the most simple, private, and fast browsers on the market today, and it’s really raised the bar that was set by the original private browser, Tor. (which is still a great browser FWIW, still doing very well and also offers an extremely anonymous service)

One area where Epic meets a need that Tor can’t, however, is with malicious cryptocurrency activities. Hackers have used Tor to steal cryptocurrency from users, and fairly recently too.

Long story short, Epic is the only private browser out there that just works out of the box with a high level of privacy and speed, and it doesn’t have any of the issues where advanced security protocols render certain website undeliverable. In the event that one won’t, Epic lets you turn off the proxy and ad blocking feature for a particular website if needed.

Other appealing features:

  • Free VPN
  • 1-click encrypted proxy
  • Blocks fingerprinting and ultrasound signaling
  • Locally stored database of the top 10,000 websites in the world

Coming to Mobile Soon

Epic is expected to launch the company’s mobile browser before long. They expect their mobile browsers to be even more significant than the desktop browsers, given the scale that mobile’s going to operate on. With the extent to which most of us use our smartphones for internet search queries, there’s no doubt that this mobile browser release will put Epic even more in the spotlight in the near future.

5G Networks: What to Expect

We don’t know about you, but for those of us here it doesn’t seem like it was that long ago that 3G Internet speeds were being revelled in as the latest and greatest. Things obviously change fast, as 3G has been in the rear view mirror for a long time now, and the reality is that the newest latest and greatest – 4G – is about to join it there.

Here at 4GoodHosting, the fact we’re a leading Canadian web host makes us as keen to learn more about what the new 5G networks have in store for us as anyone else who’s in the digital space day in and out. It appears that we’re in for quite a treat, although there are some who suggest tempering expectations. That’s to be expected anytime wholesale changes to infrastructure key to big-picture operations are forthcoming.

Nonetheless, we’re supposed to be immersed in the 5G world before the end of next year. Mobile 5G is expected to start making appearances in cities around North America this year, with much more extensive rollouts expected in 2020 so a discussion of what we can all expect from 5G is definitely in order. Let’s do it.

What is 5G, and How’s It Going to Work?

To cut right to it, 5G is the next generation of mobile broadband that will augment 4G LTE connections for now before eventually replacing them. 5G is promising to deliver exponentially faster download and upload speeds along with drastically reduced latency – the time it takes devices to communicate with each other across wireless networks. Right, that alone is worthy of some serious fanfare, but fortunately there’s even more to this.

But before getting into additional benefits expected to be seen with 5G networks, let’s have a look at what makes them different from 4G ones and how exactly these new super networks are predicted to function.

Spectrum-Specific Band Function

It’s important to start with an understanding of the fact that unlike LTE, 5G is going to operate on three different spectrum brands. The lowest one will be the sub-1GHz spectrum bands like GSMA / ITU. They are what’s known as low-band spectrums, and they’re the ones used for LTE by most carriers in North America. This spectrum is quite literally running out of steam, so it’s ready to be replaced. It does provide great area coverage and signal penetration but peak data speeds never exceed 100Mbps and often you’re not even anywhere close to that even.

Mid-band spectrums provides faster coverage and lower latency but the long-standing complaint related to them is that they fail to penetrate buildings and peak speeds top out at around 1GB

High-band spectrums (aka mmWave) are what most people think of when they think of 5G, and high-band spectrums can offer peak speeds up to 10 Gbps along with impressively low latency most of the time. The major drawback here though? It has low coverage area and building penetration is poor.

It appears that most carriers are going to start out by piggybacking 5G on top of their 4G LTE networks to start, and then nationwide 5G-exclusive networks will be built. Providers are very aware that small cells are going to required so that these suped-up 4G LTE networks don’t have their 5G appeal diminished with poor penetration rates and intermittently average download speeds.

In this regard, we all stand to benefit from the industry being cautious about not rolling out 5G on its own and then having growing pains with these networks.

Right, some people may not be familiar with small cells. They’re low-power base stations that cover small geographic areas that allow carriers using mmWave for 5G to offer better overall coverage area. Beamforming will be used to improve 5G service on the mid-band by sending a single focused signal to each and every user in the cell, while systems using it monitor each user to make sure they have a consistent signal.

Latency promises to be nearly if not entirely non-existent between the small cells and beamforming within 5-G enabled 4G LTE networks.

Examples of How 5G SHOULD Make Things Better

  1. Improved broadband

The reality today is that carriers are running out of LTE capacity in many major metropolitan areas. In some spots, users are already experiencing noticeable slowdowns during busy times of day. 5G will add huge amounts of spectrum in bands that have not been dedicated for commercial broadband traffic.

  1. Autonomous vehicles

Uber may have a devil of a time getting footed in Vancouver, but you can likely expect to see autonomous vehicles made possible with ubiquitous 5G deployment. The belief is that it will make it possible for your vehicle to communicate with other vehicles on the road, provide information to other vehicles regarding road conditions, and share performance information with both drivers and automakers.

This applications has a TON of promise, and it’s definitely one to keep an eye on.

  1. Public Infrastructure & Safety

It’s also predicated that 5G will allow cities and other municipalities to operate with greater efficiency. All sorts of civic maintenance process will be made more efficient by means of 5G networks.

  1. Remote Device Control

The remarkably low levels of latency expected with 5G make it so that remote control of heavy machinery may become possible. This means fewer actual people in hazardous environments, and it will also allow technicians with specialized skills to control machinery from any location around the globe.

  1. Health Care

5G and its super low latency may also be huge for health care applications. Since URLLC reduces 5G latency even further than what you’ll see with enhanced mobile broadband, we may see big improvements in telemedicine, remote recovery and physical therapy via AR, precision surgery, and even remote surgery in the very near future once 5G becomes the norm.

One of the most beneficial potential advances that may come with 5G as it concerns healthcare is that hospitals may be able to create massive sensor networks to monitor patients, allow physicians to prescribe smart pills to track compliance, and let insurers monitor subscribers to determine appropriate treatments and processes.

  1. IoT

Last but certainly not least is the way 5G will benefit the Internet of Things. As it is now, sensors that can communicate with each other tend to require a lot of resources and really drain LTE data capacity.

With 5G and it’s fast speeds and low latencies, the IoT will be powered by communications among sensors and smart devices. These devices will require fewer resources than ones that are currently in use, and there’s huge efficiencies to be had with connecting to a single base station.

It’s interesting to think that one day 5G will probably be as long-gone and forgotten as 3G is now, despite the fanfare we all gave it many years ago. You can’t stop progress in the digital world, and it’s fair to say that 99% of us wouldn’t want to even if we could.

 

Getting Ready for Wi-Fi 6: What to Expect

Most people aren’t any more familiar with Wi-Fi beyond understanding that it means a wireless internet connection. Those same people won’t be aware that in the last near decade the digital world has moved from Wi-Fi 4 to to Wi-Fi 5, and now Wi-Fi 5 is set to be replaced by Wi-Fi 6. What’s to be made of all of this for the average person who only knows that the wi-fi networks in their home and office are essential parts of their connected day-to-day, and that the wi-fi in Starbucks is pretty darn convenient as well.

The numeric chain that identifies a Wi-Fi standard is something they may well recognize though. 802.11 is the standard, but the Wi-Fi 4 you had from 2009 to 2014 is different from the same 802.11 standard you’ve had with Wi-Fi 5 since then till now. What’s to come later this year with Wi-Fi 6 will be a different 802.11. Right, we get you – what’s the difference exactly.

Here at 4GoodHosting, we’re like any quality Canadian web hosting provider in that the nature of our work and interests makes it so that we pick up on these things, if for no other reason than we’re exposed to and working with them on a regular basis. Much of the time these little particulars related to computing, web hosting, and digital connectivity aren’t worth discussing in great detail.

However, because Wi-Fi is such an essential and much-appreciated resource for all of us we thought we’d look at the ‘new’ Wi-Fi set to arrive later this year here today.

Wi-Fi 6: Problem Solver

When we look at ‘802.11ac’, the average person won’t get the significance of that. The fact is, however, they should and what Wi-Fi 6 is being designed to be is a solution to that problem.

What we’re going to see is the beginning of generational Wi-Fi labels.

Let’s make you aware that there is a collective body known as the Wi-Fi Alliance. They are in charge of deciding, developing, and designating Wi-Fi standards. We are all aware of how devices are becoming more complex and internet connections evolve, and when they do the process of delivering wireless connections also changes.

As a results, Wi-Fi standards — the technical specifications that manufacturers establish to create Wi-Fi — need to be updated from time to time so that new technology can flourish and compatibility extends to the near entirety of devices out there.

As mentioned though, the naming of Wi-Fi standards is totally foreign to the average person if they ever try to figure what that numeric 802-something chain stands for. The Wi-Fi Alliance’s response to this is now to simply refer to the number of the generation. Not only will this apply to the upcoming Wi-Fi 6, but will also be retroactive and thus apply to older standards. For example:

802.11n (2009) – Wi-Fi 4

802.11ac (2014) – Wi-Fi 5

802.11ax (expected late 2019) – Wi-Fi 6

It’s easier to see how this is a better classification approach, but there’s likely going to be a a period of confusion where some products are labeled with the old code and some are just called Wi-Fi 4 or Wi-Fi 5 when they’re functionally interchangeable in as far as ‘type’ is concerned. Eventually, however, this should be resolved as older product labeling is phased out and everyone – or most people at least – become familiar with the new Wi-Fi classifications. In all honesty, for most people if you just pay even the slightest amount of attention you’ll begin to notice the difference without having to put much thought into it.

How Wi-Fi 6 Will Be Different – And Better

The biggest impetus to create Wi-Fi 6 was to better accommodate all the many new Wi-Fi technologies that have been emerging. Wi-Fi 6 helps standardize them. Here’s the most relevant developments, and exactly what they should mean for your wireless network.

Lower Latency

Lower latency is a BIG plus that’s going to come with Wi-Fi 6, and you’ll probably notice it right quick. Reduced latency means shorter or no delay times as data is sent – which is very similar to ping rate and other such measurements. Low latency connections improve load times and prevents disconnects and other issues more effectively. Wi-Fi 6 lowers latency compared to older Wi-Fi standards, and it does so using more advanced technology like OFDMA (orthogonal frequency division multiple access). Long story short, it’s going to pack data into a signal much more completely and reliably.

Speed

Wi-Fi 6 will also be faster, and considerably faster compared to Wi-Fi 5. By offering full support for technologies like MU-MIMO, connection quality will improve for compatible mobile devices in a big way, and content delivery should be sped up accordingly. These improvements won’t be as relative to Internet speed as you might think too. They can and likely will improve the speed of your Wi-Fi data and let your receive more information, more quickly.

Now a question we imagine will come up for most of you – will all routers be able to work with the new 802.11ax standard? No, they won’t. If your router is especially dated, you should happily accept the fact it’s time to get a newer model. It will be 100% worth it, don’t have any doubts about that.

Wi-Fi 6 is also going to mean fewer dead zones, as a result of expanded beamforming capabilities being built into it. ‘Beamforming’, you say? That’s the name for the trick your router uses to focus signals on a particular device, and that’s quite important if the device is having difficulty working with a connection. The new WiFi 6 802.11ax standard expands the range of beamforming and improves its capabilities. Long story short again, ‘dead zones’ in your home are going to be MUCH less likely.

Improved Battery Life

Wi-Fi 6 is going to mean better battery life, and we’ll go right ahead and assume that’s going to be most appealing for a lot of you who are away from home for long periods of the day and taking advantage of Wi-Fi connectivity fairly often throughout.

One of the new technologies that Wi-Fi 6 is set up to work with is called ‘TWT’, or target wake time. It assists connected device with customizing when and how they ‘wake up’ for the purpose of receiving data signals from Wi-Fi. Devices are able to ‘sleep’ while waiting for the next necessary Wi-Fi transmission and battery drain is reduced as a result. Your phone does not sleep at all itself, only the parts of it that are operating with Wi-Fi.

Everybody will like the idea of more battery life and less time spent plugging in to recharge.

Keep an Eye Out for the Wi-Fi 6 Label

How will you know if a router, phone or other device works with the new 802.11ax standard? Simply look for the phrase ‘Wi-Fi 6’ on packaging, advertisements, labels or elsewhere. Look up the brand and model # online if for some reason you don’t see it on the packaging. The Wi-Fi Alliance has also suggested using icons to show the Wi-Fi generation. These icons appear as Wi-Fi signals with a circled number within the signal.

Identifying these icons should help you pick out the right device. If not, you can of course always ask the person behind the till and they should be knowledgable regarding this (if they work there you’d have to assume they would be).

Keep in mind that most of the devices around 2020 and later are expected to be Wi-Fi 6, and so we’ll have to wait a year or so before they start to populate the market.

 

Chromium Manifest V3 Updates May Disable Ad Blockers

It’s likely that a good many of you are among the thousands upon thousands of people who have an Ad Blocker installed for your web browsers of choice. Some people do use them simply to avoid the nuisance of having to watch ad after ad, and it’s people like these that have necessitated some sites to insist that you ‘whitelist’ them in order to proceed into the website they want to visit. That’s perfectly understandable, as those paying advertisers are the way the website generates income for the individual or business.

For others, however, we spend a great deal of our working day researching and referencing online, and having to watch ads before getting to the content we need in order to do our work. For us, an ad blocker is much more of a tool of necessity rather than convenience. Still, we get caught up in more than a few sites that will insist on being whitelisted too. For me, my ad blocker is a godsend and I don’t whitelist any website or disable my ad blocker for any of them.

Here at 4GoodHosting, part of what makes us a good Canadian web hosting provider is having built up an insight into what really matters to our customers. The bulk of them are people who use the Information Superhighway as a production resource rather than web ‘surfers’ for whom it’s more of an entertainment one. That’s why today’s news is some that’s sure to be very relevant for most of our customers.

Weakened WebRequest APIs

Some of you may not know how your ad blocker works, and that’s perfectly normal. As long as it does its job, you don’t really need to know. Chromium is Google’s newest all-powerful web browser, and just like Chrome did you can expect it to soon become nearly ubiquitous as most people’s web browser of-choice.

However, Chromium developers in the last few weeks have shared that among the updates they are planning to do in Manifest V3 is one that will restrict the blocking version of the webRequest API. The alternative they’re introducing is called declrativeNetRequest API.

After becoming aware of it, many ad blocker developers expressed their belief that the introduction of the declarativeNetRequest API will mean many already existing ad blockers won’t be ‘blocking’ much of anything anymore.

One industry expert stated on the subject, “If this limited declarativeNetRequest API ends up being the only way content blockers can accomplish their duty, this essentially means that two existing and popular content blockers like uBO and uMatrix will cease to be functional.”

What is the Manifest V3 Version?

It’s basically a mechanism through which specific capabilities can be restricted to a certain class of extensions. These restrictions are indicated in the form of either a minimum, or maximum, version.

Why the Update?

Currently, the webRequest API allows extensions to intercept requests and then modify, redirect, or block them. The basic flow of handling a request using this API is as follows,

  • Chromium receives the request / queries the extension / receives the result

However, in Manifest V3 the use of this API will have its blocking form limited quite significantly. The non-blocking form of the API that permits extensions to observer network requests for modifying, redirecting, or blocking them will not be discouraged. In addition, the limitations they are going to put in the webRequest API have yet to be determined

Manifest V3 is set to make the declarativeNetRequest API as the primary content-blocking API in extensions. This API will then allow extensions to tell Chrome what to do with a given request, instead of Chromium forwarding the request to the extension. This will enable Chromium to handle a request synchronously. Google insists this API is overall a better performer and provides better privacy guarantees to users – the latter part of which if of course very important these days.

Consensus Among Ad Blocker Developers and Maintainers?

When informed about this coming update many developers were concerned that the change will end up completely disabling all ad blockers. The concern was that the proposed declarativeNetRequest API will result in it being impossible to develop new and functional filtering engine designs. This is because the declarativeNetRequest API is no more than the implementation of one specific filtering engine, and some ad blocker developers have commented that it’s very limited in its scope.

It’s also believed that the declarativeNetRequest API developers will be unable to implement other features, such as blocking of media element that are larger than a set size and disabling of JavaScript execution through the injection of CSP directives, among other features.

Others are making the comparison to Safari content blocking APIs, which essentially put limits on the number of admissible rules. Safari has introduced a similar API recently, and the belief is that’s the reason why Apple has gone in this direction too. Many seem to think that extensions written in that API are more usable, but still fall well short of the full power of uBlock Origin. The hope is that this API won’t be the last of them in the foreseeable nearest future.

Google Chrome Solution for ‘History Manipulation’ On Its Way

No one will need to be convinced of the fact there’s a massive number of shady websites out there designed to ensnare you for any number of no-good purposes. Usually you’re rerouted to them when you take a seemingly harmless action and then often you’re unable to back <- yourself out of the site once you’ve unwilling landed on it. Nobody wants to be on these spammy or malicious pages and you’re stressing out every second longer that you’re there.

The well being of web surfers who also happen to be customers or friends here at 4GoodHosting is important to us, and being proactive in sharing all our wisdom about anything and everything related to the web is a part of what makes one of the best Canadian web hosting providers.

It’s that aim that has us sharing this news with you here today – that Google understands the unpleasantness that comes with this being locked into a website and has plans to make it remediable pretty quick here.

The first time something like this occurs you’ll almost certainly be clicking on the back button repeatedly before realizing it’s got no function. Eventually you’ll come to realize that you’ve got no other recourse than to close the browser, and most often times you’ll quit Chrome altogether ASAP and then launch it again for fear of inheriting a virus or something of the sort from the nefarious site.

How History Manipulation Works, and what Google is Doing About It

You’ll be pleased to hear the Chrome browser will soon be armed with specific protection measures to prevent this happening. The way the ‘back’ button is broken here is something called ‘history manipulation’ by the Chrome team. What it involves is that the malicious site stacks dummy pages onto your browsing history, and these work to fast-forward you back to the unintended destination page you were trying to get away from.

Fortunately, Chrome developers aren’t letting this slide. There are upcoming changes to Chromium’s code which will facilitate the detection of these dummy history entries and then flag sites that use them.

The aim is to allow Chrome to ignore the entirety of these false history entries to make it so that you’re not buried in a site that you had no intention of landing on and the back button functions just as you expect it to.

This development is still in its formative stages, and we should be aware that these countermeasures aren’t even in the pre-release test versions of Chrome yet. However, industry insiders report that testing should begin within the next few weeks or so, and all signs point towards the new feature being part of the full release version of the web browser.

In addition, this being a change to the Chromium engine makes it so that it may eventually benefit other browsers based on it. Most notable of these is Microsoft Edge, making it so that the frustrations of a paralyzed back button will be a thing of the past for either popular web browser. So far there’s no industry talk of Apple doing the same for Safari, but one can imagine they’ll be equally on top of this in much the same way.

Merry Christmas from 4GoodHosting

Given it’s the 24th of December here we of course would like to take this opportunity to wish a Merry Christmas to one and all. We hope you are enjoying the holidays with your family and this last week of 2018 is an especially good one. We can reflect on 2018, and look forward to an even more prosperous year in 2019.

Happy Holidays and best wishes, from all of us to all of you!

The Surprising Ways We Can Learn About Cybersecurity from Public Wi-Fi

A discussion of cybersecurity isn’t exactly a popular topic of conversation for most people, but those same people would likely gush at length if asked about how fond of public wi-fi connections they are! That’s a reflection of our modern world it would seem; we’re all about digital connectivity, but the potential for that connectivity to go sour on us is less of a focus of our attention. That is until it actually does go sour on you, of course, at which point you’ll be wondering why more couldn’t have been done to keep your personal information secure.

Here at 4GoodHosting, cybersecurity is a big priority for us the same way it should be for any of the best Canadian web hosting providers. We wouldn’t have it any other way, and we do work to keep abreast of all the developments in the world of cybersecurity, and in particular these days as it pertains to cloud computing. We recently read a very interesting article about how our preferences for the ways we (meaning the collective whole of society) use public wi-fi can highlight some of the natures and needs related to web security, and we thought it would be helpful to share it and expand on it for you with our blog this week.

Public Wi-Fi and Its Perils

Free, public Wi-Fi is a real blessing for us when mobile data is unavailable, or scarce as if often the case! Few people really know how to articulate exactly what the risks of using public wi-fi are and how we can protect ourselves.

Let’s start with this; when you join a public hotspot without protection and begin to access the internet, the packets of data moving from your device to the router are public and thus open to interception by anyone. Yes, SSL/TLS technology exists but all that’s required for cybercriminal to snoop on your connection is some relatively simple Linux software that he or she can find online without much fuss.

Let’s take a look at some of the attacks that you may be subjected to due to using a public wi-fi network on your mobile device:

Data monitoring

W-fi adapters are usually set to ‘managed’ mode. It then acts as a standalone client connecting to a single router for Internet access. The interface the ignore all data packets with the exception of those that are explicitly addressed to it. However, some adapters can be configured into other modes. ‘Monitor’ mode means an adapter all wireless traffic will be captured in a certain channel, no matter who is the source or intended recipient. In monitor mode the adapter is also able to capture data packets without being connected to a router. It has the ability to sniff and snoop on every piece of data it likes provided it can get its hands on it.

It should be noted that not all commercial wi-fi adapters are capable of this. It’s cheaper for manufacturers to produce models that handle ‘managed’ mode exclusively. Still, should someone get their hands on one and pair it with some simple Linux software, they’ll then able to see which URLs you are loading plus the data you’re providing to any website not using HTTPS – names, addresses, financial accounts etc. That’s obviously going to be a problem for you

Fake Hotspots

Snaring unencrypted data packets out of the air is definitely a risk of public wi-fi, but it’s certainly not the only one. When connecting to an unprotected router, you are then giving your trust to the supplier of that connection. Usually this trust is fine, your local Tim Horton’s probably takes no interest in your private data. However, being careless when connecting to public routers means that cybercriminals can easily set up a fake network designed to lure you in.

Once this illegitimate hotspot has been created, all of the data flowing through it can then be captured, analysed, and manipulated. One of the most common choices here is to redirect your traffic to an imitation of a popular website. This clone site will serve one purpose; to capture your personal information and card details in the same way a phishing scam would.

ARP Spoofing

The reality unfortunately is that cybercriminals don’t even need a fake hotspot to mess with your traffic.
Wi-Fi and Ethernet networks – all of them – have a unique MAC address. This is an identifying code used to ensure data packets make their way to the correct destination. Routers and all other devices discover this information Address Resolution Protocol (ARP).

Take this example; your smartphone sends out a request inquiring which device on the network is associated with a certain IP address. The requested device then provides its MAC address, ensuring the data packets are physically directed to the location determined to be the correct one. The problem is this ARP can be impersonated, or ‘faked’. Your smartphone might send a request for the address of the public wi-fi router, and a different device will answer you with a false address.

Providing the signal of the false device is stronger than the legitimate one, your smartphone will be fooled. Again, this can be done with simple Linux software.

Once the spoofing has taken place, all of your data will be sent to the false router, which can subsequently manipulate the traffic however it likes.

MitM – ‘Man-in-the-Middle’ Attacks

A man-in-the-middle attack (MITM) is a reference to any malicious action where the attacker secretly relays communication between two parties, or alters it for whatever malevolent reason. On an unprotected connection, a cybercriminal can modify key parts of the network traffic, redirect this traffic elsewhere, or fill an existing packet with whatever content they wish.

Examples of this could be displaying a fake login form or website, changing links, text, pictures, or more. Unfortunately, this isn’t difficult to do; an attacker within reception range of an unencrypted wi-fi point is able to insert themselves all too easily much of the time.

Best Practices for Securing your Public Wi-Fi Connection

The ongoing frequency of these attacks definitely serves to highlight the importance of basic cybersecurity best practices. Following these ones to counteract most public wi-fi threats effectively

  1. Have Firewalls in Place

An effective firewall will monitor and block any suspicious traffic flowing between your device and a router. Yes, you should always have a firewall in place and your virus definitions updated as a means of protecting your device from threats you have yet to come across.

While it’s true that properly configured firewalls can effectively block some attacks, they’re not a 100% reliable defender, and you’re definitely not exempt from danger just because of them. They primarily help protect against malicious traffic, not malicious programs, and one of the most frequent instances where they don’t protect you is when you are unaware of the fact you’re running malware. Firewalls should always be paired with other protective measures, and antivirus software being the best of them.

  1. Software updates

Software and system updates are also biggies, and should be installed as soon as you can do so. Staying up to date with the latest security patches is a very proven way to have yourself defended against existing and easily-exploited system vulnerabilities.

  1. Use a VPN

No matter if you’re a regular user of public Wi-Fi or not, A VPN is an essential security tool that you can put to work for you. VPNs serve you here by generating an encrypted tunnel that all of your traffic travels through, ensuring your data is secure regardless of the nature of the network you’re on. If you have reason to be concerned about your security online, a VPN is arguably the best safeguard against the risks posed by open networks.

That said, Free VPNs are not recommended, because many of them have been known to monitor and sell users’ data to third parties. You should choose a service provider with a strong reputation and a strict no-logging policy

  1. Use common sense

You shouldn’t fret too much over hopping onto a public Wi-Fi without a VPN, as the majority of attacks can be avoided by adhering to a few tested-and-true safe computing practices. First, avoid making purchases or visiting sensitive websites like your online banking portal. In addition, it’s best to stay away from any website that doesn’t use HTTPS. The popular browser extender HTTPS everywhere can help you here. Make use of it!

The majority of modern browsers also now have in-built security features that are able to identify threats and notify you if they encounter a malicious website. Heed these warnings.

Go ahead an make good use of public Wi-Fi and all the email checking, web browsing, social media socializing goodness they offer, but just be sure that you’re not putting yourself at risk while doing so.

Linux or Windows

The vast majority of websites are hosted on either Linux or Windows OS servers, and the market share is now shifting towards Linux according to a recent report from W3tech. Consumer surveys indicated that Unix servers make up some 66% of all web servers while Windows accounts for just over 33%. For most this isn’t going to be something they’ll give any consideration to, and it’s true that websites with standard HTML pages will be served equally well with either OS.

These days greater numbers of websites have been ‘revamped’ since their inception and now feature dynamic design elements that enhance the UX experience for viewers. If you are planning to design or redesign your website to be much more engaging, you should work with forms and execute web applications both systems will serve your needs.

Linux and Windows are pretty much neck and neck when it comes to functionality. Each works with a number of frameworks and front end programming languages, and have impressive features when it comes to hosting. Linux and Windows handle data in the same way too, and both sport easy, convenient and fast FTP tools to serve a wide range of file management functions.

Nine times out of 10 you’ll be at your best with either, and at 4GoodHosting our Linux and Windows web hosting specs make us one of the best Canadian web hosting providers with data centers in both Eastern and Western Canada.

Our standard web hosting is via ultra-fast, dual-parallel processing Hexa Core Xeon Linux-based web servers with the latest server software installations, and our Windows hosting includes full support for the entire spectrum of frameworks and languages: ASP.NET, Frontpage, Choice of MySQL, MSSQL 2000 or 2005 DB, ATLAS, Silverlight, ASP 3.0, PHP4 & PHP5, and Plesk.

Let’s have a look at the difference with each.

Price

The most significant difference between Linux and Windows web hosting is the core operating system on which the server(s) and user interface run. Linux uses some form of the Linux kernel, and these are usually free. There are some paid distributions, Red Hat being a good one, which comes with a number of special features aimed at better server performance. With Windows you’ll have a licensing fee because Microsoft develops and owns its OS and hardware upgrade needs can be a possibility too. We like Linux because over its lifespan, Linux servers generally cost significantly less than a similar one that’s Windows-based.

Software Support

Before choosing an OS, you’ll also have to consider the script languages and database applications that are required to host the website on it. If your website needs Windows-based scripts or database applications to display correctly, then a Windows web hosting platform is probably best for you. Sites developed with Microsoft ASP.NET, ASP Classic, MSSSQL, MS Access, SharePoint technologies will also head over to the Windows side.

Conversely, if your website requires Linux-based script or database software, then a Linux-based web hosting platform is going to be your best choice. Plus, anyone planning to use Apache modules, NGINX or development tools like Perl, PHP, or Python with a MySQL database will enjoy the large support structure for these formats found with Linux.

Control Panel And Dev Tools

Another consideration with these two web hosting options is that Linux offers control panels like cPanel or WHM, and Windows uses Plesk. There are fundamental differences between them. cPanel has a simple user-friendly interface and users can download applications, such as WordPress, phpBB, Drupal, Joomla, and more with super simple one-click installs. Creating and manage MySQL databases and configuring PHP is easy, and cPanel automatically updates software packages too. Plus, we like our Linux hosted websites for people new to the web. cPanel makes it easy for even people with no coding knowledge to create websites, blogs, and wiki pages. You can get tasks done faster without having to learn the details of every package installed.

Plesk is very versatile in that it can help you run the Windows version of the Linux, Apache, MySQL, and PHP stack. Plesk also supports Docker, Git, and other advanced security extensions. Windows servers have many unique and practical tools available as well, such as the Microsoft Web Platform Installer (Web PI) for speedier installation of the IIS (Internet Information System web server), MSSQL, and ASP.NET stack.

Because it’s been on the field longer, there are loads of open-source Linux applications available online. Windows hosting has fewer apps to choose from, but you have the security of knowing they are all from from vetted licensed providers. This increases the rate you can move ahead with database deployment.

Performance And Security

A reputable Canadian web host can be expected to secure your website within its data centres, but online attacks on Windows servers over the last few years show that they may be more of a red flag here than with Linux servers. That’s not to say that Linux – or any OS that has been or ever will be developed – will not have any security issues. Solid security is a product of good passwords, applying necessary patches, and using the rack for support.

Further, Linux server is pretty much universally considered to superior to Windows for stability and reliability. They rarely need to be rebooted and configuration changes rarely require a restart. Running multiple database and file servers on Windows can make it unstable, and another small difference is that Linux files are case-sensitive and Windows files are not.

Penguin for the Win

Your choice of server should be dictated by the features & database application needed for the proper functioning of your hosting or website development project. Those of you working on your own external-facing site and looking for a combination of flexibility and stability will be set up perfectly with Linux and cPanel. Those working in a complex IT environment with existing databases and legacy applications running on Windows servers will be best served being hosted on a Windows OS server.

Site Isolation from Google Promises to Repel More Malware Attacks

Against malware

Against malware

Security in the digital business world is really a challenge these days, and the world wide web is becoming as full of nefarious characters at the town of Machine, the ‘End of the Line’ as it were in the cool monochrome Western Dead Man with Johnny Depp from the ‘90s. A few months back we had detailed the big bad Spectre virus that had come onto the scene and posed major threats as regarded the insecurity of data for any type of website handling sensitive personal information.

It continues to be a ‘thing’, and in response to it Google recently enabled a new security feature in Chrome that secures users from malicious attacks like Spectre. It’s called Site Isolation, and is a new feature available with Chrome 67 on Windows, Mac, Linux, and Chrome OS. Here at 4GoodHosting, we’re a Canadian web hosting provider that puts an emphasis on this for obvious reasons, always seeking to be as on top of our clients’ web hosting needs as effectively as possible.

Google’s experimentation with Site Isolation has been going on since Chrome 63, and they’ve patched a lot of issues before enabling it by default for all Chrome users on desktop.

Chrome’s multi-process architecture allows different tabs to employ different renderer processes. Site Isolation functions by limiting each renderer process to documents from a single site. Chrome then relies on the operating system, and mitigates attacks between processes and any site.

Google has stated that in Chrome 67, Site Isolation has been enabled for 99% of users on Windows, Mac, Linux, and Chrome OS, according to a recent post on their company blog, stating further that ‘even if a Spectre attack were to occur in a malicious web page, data from other websites would generally not be loaded into the same process, and so there would be much less data available to the attacker. This significantly reduces the threat posed by Spectre.’

Additional known issues in Chrome for Android have been identified and are being worked on. Site Isolation for Chrome for Android should be ready with Chrome 68.

Need for Speed

Quick mention as well to Speed Update for Google Search on mobile. With this new feature the speed of pages will be a ranking factor for mobile searches. Of course, page speed has already been factoring into search engine rankings for some time now, but it was primarily based on desktop searches.

All of this is based on unsurprising finding showing people want to find answer to their searches as fast as possible, and page loading speed is an issue. Keeping that in mind, Google’s new feature for mobile users will only affect the pages that are painfully slow, and that has to be considered a good thing. Average pages should remain unaffected by and large.

We’re always happy to discuss in more detail how our web hosting service comes with the best in security and protective measures for your website when it’s hosted with us, and we also offer very competitively priced SSL certificates for Canadian websites that go a long way in securing your site reliably. Talk to us on the phone or email our support team.

Seven Steps to a Reliably Secure Server

In a follow up to last week’s blog post where we talked about how experts expect an increase in DDoS attacks this year, it makes sense for us to this week provide some tips on the best way to secure a server. Here at 4GoodHosting, in addition to being a good Canadian web hosting provider we also try to take an interest in the well being of clients of ours who are in business online. Obviously, the premise of any external threat taking them offline for an extended period of time will endanger the livelihood of their business, and as such we hope these discussions will prove valuable.

Every day we’re presented with new reports of hacks and data breaches causing very unwelcome disruptions for businesses and users alike. Web servers tend to be vulnerable to security threats and need to be protected from intrusions, hacking attempts, viruses and other malicious attacks, but there’s no replacing a secure server with its role for a business that operates online and engages in network transactions.

They tend to be the target because they are many times all too penetrable for hackers, and add to that the fact they’re known to contain valuable information. As a result, taking proper measures to ensure you have a secure server is as vital as securing the website, web application, and of course the network around it.

Your first decisions to evaluate are the server, OS and web server you’ll choose to collectively function as server you hope will be secure, and then the kind of services that run on it. No matter which particular web server software and operating system you choose to run, you must take certain measures to increase your server security. For starters, everyone will need to review and configure every aspect of your server in order to secure it.

It’s best to maintain a multi-faceted approach that offers in-depth security because each security measure implemented stacks an additional layer of defence. The following is a list we’ve assembled from many different discussion with web development and security experts that individually and collectively will help strengthen your web server security and guard against cyberattacks, stopping them essentially before they even have the chance to get ‘inside’ and wreak havoc.

Let’s begin;

  1. 1. Automated Security Updates

Unfortunately, most vulnerabilities come with a zero-day status. Before you know it a public vulnerability can be utilized to create a malicious automated exploit. Your best defence is to keep an eye ALWAYS on the ball when it comes to receiving security updates and having them put into place. Now of course your eye isn’t available 24/7, but you can and should be applying automatic security updates and security patches as soon as they are available through the system’s package manager. If automated updates aren’t available, you need to find a better system – pronto.

  1. Review Server Status and Server Security

Being able to quickly review the status of your server and check whether there are any problems originating from CPU, RAM, disk usage, running processes and other metrics will often help pinpoint server security issues with the server in a much faster period of time. In addition, ubiquitous command line tools can also review the server status. Each of your network services logs, database logs, and site access logs (Microsoft SQL Server, MySQL, Oracle) present in a web server are best stored in a segregated area and checked with regularity. Be on the lookout for strange log entries. Should your server be compromised, having a reliable alerting and server monitoring system standing guard will prevent the problem from snowballing and allow you to take strategic reactive measures.

  1. Perimeter Security With Firewalls

Seeing to it you have a secure server means involves the installation of security applications like border routers and firewalls ready and proven effective for filtering known threats, automated attacks, malicious traffic, DDoS filters, and bogon IPs, plus any untrusted networks. A local firewall will be able to actively monitor for attacks like port scans and SSH password guessing and effectively neutralize their threat to the firewall. Further, a web application firewall helps to filter incoming web page requests that are made for the explicit purpose of breaking or compromising a website.

  1. Use Scanners and Security Tools

Fortunately, we’ve got many security tools (URL scan, mod security) typically provided with web server software to aid administrators in securing their web server installations. Yes, configuring these tools can be a laborious process and time consuming as well – particularly with custom web applications – but the benefit is that they add an extra layer of security and give you serious reassurances.

Scanners can help automate the process of running advanced security checks against the open ports and network services to ensure your server and web applications are secure. It most commonly will check for SQL injection, web server configuration problems, cross site scripting, and other security vulnerabilities. You can even get scanners that can automatically audit shopping carts, forms, dynamic web content and other web applications and then provide detailed reports regarding their detection of existing vulnerabilities. These are highly recommended.

  1. Remove Unnecessary Services

Typical default operating system installations and network configurations (Remote Registry Services, Print Server Service, RAS) will not be secure. Ports are left vulnerable to abuse with larger numbers of services running on an operating system. It’s therefore advisable to switch off all unnecessary services and then disable them. As an added bonus, you’ll be boosting your server performance by doing this with a freeing of hardware resources.

  1. Manage Web Application Content

The entirety of your web application or website files and scripts should be stored on a separate drive, away from the operating system, logs and any other system files. By doing so it creates a situation where even if hackers gain access to the web root directory, they’ll have absolutely zero success using any operating system command to take control of your web server.

  1. Permissions and Privileges

File and network services permissions are imperative points for having a secure server, as they help limit any potential damage that may stem from a compromised account. Malicious users can compromise the web server engine and use the account in order to carry out malevolent tasks, most often executing specific files that work to corrupt your data or encrypt it to their specifics. Ideally, file system permissions should be granular. Review your file system permissions on a VERY regular basis to prevent users and services from engaging in unintended actions. In addition, consider removing the “root” account to enable login using SSH and disabling any default account shells that you do not normally choose to access. Make sure to use the least privilege principle to run specific network service, and also be sure to restrict what each user or service can do.

Securing web servers can make it so that corporate data and resources are safe from intrusion or misuse. We’ve clearly established here that it is about people and processes as much as it is about any one security ‘product.’ By incorporating the majority (or ideally all) measures mentioned in this post, you can begin to create a secure server infrastructure that’s supremely effective in supporting web applications and other web services.