DNS Flag Day This Past Friday: What You Need to Know About Your Domain

We’re a few days late getting to this, but we’ve chosen to make DNS Flag Day our topic this week as the ramifications of what’s to come of it will be of ongoing significance for pretty much anyone who has interests in digital marketing and the World Wide Web as a whole. Those that do will very likely be familiar with DNS and what the abbreviation stands for, but for any who don’t DNS is domain name system.

DNS has been an integral part of the information superhighway’s infrastructure for nearly as long as the Internet itself has been in existence. So what’s it’s significance? Well, in the Internet’s early days there wasn’t a perceived need for the levels of security that we know are very much required these days. There as much more in the way of trust and less in the way of pressing concerns. There wasn’t a whole lot of people using it, and as such the importance of DNS as a core service didn’t receive much focus and wasn’t developed with much urgency.

Any Canadian web hosting provider will be on the front lines of any developments regarding web security measures, and here at 4GoodHosting we’re no exception. Offering customers the best in products and services that make their website less vulnerable is always going to be a priority. Creating informed customers is something we believe in too, and that’s why we’re choosing to get you in the know regarding DNS flag day

What Exactly is this ‘Flag Day’?

The long and short of this is that this past Friday, February 1 2019, was the official DNS flag day. So, for the last 3 days, some organisations may now have a non-functioning domain. Not likely many of them, but may will see their domains now being unable to support the latest security features – making them an easier target for network attackers.

How and why? Well, a little bit of background info is needed. These days DNS has a wide-spread complexity, which is ever more necessary because cyber criminals launching are launching ever more complex disruptive distributed denial of service (DDoS) attacks aimed at a domain’s DNS. They’ve been having more success, and when they do it works out that no functioning DNS = no website

Developers have done their part to counter these threats quite admirably, and most notably with many workaround’s put in place to guarantee that DNS can continue to function as part of a rapidly growing internet.

The situation as it’s become over recent years is one where a combination of protocol and product evolution have made it so that DNS is being pushed and pulled in all sorts of different directions. This naturally means complications, and technology implementers typically have to weigh these ever-growing numbers of changes against the associated risks.

Cutting to the chase a bit again, the workarounds have ended up allowing legacy behaviours and slowing down DNS performance for everyone.

To address these problems, as of last Friday, vendors of DNS software – as well as large public DNS providers – have removed certain DNS workarounds that many people have been consciously or unconsciously relying on to protect their domains.

Flag’s Up

The reason this move had to be made is because broken implementations and protocol violations have resulted in delayed response times, far too much complexity and difficulty with upgrading to new features. DNS Flag Day has now put an end to the mass backing of many workarounds.

The change will affect sites with software that doesn’t follow published standards. For starters, domain timeouts will now be identified as being a sign of a network or server problem. Moving forward, DNS servers that do not respond to extension mechanisms for DNS (EDNS) queries will be regarded as inactive servers, and won’t return requests from browsers.

Test Your Domain

If you’re the type to be proactive about these things then here’s what you can do. You can test your domain, and your DNS serves with the extension mechanism compliance tester. You’ll receive a detailed technical report that will indicate your test failing, failing partially, or being successful.

Failures in these tests are caused by broken DNS software or broken firewall configuration, which can be remediated by upgrading DNS software to the latest stable version and re-testing. If the tests still fail, organisations will need to look further into their firewall configuration.

In addition to the initial testing, it’s recommended that business that rely on their online presence (which really is every one of them these days) use the next three months to make sure their domain meets what’s required of it now. Organizations with multiple domains that are clustered on a single network and in a shared server arrangement may well find that there is an increased chance that you may end up being caught up in a DDoS attack on another domain sitting near to yours.

Also, if you’re using a third-party DNS provider, most attacks on the network won’t be aimed at you, but you’re still at risk due to being on shared hosting. VPS hosting does eliminate this risk, and VPS web hosting Canada is already a better choice for sites that need a little more ‘elbow room’ when it comes to bandwidth and such. If VPS is something that interests you, 4GoodHosting has some of the best prices on VPS hosting packages and we’ll be happy to set you up. Just ask!

DNS Amplification and DNS Flood Risks

We’re now going to see more weak domains spanning the internet than ever before, and this makes it so that there is even more opportunity for cyber criminals to exploit vulnerable DNS servers through any number of different DDoS attacks.

DNS amplification is one of them, and it involves attackers using DNS to respond to small look-up queries with a fake and artificial IP of the target. The target is then overloaded with more larger DNS responses that are more than it’s able to handle. The result is that legitimate DNS queries are blocked and the organization’s network is hopelessly backed up.

Another one is DNS floods, and this involves waves of responses being aimed at the DNS servers hosting specific websites. They take over server-side assets like memory or CPU and proceed to fire a barrage of UDP requests generated by running scripts on compromised botnet machines.

Layer 7 (application layer) attacks will almost certainly be on the rise now too, and including those targeting DNS services with HTTP and HTTPS requests. These attacks are built to target applications with requests that look like legitimate ones, which can make them particularly difficult to detect.

What’s Next

Cyber-attacks will continue, as well as continue to evolve. Organizations will continue to spend time, money and resource on security. As regards DNS, it’s now possible to corrupt and take advantage of what was once the fail-safe means of web security. The measures taken as DNS Flagging have been put in place to address this problem, and it’s important that you now that your domain matches the new requirement. Again, use the link above to test yours.

There’s going to be a bit of a rough patch for some, but this is a positive step in the right direction. DNS is an essential part of the wider internet infrastructure. Entering or leaving a network is going to be less of a simple process now, but it’s the way it has to be.

Chromium Manifest V3 Updates May Disable Ad Blockers

It’s likely that a good many of you are among the thousands upon thousands of people who have an Ad Blocker installed for your web browsers of choice. Some people do use them simply to avoid the nuisance of having to watch ad after ad, and it’s people like these that have necessitated some sites to insist that you ‘whitelist’ them in order to proceed into the website they want to visit. That’s perfectly understandable, as those paying advertisers are the way the website generates income for the individual or business.

For others, however, we spend a great deal of our working day researching and referencing online, and having to watch ads before getting to the content we need in order to do our work. For us, an ad blocker is much more of a tool of necessity rather than convenience. Still, we get caught up in more than a few sites that will insist on being whitelisted too. For me, my ad blocker is a godsend and I don’t whitelist any website or disable my ad blocker for any of them.

Here at 4GoodHosting, part of what makes us a good Canadian web hosting provider is having built up an insight into what really matters to our customers. The bulk of them are people who use the Information Superhighway as a production resource rather than web ‘surfers’ for whom it’s more of an entertainment one. That’s why today’s news is some that’s sure to be very relevant for most of our customers.

Weakened WebRequest APIs

Some of you may not know how your ad blocker works, and that’s perfectly normal. As long as it does its job, you don’t really need to know. Chromium is Google’s newest all-powerful web browser, and just like Chrome did you can expect it to soon become nearly ubiquitous as most people’s web browser of-choice.

However, Chromium developers in the last few weeks have shared that among the updates they are planning to do in Manifest V3 is one that will restrict the blocking version of the webRequest API. The alternative they’re introducing is called declrativeNetRequest API.

After becoming aware of it, many ad blocker developers expressed their belief that the introduction of the declarativeNetRequest API will mean many already existing ad blockers won’t be ‘blocking’ much of anything anymore.

One industry expert stated on the subject, “If this limited declarativeNetRequest API ends up being the only way content blockers can accomplish their duty, this essentially means that two existing and popular content blockers like uBO and uMatrix will cease to be functional.”

What is the Manifest V3 Version?

It’s basically a mechanism through which specific capabilities can be restricted to a certain class of extensions. These restrictions are indicated in the form of either a minimum, or maximum, version.

Why the Update?

Currently, the webRequest API allows extensions to intercept requests and then modify, redirect, or block them. The basic flow of handling a request using this API is as follows,

  • Chromium receives the request / queries the extension / receives the result

However, in Manifest V3 the use of this API will have its blocking form limited quite significantly. The non-blocking form of the API that permits extensions to observer network requests for modifying, redirecting, or blocking them will not be discouraged. In addition, the limitations they are going to put in the webRequest API have yet to be determined

Manifest V3 is set to make the declarativeNetRequest API as the primary content-blocking API in extensions. This API will then allow extensions to tell Chrome what to do with a given request, instead of Chromium forwarding the request to the extension. This will enable Chromium to handle a request synchronously. Google insists this API is overall a better performer and provides better privacy guarantees to users – the latter part of which if of course very important these days.

Consensus Among Ad Blocker Developers and Maintainers?

When informed about this coming update many developers were concerned that the change will end up completely disabling all ad blockers. The concern was that the proposed declarativeNetRequest API will result in it being impossible to develop new and functional filtering engine designs. This is because the declarativeNetRequest API is no more than the implementation of one specific filtering engine, and some ad blocker developers have commented that it’s very limited in its scope.

It’s also believed that the declarativeNetRequest API developers will be unable to implement other features, such as blocking of media element that are larger than a set size and disabling of JavaScript execution through the injection of CSP directives, among other features.

Others are making the comparison to Safari content blocking APIs, which essentially put limits on the number of admissible rules. Safari has introduced a similar API recently, and the belief is that’s the reason why Apple has gone in this direction too. Many seem to think that extensions written in that API are more usable, but still fall well short of the full power of uBlock Origin. The hope is that this API won’t be the last of them in the foreseeable nearest future.

Dedicated IP Addresses and SEO

Even the most layman of web endeavourers will be familiar with the acronym SEO. We imagine further there’s very few if any individuals anywhere who don’t know it stands for search engine optimization, and understand just how integral SEO is for having success in digital marketing. Most people with a small business that relies on its website for maximum visibility with prospective customers will hire an SEO professional to SEO optimize their site. That continues to be highly recommended, and for 9 out of 10 people it is NOT something you can do effectively on your own, no matter how much you’ve read online or how many YouTube videos you’ve watched.

Here at 4GoodHosting, we are like any other top Canadian web hosting provider in that we offer SEO optimization services for our clients. Some people will think that choosing the best keywords and having them at the ideal density is most integral to having good SEO, and that’s true and by and large. But there are a number of smaller but still significant influence that influence SEO, and they’ll be beyond the wherewithal of most people.

Whether websites benefit from a Dedicated IP address rather that a Shared IP address isn’t something you’ll hear discussed regularly. When you learn that the answer is yes, they do, and exactly why, however, it’s a switch many people will want to consider if they currently have a Shared IP address. Let’s have a look at why that is today.

What Exactly Is an IP address?

For some, we may need to start at the start with all of this so let’s begin be defining what exactly an IP address is. Any device connected to the Internet has a unique IP address, and that’s true if it’s a PC, laptop, mobile device, or your web host’s server. It’s made up of a 4-number string which will start at 0 and then go up to 255. Here’s an example of one:

1.25.255.255

This numerical string code makes the machine you are using known. Once it’s identified – and it has to be – the Internet is then able to send data to it. You now can access the hundreds of thousands of websites along the Information Superhighway.

What’s a Shared IP address?

In most instances, the server your web host uses to host your site will be a single machine with a matching single IP address. For most people – and nearly all who go with the most basic hosting package without giving it much thought – you’ll be set up in an arrangement where the server is hosting thousands of websites like yours. It’s not ‘dedicated’ to you and your site exclusively.

Instead, all of the websites hosted it will be represented by the single IP address allocated to the web host’s server. Now if your website is utilized for more of a personal venture or hobby and it’s NOT going to be a leverage point in trying to secure more business, shared hosting will probably be fine. Alternately, if page rankings are a priority for you then shared hosting may be putting you at a disadvantage.

The solution? A dedicated IP address for your Canadian website. If you need one, we can take care of that for you quickly and fairly easily for you. But we imagine you’ll need more convincing, so let’s move now to explaining what constitutes a Dedicated IP address..

The Dedicated IP Address

A dedicated IP address involves you having your own server, and that server only has one website on it – yours. It is common, however, for more than one site reside on a specific server. A Dedicated IP address is an IP address that is allocated to a single website, instead of one being assigned the server and representing every website hosted there by default.

The Purpose of Dedicated IP Addresses

The primary appeal of Dedicated IP addresses is that they promote large ecommerce being more secure, and in particular as it regards sensitive data like credit card numbers, etc. On a more individual scale, though, a dedicated IP address is superior for SEO interests as well.

Why is that? Let’s list all of the reasons here:

1. Speed

When you share space, you share resources and in as far as shared web hosting and shared IP addresses are concerned that means you are sharing bandwidth. The long and short of it is all those other sites on the same server will be slowing yours down. That might be a problem in itself, but if it isn’t then the way slow site speeds push you further down Google’s rankings will be.

While adding a unique IP address to your site will not automatically mean it loads faster, but migrating to a Dedicated Server with a Dedicated IP address definitely will. Sites with a Dedicated IP address are faster, more reliable, and more secure, and that’s a big deal.

2. SSL

For nearly 5 years now Google has been giving preference to websites that have added an SSL 2048-bit key certificate. The easiest way to see whether that’s been done or not is seeing the site’s URL change from HTTP to HTTPS. SSL sites typically utilize unique IP addresses. Google continues to insist that SSL impacts less than 1% of searches, but it’s a factor nonetheless and is another benefit of a Dedicated IP address.

SSL can make your website more visible through public networks and can make websites operate marginally faster, and the benefit of this is in the way visitors get a faster response from the website because it’s not held up by Google the way it would be if it didn’t have an SSL cert. The majority of ecommerce sites with a Dedicated IP address will also have an SSL cert.

3. Malware

Malware is software that’s designed and disseminated for the explicit purpose of throwing wrenches into the gears of a working web system. Unfortunately, the thousands of websites that may be on a shared server drastically increases the risk of being exposed to malware if you’re one of them. Further, when you share an IP address with any site that’s been infected with malware then your site is actually penalized despite the fact it’s not you who’s been infected.

In these cases, you’ll be best served by going with a Dedicated IP address and choosing a more reliable Canadian web hosting provider that has measures in place to protect malware from making its way into the servers in the fist place. A dedicated IP means you’re standing alone, and you’re regarded accordingly.

How Do I Get a Dedicated IP Address?

If you’re with us here at 4GoodHosting, all you need to do is ask. We’ve been setting our customers up with Dedicated IP addresses for quite some time now, and you’ll find that when you do so through us it’s not nearly as pricey as you had expected it to be.

It’s very recommended for any ecommerce site or one that’s utilized for very business-strategic aims, and it’s fair to say that you really can’t go wrong moving to a dedicated server if you’ve made the commitment to do anything and everything to protect your SEO and enjoy the same page rankings moving forward. The vast majority of people see it as a wise investment, and of course you always have option of switching back to a shared hosting arrangement if over time you don’t see any real difference or benefits for you.

Global Environmental Sustainability with Data Centers

Last week we talked about key trends for software development expected for 2019, and today we’ll discuss another trend for the coming year that’s a bit more of a given. That being that datacenters will have even more demands placed on their capacities as we continue to become more of a digital working world all the time.

Indeed, datacenters have grown to be key partners for enterprises, rather than being just an external service utilized for storing data and business operation models. Even the smallest of issues in datacenter operations can impact business.

While datacenters are certainly lifeblood for every business, they also have global impacts and in particular as it relates to energy consumption. Somewhere in the vicinity of 3% of total electricity consumption worldwide is made by datacenters, and to put that in perspective that’s more than the entire power consumption of the UK.

Datacenters also account for 2% of global greenhouse gas emissions, and 2% electronic waste (aka e-waste). Many people aren’t aware of the extent to which our growingly digital world impacts the natural one so directly, but it really does.

Like any good Canadian web hosting provider who provides the service for thousands of customers, we have extensive datacenter requirements ourselves. Most will make efforts to ensure their datacenters operate as energy-efficiently as possible, and that goes along with the primary aim – making sure those data centers are rock-solid reliable AND as secure as possible.

Let’s take a look today at what’s being done around the globe to promote environmental sustainability with data centers.

Lack of Environmental Policies

Super Micro Computer recently put out a report entitled ‘Data Centers and the Environment’ and it stated that 43% of organizations don’t have an environmental policy, and another 50% have no plans to develop any such policy anytime soon. Reasons why? high costs (29%), lack of resources or understanding (27%), and then another 14% don’t make environmental issues a priority.

The aim of the report was to help datacenter managers better understand the environmental impact of datacenters, provide quantitative comparisons of other companies, and then in time help them reduce this impact.

Key Findings

28% of businesses take environmental issues into consideration when choosing datacenter technology

Priorities that came before it for most companies surveyed were security, performance, and connectivity. However, 9% of companies considered ‘green’ technology to be the foremost priority. When it comes to actual datacenter design, however, the number of companies who put a priority on energy efficiency jumps up by 50% to 59%.

The Average PUE for a Datacenter is 1.89

Power Usage Effectiveness (PUE) means the ratio of energy consumed by datacenter in comparison to the energy provided to IT equipment. The report found the average datacenter PUE is approx. 1.6 but many (over 2/3) of enterprise datacenters come in with a PUE over 2.03.

Further, it seems some 58% of companies are unaware of their datacenter PUE. Only a meagre 6% come in that average range between 1.0 and 1.19.

24.6 Degrees C is the Average Datacenter Temperature

It’s common for companies to run datacenters at higher temperatures to reduce strain on HVAC systems and increase savings on energy consumption and related costs. The report found 43% of the datacenters have temperatures ranging between 21 degrees C and 24 degrees C.

The primary reasons indicated for running datacenters at higher temperatures are for reliability and performance. Hopefully these operators will come to soon learn that recent advancements in server technology have optimized thermal designs and newer datacenter designs make use of free-air cooling. With them, they can run datacenters at ambient temperatures up to 40 degrees C and see no decrease in reliability and performance. It also helps improve PUE and saving costs.

Another trend in data center technology is immersion cooling, where datacenters are cooled by being entirely immersed. We can expect to see more of this type of datacenter technology rolled out this year too.

3/4 of Datacenters Have System Refreshes Within 5 Years

Datacenters and their energy consumption can be optimized with regular updates of the systems and adding modern technologies that consume low power. The report found that approximately 45% of data center operators conduct a refreshing of their system sometime within every 3 years. 28% of them do it every four to five years. It also seems that the larger the company, the more likely they are to do these refreshes.

8% Increase in Datacenter E-Waste Expected Each Year

It’s inevitable that electronic waste (e-waste) is created when datacenters dispose of server, storage, and networking equipment. It’s a bit of a staggering statistic when you learn that around 20 to 50 million electric tons of e-waste is disposed every year around the world, and the main reason it’s so problematic is that e-waste deposits heavy metals and other hazardous waste into landfills. If left unchecked and we continue to produce it as we have then e-waste disposal will increase by 8% each year.

Some companies partner with recycling companies to dispose of e-waste, and some repurpose their hardware in any one of a number of different ways. The report found that some 12% of companies don’t have a recycling or repurposing program in place, and typically they don’t because it’s costly, partners / providers are difficult to find in their area, and lack of proper planning.

On a more positive note, many companies are adopting policies to address the environmental issues that stem from their datacenter operation. Around 58% of companies already have environmental policy in place or are developing it.

We can all agree that datacenters are an invaluable resource and absolutely essential for the digital connectivity of our modern world. However, they are ‘power pigs’ as the expression goes, and it’s unavoidable that they are given the sheer volume of activity that goes on within them every day. We’ve seen how they’ve become marginally more energy efficient, and in this year to come we will hopefully see more energy efficiency technology applied to them.

Key Trends in Software Development Expected for 2019

Here we are into the first week of 2019 and as expected we’ve got a whole lot on the horizon this year in the way of software development. We live in a world that’s more and more digital all the time, and the demands put on the software development industry are pretty much non-stop in response to this ongoing shift. Often times it’s all about more efficient ‘straight lining’ of tasks as well as creating more of a can-do environment for people who need applications and the like to work smarter.

Here at 4GoodHosting, a part of what makes us a reputable Canadian web hosting provider is the way we stay abreast of developments. Not only in the web hosting industry, but also in the ones that have a direct relevance for clients of ours in the way they’re connected to computing and computing technology.

Today we’re going to discuss the key trends in software development that are expected for this coming year.

Continuing to Come a Long Way

Look back 10 years and you’ll surely agree the changes in the types of applications and websites that have been built – as well as how they’ve been built – is really quite something. The web of 2008 is almost unrecognizable. Today it is very much an app and API economy. It was only just 10ish years ago that JavaScript framework was the newest and best around, but now building for browsers exclusively is very much a thing of the past.

In 2019 we’re going to see priorities put on progressive web apps, artificial intelligence, and native app development remain. As adoption increases and new tools emerge, we can expect to see more radical shifts in the ways we work in the digital world. There’s going to be less in the way of ‘cutting edge’ and more in the way of refinements on technology that reflect developers now having a better understanding of how technologies can be applied

The biggest thing for web developers now is that they need to expand upon the stack as applications become increasingly lightweight (in large part due to libraries and frameworks like Vue and React), and data grows to be more intensive, which can be attributed to the range of services upon which applications and websites depend.

Reinventing Modern JavaScript Web Development

One of the things that’s being seen is how topics that previously weren’t included under the umbrella of web development – microservices and native app development most notably– are now very much part of the need-to-know landscape.

The way many aspects of development have been simplified has forced developers to evaluate how these aspects fit together more closely. With all the layers of abstraction in modern development, the way things interact and work alongside each other becomes even more important. Having a level of wherewithal regarding this working relationship is very beneficial for any developer.

Those who’ve adapted to the new realities well will now agree that it’s no longer a case of writing the requisite code to make something run on the specific part of the application being worked on. Rather, it’s about understanding how the various pieces fit together from the backend to the front.

In 2019, developers will need to dive deeper become inside-out familiar with their software systems. Being explicitly comfortable with backends will be an increasingly necessary starting point. Diving into the cloud and understanding that dynamic is also highly advisable. It will be wise to start playing with microservices. Rethinking and revisiting languages you thought you knew is a good idea too.

Be Familiar With infrastructure to Tackle Challenges of API development

Some will be surprised to hear it, but as the stack shrinks and the responsibilities of web developers shift we can expect that having an understanding of the architectural components within the software being built will be wholly essential.

That reality is put in place by DevOps, and essentially it has made developers responsible for how their code runs once it hits production. As a result, the requisite skills and toolchain for the modern developer is also expanding.

RESTful API Design Patterns and Best Practices

You can make your way into software architecture through a number of different avenues, but exploring API design is likely the best of them. Hands on RESTful API Design gives you a practical way into the topic.

REST is the industry standard for API design, and the diverse range of tools and approaches is making client management a potentially complex but interesting area. GraphQL, a query language developed by Facebook is responsible for killing off REST, while Redux and Relay – a pair of libraries for managing data in React applications – have both seen a significant amount of interest over the last year as a pair of key tools for working with APIs.

Microservices for Infrastructure Responsibility

Microservices are becoming the dominant architectural mode, and that’s the reason we’re seeing such an array of tools capable of managing APIs. Expect a whole lot more of them to be introduced this year, and be proactive in finding which ones work best for you. While you may not need to implement microservices now, if you want to be building software in 5 years time then you really should become explicitly familiar with the principles behind microservices and the tools that can assist you when using them.

We can expect to see containers being one of the central technologies driving microservices. You could run microservices in a virtual machine, but as they’re harder to scale than containers you likely wouldn’t see the benefits you’ll expect from a microservices architecture. As a result, really getting to know core container technologies should also be a real consideration.

The obvious place to start is with Docker. Developers need to understand it to varying degrees, but even those who don’t think they’ll be using it immediately will agree that the real-world foundation in containers it provides will be valuable knowledge to have at some point.

Kubernetes warrants mention here as well, as it is the go-to tool that allows you to scale and orchestrate containers. It offers control over how you scale application services in a way that would have bee unimaginable a decade ago.

A great way for anyone to learn how Docker and Kubernetes come together as part of a fully integrated approach to development is with Hands on Microservices with Node.js.

Continued Embracing of the Cloud

It appears the general trend is towards full stack, and for this reason developers simply can’t afford to ignore cloud computing. The levels of abstraction it offers, and the various services and integrations that come with the leading cloud services make it so that many elements of the development process are much easier.

Issues surrounding scale, hardware, setup and maintenance nearly disappear entirely when you use cloud. Yes, cloud platforms bring their own set of challenges, but they also allow you to focus on more pressing issues and problems.

More importantly, however, they open up new opportunities. First and foremost of them is going Serverless becomes a possibility. Doing so allows you to scale incredibly quickly by running everything on your cloud provider.

There are other advantages too, like when you use cloud to incorporate advanced features like artificial intelligence into your applications. AWS has a whole suite of machine learning tools; AWS Lex helps you build conversational interfaces, and AWS Polly turns text into speech. Azure Cognitive Services has a nice array of features for vision, speech, language, and search.

As a developer, it’s going to be increasingly important to see the Cloud as a way of expanding on the complexity of applications and processes while keeping them agile. Features and optimizations previously might have found to be sluggish or impossible can and should be developed as necessary and then incorporated. Leveraging AWS and Azure (among others) is going to be something that many developers will do with success in the coming year.

Back to Basics with New languages & Fresh Approaches

All of this ostensible complexity in contemporary software development may lead some to think that languages don’t matter as much as they once did. It’s important to know that’s definitely not the case. Building up a deeper understanding of how languages work, what they offer, and where they come up short can make you a much more accomplished developer. Doing what it takes to be prepared is really good advice for a what’s an ever-more unpredictable digital world to come this year and in years to follow.

We can expect to see a trend where developers go back to a language they know and explore a new paradigm within it, or they learn a new language from scratch.

Never Time to Be Complacent

We’ll reiterate what the experts we read are saying; that in just a matter of years much of what is ‘emerging’ today will be old hat. It’s helpful to take a look at the set of skills many full stack developer job postings are requiring. You’ll see that the different demands are so diverse that adaptability should be a real priority for a developer that wants to remain upwardly mobile within his or her profession. Without doubt it will be immensely valuable both for your immediate projects and future career prospects.

Top-5 Strategic Technology Trends Expected for 2019

Here we are on the final day of the year, and most will agree that 2018 has seen IT technology expand in leaps and bounds exactly as it was expected to. In truth, it seems every year brings us a whole whack of new technology trends cementing themselves in the world of IT, web, and computing development. Not surprisingly, the same is forecast for 2019.

Here at 4GoodHosting, a significant part of what makes us one of the many good Canadian web hosting providers is that we enjoy keeping abreast of these developments and then aligning our resources and services with them when it’s beneficial for our customers to do so.

Worldwide IT spending for 2019 is projected to be in the vicinity of $3.8 trillion. That will be a 3.2% increased from the roughly $3.7 trillion spend this year. That’s a LOT of money going into the research and development shaping the digital world that’s so integral to the professional and personal lives for so many of us.

So for the last day of 2018 let’s have a look at the top 10 strategic technology trends we can expect to become the norm over the course of the year that’ll start tomorrow.

  1. Autonomous Things

We’ve all heard the rumblings that we’re on the cusp of the start of the robot age. It seems that may be true. Autonomous things like robots, drones and autonomous vehicles use AI to automate functions that were performed by humans previously. This type of automation goes beyond that provided by rigid programming models, and these automated things use AI to deliver advanced behaviors tailored by their interacting more naturally with their surroundings and with people – when necessary.

The proliferation of autonomous things will constitute a real shift from stand-alone intelligent things to collections of them that will collaborate very intelligently. Multiple devices will work together, and without human input if it’s not required – or not conducive to more cost-effective production or maintenance.

The last part of that is key, as the way autonomous things can reduce production costs by removing the employee cost from the production chain wherever possible is going to have huge ramifications for unskilled labour. As the saying goes – you can’t stop progress.

  1. Augmented Analytics

Augmented analytics can be defined as a focus on specific area of augmented intelligence, and most relevantly in what we’re talking about here is the way we’ll see it start to use machine learning (ML) to transform how analytics content is developed, shared, and consumed. The forecast seems to be that augmented analytics capabilities will quickly become part of mainstream adoption methods and affix itself as a key feature of data preparation, data management, process mining, modern analytics, data science platforms and business process management.

We can also expect to see Automated insights from augmented analytics being embedded in enterprise applications. Look for HR, finance, marketing, customer service, sales, and asset management departments to be optimizing decisions and actions of all employees within their context. These insights from analytics will no longer be utilized by analysts and data scientists exclusively.

The way augmented analytics will automate the data preparation process, insight generation and insight visualization, plus eliminate the need for professional data scientists promises to be a huge paradigm shift too. It’s expected that through 2020 the number of citizen data scientists will have expanded 5x faster than the number of ‘industry-expert’ data scientists, and these citizen variety will then fill the data science and machine learning talent gap resulting from the shortage and high cost of traditional data scientists.

  1. AI-Driven Development

We should also expect to see the market shifting from the old way where professional data scientists would partner with application developers to create most AI-enhanced solutions to a newer where a professional developer can operate on their own using predefined models that are now delivered as a service. The developer is now provided with an ecosystem of AI algorithms and models, and now has development tools that are tailored to integrating AI capabilities and models into workable solutions that weren’t reachable before.

AI being applied to the development process itself leads to another opportunity for professional application development that serves the aim to automate various data science, application development and testing functions. 2019 will be the start of a 3-year window where it’s forecast that at least 40% of new application development projects will have AI co-developers working within the development team.

  1. Digital Twins

Much as the name suggests, a digital twin is a digital representation of a real-world entity or system, and we can expect them to start being increasingly common over the coming year. So much so in fact that by 2020 it is estimated that there will be more than 20 billion connected sensors and endpoints serving digital twins working on millions and millions of different digital tasks.

These digital twins will be deployed simply at first, but we can expect them to evolve them over time and have ever-greater abilities to collect and visualize the right data, determine correct application of the right analytics and rules, and respond effectively to business objectives.

Organization digital twins will help drive efficiencies in business processes, plus create more flexible, dynamic and responsive processes that can potentially react to changing conditions automatically, and we can look for this trend to really start picking up steam in 2019.

  1. Immersive Experience

The last trend we’ll touch on here today is the one that most people will be able to relate to on a n everyday level. We’re all seeing the changes in how people interact with the digital world. Virtual reality (VR), augmented reality (AR) and mixed reality (MR) are revolutionizing the way people interact with the digital world, as well as how they regard it overall. It is from this combined shift in perception and interaction models that future immersive user experiences will be shaped.

2019 should see a continuance of thought and perspective about individual devices and how fragmented user interface (UI) technologies are used for a multichannel and multimodal experience. The relevance of that all will be in how the experience connects people with the digital world across hundreds of edge devices surrounding them – traditional computing devices, wearables, automobiles, environmental sensors and consumer appliances will all increasingly be part of the ‘smart’ device crowd as we move forward.

In the bigger picture, this multi-experience environment will create an ambient experience where the spaces that surround us create a ‘digital entirety’ rather than the sum of individual devices working together. In a sense it will be like the environment itself is the digital processor.

We’ll discuss more about what’s forecasted to be in store for web hosting and computing in 2019 in following weeks, but for now we’d like to say Happy New Year to you and we continue to appreciate your choosing as us your web hosting provider. Here’s to a positive and productive coming year for all of you.

 

 

Google Chrome Solution for ‘History Manipulation’ On Its Way

No one will need to be convinced of the fact there’s a massive number of shady websites out there designed to ensnare you for any number of no-good purposes. Usually you’re rerouted to them when you take a seemingly harmless action and then often you’re unable to back <- yourself out of the site once you’ve unwilling landed on it. Nobody wants to be on these spammy or malicious pages and you’re stressing out every second longer that you’re there.

The well being of web surfers who also happen to be customers or friends here at 4GoodHosting is important to us, and being proactive in sharing all our wisdom about anything and everything related to the web is a part of what makes one of the best Canadian web hosting providers.

It’s that aim that has us sharing this news with you here today – that Google understands the unpleasantness that comes with this being locked into a website and has plans to make it remediable pretty quick here.

The first time something like this occurs you’ll almost certainly be clicking on the back button repeatedly before realizing it’s got no function. Eventually you’ll come to realize that you’ve got no other recourse than to close the browser, and most often times you’ll quit Chrome altogether ASAP and then launch it again for fear of inheriting a virus or something of the sort from the nefarious site.

How History Manipulation Works, and what Google is Doing About It

You’ll be pleased to hear the Chrome browser will soon be armed with specific protection measures to prevent this happening. The way the ‘back’ button is broken here is something called ‘history manipulation’ by the Chrome team. What it involves is that the malicious site stacks dummy pages onto your browsing history, and these work to fast-forward you back to the unintended destination page you were trying to get away from.

Fortunately, Chrome developers aren’t letting this slide. There are upcoming changes to Chromium’s code which will facilitate the detection of these dummy history entries and then flag sites that use them.

The aim is to allow Chrome to ignore the entirety of these false history entries to make it so that you’re not buried in a site that you had no intention of landing on and the back button functions just as you expect it to.

This development is still in its formative stages, and we should be aware that these countermeasures aren’t even in the pre-release test versions of Chrome yet. However, industry insiders report that testing should begin within the next few weeks or so, and all signs point towards the new feature being part of the full release version of the web browser.

In addition, this being a change to the Chromium engine makes it so that it may eventually benefit other browsers based on it. Most notable of these is Microsoft Edge, making it so that the frustrations of a paralyzed back button will be a thing of the past for either popular web browser. So far there’s no industry talk of Apple doing the same for Safari, but one can imagine they’ll be equally on top of this in much the same way.

Merry Christmas from 4GoodHosting

Given it’s the 24th of December here we of course would like to take this opportunity to wish a Merry Christmas to one and all. We hope you are enjoying the holidays with your family and this last week of 2018 is an especially good one. We can reflect on 2018, and look forward to an even more prosperous year in 2019.

Happy Holidays and best wishes, from all of us to all of you!

Why 64-Bit is Leaving 32-Bit in the Dust with Modern Computing

Having to choose between 32-bit and 64-bit options when downloading an app or installing a game is pretty common, and many PCs will have a sticker on it that reads 64-bit processor. You’ll be hard pressed to find a sticker on one that reads 32-bit. It’s pretty easy to conclude like you do with most things that more is better, but why is that exactly? Unless you’re a genuinely computer savvy individual you won’t know what the real significance of the difference between the two.

There is some meat to that though, and here at 4GoodHosting as a top Canadian web hosting provider we try to have our thumb on the pulse of the web hosting and computing world. Having a greater understanding of what exactly is ‘under the hood’ of your desktop or notebook and what’s advantageous – or not – about that is helpful. So let’s have a look at the importance difference between 32-bit and 64-bit computing today.

Why Bits Matter

First and foremost, it’s about capability. As you might expect, a 64-bit processor is more capable than a 32-bit processor, and primarily because it can handle more data at once. A greater number of computational values can be taken on by a 64-bit processor and this includes memory addresses. This means it’s able to access over four billion times the physical memory of a 32-bit processor. With the ever-greater memory demands of modern desktop and notebook computers, that’s a big deal.

The key difference in that is something else. 32-bit processors can handle a limited amount of RAM (in Windows, 4GB or less) without difficulty, while 64-bit processors can accordingly take on much more. The ability to do this, however, is based on your operating system being able to take advantage of this greater access to memory. Run anything Windows 10 or up for a PC and you won’t need to worry about limits.

The proliferation of 64-bit processors and larger capacities of RAM have led both Microsoft and Apple to upgrade versions of their operating systems now designed to take full advantage of the new technology. OS X Snow Leopard for Mac was the first fully 64-bit operating system to arrive, nearly 10 years ago in 2009. iPhone was the first smartphone with a 64-bit chip, the Apple A7.

Basic version of the Microsoft Window OS had software limitations with the amount of RAM available for use by applications. Even in the ultimate and professional version of the operating system, 4GB is the maximum usable memory available from the 32-bit. Before you think that going 64-bit is the solution to having nearly unlimited processor capability, however, understand that any real jump in power comes from software designed with to operate within the architecture.

Designed to Make Use of Memory

These days, the recommendation is that you shouldn’t have less than 8GB of RAM to make the best use of applications and video games designed for 64-bit architecture. This is especially useful for programs that can store a lot of information for immediate access, and ones that regularly open multiple large files at the same time.

Another plus is that most software is backwards compatible, which allows you to run applications that are 32-bit in a 64-bit environment and not experience performance issues or make it so that there’s more for you to do. There are exceptions to this, and the most notable of them are virus protection software and drivers. The hardware of these installs usually require the proper version be installed if they’re going to function properly.

Same, But Different

There’s likely no better example of this difference than one found right within your file system. If you’re a Windows user, you’ve likely noticed that you have two Program Files folders; the first is labeled Program Files, while the other is labeled Program Files (x86).

All applications installed shared resources on a Windows system (called DLL files), and how they are structured depends on whether they’re used for 64-bit applications or 32-bit ones. Should a 32-bit application reach out for a DLL and discover that it’s a 64-bit version, it’ll respond quite simply in one way – by refusing to run.

32-bit (x86) architecture has been in use for a good long time now, and there are still plenty of applications that run 32-bit architecture. How they run on some platforms is changing, however. Modern 64-bit systems have the ability to run 32-bit and 64-bit software, and the reason is because they have 2 separate Program Files directories. 32-bit applications are shuffled off to the appropriate x86 folder, and with that Windows then responds by serving up the right DLL, – which is the 32-bit version in this case. On the other side of things, every file in the regular Program Files directory can access the separate content.

Naturally, we can expect 32-bit computing architecture to go the way of the Dodo bird before long, but it’s interesting to note that the superiority of 64-bit is the sum of more than just being a doubling of bits between the two.

 

Top 5 Programming Languages for Taking On Big Data

In today’s computing world, ‘big data’ – data sets that are too large or complex for traditional data-processing application software – are increasingly common and having the ability to work with them is increasingly a to-be-expected requirement of IT professionals. One of the most important decisions these individuals have to make is deciding on a programming languages for big data manipulation and analysis. More is now required than just simply understanding big data and framing the architecture to solve it. Choosing the right language means you’re able to execute effectively, and that’s very valuable.

As a proven reliable Canadian web hosting provider, here at 4GoodHosting we are naturally attuned to developments in the digital world. Although we didn’t know what it would come to be called, we foresaw the rise of big data but we didn’t entirely foresee just how much of a sway of influence it would have for all of us who take up some niche in information technology.

So with big data becoming even more of a buzz term every week, we thought we’d put together a blog about what seems to be the consensus on the top 5 programming languages for working with Big Data.

Best languages for big data

All of these 5 programming languages make the list because they’re both popular and deemed to be effective.

Scala

Scale blends object-oriented and functional programming paradigms very nicely, and is fast and robust. It’s a popular language choice for many IT professionals needing to work with big data. Another testament to its functionality is that both Apache Spark and Apache Kafka have been built on top of Scala.

Scala runs on the JVM, meaning that codes written in Scala can be easily incorporated within a Java-based Big Data ecosystem. A primary factor differentiating Scala from Java is that Scala is a lot less verbose as compared to Java. What would take seemingly forever to write 100s of lines of confusing-looking Java code can be done in 15 or so lines in Scala. One drawback attached to Scala, though, is its steep learning curve. This is especially true compared to languages like Go and Python. In some cases this difficulty puts off beginners looking to use it.

Advantages of Scala for Big Data:

  • Fast and robust
  • Suitable for working with Big Data tools like Apache Sparkfor distributed Big Data processing
  • JVM compliant, can be used in a Java-based ecosystem

Python

Python’s been earmarked as one of the fastest growing programming languages in 2018, and it benefits from the way its general-purpose nature allows it to be used across a broad spectrum of use-cases. Big Data programming is one of the primary ones of them.

Many libraries for data analysis and manipulation which are being used in a Big Data framework to clean and manipulate large chunks of data more frequently. These include pandas, NumPy, SciPy – all of which are Python-based. In addition, most popular machine learning and deep learning frameworks like Scikit-learn, Tensorflow and others are written in Python too, and are being applied within the Big Data ecosystem much more often.

One negative for Python, however, is that its slowness is one reason why it’s not an established Big Data programming language yet. While it is indisputably easy to use, Big Data professionals have found systems built with languages such as Java or Scala to be faster and more robust.

Python makes up for this by going above and beyond with other qualities. It is primarily a scripting language, so interactive coding and development of analytical solutions for Big Data is made easy as a result. Python also has the ability to integrate effortlessly with the existing Big Data frameworks – Apache Hadoop and Apache Spark most notably. This allows you to perform predictive analytics at scale without any problem.

Advantages of Python for big data:

  • General-purpose
  • Rich libraries for data analysis and machine learning
  • Ease of use
  • Supports iterative development
  • Rich integration with Big Data tools
  • Interactive computing through Jupyter notebooks

R

Those of you who put a lot of emphasis on statistics will love R. It’s referred to as the ‘language of statistics’, and is used to build data models which can be implemented for effective and accurate data analysis.

Large repositories of R packages (CRAN, also called as Comprehensive R Archive Network) set you up with pretty much every type of tool you’d need to accomplish any task in Big Data processing. From analysis to data visualization, R makes it all doable. It can be integrated seamlessly with Apache Hadoop, Apache Spark and most other popular frameworks used to process and analyze Big Data.

The easiest flaw to find with R as a Big Data programming language is that it’s not much of a general purpose language with plenty of practicality. Code written in R is not production-deployable and generally has to be translated to some other programming language like Python or Java. For building statistical models for Big Data analytics, however, R is hard to beat overall.

Advantages of R for big data:

  • Ideally designedfor data science
  • Support for Hadoop and Spark
  • Strong statistical modelling and visualization capabilities
  • Support for Jupyter notebooks

Java

Java is the proverbial ‘old reliable’ as a programming language for big data. Much of the traditional Big Data frameworks like Apache Hadoop and the collection of tools within its ecosystem are based in Java, and still used in many enterprises today. This goes along with the fact that Java is the most stable and production-ready language of all the 4 we’ve covered here so far.

Java’s primary advantage is in the way you have an ability to use a large ecosystem of tools and libraries for interoperability, monitoring and much more, and the bulk of them have already been proven trustworthy.

Java’s verbosity is its primary drawback. Having to write hundreds of lines of codes in Java for a task which would require only 15-20 lines of code in Python or Scala is a big minus for many developers. New lambda functions in Java 8 do counter this some. Another consideration is that Java does not support iterative development unlike newer languages like Python. It is expected that future releases of Java will address this, however.

Java’s history and the continued reliance on traditional Big Data tools and frameworks will mean that Java will never be displaced from a list of preferred Big Data languages.

Advantages of Java for big data:

  • Array of traditional Big Data tools and frameworks written in Java
  • Stable and production-ready
  • Large ecosystem of tried & tested tools and libraries

Go

Last but not the least here is Go. one of the programming languages that’s gained a lot of ground recently. Designed by a group of Google engineers who had become frustrated with C++, Go is worthy of consideration simply because of the fact that it powers many tools used in Big Data infrastructure, including Kubernetes, Docker and several others too.

Go is fast, easy to learn, and it is fairly easy to develop applications with this language. Deploying them is also easy. What might be more relevant for it though is as businesses look at building data analysis systems that can operate at scale, Go-based systems are a great fit for integrating machine learning and undertaking parallel processing of data. That other languages can be interfaced with Go-based systems with relative ease is a big plus too.

Advantages of Go for big data:

Fast and easy to use

Many tools used in the Big Data infrastructure are Go-based

Efficient distributed computing

A few other languages will get HMs here too – Julia, SAS and MATLAB being the most notable ones. All of our 5 had better speed, efficiency, ease of use, documentation, or community support, among other things.

Which Language is Best for You?

This really depends on the use-case you will be developing. If your focus is hardcore data analysis involving s a lot of statistical computing, R would likely be your best choice. On the other hand, if your aim is to develop streaming applications, Scala is your guy. If you’ll be using machine learning to leverage Big Data and develop predictive models, Python is probably best. If you’re building Big Data solutions with traditionally-available tools, you shouldn’t stray from the old faithful – Java.

Combining the power of two languages to get a more efficient and powerful solution might be an option too. For example, you can train your machine learning model in Python and then deploy it with Spark in distributed mode. All of this will depend on how efficiently your solution is able to function, and more importantly, how speedy and accurate it’s able to work.

 

The Surprising Ways We Can Learn About Cybersecurity from Public Wi-Fi

A discussion of cybersecurity isn’t exactly a popular topic of conversation for most people, but those same people would likely gush at length if asked about how fond of public wi-fi connections they are! That’s a reflection of our modern world it would seem; we’re all about digital connectivity, but the potential for that connectivity to go sour on us is less of a focus of our attention. That is until it actually does go sour on you, of course, at which point you’ll be wondering why more couldn’t have been done to keep your personal information secure.

Here at 4GoodHosting, cybersecurity is a big priority for us the same way it should be for any of the best Canadian web hosting providers. We wouldn’t have it any other way, and we do work to keep abreast of all the developments in the world of cybersecurity, and in particular these days as it pertains to cloud computing. We recently read a very interesting article about how our preferences for the ways we (meaning the collective whole of society) use public wi-fi can highlight some of the natures and needs related to web security, and we thought it would be helpful to share it and expand on it for you with our blog this week.

Public Wi-Fi and Its Perils

Free, public Wi-Fi is a real blessing for us when mobile data is unavailable, or scarce as if often the case! Few people really know how to articulate exactly what the risks of using public wi-fi are and how we can protect ourselves.

Let’s start with this; when you join a public hotspot without protection and begin to access the internet, the packets of data moving from your device to the router are public and thus open to interception by anyone. Yes, SSL/TLS technology exists but all that’s required for cybercriminal to snoop on your connection is some relatively simple Linux software that he or she can find online without much fuss.

Let’s take a look at some of the attacks that you may be subjected to due to using a public wi-fi network on your mobile device:

Data monitoring

W-fi adapters are usually set to ‘managed’ mode. It then acts as a standalone client connecting to a single router for Internet access. The interface the ignore all data packets with the exception of those that are explicitly addressed to it. However, some adapters can be configured into other modes. ‘Monitor’ mode means an adapter all wireless traffic will be captured in a certain channel, no matter who is the source or intended recipient. In monitor mode the adapter is also able to capture data packets without being connected to a router. It has the ability to sniff and snoop on every piece of data it likes provided it can get its hands on it.

It should be noted that not all commercial wi-fi adapters are capable of this. It’s cheaper for manufacturers to produce models that handle ‘managed’ mode exclusively. Still, should someone get their hands on one and pair it with some simple Linux software, they’ll then able to see which URLs you are loading plus the data you’re providing to any website not using HTTPS – names, addresses, financial accounts etc. That’s obviously going to be a problem for you

Fake Hotspots

Snaring unencrypted data packets out of the air is definitely a risk of public wi-fi, but it’s certainly not the only one. When connecting to an unprotected router, you are then giving your trust to the supplier of that connection. Usually this trust is fine, your local Tim Horton’s probably takes no interest in your private data. However, being careless when connecting to public routers means that cybercriminals can easily set up a fake network designed to lure you in.

Once this illegitimate hotspot has been created, all of the data flowing through it can then be captured, analysed, and manipulated. One of the most common choices here is to redirect your traffic to an imitation of a popular website. This clone site will serve one purpose; to capture your personal information and card details in the same way a phishing scam would.

ARP Spoofing

The reality unfortunately is that cybercriminals don’t even need a fake hotspot to mess with your traffic.
Wi-Fi and Ethernet networks – all of them – have a unique MAC address. This is an identifying code used to ensure data packets make their way to the correct destination. Routers and all other devices discover this information Address Resolution Protocol (ARP).

Take this example; your smartphone sends out a request inquiring which device on the network is associated with a certain IP address. The requested device then provides its MAC address, ensuring the data packets are physically directed to the location determined to be the correct one. The problem is this ARP can be impersonated, or ‘faked’. Your smartphone might send a request for the address of the public wi-fi router, and a different device will answer you with a false address.

Providing the signal of the false device is stronger than the legitimate one, your smartphone will be fooled. Again, this can be done with simple Linux software.

Once the spoofing has taken place, all of your data will be sent to the false router, which can subsequently manipulate the traffic however it likes.

MitM – ‘Man-in-the-Middle’ Attacks

A man-in-the-middle attack (MITM) is a reference to any malicious action where the attacker secretly relays communication between two parties, or alters it for whatever malevolent reason. On an unprotected connection, a cybercriminal can modify key parts of the network traffic, redirect this traffic elsewhere, or fill an existing packet with whatever content they wish.

Examples of this could be displaying a fake login form or website, changing links, text, pictures, or more. Unfortunately, this isn’t difficult to do; an attacker within reception range of an unencrypted wi-fi point is able to insert themselves all too easily much of the time.

Best Practices for Securing your Public Wi-Fi Connection

The ongoing frequency of these attacks definitely serves to highlight the importance of basic cybersecurity best practices. Following these ones to counteract most public wi-fi threats effectively

  1. Have Firewalls in Place

An effective firewall will monitor and block any suspicious traffic flowing between your device and a router. Yes, you should always have a firewall in place and your virus definitions updated as a means of protecting your device from threats you have yet to come across.

While it’s true that properly configured firewalls can effectively block some attacks, they’re not a 100% reliable defender, and you’re definitely not exempt from danger just because of them. They primarily help protect against malicious traffic, not malicious programs, and one of the most frequent instances where they don’t protect you is when you are unaware of the fact you’re running malware. Firewalls should always be paired with other protective measures, and antivirus software being the best of them.

  1. Software updates

Software and system updates are also biggies, and should be installed as soon as you can do so. Staying up to date with the latest security patches is a very proven way to have yourself defended against existing and easily-exploited system vulnerabilities.

  1. Use a VPN

No matter if you’re a regular user of public Wi-Fi or not, A VPN is an essential security tool that you can put to work for you. VPNs serve you here by generating an encrypted tunnel that all of your traffic travels through, ensuring your data is secure regardless of the nature of the network you’re on. If you have reason to be concerned about your security online, a VPN is arguably the best safeguard against the risks posed by open networks.

That said, Free VPNs are not recommended, because many of them have been known to monitor and sell users’ data to third parties. You should choose a service provider with a strong reputation and a strict no-logging policy

  1. Use common sense

You shouldn’t fret too much over hopping onto a public Wi-Fi without a VPN, as the majority of attacks can be avoided by adhering to a few tested-and-true safe computing practices. First, avoid making purchases or visiting sensitive websites like your online banking portal. In addition, it’s best to stay away from any website that doesn’t use HTTPS. The popular browser extender HTTPS everywhere can help you here. Make use of it!

The majority of modern browsers also now have in-built security features that are able to identify threats and notify you if they encounter a malicious website. Heed these warnings.

Go ahead an make good use of public Wi-Fi and all the email checking, web browsing, social media socializing goodness they offer, but just be sure that you’re not putting yourself at risk while doing so.