Top-5 Strategic Technology Trends Expected for 2019

Reading Time: 5 minutes

Here we are on the final day of the year, and most will agree that 2018 has seen IT technology expand in leaps and bounds exactly as it was expected to. In truth, it seems every year brings us a whole whack of new technology trends cementing themselves in the world of IT, web, and computing development. Not surprisingly, the same is forecast for 2019.

Here at 4GoodHosting, a significant part of what makes us one of the many good Canadian web hosting providers is that we enjoy keeping abreast of these developments and then aligning our resources and services with them when it’s beneficial for our customers to do so.

Worldwide IT spending for 2019 is projected to be in the vicinity of $3.8 trillion. That will be a 3.2% increased from the roughly $3.7 trillion spend this year. That’s a LOT of money going into the research and development shaping the digital world that’s so integral to the professional and personal lives for so many of us.

So for the last day of 2018 let’s have a look at the top 10 strategic technology trends we can expect to become the norm over the course of the year that’ll start tomorrow.

  1. Autonomous Things

We’ve all heard the rumblings that we’re on the cusp of the start of the robot age. It seems that may be true. Autonomous things like robots, drones and autonomous vehicles use AI to automate functions that were performed by humans previously. This type of automation goes beyond that provided by rigid programming models, and these automated things use AI to deliver advanced behaviors tailored by their interacting more naturally with their surroundings and with people – when necessary.

The proliferation of autonomous things will constitute a real shift from stand-alone intelligent things to collections of them that will collaborate very intelligently. Multiple devices will work together, and without human input if it’s not required – or not conducive to more cost-effective production or maintenance.

The last part of that is key, as the way autonomous things can reduce production costs by removing the employee cost from the production chain wherever possible is going to have huge ramifications for unskilled labour. As the saying goes – you can’t stop progress.

  1. Augmented Analytics

Augmented analytics can be defined as a focus on specific area of augmented intelligence, and most relevantly in what we’re talking about here is the way we’ll see it start to use machine learning (ML) to transform how analytics content is developed, shared, and consumed. The forecast seems to be that augmented analytics capabilities will quickly become part of mainstream adoption methods and affix itself as a key feature of data preparation, data management, process mining, modern analytics, data science platforms and business process management.

We can also expect to see Automated insights from augmented analytics being embedded in enterprise applications. Look for HR, finance, marketing, customer service, sales, and asset management departments to be optimizing decisions and actions of all employees within their context. These insights from analytics will no longer be utilized by analysts and data scientists exclusively.

The way augmented analytics will automate the data preparation process, insight generation and insight visualization, plus eliminate the need for professional data scientists promises to be a huge paradigm shift too. It’s expected that through 2020 the number of citizen data scientists will have expanded 5x faster than the number of ‘industry-expert’ data scientists, and these citizen variety will then fill the data science and machine learning talent gap resulting from the shortage and high cost of traditional data scientists.

  1. AI-Driven Development

We should also expect to see the market shifting from the old way where professional data scientists would partner with application developers to create most AI-enhanced solutions to a newer where a professional developer can operate on their own using predefined models that are now delivered as a service. The developer is now provided with an ecosystem of AI algorithms and models, and now has development tools that are tailored to integrating AI capabilities and models into workable solutions that weren’t reachable before.

AI being applied to the development process itself leads to another opportunity for professional application development that serves the aim to automate various data science, application development and testing functions. 2019 will be the start of a 3-year window where it’s forecast that at least 40% of new application development projects will have AI co-developers working within the development team.

  1. Digital Twins

Much as the name suggests, a digital twin is a digital representation of a real-world entity or system, and we can expect them to start being increasingly common over the coming year. So much so in fact that by 2020 it is estimated that there will be more than 20 billion connected sensors and endpoints serving digital twins working on millions and millions of different digital tasks.

These digital twins will be deployed simply at first, but we can expect them to evolve them over time and have ever-greater abilities to collect and visualize the right data, determine correct application of the right analytics and rules, and respond effectively to business objectives.

Organization digital twins will help drive efficiencies in business processes, plus create more flexible, dynamic and responsive processes that can potentially react to changing conditions automatically, and we can look for this trend to really start picking up steam in 2019.

  1. Immersive Experience

The last trend we’ll touch on here today is the one that most people will be able to relate to on a n everyday level. We’re all seeing the changes in how people interact with the digital world. Virtual reality (VR), augmented reality (AR) and mixed reality (MR) are revolutionizing the way people interact with the digital world, as well as how they regard it overall. It is from this combined shift in perception and interaction models that future immersive user experiences will be shaped.

2019 should see a continuance of thought and perspective about individual devices and how fragmented user interface (UI) technologies are used for a multichannel and multimodal experience. The relevance of that all will be in how the experience connects people with the digital world across hundreds of edge devices surrounding them – traditional computing devices, wearables, automobiles, environmental sensors and consumer appliances will all increasingly be part of the ‘smart’ device crowd as we move forward.

In the bigger picture, this multi-experience environment will create an ambient experience where the spaces that surround us create a ‘digital entirety’ rather than the sum of individual devices working together. In a sense it will be like the environment itself is the digital processor.

We’ll discuss more about what’s forecasted to be in store for web hosting and computing in 2019 in following weeks, but for now we’d like to say Happy New Year to you and we continue to appreciate your choosing as us your web hosting provider. Here’s to a positive and productive coming year for all of you.

 

 

Google Chrome Solution for ‘History Manipulation’ On Its Way

Reading Time: 3 minutes

No one will need to be convinced of the fact there’s a massive number of shady websites out there designed to ensnare you for any number of no-good purposes. Usually you’re rerouted to them when you take a seemingly harmless action and then often you’re unable to back <- yourself out of the site once you’ve unwilling landed on it. Nobody wants to be on these spammy or malicious pages and you’re stressing out every second longer that you’re there.

The well being of web surfers who also happen to be customers or friends here at 4GoodHosting is important to us, and being proactive in sharing all our wisdom about anything and everything related to the web is a part of what makes one of the best Canadian web hosting providers.

It’s that aim that has us sharing this news with you here today – that Google understands the unpleasantness that comes with this being locked into a website and has plans to make it remediable pretty quick here.

The first time something like this occurs you’ll almost certainly be clicking on the back button repeatedly before realizing it’s got no function. Eventually you’ll come to realize that you’ve got no other recourse than to close the browser, and most often times you’ll quit Chrome altogether ASAP and then launch it again for fear of inheriting a virus or something of the sort from the nefarious site.

How History Manipulation Works, and what Google is Doing About It

You’ll be pleased to hear the Chrome browser will soon be armed with specific protection measures to prevent this happening. The way the ‘back’ button is broken here is something called ‘history manipulation’ by the Chrome team. What it involves is that the malicious site stacks dummy pages onto your browsing history, and these work to fast-forward you back to the unintended destination page you were trying to get away from.

Fortunately, Chrome developers aren’t letting this slide. There are upcoming changes to Chromium’s code which will facilitate the detection of these dummy history entries and then flag sites that use them.

The aim is to allow Chrome to ignore the entirety of these false history entries to make it so that you’re not buried in a site that you had no intention of landing on and the back button functions just as you expect it to.

This development is still in its formative stages, and we should be aware that these countermeasures aren’t even in the pre-release test versions of Chrome yet. However, industry insiders report that testing should begin within the next few weeks or so, and all signs point towards the new feature being part of the full release version of the web browser.

In addition, this being a change to the Chromium engine makes it so that it may eventually benefit other browsers based on it. Most notable of these is Microsoft Edge, making it so that the frustrations of a paralyzed back button will be a thing of the past for either popular web browser. So far there’s no industry talk of Apple doing the same for Safari, but one can imagine they’ll be equally on top of this in much the same way.

Merry Christmas from 4GoodHosting

Given it’s the 24th of December here we of course would like to take this opportunity to wish a Merry Christmas to one and all. We hope you are enjoying the holidays with your family and this last week of 2018 is an especially good one. We can reflect on 2018, and look forward to an even more prosperous year in 2019.

Happy Holidays and best wishes, from all of us to all of you!

Why 64-Bit is Leaving 32-Bit in the Dust with Modern Computing

Reading Time: 3 minutes

Having to choose between 32-bit and 64-bit options when downloading an app or installing a game is pretty common, and many PCs will have a sticker on it that reads 64-bit processor. You’ll be hard pressed to find a sticker on one that reads 32-bit. It’s pretty easy to conclude like you do with most things that more is better, but why is that exactly? Unless you’re a genuinely computer savvy individual you won’t know what the real significance of the difference between the two.

There is some meat to that though, and here at 4GoodHosting as a top Canadian web hosting provider we try to have our thumb on the pulse of the web hosting and computing world. Having a greater understanding of what exactly is ‘under the hood’ of your desktop or notebook and what’s advantageous – or not – about that is helpful. So let’s have a look at the importance difference between 32-bit and 64-bit computing today.

Why Bits Matter

First and foremost, it’s about capability. As you might expect, a 64-bit processor is more capable than a 32-bit processor, and primarily because it can handle more data at once. A greater number of computational values can be taken on by a 64-bit processor and this includes memory addresses. This means it’s able to access over four billion times the physical memory of a 32-bit processor. With the ever-greater memory demands of modern desktop and notebook computers, that’s a big deal.

The key difference in that is something else. 32-bit processors can handle a limited amount of RAM (in Windows, 4GB or less) without difficulty, while 64-bit processors can accordingly take on much more. The ability to do this, however, is based on your operating system being able to take advantage of this greater access to memory. Run anything Windows 10 or up for a PC and you won’t need to worry about limits.

The proliferation of 64-bit processors and larger capacities of RAM have led both Microsoft and Apple to upgrade versions of their operating systems now designed to take full advantage of the new technology. OS X Snow Leopard for Mac was the first fully 64-bit operating system to arrive, nearly 10 years ago in 2009. iPhone was the first smartphone with a 64-bit chip, the Apple A7.

Basic version of the Microsoft Window OS had software limitations with the amount of RAM available for use by applications. Even in the ultimate and professional version of the operating system, 4GB is the maximum usable memory available from the 32-bit. Before you think that going 64-bit is the solution to having nearly unlimited processor capability, however, understand that any real jump in power comes from software designed with to operate within the architecture.

Designed to Make Use of Memory

These days, the recommendation is that you shouldn’t have less than 8GB of RAM to make the best use of applications and video games designed for 64-bit architecture. This is especially useful for programs that can store a lot of information for immediate access, and ones that regularly open multiple large files at the same time.

Another plus is that most software is backwards compatible, which allows you to run applications that are 32-bit in a 64-bit environment and not experience performance issues or make it so that there’s more for you to do. There are exceptions to this, and the most notable of them are virus protection software and drivers. The hardware of these installs usually require the proper version be installed if they’re going to function properly.

Same, But Different

There’s likely no better example of this difference than one found right within your file system. If you’re a Windows user, you’ve likely noticed that you have two Program Files folders; the first is labeled Program Files, while the other is labeled Program Files (x86).

All applications installed shared resources on a Windows system (called DLL files), and how they are structured depends on whether they’re used for 64-bit applications or 32-bit ones. Should a 32-bit application reach out for a DLL and discover that it’s a 64-bit version, it’ll respond quite simply in one way – by refusing to run.

32-bit (x86) architecture has been in use for a good long time now, and there are still plenty of applications that run 32-bit architecture. How they run on some platforms is changing, however. Modern 64-bit systems have the ability to run 32-bit and 64-bit software, and the reason is because they have 2 separate Program Files directories. 32-bit applications are shuffled off to the appropriate x86 folder, and with that Windows then responds by serving up the right DLL, – which is the 32-bit version in this case. On the other side of things, every file in the regular Program Files directory can access the separate content.

Naturally, we can expect 32-bit computing architecture to go the way of the Dodo bird before long, but it’s interesting to note that the superiority of 64-bit is the sum of more than just being a doubling of bits between the two.

 

Top 5 Programming Languages for Taking On Big Data

Reading Time: 5 minutes

In today’s computing world, ‘big data’ – data sets that are too large or complex for traditional data-processing application software – are increasingly common and having the ability to work with them is increasingly a to-be-expected requirement of IT professionals. One of the most important decisions these individuals have to make is deciding on a programming languages for big data manipulation and analysis. More is now required than just simply understanding big data and framing the architecture to solve it. Choosing the right language means you’re able to execute effectively, and that’s very valuable.

As a proven reliable Canadian web hosting provider, here at 4GoodHosting we are naturally attuned to developments in the digital world. Although we didn’t know what it would come to be called, we foresaw the rise of big data but we didn’t entirely foresee just how much of a sway of influence it would have for all of us who take up some niche in information technology.

So with big data becoming even more of a buzz term every week, we thought we’d put together a blog about what seems to be the consensus on the top 5 programming languages for working with Big Data.

Best languages for big data

All of these 5 programming languages make the list because they’re both popular and deemed to be effective.

Scala

Scale blends object-oriented and functional programming paradigms very nicely, and is fast and robust. It’s a popular language choice for many IT professionals needing to work with big data. Another testament to its functionality is that both Apache Spark and Apache Kafka have been built on top of Scala.

Scala runs on the JVM, meaning that codes written in Scala can be easily incorporated within a Java-based Big Data ecosystem. A primary factor differentiating Scala from Java is that Scala is a lot less verbose as compared to Java. What would take seemingly forever to write 100s of lines of confusing-looking Java code can be done in 15 or so lines in Scala. One drawback attached to Scala, though, is its steep learning curve. This is especially true compared to languages like Go and Python. In some cases this difficulty puts off beginners looking to use it.

Advantages of Scala for Big Data:

  • Fast and robust
  • Suitable for working with Big Data tools like Apache Sparkfor distributed Big Data processing
  • JVM compliant, can be used in a Java-based ecosystem

Python

Python’s been earmarked as one of the fastest growing programming languages in 2018, and it benefits from the way its general-purpose nature allows it to be used across a broad spectrum of use-cases. Big Data programming is one of the primary ones of them.

Many libraries for data analysis and manipulation which are being used in a Big Data framework to clean and manipulate large chunks of data more frequently. These include pandas, NumPy, SciPy – all of which are Python-based. In addition, most popular machine learning and deep learning frameworks like Scikit-learn, Tensorflow and others are written in Python too, and are being applied within the Big Data ecosystem much more often.

One negative for Python, however, is that its slowness is one reason why it’s not an established Big Data programming language yet. While it is indisputably easy to use, Big Data professionals have found systems built with languages such as Java or Scala to be faster and more robust.

Python makes up for this by going above and beyond with other qualities. It is primarily a scripting language, so interactive coding and development of analytical solutions for Big Data is made easy as a result. Python also has the ability to integrate effortlessly with the existing Big Data frameworks – Apache Hadoop and Apache Spark most notably. This allows you to perform predictive analytics at scale without any problem.

Advantages of Python for big data:

  • General-purpose
  • Rich libraries for data analysis and machine learning
  • Ease of use
  • Supports iterative development
  • Rich integration with Big Data tools
  • Interactive computing through Jupyter notebooks

R

Those of you who put a lot of emphasis on statistics will love R. It’s referred to as the ‘language of statistics’, and is used to build data models which can be implemented for effective and accurate data analysis.

Large repositories of R packages (CRAN, also called as Comprehensive R Archive Network) set you up with pretty much every type of tool you’d need to accomplish any task in Big Data processing. From analysis to data visualization, R makes it all doable. It can be integrated seamlessly with Apache Hadoop, Apache Spark and most other popular frameworks used to process and analyze Big Data.

The easiest flaw to find with R as a Big Data programming language is that it’s not much of a general purpose language with plenty of practicality. Code written in R is not production-deployable and generally has to be translated to some other programming language like Python or Java. For building statistical models for Big Data analytics, however, R is hard to beat overall.

Advantages of R for big data:

  • Ideally designedfor data science
  • Support for Hadoop and Spark
  • Strong statistical modelling and visualization capabilities
  • Support for Jupyter notebooks

Java

Java is the proverbial ‘old reliable’ as a programming language for big data. Much of the traditional Big Data frameworks like Apache Hadoop and the collection of tools within its ecosystem are based in Java, and still used in many enterprises today. This goes along with the fact that Java is the most stable and production-ready language of all the 4 we’ve covered here so far.

Java’s primary advantage is in the way you have an ability to use a large ecosystem of tools and libraries for interoperability, monitoring and much more, and the bulk of them have already been proven trustworthy.

Java’s verbosity is its primary drawback. Having to write hundreds of lines of codes in Java for a task which would require only 15-20 lines of code in Python or Scala is a big minus for many developers. New lambda functions in Java 8 do counter this some. Another consideration is that Java does not support iterative development unlike newer languages like Python. It is expected that future releases of Java will address this, however.

Java’s history and the continued reliance on traditional Big Data tools and frameworks will mean that Java will never be displaced from a list of preferred Big Data languages.

Advantages of Java for big data:

  • Array of traditional Big Data tools and frameworks written in Java
  • Stable and production-ready
  • Large ecosystem of tried & tested tools and libraries

Go

Last but not the least here is Go. one of the programming languages that’s gained a lot of ground recently. Designed by a group of Google engineers who had become frustrated with C++, Go is worthy of consideration simply because of the fact that it powers many tools used in Big Data infrastructure, including Kubernetes, Docker and several others too.

Go is fast, easy to learn, and it is fairly easy to develop applications with this language. Deploying them is also easy. What might be more relevant for it though is as businesses look at building data analysis systems that can operate at scale, Go-based systems are a great fit for integrating machine learning and undertaking parallel processing of data. That other languages can be interfaced with Go-based systems with relative ease is a big plus too.

Advantages of Go for big data:

Fast and easy to use

Many tools used in the Big Data infrastructure are Go-based

Efficient distributed computing

A few other languages will get HMs here too – Julia, SAS and MATLAB being the most notable ones. All of our 5 had better speed, efficiency, ease of use, documentation, or community support, among other things.

Which Language is Best for You?

This really depends on the use-case you will be developing. If your focus is hardcore data analysis involving s a lot of statistical computing, R would likely be your best choice. On the other hand, if your aim is to develop streaming applications, Scala is your guy. If you’ll be using machine learning to leverage Big Data and develop predictive models, Python is probably best. If you’re building Big Data solutions with traditionally-available tools, you shouldn’t stray from the old faithful – Java.

Combining the power of two languages to get a more efficient and powerful solution might be an option too. For example, you can train your machine learning model in Python and then deploy it with Spark in distributed mode. All of this will depend on how efficiently your solution is able to function, and more importantly, how speedy and accurate it’s able to work.

 

The Surprising Ways We Can Learn About Cybersecurity from Public Wi-Fi

Reading Time: 6 minutes

A discussion of cybersecurity isn’t exactly a popular topic of conversation for most people, but those same people would likely gush at length if asked about how fond of public wi-fi connections they are! That’s a reflection of our modern world it would seem; we’re all about digital connectivity, but the potential for that connectivity to go sour on us is less of a focus of our attention. That is until it actually does go sour on you, of course, at which point you’ll be wondering why more couldn’t have been done to keep your personal information secure.

Here at 4GoodHosting, cybersecurity is a big priority for us the same way it should be for any of the best Canadian web hosting providers. We wouldn’t have it any other way, and we do work to keep abreast of all the developments in the world of cybersecurity, and in particular these days as it pertains to cloud computing. We recently read a very interesting article about how our preferences for the ways we (meaning the collective whole of society) use public wi-fi can highlight some of the natures and needs related to web security, and we thought it would be helpful to share it and expand on it for you with our blog this week.

Public Wi-Fi and Its Perils

Free, public Wi-Fi is a real blessing for us when mobile data is unavailable, or scarce as if often the case! Few people really know how to articulate exactly what the risks of using public wi-fi are and how we can protect ourselves.

Let’s start with this; when you join a public hotspot without protection and begin to access the internet, the packets of data moving from your device to the router are public and thus open to interception by anyone. Yes, SSL/TLS technology exists but all that’s required for cybercriminal to snoop on your connection is some relatively simple Linux software that he or she can find online without much fuss.

Let’s take a look at some of the attacks that you may be subjected to due to using a public wi-fi network on your mobile device:

Data monitoring

W-fi adapters are usually set to ‘managed’ mode. It then acts as a standalone client connecting to a single router for Internet access. The interface the ignore all data packets with the exception of those that are explicitly addressed to it. However, some adapters can be configured into other modes. ‘Monitor’ mode means an adapter all wireless traffic will be captured in a certain channel, no matter who is the source or intended recipient. In monitor mode the adapter is also able to capture data packets without being connected to a router. It has the ability to sniff and snoop on every piece of data it likes provided it can get its hands on it.

It should be noted that not all commercial wi-fi adapters are capable of this. It’s cheaper for manufacturers to produce models that handle ‘managed’ mode exclusively. Still, should someone get their hands on one and pair it with some simple Linux software, they’ll then able to see which URLs you are loading plus the data you’re providing to any website not using HTTPS – names, addresses, financial accounts etc. That’s obviously going to be a problem for you

Fake Hotspots

Snaring unencrypted data packets out of the air is definitely a risk of public wi-fi, but it’s certainly not the only one. When connecting to an unprotected router, you are then giving your trust to the supplier of that connection. Usually this trust is fine, your local Tim Horton’s probably takes no interest in your private data. However, being careless when connecting to public routers means that cybercriminals can easily set up a fake network designed to lure you in.

Once this illegitimate hotspot has been created, all of the data flowing through it can then be captured, analysed, and manipulated. One of the most common choices here is to redirect your traffic to an imitation of a popular website. This clone site will serve one purpose; to capture your personal information and card details in the same way a phishing scam would.

ARP Spoofing

The reality unfortunately is that cybercriminals don’t even need a fake hotspot to mess with your traffic.
Wi-Fi and Ethernet networks – all of them – have a unique MAC address. This is an identifying code used to ensure data packets make their way to the correct destination. Routers and all other devices discover this information Address Resolution Protocol (ARP).

Take this example; your smartphone sends out a request inquiring which device on the network is associated with a certain IP address. The requested device then provides its MAC address, ensuring the data packets are physically directed to the location determined to be the correct one. The problem is this ARP can be impersonated, or ‘faked’. Your smartphone might send a request for the address of the public wi-fi router, and a different device will answer you with a false address.

Providing the signal of the false device is stronger than the legitimate one, your smartphone will be fooled. Again, this can be done with simple Linux software.

Once the spoofing has taken place, all of your data will be sent to the false router, which can subsequently manipulate the traffic however it likes.

MitM – ‘Man-in-the-Middle’ Attacks

A man-in-the-middle attack (MITM) is a reference to any malicious action where the attacker secretly relays communication between two parties, or alters it for whatever malevolent reason. On an unprotected connection, a cybercriminal can modify key parts of the network traffic, redirect this traffic elsewhere, or fill an existing packet with whatever content they wish.

Examples of this could be displaying a fake login form or website, changing links, text, pictures, or more. Unfortunately, this isn’t difficult to do; an attacker within reception range of an unencrypted wi-fi point is able to insert themselves all too easily much of the time.

Best Practices for Securing your Public Wi-Fi Connection

The ongoing frequency of these attacks definitely serves to highlight the importance of basic cybersecurity best practices. Following these ones to counteract most public wi-fi threats effectively

  1. Have Firewalls in Place

An effective firewall will monitor and block any suspicious traffic flowing between your device and a router. Yes, you should always have a firewall in place and your virus definitions updated as a means of protecting your device from threats you have yet to come across.

While it’s true that properly configured firewalls can effectively block some attacks, they’re not a 100% reliable defender, and you’re definitely not exempt from danger just because of them. They primarily help protect against malicious traffic, not malicious programs, and one of the most frequent instances where they don’t protect you is when you are unaware of the fact you’re running malware. Firewalls should always be paired with other protective measures, and antivirus software being the best of them.

  1. Software updates

Software and system updates are also biggies, and should be installed as soon as you can do so. Staying up to date with the latest security patches is a very proven way to have yourself defended against existing and easily-exploited system vulnerabilities.

  1. Use a VPN

No matter if you’re a regular user of public Wi-Fi or not, A VPN is an essential security tool that you can put to work for you. VPNs serve you here by generating an encrypted tunnel that all of your traffic travels through, ensuring your data is secure regardless of the nature of the network you’re on. If you have reason to be concerned about your security online, a VPN is arguably the best safeguard against the risks posed by open networks.

That said, Free VPNs are not recommended, because many of them have been known to monitor and sell users’ data to third parties. You should choose a service provider with a strong reputation and a strict no-logging policy

  1. Use common sense

You shouldn’t fret too much over hopping onto a public Wi-Fi without a VPN, as the majority of attacks can be avoided by adhering to a few tested-and-true safe computing practices. First, avoid making purchases or visiting sensitive websites like your online banking portal. In addition, it’s best to stay away from any website that doesn’t use HTTPS. The popular browser extender HTTPS everywhere can help you here. Make use of it!

The majority of modern browsers also now have in-built security features that are able to identify threats and notify you if they encounter a malicious website. Heed these warnings.

Go ahead an make good use of public Wi-Fi and all the email checking, web browsing, social media socializing goodness they offer, but just be sure that you’re not putting yourself at risk while doing so.