Global Environmental Sustainability with Data Centers

Last week we talked about key trends for software development expected for 2019, and today we’ll discuss another trend for the coming year that’s a bit more of a given. That being that datacenters will have even more demands placed on their capacities as we continue to become more of a digital working world all the time.

Indeed, datacenters have grown to be key partners for enterprises, rather than being just an external service utilized for storing data and business operation models. Even the smallest of issues in datacenter operations can impact business.

While datacenters are certainly lifeblood for every business, they also have global impacts and in particular as it relates to energy consumption. Somewhere in the vicinity of 3% of total electricity consumption worldwide is made by datacenters, and to put that in perspective that’s more than the entire power consumption of the UK.

Datacenters also account for 2% of global greenhouse gas emissions, and 2% electronic waste (aka e-waste). Many people aren’t aware of the extent to which our growingly digital world impacts the natural one so directly, but it really does.

Like any good Canadian web hosting provider who provides the service for thousands of customers, we have extensive datacenter requirements ourselves. Most will make efforts to ensure their datacenters operate as energy-efficiently as possible, and that goes along with the primary aim – making sure those data centers are rock-solid reliable AND as secure as possible.

Let’s take a look today at what’s being done around the globe to promote environmental sustainability with data centers.

Lack of Environmental Policies

Super Micro Computer recently put out a report entitled ‘Data Centers and the Environment’ and it stated that 43% of organizations don’t have an environmental policy, and another 50% have no plans to develop any such policy anytime soon. Reasons why? high costs (29%), lack of resources or understanding (27%), and then another 14% don’t make environmental issues a priority.

The aim of the report was to help datacenter managers better understand the environmental impact of datacenters, provide quantitative comparisons of other companies, and then in time help them reduce this impact.

Key Findings

28% of businesses take environmental issues into consideration when choosing datacenter technology

Priorities that came before it for most companies surveyed were security, performance, and connectivity. However, 9% of companies considered ‘green’ technology to be the foremost priority. When it comes to actual datacenter design, however, the number of companies who put a priority on energy efficiency jumps up by 50% to 59%.

The Average PUE for a Datacenter is 1.89

Power Usage Effectiveness (PUE) means the ratio of energy consumed by datacenter in comparison to the energy provided to IT equipment. The report found the average datacenter PUE is approx. 1.6 but many (over 2/3) of enterprise datacenters come in with a PUE over 2.03.

Further, it seems some 58% of companies are unaware of their datacenter PUE. Only a meagre 6% come in that average range between 1.0 and 1.19.

24.6 Degrees C is the Average Datacenter Temperature

It’s common for companies to run datacenters at higher temperatures to reduce strain on HVAC systems and increase savings on energy consumption and related costs. The report found 43% of the datacenters have temperatures ranging between 21 degrees C and 24 degrees C.

The primary reasons indicated for running datacenters at higher temperatures are for reliability and performance. Hopefully these operators will come to soon learn that recent advancements in server technology have optimized thermal designs and newer datacenter designs make use of free-air cooling. With them, they can run datacenters at ambient temperatures up to 40 degrees C and see no decrease in reliability and performance. It also helps improve PUE and saving costs.

Another trend in data center technology is immersion cooling, where datacenters are cooled by being entirely immersed. We can expect to see more of this type of datacenter technology rolled out this year too.

3/4 of Datacenters Have System Refreshes Within 5 Years

Datacenters and their energy consumption can be optimized with regular updates of the systems and adding modern technologies that consume low power. The report found that approximately 45% of data center operators conduct a refreshing of their system sometime within every 3 years. 28% of them do it every four to five years. It also seems that the larger the company, the more likely they are to do these refreshes.

8% Increase in Datacenter E-Waste Expected Each Year

It’s inevitable that electronic waste (e-waste) is created when datacenters dispose of server, storage, and networking equipment. It’s a bit of a staggering statistic when you learn that around 20 to 50 million electric tons of e-waste is disposed every year around the world, and the main reason it’s so problematic is that e-waste deposits heavy metals and other hazardous waste into landfills. If left unchecked and we continue to produce it as we have then e-waste disposal will increase by 8% each year.

Some companies partner with recycling companies to dispose of e-waste, and some repurpose their hardware in any one of a number of different ways. The report found that some 12% of companies don’t have a recycling or repurposing program in place, and typically they don’t because it’s costly, partners / providers are difficult to find in their area, and lack of proper planning.

On a more positive note, many companies are adopting policies to address the environmental issues that stem from their datacenter operation. Around 58% of companies already have environmental policy in place or are developing it.

We can all agree that datacenters are an invaluable resource and absolutely essential for the digital connectivity of our modern world. However, they are ‘power pigs’ as the expression goes, and it’s unavoidable that they are given the sheer volume of activity that goes on within them every day. We’ve seen how they’ve become marginally more energy efficient, and in this year to come we will hopefully see more energy efficiency technology applied to them.

Key Trends in Software Development Expected for 2019

Here we are into the first week of 2019 and as expected we’ve got a whole lot on the horizon this year in the way of software development. We live in a world that’s more and more digital all the time, and the demands put on the software development industry are pretty much non-stop in response to this ongoing shift. Often times it’s all about more efficient ‘straight lining’ of tasks as well as creating more of a can-do environment for people who need applications and the like to work smarter.

Here at 4GoodHosting, a part of what makes us a reputable Canadian web hosting provider is the way we stay abreast of developments. Not only in the web hosting industry, but also in the ones that have a direct relevance for clients of ours in the way they’re connected to computing and computing technology.

Today we’re going to discuss the key trends in software development that are expected for this coming year.

Continuing to Come a Long Way

Look back 10 years and you’ll surely agree the changes in the types of applications and websites that have been built – as well as how they’ve been built – is really quite something. The web of 2008 is almost unrecognizable. Today it is very much an app and API economy. It was only just 10ish years ago that JavaScript framework was the newest and best around, but now building for browsers exclusively is very much a thing of the past.

In 2019 we’re going to see priorities put on progressive web apps, artificial intelligence, and native app development remain. As adoption increases and new tools emerge, we can expect to see more radical shifts in the ways we work in the digital world. There’s going to be less in the way of ‘cutting edge’ and more in the way of refinements on technology that reflect developers now having a better understanding of how technologies can be applied

The biggest thing for web developers now is that they need to expand upon the stack as applications become increasingly lightweight (in large part due to libraries and frameworks like Vue and React), and data grows to be more intensive, which can be attributed to the range of services upon which applications and websites depend.

Reinventing Modern JavaScript Web Development

One of the things that’s being seen is how topics that previously weren’t included under the umbrella of web development – microservices and native app development most notably– are now very much part of the need-to-know landscape.

The way many aspects of development have been simplified has forced developers to evaluate how these aspects fit together more closely. With all the layers of abstraction in modern development, the way things interact and work alongside each other becomes even more important. Having a level of wherewithal regarding this working relationship is very beneficial for any developer.

Those who’ve adapted to the new realities well will now agree that it’s no longer a case of writing the requisite code to make something run on the specific part of the application being worked on. Rather, it’s about understanding how the various pieces fit together from the backend to the front.

In 2019, developers will need to dive deeper become inside-out familiar with their software systems. Being explicitly comfortable with backends will be an increasingly necessary starting point. Diving into the cloud and understanding that dynamic is also highly advisable. It will be wise to start playing with microservices. Rethinking and revisiting languages you thought you knew is a good idea too.

Be Familiar With infrastructure to Tackle Challenges of API development

Some will be surprised to hear it, but as the stack shrinks and the responsibilities of web developers shift we can expect that having an understanding of the architectural components within the software being built will be wholly essential.

That reality is put in place by DevOps, and essentially it has made developers responsible for how their code runs once it hits production. As a result, the requisite skills and toolchain for the modern developer is also expanding.

RESTful API Design Patterns and Best Practices

You can make your way into software architecture through a number of different avenues, but exploring API design is likely the best of them. Hands on RESTful API Design gives you a practical way into the topic.

REST is the industry standard for API design, and the diverse range of tools and approaches is making client management a potentially complex but interesting area. GraphQL, a query language developed by Facebook is responsible for killing off REST, while Redux and Relay – a pair of libraries for managing data in React applications – have both seen a significant amount of interest over the last year as a pair of key tools for working with APIs.

Microservices for Infrastructure Responsibility

Microservices are becoming the dominant architectural mode, and that’s the reason we’re seeing such an array of tools capable of managing APIs. Expect a whole lot more of them to be introduced this year, and be proactive in finding which ones work best for you. While you may not need to implement microservices now, if you want to be building software in 5 years time then you really should become explicitly familiar with the principles behind microservices and the tools that can assist you when using them.

We can expect to see containers being one of the central technologies driving microservices. You could run microservices in a virtual machine, but as they’re harder to scale than containers you likely wouldn’t see the benefits you’ll expect from a microservices architecture. As a result, really getting to know core container technologies should also be a real consideration.

The obvious place to start is with Docker. Developers need to understand it to varying degrees, but even those who don’t think they’ll be using it immediately will agree that the real-world foundation in containers it provides will be valuable knowledge to have at some point.

Kubernetes warrants mention here as well, as it is the go-to tool that allows you to scale and orchestrate containers. It offers control over how you scale application services in a way that would have bee unimaginable a decade ago.

A great way for anyone to learn how Docker and Kubernetes come together as part of a fully integrated approach to development is with Hands on Microservices with Node.js.

Continued Embracing of the Cloud

It appears the general trend is towards full stack, and for this reason developers simply can’t afford to ignore cloud computing. The levels of abstraction it offers, and the various services and integrations that come with the leading cloud services make it so that many elements of the development process are much easier.

Issues surrounding scale, hardware, setup and maintenance nearly disappear entirely when you use cloud. Yes, cloud platforms bring their own set of challenges, but they also allow you to focus on more pressing issues and problems.

More importantly, however, they open up new opportunities. First and foremost of them is going Serverless becomes a possibility. Doing so allows you to scale incredibly quickly by running everything on your cloud provider.

There are other advantages too, like when you use cloud to incorporate advanced features like artificial intelligence into your applications. AWS has a whole suite of machine learning tools; AWS Lex helps you build conversational interfaces, and AWS Polly turns text into speech. Azure Cognitive Services has a nice array of features for vision, speech, language, and search.

As a developer, it’s going to be increasingly important to see the Cloud as a way of expanding on the complexity of applications and processes while keeping them agile. Features and optimizations previously might have found to be sluggish or impossible can and should be developed as necessary and then incorporated. Leveraging AWS and Azure (among others) is going to be something that many developers will do with success in the coming year.

Back to Basics with New languages & Fresh Approaches

All of this ostensible complexity in contemporary software development may lead some to think that languages don’t matter as much as they once did. It’s important to know that’s definitely not the case. Building up a deeper understanding of how languages work, what they offer, and where they come up short can make you a much more accomplished developer. Doing what it takes to be prepared is really good advice for a what’s an ever-more unpredictable digital world to come this year and in years to follow.

We can expect to see a trend where developers go back to a language they know and explore a new paradigm within it, or they learn a new language from scratch.

Never Time to Be Complacent

We’ll reiterate what the experts we read are saying; that in just a matter of years much of what is ‘emerging’ today will be old hat. It’s helpful to take a look at the set of skills many full stack developer job postings are requiring. You’ll see that the different demands are so diverse that adaptability should be a real priority for a developer that wants to remain upwardly mobile within his or her profession. Without doubt it will be immensely valuable both for your immediate projects and future career prospects.

Top-5 Strategic Technology Trends Expected for 2019

Here we are on the final day of the year, and most will agree that 2018 has seen IT technology expand in leaps and bounds exactly as it was expected to. In truth, it seems every year brings us a whole whack of new technology trends cementing themselves in the world of IT, web, and computing development. Not surprisingly, the same is forecast for 2019.

Here at 4GoodHosting, a significant part of what makes us one of the many good Canadian web hosting providers is that we enjoy keeping abreast of these developments and then aligning our resources and services with them when it’s beneficial for our customers to do so.

Worldwide IT spending for 2019 is projected to be in the vicinity of $3.8 trillion. That will be a 3.2% increased from the roughly $3.7 trillion spend this year. That’s a LOT of money going into the research and development shaping the digital world that’s so integral to the professional and personal lives for so many of us.

So for the last day of 2018 let’s have a look at the top 10 strategic technology trends we can expect to become the norm over the course of the year that’ll start tomorrow.

  1. Autonomous Things

We’ve all heard the rumblings that we’re on the cusp of the start of the robot age. It seems that may be true. Autonomous things like robots, drones and autonomous vehicles use AI to automate functions that were performed by humans previously. This type of automation goes beyond that provided by rigid programming models, and these automated things use AI to deliver advanced behaviors tailored by their interacting more naturally with their surroundings and with people – when necessary.

The proliferation of autonomous things will constitute a real shift from stand-alone intelligent things to collections of them that will collaborate very intelligently. Multiple devices will work together, and without human input if it’s not required – or not conducive to more cost-effective production or maintenance.

The last part of that is key, as the way autonomous things can reduce production costs by removing the employee cost from the production chain wherever possible is going to have huge ramifications for unskilled labour. As the saying goes – you can’t stop progress.

  1. Augmented Analytics

Augmented analytics can be defined as a focus on specific area of augmented intelligence, and most relevantly in what we’re talking about here is the way we’ll see it start to use machine learning (ML) to transform how analytics content is developed, shared, and consumed. The forecast seems to be that augmented analytics capabilities will quickly become part of mainstream adoption methods and affix itself as a key feature of data preparation, data management, process mining, modern analytics, data science platforms and business process management.

We can also expect to see Automated insights from augmented analytics being embedded in enterprise applications. Look for HR, finance, marketing, customer service, sales, and asset management departments to be optimizing decisions and actions of all employees within their context. These insights from analytics will no longer be utilized by analysts and data scientists exclusively.

The way augmented analytics will automate the data preparation process, insight generation and insight visualization, plus eliminate the need for professional data scientists promises to be a huge paradigm shift too. It’s expected that through 2020 the number of citizen data scientists will have expanded 5x faster than the number of ‘industry-expert’ data scientists, and these citizen variety will then fill the data science and machine learning talent gap resulting from the shortage and high cost of traditional data scientists.

  1. AI-Driven Development

We should also expect to see the market shifting from the old way where professional data scientists would partner with application developers to create most AI-enhanced solutions to a newer where a professional developer can operate on their own using predefined models that are now delivered as a service. The developer is now provided with an ecosystem of AI algorithms and models, and now has development tools that are tailored to integrating AI capabilities and models into workable solutions that weren’t reachable before.

AI being applied to the development process itself leads to another opportunity for professional application development that serves the aim to automate various data science, application development and testing functions. 2019 will be the start of a 3-year window where it’s forecast that at least 40% of new application development projects will have AI co-developers working within the development team.

  1. Digital Twins

Much as the name suggests, a digital twin is a digital representation of a real-world entity or system, and we can expect them to start being increasingly common over the coming year. So much so in fact that by 2020 it is estimated that there will be more than 20 billion connected sensors and endpoints serving digital twins working on millions and millions of different digital tasks.

These digital twins will be deployed simply at first, but we can expect them to evolve them over time and have ever-greater abilities to collect and visualize the right data, determine correct application of the right analytics and rules, and respond effectively to business objectives.

Organization digital twins will help drive efficiencies in business processes, plus create more flexible, dynamic and responsive processes that can potentially react to changing conditions automatically, and we can look for this trend to really start picking up steam in 2019.

  1. Immersive Experience

The last trend we’ll touch on here today is the one that most people will be able to relate to on a n everyday level. We’re all seeing the changes in how people interact with the digital world. Virtual reality (VR), augmented reality (AR) and mixed reality (MR) are revolutionizing the way people interact with the digital world, as well as how they regard it overall. It is from this combined shift in perception and interaction models that future immersive user experiences will be shaped.

2019 should see a continuance of thought and perspective about individual devices and how fragmented user interface (UI) technologies are used for a multichannel and multimodal experience. The relevance of that all will be in how the experience connects people with the digital world across hundreds of edge devices surrounding them – traditional computing devices, wearables, automobiles, environmental sensors and consumer appliances will all increasingly be part of the ‘smart’ device crowd as we move forward.

In the bigger picture, this multi-experience environment will create an ambient experience where the spaces that surround us create a ‘digital entirety’ rather than the sum of individual devices working together. In a sense it will be like the environment itself is the digital processor.

We’ll discuss more about what’s forecasted to be in store for web hosting and computing in 2019 in following weeks, but for now we’d like to say Happy New Year to you and we continue to appreciate your choosing as us your web hosting provider. Here’s to a positive and productive coming year for all of you.

 

 

Google Chrome Solution for ‘History Manipulation’ On Its Way

No one will need to be convinced of the fact there’s a massive number of shady websites out there designed to ensnare you for any number of no-good purposes. Usually you’re rerouted to them when you take a seemingly harmless action and then often you’re unable to back <- yourself out of the site once you’ve unwilling landed on it. Nobody wants to be on these spammy or malicious pages and you’re stressing out every second longer that you’re there.

The well being of web surfers who also happen to be customers or friends here at 4GoodHosting is important to us, and being proactive in sharing all our wisdom about anything and everything related to the web is a part of what makes one of the best Canadian web hosting providers.

It’s that aim that has us sharing this news with you here today – that Google understands the unpleasantness that comes with this being locked into a website and has plans to make it remediable pretty quick here.

The first time something like this occurs you’ll almost certainly be clicking on the back button repeatedly before realizing it’s got no function. Eventually you’ll come to realize that you’ve got no other recourse than to close the browser, and most often times you’ll quit Chrome altogether ASAP and then launch it again for fear of inheriting a virus or something of the sort from the nefarious site.

How History Manipulation Works, and what Google is Doing About It

You’ll be pleased to hear the Chrome browser will soon be armed with specific protection measures to prevent this happening. The way the ‘back’ button is broken here is something called ‘history manipulation’ by the Chrome team. What it involves is that the malicious site stacks dummy pages onto your browsing history, and these work to fast-forward you back to the unintended destination page you were trying to get away from.

Fortunately, Chrome developers aren’t letting this slide. There are upcoming changes to Chromium’s code which will facilitate the detection of these dummy history entries and then flag sites that use them.

The aim is to allow Chrome to ignore the entirety of these false history entries to make it so that you’re not buried in a site that you had no intention of landing on and the back button functions just as you expect it to.

This development is still in its formative stages, and we should be aware that these countermeasures aren’t even in the pre-release test versions of Chrome yet. However, industry insiders report that testing should begin within the next few weeks or so, and all signs point towards the new feature being part of the full release version of the web browser.

In addition, this being a change to the Chromium engine makes it so that it may eventually benefit other browsers based on it. Most notable of these is Microsoft Edge, making it so that the frustrations of a paralyzed back button will be a thing of the past for either popular web browser. So far there’s no industry talk of Apple doing the same for Safari, but one can imagine they’ll be equally on top of this in much the same way.

Merry Christmas from 4GoodHosting

Given it’s the 24th of December here we of course would like to take this opportunity to wish a Merry Christmas to one and all. We hope you are enjoying the holidays with your family and this last week of 2018 is an especially good one. We can reflect on 2018, and look forward to an even more prosperous year in 2019.

Happy Holidays and best wishes, from all of us to all of you!

Why 64-Bit is Leaving 32-Bit in the Dust with Modern Computing

Having to choose between 32-bit and 64-bit options when downloading an app or installing a game is pretty common, and many PCs will have a sticker on it that reads 64-bit processor. You’ll be hard pressed to find a sticker on one that reads 32-bit. It’s pretty easy to conclude like you do with most things that more is better, but why is that exactly? Unless you’re a genuinely computer savvy individual you won’t know what the real significance of the difference between the two.

There is some meat to that though, and here at 4GoodHosting as a top Canadian web hosting provider we try to have our thumb on the pulse of the web hosting and computing world. Having a greater understanding of what exactly is ‘under the hood’ of your desktop or notebook and what’s advantageous – or not – about that is helpful. So let’s have a look at the importance difference between 32-bit and 64-bit computing today.

Why Bits Matter

First and foremost, it’s about capability. As you might expect, a 64-bit processor is more capable than a 32-bit processor, and primarily because it can handle more data at once. A greater number of computational values can be taken on by a 64-bit processor and this includes memory addresses. This means it’s able to access over four billion times the physical memory of a 32-bit processor. With the ever-greater memory demands of modern desktop and notebook computers, that’s a big deal.

The key difference in that is something else. 32-bit processors can handle a limited amount of RAM (in Windows, 4GB or less) without difficulty, while 64-bit processors can accordingly take on much more. The ability to do this, however, is based on your operating system being able to take advantage of this greater access to memory. Run anything Windows 10 or up for a PC and you won’t need to worry about limits.

The proliferation of 64-bit processors and larger capacities of RAM have led both Microsoft and Apple to upgrade versions of their operating systems now designed to take full advantage of the new technology. OS X Snow Leopard for Mac was the first fully 64-bit operating system to arrive, nearly 10 years ago in 2009. iPhone was the first smartphone with a 64-bit chip, the Apple A7.

Basic version of the Microsoft Window OS had software limitations with the amount of RAM available for use by applications. Even in the ultimate and professional version of the operating system, 4GB is the maximum usable memory available from the 32-bit. Before you think that going 64-bit is the solution to having nearly unlimited processor capability, however, understand that any real jump in power comes from software designed with to operate within the architecture.

Designed to Make Use of Memory

These days, the recommendation is that you shouldn’t have less than 8GB of RAM to make the best use of applications and video games designed for 64-bit architecture. This is especially useful for programs that can store a lot of information for immediate access, and ones that regularly open multiple large files at the same time.

Another plus is that most software is backwards compatible, which allows you to run applications that are 32-bit in a 64-bit environment and not experience performance issues or make it so that there’s more for you to do. There are exceptions to this, and the most notable of them are virus protection software and drivers. The hardware of these installs usually require the proper version be installed if they’re going to function properly.

Same, But Different

There’s likely no better example of this difference than one found right within your file system. If you’re a Windows user, you’ve likely noticed that you have two Program Files folders; the first is labeled Program Files, while the other is labeled Program Files (x86).

All applications installed shared resources on a Windows system (called DLL files), and how they are structured depends on whether they’re used for 64-bit applications or 32-bit ones. Should a 32-bit application reach out for a DLL and discover that it’s a 64-bit version, it’ll respond quite simply in one way – by refusing to run.

32-bit (x86) architecture has been in use for a good long time now, and there are still plenty of applications that run 32-bit architecture. How they run on some platforms is changing, however. Modern 64-bit systems have the ability to run 32-bit and 64-bit software, and the reason is because they have 2 separate Program Files directories. 32-bit applications are shuffled off to the appropriate x86 folder, and with that Windows then responds by serving up the right DLL, – which is the 32-bit version in this case. On the other side of things, every file in the regular Program Files directory can access the separate content.

Naturally, we can expect 32-bit computing architecture to go the way of the Dodo bird before long, but it’s interesting to note that the superiority of 64-bit is the sum of more than just being a doubling of bits between the two.

 

Top 5 Programming Languages for Taking On Big Data

In today’s computing world, ‘big data’ – data sets that are too large or complex for traditional data-processing application software – are increasingly common and having the ability to work with them is increasingly a to-be-expected requirement of IT professionals. One of the most important decisions these individuals have to make is deciding on a programming languages for big data manipulation and analysis. More is now required than just simply understanding big data and framing the architecture to solve it. Choosing the right language means you’re able to execute effectively, and that’s very valuable.

As a proven reliable Canadian web hosting provider, here at 4GoodHosting we are naturally attuned to developments in the digital world. Although we didn’t know what it would come to be called, we foresaw the rise of big data but we didn’t entirely foresee just how much of a sway of influence it would have for all of us who take up some niche in information technology.

So with big data becoming even more of a buzz term every week, we thought we’d put together a blog about what seems to be the consensus on the top 5 programming languages for working with Big Data.

Best languages for big data

All of these 5 programming languages make the list because they’re both popular and deemed to be effective.

Scala

Scale blends object-oriented and functional programming paradigms very nicely, and is fast and robust. It’s a popular language choice for many IT professionals needing to work with big data. Another testament to its functionality is that both Apache Spark and Apache Kafka have been built on top of Scala.

Scala runs on the JVM, meaning that codes written in Scala can be easily incorporated within a Java-based Big Data ecosystem. A primary factor differentiating Scala from Java is that Scala is a lot less verbose as compared to Java. What would take seemingly forever to write 100s of lines of confusing-looking Java code can be done in 15 or so lines in Scala. One drawback attached to Scala, though, is its steep learning curve. This is especially true compared to languages like Go and Python. In some cases this difficulty puts off beginners looking to use it.

Advantages of Scala for Big Data:

  • Fast and robust
  • Suitable for working with Big Data tools like Apache Sparkfor distributed Big Data processing
  • JVM compliant, can be used in a Java-based ecosystem

Python

Python’s been earmarked as one of the fastest growing programming languages in 2018, and it benefits from the way its general-purpose nature allows it to be used across a broad spectrum of use-cases. Big Data programming is one of the primary ones of them.

Many libraries for data analysis and manipulation which are being used in a Big Data framework to clean and manipulate large chunks of data more frequently. These include pandas, NumPy, SciPy – all of which are Python-based. In addition, most popular machine learning and deep learning frameworks like Scikit-learn, Tensorflow and others are written in Python too, and are being applied within the Big Data ecosystem much more often.

One negative for Python, however, is that its slowness is one reason why it’s not an established Big Data programming language yet. While it is indisputably easy to use, Big Data professionals have found systems built with languages such as Java or Scala to be faster and more robust.

Python makes up for this by going above and beyond with other qualities. It is primarily a scripting language, so interactive coding and development of analytical solutions for Big Data is made easy as a result. Python also has the ability to integrate effortlessly with the existing Big Data frameworks – Apache Hadoop and Apache Spark most notably. This allows you to perform predictive analytics at scale without any problem.

Advantages of Python for big data:

  • General-purpose
  • Rich libraries for data analysis and machine learning
  • Ease of use
  • Supports iterative development
  • Rich integration with Big Data tools
  • Interactive computing through Jupyter notebooks

R

Those of you who put a lot of emphasis on statistics will love R. It’s referred to as the ‘language of statistics’, and is used to build data models which can be implemented for effective and accurate data analysis.

Large repositories of R packages (CRAN, also called as Comprehensive R Archive Network) set you up with pretty much every type of tool you’d need to accomplish any task in Big Data processing. From analysis to data visualization, R makes it all doable. It can be integrated seamlessly with Apache Hadoop, Apache Spark and most other popular frameworks used to process and analyze Big Data.

The easiest flaw to find with R as a Big Data programming language is that it’s not much of a general purpose language with plenty of practicality. Code written in R is not production-deployable and generally has to be translated to some other programming language like Python or Java. For building statistical models for Big Data analytics, however, R is hard to beat overall.

Advantages of R for big data:

  • Ideally designedfor data science
  • Support for Hadoop and Spark
  • Strong statistical modelling and visualization capabilities
  • Support for Jupyter notebooks

Java

Java is the proverbial ‘old reliable’ as a programming language for big data. Much of the traditional Big Data frameworks like Apache Hadoop and the collection of tools within its ecosystem are based in Java, and still used in many enterprises today. This goes along with the fact that Java is the most stable and production-ready language of all the 4 we’ve covered here so far.

Java’s primary advantage is in the way you have an ability to use a large ecosystem of tools and libraries for interoperability, monitoring and much more, and the bulk of them have already been proven trustworthy.

Java’s verbosity is its primary drawback. Having to write hundreds of lines of codes in Java for a task which would require only 15-20 lines of code in Python or Scala is a big minus for many developers. New lambda functions in Java 8 do counter this some. Another consideration is that Java does not support iterative development unlike newer languages like Python. It is expected that future releases of Java will address this, however.

Java’s history and the continued reliance on traditional Big Data tools and frameworks will mean that Java will never be displaced from a list of preferred Big Data languages.

Advantages of Java for big data:

  • Array of traditional Big Data tools and frameworks written in Java
  • Stable and production-ready
  • Large ecosystem of tried & tested tools and libraries

Go

Last but not the least here is Go. one of the programming languages that’s gained a lot of ground recently. Designed by a group of Google engineers who had become frustrated with C++, Go is worthy of consideration simply because of the fact that it powers many tools used in Big Data infrastructure, including Kubernetes, Docker and several others too.

Go is fast, easy to learn, and it is fairly easy to develop applications with this language. Deploying them is also easy. What might be more relevant for it though is as businesses look at building data analysis systems that can operate at scale, Go-based systems are a great fit for integrating machine learning and undertaking parallel processing of data. That other languages can be interfaced with Go-based systems with relative ease is a big plus too.

Advantages of Go for big data:

Fast and easy to use

Many tools used in the Big Data infrastructure are Go-based

Efficient distributed computing

A few other languages will get HMs here too – Julia, SAS and MATLAB being the most notable ones. All of our 5 had better speed, efficiency, ease of use, documentation, or community support, among other things.

Which Language is Best for You?

This really depends on the use-case you will be developing. If your focus is hardcore data analysis involving s a lot of statistical computing, R would likely be your best choice. On the other hand, if your aim is to develop streaming applications, Scala is your guy. If you’ll be using machine learning to leverage Big Data and develop predictive models, Python is probably best. If you’re building Big Data solutions with traditionally-available tools, you shouldn’t stray from the old faithful – Java.

Combining the power of two languages to get a more efficient and powerful solution might be an option too. For example, you can train your machine learning model in Python and then deploy it with Spark in distributed mode. All of this will depend on how efficiently your solution is able to function, and more importantly, how speedy and accurate it’s able to work.

 

The Surprising Ways We Can Learn About Cybersecurity from Public Wi-Fi

A discussion of cybersecurity isn’t exactly a popular topic of conversation for most people, but those same people would likely gush at length if asked about how fond of public wi-fi connections they are! That’s a reflection of our modern world it would seem; we’re all about digital connectivity, but the potential for that connectivity to go sour on us is less of a focus of our attention. That is until it actually does go sour on you, of course, at which point you’ll be wondering why more couldn’t have been done to keep your personal information secure.

Here at 4GoodHosting, cybersecurity is a big priority for us the same way it should be for any of the best Canadian web hosting providers. We wouldn’t have it any other way, and we do work to keep abreast of all the developments in the world of cybersecurity, and in particular these days as it pertains to cloud computing. We recently read a very interesting article about how our preferences for the ways we (meaning the collective whole of society) use public wi-fi can highlight some of the natures and needs related to web security, and we thought it would be helpful to share it and expand on it for you with our blog this week.

Public Wi-Fi and Its Perils

Free, public Wi-Fi is a real blessing for us when mobile data is unavailable, or scarce as if often the case! Few people really know how to articulate exactly what the risks of using public wi-fi are and how we can protect ourselves.

Let’s start with this; when you join a public hotspot without protection and begin to access the internet, the packets of data moving from your device to the router are public and thus open to interception by anyone. Yes, SSL/TLS technology exists but all that’s required for cybercriminal to snoop on your connection is some relatively simple Linux software that he or she can find online without much fuss.

Let’s take a look at some of the attacks that you may be subjected to due to using a public wi-fi network on your mobile device:

Data monitoring

W-fi adapters are usually set to ‘managed’ mode. It then acts as a standalone client connecting to a single router for Internet access. The interface the ignore all data packets with the exception of those that are explicitly addressed to it. However, some adapters can be configured into other modes. ‘Monitor’ mode means an adapter all wireless traffic will be captured in a certain channel, no matter who is the source or intended recipient. In monitor mode the adapter is also able to capture data packets without being connected to a router. It has the ability to sniff and snoop on every piece of data it likes provided it can get its hands on it.

It should be noted that not all commercial wi-fi adapters are capable of this. It’s cheaper for manufacturers to produce models that handle ‘managed’ mode exclusively. Still, should someone get their hands on one and pair it with some simple Linux software, they’ll then able to see which URLs you are loading plus the data you’re providing to any website not using HTTPS – names, addresses, financial accounts etc. That’s obviously going to be a problem for you

Fake Hotspots

Snaring unencrypted data packets out of the air is definitely a risk of public wi-fi, but it’s certainly not the only one. When connecting to an unprotected router, you are then giving your trust to the supplier of that connection. Usually this trust is fine, your local Tim Horton’s probably takes no interest in your private data. However, being careless when connecting to public routers means that cybercriminals can easily set up a fake network designed to lure you in.

Once this illegitimate hotspot has been created, all of the data flowing through it can then be captured, analysed, and manipulated. One of the most common choices here is to redirect your traffic to an imitation of a popular website. This clone site will serve one purpose; to capture your personal information and card details in the same way a phishing scam would.

ARP Spoofing

The reality unfortunately is that cybercriminals don’t even need a fake hotspot to mess with your traffic.
Wi-Fi and Ethernet networks – all of them – have a unique MAC address. This is an identifying code used to ensure data packets make their way to the correct destination. Routers and all other devices discover this information Address Resolution Protocol (ARP).

Take this example; your smartphone sends out a request inquiring which device on the network is associated with a certain IP address. The requested device then provides its MAC address, ensuring the data packets are physically directed to the location determined to be the correct one. The problem is this ARP can be impersonated, or ‘faked’. Your smartphone might send a request for the address of the public wi-fi router, and a different device will answer you with a false address.

Providing the signal of the false device is stronger than the legitimate one, your smartphone will be fooled. Again, this can be done with simple Linux software.

Once the spoofing has taken place, all of your data will be sent to the false router, which can subsequently manipulate the traffic however it likes.

MitM – ‘Man-in-the-Middle’ Attacks

A man-in-the-middle attack (MITM) is a reference to any malicious action where the attacker secretly relays communication between two parties, or alters it for whatever malevolent reason. On an unprotected connection, a cybercriminal can modify key parts of the network traffic, redirect this traffic elsewhere, or fill an existing packet with whatever content they wish.

Examples of this could be displaying a fake login form or website, changing links, text, pictures, or more. Unfortunately, this isn’t difficult to do; an attacker within reception range of an unencrypted wi-fi point is able to insert themselves all too easily much of the time.

Best Practices for Securing your Public Wi-Fi Connection

The ongoing frequency of these attacks definitely serves to highlight the importance of basic cybersecurity best practices. Following these ones to counteract most public wi-fi threats effectively

  1. Have Firewalls in Place

An effective firewall will monitor and block any suspicious traffic flowing between your device and a router. Yes, you should always have a firewall in place and your virus definitions updated as a means of protecting your device from threats you have yet to come across.

While it’s true that properly configured firewalls can effectively block some attacks, they’re not a 100% reliable defender, and you’re definitely not exempt from danger just because of them. They primarily help protect against malicious traffic, not malicious programs, and one of the most frequent instances where they don’t protect you is when you are unaware of the fact you’re running malware. Firewalls should always be paired with other protective measures, and antivirus software being the best of them.

  1. Software updates

Software and system updates are also biggies, and should be installed as soon as you can do so. Staying up to date with the latest security patches is a very proven way to have yourself defended against existing and easily-exploited system vulnerabilities.

  1. Use a VPN

No matter if you’re a regular user of public Wi-Fi or not, A VPN is an essential security tool that you can put to work for you. VPNs serve you here by generating an encrypted tunnel that all of your traffic travels through, ensuring your data is secure regardless of the nature of the network you’re on. If you have reason to be concerned about your security online, a VPN is arguably the best safeguard against the risks posed by open networks.

That said, Free VPNs are not recommended, because many of them have been known to monitor and sell users’ data to third parties. You should choose a service provider with a strong reputation and a strict no-logging policy

  1. Use common sense

You shouldn’t fret too much over hopping onto a public Wi-Fi without a VPN, as the majority of attacks can be avoided by adhering to a few tested-and-true safe computing practices. First, avoid making purchases or visiting sensitive websites like your online banking portal. In addition, it’s best to stay away from any website that doesn’t use HTTPS. The popular browser extender HTTPS everywhere can help you here. Make use of it!

The majority of modern browsers also now have in-built security features that are able to identify threats and notify you if they encounter a malicious website. Heed these warnings.

Go ahead an make good use of public Wi-Fi and all the email checking, web browsing, social media socializing goodness they offer, but just be sure that you’re not putting yourself at risk while doing so.

Determining a Domain Name’s Worth

All of us have heard the stories of people who’ve smartly purchased the rights to domain names they foresaw being in demand in the future, and then selling them for a tidy profit some time later. Then ther was the well-publicized story of a former Google employee who owned google.com for a whole minute and was handsomely rewarded by the Internet giant for giving it back to them in 2015. That same year Google became a subsidiary of Alphabet, and they wisely nipped any problem in the bud by acquiring abcdefghijklmnopqrstuvwxyz.com shortly thereafter.

Here at 4GoodHosting, we register many new domain names for clients every month as a Canadian web hosting provider who offers the service free with our web hosting packages. If you’ve identified the perfect domain for your website, you can request it right here – https://4goodhosting.com/domain-name – and provided it’s available we can secure it for you. For those of you that have ever wondered about the $ of your domain name, you might be surprised to learn that you can actually come to an approximate valuation of it with a few online tools.

Even if your domain name is the most obscure one imaginable and would almost certainly never be in demand, this is quite interesting to learn more about.

Domain Hoarding?

The first thing to understand here is that there are hundreds of thousands of domain names that have been registered but do not have a website attached to them. Nearly all of them have been acquired by individuals who see the possibility of selling it in the future. There’s some very promising examples of this, like when the Expedia group paid $11 million for Hotels.com, or the person who registered FB.com receiving millions for it.

If your domain is one that is not unique and describes the nature of your business, or uses a term or portion of it to describe some aspect of the business or venture that would apply to similar ones elsewhere then there may resale value to the domain name. In some instances, there will be individuals who are willing to pay to assume ownership of it. Most of the time they’ll reach out to the owner by their web hosting provider reaching out to yours, and in rarer instances the domain owner will be aware of growing interest in the domain and put it ‘up for sale.’

What Makes a Domain Name Valuable?

For the most part a domain name is only worth as much as someone is willing to pay for it. For some domains, however, there are certain attributes that might make it have greater value:

  • Length – Shorter domain names tend to be easy to remember and require less effort to type them into a browser. Generally speaking, shorter domains tend to be worth more than longer ones.
  • Number of words – One-word domain names are always the most valuable, but combining 2 words to make business names (LinkedIn, Facebook) is a trend and has led to 2-word domain names being worth more too. Combining 3 words is almost unheard of and not recommended, so this type of domain would be by and large useless.
  • Accurate spelling – It’s true that some big brands will buy up domain names that are similarly spelled to their primary domain name. A popular domain name that’s correctly spelled will have more resale value if it is ever made available.
  • Domain name age and activity – Domains that have been live and accessible for a long time will come with built-in SEO attributes. This of course gives them significant value, with whoever buying the name not having to work as hard to get favourable search engine results from it.
  • TLD – A top-level domain (TLD) is extremely important to the value of any domain name. This is the suffix to your domain name, and with .com domains being the most common and popular it is these ones that are the most valuable. Acquiring one will cost more than the .org or .net. version of the same domain name. Niche TLDs like .pizza will typically have little to no resale value..

Finding Out If a Domain Has Value

There are a handful of domain name appraisal services online, and most won’t cost you anything to use them. Do keep in mind that the values these services place on domains are only approximations, so don’t take any valuation provided by them to be a 100% reliable estimation of what a domain name is worth.

Free Valuator is the best among them in our opinion. You can get a value estimation for a domain name in a matter of seconds and they can also introduce you to a professional domain name value assessor if you are considering making your domain name available. Estibot is another one, and it gets a mention here because it uses a different approach to determining how much a website name is worth. It actually uses mathematical models to calculate the value of a domain name.

What’s Next?

After you’ve checked the value of your domain name, you have 2 options; if it’s valuable you can go ahead and make it available for sale. Putting it on a domain auction site, like the one at GoDaddy, is a popular choice. Alternately, you might want to contact a professional domain name broker. They’ll have the knowledge and connections to get you the biggest $ return for your domain name. This is the best course of action if you think a big brand might want your domain name.

Next, if you think the value is bound to be greater in the future then you could sit tight and wait to see if that happens. If this is what you choose to do then taking steps to improve the value of your domain name, like adding content to your site and other approaches to boost its SEO value, is a smart move.

Have a domain name that’s estimated to be much more valuable than you thought? We’d like to hear about it here.

New Blockchain Development Kit Arrives from Microsoft

Blockchain isn’t exactly a household name in the digital commerce world – yet – but for those of us on the inside track it’s already well established as the next big thing in as far as grand-scale transactional computing is concerned. For those who aren’t familiar with it, we’ll explain briefly here; blockchain is a shared distributed ledger technology where each transaction is digitally signed to ensure its authenticity and integrity. From a ‘what does that mean for me’ perspective, it’s a new and very powerful means of upping security for digital transactions as well as ensuring pinpoint accuracy.

Right, now that we’ve got the basic explanation out of the way we’re going to come at this from an angle that’s designed for those of you already very much in the know regarding blockchain. Here at 4GoodHosting, we’re like any leading Canadian web hosting provider in that a good many of our customers have ecommerce websites where secure transactions are an absolute priority. As such, blockchain can’t arrive in full soon enough and that’s why recent news from Microsoft is very promising.

Microsoft is about to offer a new serverless blockchain development kit powered by its intelligent cloud platform – Azure. As of now it’s being called the ‘Azure Blockchain Development Kit’ and the aim with it is to facilitate seamless integration of blockchain with the best of Microsoft and other third-party SaaS offerings. The Principal Program Manager at Microsoft states that it will enable users to build key management, off-chain identity and data monitoring and messaging APIs into reference architectures that can be used to quickly build blockchain applications.

It is expected to have 3 major capabilities:

  • Integrating data and systems
  • Connecting interfaces
  • Deployment of smart contracts and blockchain networks

It should enable organizations and individuals to connect to blockchain through user interfaces. The development kit will come ready with voice interfaces, SMSes, internet of things, support for mobile clients, device integration, virtual assistants, and bots. Voice and SMS interfaces for the purpose of tracking and supply chain solutions promise to be very useful for developers, and it will have support for Android and iOS mobile operating systems

In addition, it will be compatible with other ledger technologies like Ethereum and Bitcoin too.

Concurrently with this new Azure Blockchain Development Kit, Microsoft is also announcing their release of a set of Flow Connectors and Logic Apps for blockchain. This Ethereum Blockchain connector will allow users to call contract actions, deploy contracts, trigger other Logic Apps, and read contract state. This is important, because end to end blockchain solutions require integration with data, software, and media that live ‘off chain’ as this state is referred to.

This new product from Microsoft is a next step in their quest to simplify and quicken blockchain accessibility and affordability for anyone who has an idea of what they might be able to do with it. Based on the serverless architecture, the expectation is that it will further reduce costs and thus making it accessible for every blockchain enthusiast, ISV and enterprise.

The Kit is being built atop Microsoft’s investments in blockchain technology and connects to Azure’s compute, storage, data and integration services, which are already proven reliable in the workspace. Over recent years, Microsoft has been working on extending the use of Blockchain and related distributed ledger technologies. The idea is that new digital identities will eventually come together to promote greater personal security, privacy and control.

It should be mentioned that other major players (Google most notably) also recently launched similar blockchain development kits. What will remain to be seen is what developers will think of it and how practical it is for them.

Microsoft has a white paper on how to deploy any decentralized application using the Azure Blockchain Development Kit. You can download it here, and overall the development of Blockchain is definitely something worth keeping tabs on as it continues to change the landscape of the ecommerce world.

 

Testing – And Improving – Page Speed for More Responsive Sites

In all the recent hubbub about https, GPDR regulation and the like there’s been some degree of neglect for the importance of website loading speeds. Most people behind a website won’t need to be made aware of what bounce rates are, or that in general people tend to be just as impatient when it comes to viewing a website as they are for nearly everything else in their life. Page speed has been a part of the Google algorithm for many years, in fact it’s been a big deal for the better part of 10 years now.

Here at 4GoodHosting, the nature of our business and the fact we’re a Canadian web hosting provider with our thumb on the pulse of the web hosting industry makes it so that we really grasp the importance of issues like these when it comes to website performance. We’re 10 months removed from Google starting to educate us all about how page speed is important for the user experience. The focus has of course shifted to mobile search in a big way, and again that’s quite natural given the way mobile is become the predominant search method.

At the start of 2018 Google announced its ‘speed update’, saying that it would only affect a small percentage of sites that were offering a painfully slow user experience. Most people have gotten on board with it sufficiently over the last year, but for those who have yet to let’s spend today discussing how to test and improve website page speed.

How To Test Your Site

There’s choices when it comes to online services you can use to gain an understanding of how quick your site is. Google’s two are really all you need to consider here. First up is PageSpeed Insights, which provides you with a reasonably accurate overview of how your site is performing and some things you can do to improve it. What we’ve learned from it is that render blocking (a slow part of the page that stops the whole page from loading) is the culprit most of the time. This issue isn’t easy to remedy, but you have to do it.

If mobile is your primary focus, then this tool here is perfect for you. It compares your site to other mobile sites and delivers a percentage score. Keep in mind that for both of these the numbers are estimates, and while they’ll likely be fairly close to accurate you shouldn’t take them as definitive findings.

This leads to the next part of our discussion here – tips you can implement to improve your page loading speeds.

How To Improve Your Page Load Times

There’s much you can do to speed up your site. Sometimes you’ll be addressing platform specific problems, while in other instances they will be more general issues. Some of these changes you can implement yourself, but for others you may need to bring in someone more web savvy than yourself.

  1. Better Hosting

Inexpensive shared hosting means your site is on a server filled with other domains like yours. This of course leads to a slower site due to a lack of available resources on the server. The simple fix for this is to move to better hosting. A dedicated servers (vps hosting in Canada) is an option, but for many smaller sites and interests its going to be an expensive and really unnecessary solution. However, it should be something to consider if shared hosting is being your website’s slow page loading times.

  1. Optimize Your Images

Plain and simple, compressing your images and reduce their size is the easiest and arguably the most effective way to improve page load speed times. Optimizing the image can be done in an offline editor, and one of the best ones is a site called kraken.io which in our opinion is better than the Adobe compression tool for smaller images sizes.

  1. Cache Your Site For Speed Gains

Caching your site can speed it up enormously. When you cache a site, it takes a snapshot of the page and keeps it handy. It then is able to deliver it to the visitor much quicker than it would normally. This can be done in numerous different ways. WordPress users can do so in the W3 Total Cache. The large amount of options there are something you should familiarize yourself with.

  1. Use A Content Delivery Network

Content delivery networks assume an extremely valuable role in the internet’s infrastructure. A CDN delivers a webpage or any file to a user by accessing it from the closest geographic location available. The benefits of doing this are that it is far more efficient, conserves bandwidth, protects the network, and also improving the user experience by providing the asset quicker.

CDN’s are fairly commonplace now, with estimates suggesting that 40% of all sites are using one. The best ones will be able to offer speed gains and protection from DDOs attacks.

  1. Minimize The Number Of Http Requests

An onslaught of http requests – requests for information from your server – can overwhelm a website. When someone visits your site they are requesting various files to load in the browser. Most of these requests are sequential and the increase in the number of external files means more requests, and that means a slower load time for that user.

One tip here is to take all the css files and put them into the same file. This can be done with Javascript files too. Consolidating as many files together to reduce the number of http requests is highly recommended.

  1. Disable Hotlinking

Hotlinking is when other sites leech your image content. Visitors to another site receive an image loaded from your server. This can mean your monthly bandwidth is stolen, but the fix is quick, easy, and effective – edit your .htaccess file.

  1. Serve Your Pages Via AMP

Google has put a lot of time and effort into improving the web for mobile, and pushed websites to improve user experience if it’s diminished by a slow connection. One of their recent innovations was AMP – accelerated mobile pages. These pages load faster and serve ads faster as well, which benefits those dealing with a slower internet connection. AMP pages use Amp Html, a special library of JavaScript, and a cache to serve pages.

AMP pages have been well received by larger news sites that see their specific need to be able to serve pages more quickly.

  1. Use An External Commenting System

Having an engaged user base is very desirable, but how they are set up for commenting on the site can be an issue here. It can be a real problem for on-page SEO, and can lead to the page loading much slower. A popular fix for this is to ‘lazy load’ the comments, making it so that the page doesn’t serve up this user-generated content to Google’s web crawlers. Instead, it only shows it to real visitors.

Another fix for commenting problems is to use an external commenting service. Supporting your platform with Disqus is a good choice here.