Tim Berners-Lee Interview, his recent thoughts on the current state of the internet and its future.

Reading Time: < 1 minute

The man in the right position to play a hand in inventing the “http” protocol for the then forthcoming World Wide Web recently stated his thoughts on some keys issues on the current and future Internet. Mr. Lee was tasked with helping computers communicate over a network, him and others helped create the protocol that transmits web pages between computers over a distributed IP network.

 

Mr. Berners-Lee worked as a contractor at CERN from June – December 1980. While there, he proposed a project based on the concept of “hypertext” to facilitate sharing and updating information among researchers. To demonstrate it, he built a prototype system named “Enquire”.

After leaving CERN in late 1980, he went to work at John Poole’s Image Computer Systems, Ltd, in Bournemouth, Dorset. He ran the company’s technical side for three years. The project he worked on was a “real time remote procedure call” which gave him experience in computer networking in 1984, he returned to CERN as a fellow. In 1989, CERN was the largest Internet node in Europe, and Berners-Lee saw an opportunity to join hypertext with the Internet.

“I just had to take the hypertext idea and connect it to the Transmission Control Protocol (TCP) and domain name system ideas and ‘ta-da!’ the World Wide Web … Creating the web was really an act of desperation, because the situation without it was very difficult when I was working at CERN later. Most of the technology involved in the web, like the hypertext, like the Internet, multi=font text objects, had all been designed already. I just had to put them together. It was a step of generalizing, going to a higher level of abstraction, thinking about all the documentation systems out there as being possibly part of a larger imaginary documentation system.”

Sir Tim Berners-Lee surprised people at a recent independent festival in California recently. He spoke about, net neutrality and the scandal regarding the “elusive promise” of a truly free Internet. The subject of net neutrality is brought to the forefront in the independent film “For Everyone.Net”, which hopes to put a friendly face on the fight for a free Web. This is the reason why Berners-Lee came to an independent film festival; where his words were recorded.

“We should be able to use the Web without worrying about being spied on and without finding that you can’t get to places because the ISP you use has got a deal with somebody else,“ said Berners-Lee. “The incentive whether you’re a government or a company to mess up net neutrality is control—for example, if you’re an oppressive government and you control the Internet, you can use it in all kinds of nasty ways. “If you look at the world through blurry spectacles, you see a slow and uneven march towards more openness. It’s true that when you go to China you can’t get to YouTube, but you can get to Vimeo. You can’t get to Twitter, but you can get to lots of other sites. China charts on the Web index not because they have an excellent record when it comes to censorship, but because they have a massive amount of stuff happening there—a huge amount of e-commerce, a huge amount of social chat,” he explained. “They’re using the Internet and it’s having a big positive economic effect. I hope that what will happen, bit by bit, is each country realizes it needs to play a part, just from a purely economic point of view. Companies have to be able to see and understand what the outside world is like, and the outside world has to be able to understand what China is like.”

Internet growth Projection summary from the Cisco Visual Networking Index forecast

Reading Time: < 1 minute

Print

The article covers the results of the annual Cisco Visual Networking Index forecast; which analyzes Internet protocol (IP) networking growth and trends worldwide.

The report quantitatively projects the volume of IP-traffic expected to travel public & private networks including: overall Internet, mobile data traffic generated by consumers & business users, global & regional residential, consumer mobile, and business services growth rates.

Through 2016, annual global IP traffic is forecast to be 1.3ZB (zeta-bytes), or a 1.3 trillion gigabytes. The projected increase of global traffic between 2015 & 2016 alone is more than 330EB (exabytes), which is almost the total amount of global IP traffic generated in 2011 (369 exabytes).

This level of traffic growth is driven by a number of factors:

  1. More Internet users: By 2016, there are expected to be 3.4 billion Internet users online; about 45% of the world’s projected population.
  2. Increasing #of devices: The ubiquity of mobile phones, tablets, and other smart devices as well as machine-to-machine “M2M” connections are driving up the demand for connectivity. By 2016, the forecast states there will be nearly 18.9 billion network connections — almost 2.5 connections for each person on earth, — compared with 10.3 billion in 2011
  3. More video: By 2016, 1.2 million video minutes — the equivalent of 833 days (or over two years) — would travel the Internet every second.
  4. Faster broadband speeds: The average fixed broadband speed is expected to increase nearly fourfold, from 9 megabits per second (Mbps) in 2011 to 34 Mbps in 2016.
  5. Wi-Fi growth: By 2016, over half of the world’s Internet traffic is expected to come from Wi-Fi connections.

Total expected growth in ‘bytes’:

Global IP traffic is expected to reach 1.3ZB (zettabytes) per year (110 exabytes per month) through 2016; nearly a 400% increase from approximately 31 exabytes per month in 2011.

Average global IP traffic in 2016 is expected to reach 150PB (petabytes) per hour, the equivalent of 278 million people streaming an HD movie (at an average streaming speed of 1.2 Mbps) simultaneously.

Regional IP traffic trends:

By 2016, the Asia-Pacific region is forecast to generate the most IP traffic (40.5EB exabytes per month), maintaining the top spot over North America (27.5 exabytes per month); which generated the second most amount of traffic.

The fastest-growing IP-traffic regions for the forecast period (2011-2016) are the Middle East and Africa (58% compound annual growth rate, for 1000% growth), and Latin America (49% CAGR, for 700% growth).

For fastest-growing IP traffic at the country level, India is expected to have the highest IP traffic growth rate with a 62% CAGR from 2011 to 2016. In a second-place tie, Brazil and South Africa both have 53 % CAGRs over the forecast period.

Through 2016, the highest traffic-generating countries will be the United States (22EB (exabytes) per month) and China (12EB per month).

Cybersecurity investments for companies in Canada increased 82% over the past year

Reading Time: < 1 minute

Internet Security System

Investments in cybersecurity by Canadian companies increased 82% over 2015 into 2016.

However “security incidents” increased by about 160% over the same time span, as reported at this beginning of 2016 by Price Waterhouse Cooper.

Cybersecurity spending in 2015 represented about 5% of overall IT-spending average within datacenters. The PwC report indicates that Canadian companies are more likely than the global average to employ an information security strategy and/or active monitoring analysis or what is generally termed “security intelligence”.

PwC states: All Information Technology businesses need to better understand the full range of problems that cybersecurity breaches can have on their organizations. Beyond data loss there are impacts on: competitiveness, service quality, financial and reputation damages, employee retention, and in some cases the health and safety of both employees and the public. The Global State of Information Security (GSIS) Survey 2016 indicates a gap in executive understanding of the threat landscape.

In Canada, “cybersecurity insurance” coverage grew to 59% of companies in 2015, the report stated, 54% percent now use big data security analytics, and use of cloud-based security services has grown to 64%, right on par with global averages for cloud security services.

The PwC Canada report states there are three areas where public & private sector organizations are investing in cybersecurity right now: solutions to manage how employees and customers and third parties access and use data, outsourced “Managed Security Services” to monitor & detect security events more efficiently, and “data privacy compliance” for mandatory breach notifications.

4GooodHosting employs the most appropriate and always the latest versions of our cybersecurity security suite guarding our customer’s webservers. 4GoodHosting has never had a databreach, or severs brought down by outside hackers, and we intend to keep it that way. The security and availability of our customers’ websites and servers is always out top concern.

“Halow” is a new wifi-frequency that offers double range

Reading Time: 2 minutes

b34084c9a1fa1de08cda53581f33c96c

HaLow There!

“Halow” (pronounced “halo”) is a new wifi-frequency that offers double range, and uses half as much power; or succinctly “low-power, long-range wifi”.
( 802.11ah standard )

The WiFi alliance has finally approved the frantically-anticipated 802.11ah wifi standard and nicknamed it ‘HaLow’; which uses less power and has better wall penetration. It can be thought of much like a long-range Bluetooth signal. Basically HaLow is all about connecting low-power & long-range wifi; which is critical to the production of small, affordable smart devices.

“Halow” enabled devices will operate at a lower and much less dangerous “less-microwaving” frequency, in the unlicensed 900MHz band (incorporating IEEE 802.11ah protocol standards). Coincidentally this also extends the range of a typical router or device of the current 2.4GHz standard, uses less power and provides better wall penetration.

The new wifi standard is seen as essential for the Internet of Things (IoT) and various futuristic connected home devices. The biggest obstacle has been gadgets like home-security sensors, smartbulbs, cameras, smart-watches, connected cars, digital healthcare products, and wearables; as well as industrial and retail sectors. These devices have had to supply enough power to send data long distances, constantly, to remote hubs or routers. However, today most prevalent wifi-standard in the 2.4Ghz spectrum doesn’t allow for long battery usage and transmission distances.

4GoodHosting’s New Year 12 Top Website Script Recommendations

Reading Time: 3 minutes

TopScriptsb

WordPress

is web software you can use to create a beautiful website or blog. We like to say that WordPress is both free and priceless at the same time. The core software is built by hundreds of community volunteers, and when you’re ready for more there are thousands of plugins and themes available to transform your site into almost anything you can imagine. Over 60 million people have chosen WordPress to power the place on the web they call “home” we’d love you to join the family.

Abante Cart

is a free PHP based eCommerce solution for merchants to provide ability creating online business and sell products online quick and efficient. AbanteCart application is built and supported by experienced enthusiasts that are passionate about their work and contribution to rapidly evolving eCommerce industry. AbanteCart is more than just a shopping cart, it is rapidly growing eCommerce platform with many benefits.

Booked

(formerly phpScheduleIt) is a simple but powerful reserve-anything scheduler. With flexible layouts, custom rules, a powerful administrative backend, and an unbelievably simple user experience, Booked can fit almost any need. From conference rooms to lab equipment to airplanes – it’s Booked.

If the service is free, you are the product !

Reading Time: 5 minutes

New Product Concept - Green Pushpin on a Map Background with Selective Focus.The Resistance
They are the digital dissenters.
They see tech companies tracking our every move.
They want to go back to the basics – to a world where the interests of
humans come before robots, algorithms and the needs of Silicon Valley.

“Techno-skeptics”, is the current name given to a growing minority of such people. They are emerging today as they don’t want to see a “dystopian” ( or dark/”1984”-ish/police state; the opposite definition of “utopian” ) future form around all of humanity – due to the fact that we are seemingly constantly marketed to and driven to adopt and adhere to ever more digital technology. ‘Humanists’ may be the most fitting new term.

The point being made by the Techno-Skeptics, is that our culture, the way it is being influenced, seems to be taking less pleasures in the simple and wholesome family or community-oriented matters, but seems to have an insatiable need for more digital entertainment technology; and ever more more “feature-rich” cell phones, and mobile ‘wearables’, even some wired clothing products that somehow interfaces us to the internet of things, etc.

This deep thinking minority group of “Humanists” have grown to believe us humans, and our most important needs and sense of priority, are getting lost in the current technology frenzy. Are we developing technology, such as Artificial Intelligence and robots that is destined to replace most of the human workforce?

These techno-doubters are even going so far as to address their views in public, as their thinking is that “too much of a good thing” will have the adverse affect of changing or re-arranging priorities off from what is actually more important: health, sanity, and family. They believe our society is being fundamentally changed for the worse; and it seems they have some good points; so their movement is growing and gaining some notoriety. Many of them also believe we are being slowly micro-wave damaged biologically by the microwave radio signals our cell phones and laptops, and office and home wireless routers constantly emit. If there is truth to that, we should indeed to go back to the drawing boards and invent safer methods of wireless data transmission.

IPv6 – The future of Internet IP addressing…

Reading Time: 5 minutes

IPv6-696x392

An IP (Internet Protocol) address is basically a postal address for each and every Internet-connected device. Without one, websites would not know where to send information/data each time you perform a search or try to access a website. IPv4 offers only about 4.3 billion IP addresses (specifically 4,294,967,296); which you most likely are familiar with already ( x.x.x.x; (1-255).(1-255).(1-255).(1-255) ). Through the use of techniques such as Network Address Translation (NAT) the life of IPv4 was extended, because NAT allows multiple devices to connected to the Internet through a single IP address, with the router in that particular household or business keeping track of which local device are receiving & sending data packets. But without IPv6,the Internet’s expansion and innovation could be limited, and the underlying infrastructure theoretically could become increasingly complex to manage; so a more expansive, address protocol has been deemed necessary.

IPv6, the latest – and possibly could be the ultimate addressing protocol, holds 2128 or 340,282,366,920,938,463,463,374,607,431,768,211,456 IP (340 billion billion billion billion) addresses. That is enough to give at least 4.3 billion IP addresses, or the addressing space of the current internet, individually to every person on Earth; or 7 billion current Internets!

Why the IPv6 protocol architects decided on such an unnecessary huge address space is unknown; Surely, 264 or 18,446,744,073,709,551,616 (18.4 trillion million) would have been way too many already. It seems like a bad call on the planner’s part, simply too excessive, when instead each packet could contain 64 bits of extra data. However, if we want to ever give an IP address to every mappable cubic centimeter of Earth’s entire atmosphere, IPv6 will provide future generations that capability, and more.

What does your website’s “About Page” say to your audience?

Reading Time: 3 minutes

about-page3

One page that is most often visited on a website besides the front page is the “About [Us] Page”. People want to know where a particular website is coming from, so they can better judge and trust the source of information or your business. However, most websites don’t utilize the “About Us” page to better form a relationship with their visitors.

It is often used to present important key information about yourself and/or your business. It is a page that can help you form more of a personal connection with each of your visitors.

If people can relate to you or have respect for the history of your business, they will be much more inclined to do business with you.

Be Humble

When you elaborate about yourself, your accomplishments, or your business, you should do that in a way that won’t turn off your readers. If you just list your accomplishments, that will likely be off-putting to your readers; which will make it harder for them to relate to you.

A way around just listing your accomplishment is to speak of your accomplishments inside of a story, the more humorous the better.

The focus of your About Page should really be about your visitors

People like to have the focus on themselves as well. If your about page just talks yourself and your accomplishments if will be like have a conversation with somebody who is just talking about themselves. So try a different approach and try to make your “About” page about your visitors. One way to do that is to tell them all about the benefits they will gain if they keep reading your website. The aim is to keep it as interesting as possible. If you address their needs right away, they will stick around.

SEO Clarity into 2016 – Part II

Reading Time: 4 minutes

SEO Search Engine Optimization Blue Squares On Top

Part II, as continued from our previous week’s article:
https://4goodhosting.com/blog/seo-clarity-2016/

I. Keyword Research

There has been a significant change in SEO methods over the past year, but the aspects of keyword research have remained stable.

You still want to identify the words & phrases that prospects are most often using to seek out your products and/or services. There are a variety of ways we can do this effectively:

· Analyze your website analytics to identify the set of keywords that are already generating traffic to your site, particular which keywords are most likely generating sales. Then use that information to find new, related keywords, and to build additional content based on those words.
· Use keyword research tools like Google’s Keyword Planner, Searchmetrics’ Keyword Analysis tool, SEMrush, Ubersuggest to compare popular keywords and phrases that are not overly competitive.
· Focus on longer keywords; as more natural language searching becomes more popular.
· Examine the competition’s top-ranking pages to identify which keywords they are targeting.

J. Link Building



The “old school” concept of link building, which mainly consists of reaching out to other sites to exchange links, or manually submitting your domain to web directories, and placing “spammy” comments and forum post, is still dead and buried, maybe forever.

Savvy users know that these previous link building tactics are not only ineffective; they can cause ranking penalties to their site. However, the emergence of non-linked mentions (so called “implied links”) are now being considered in the ranking algorithms. This has changed the perspective across the industry on link building; and it has become apparent that building a brand is more important than building links. Business owners and marketers should be working towards organically building links through publishing high quality content.

While the significance of back-links as a ranking factor is changing, they remain a significant ranking factor. Here are the content types that are most likely to accumulate a high number of links. These are:
· Opinion forming journalism
· Research-backed content
· Long form content over 1,000 words

SEO Clarity into 2016

Reading Time: 4 minutes

SEO Search Engine Optimization Blue Squares On Top

This is going to be a two-part article about equal in size, with Part II coming next week; with so much to cover, we don’t want to overwhelm you or overwhelm ourselves in putting it all together all at once. Topics were ordered in random order, as every different site needs varying amounts of different things.

Many of these ideas are in alignment with Searchmetrics’ 2015 Ranking Factors report; which has placed extra emphasis on optimization for mobile this year.

Perhaps you are using the same SEO techniques and strategies from yesteryear? Or maybe lessons learned from 2010? 2015 was a tumultuous year in the world of SEO. It truly is hard to stay on top of it all. We have witnessed some significant shifts over the past years when it comes to getting your link landing on that coveted first page. Successful ranking optimization doesn’t happen automatically, unless you hire a professional. If you don’t have the time or ability to adapt simultaneously to Google updates and other changes in the world wild web.

Part I of this report will cover much of you will need to know about search ranking optimization in 2016.

A) Keywords within content:

Continue to use your site specific set of keywords throughout your content. One thing that likely will never change is the basic fact that proper search keywords you have identified for your site should be implemented smartly into your content; definable in the title, in headers and sub-headers, introduction & conclusion paragraphs. According to the most recent Searchmetrics report, top-ranking pages increased the total set of identifiable keywords integrated into the body of page texts. Keywords are one of the most important ranking factors, but not all of them. You should devote time to understanding the rest of the techniques listed below and in next week’s Part II.

B) Content structure:

Properly structuring content in the most logical way possible is good for both rankings and also for user experience. There are a number of ways to iron-out optimal structure for your content:

  • * Use un-ordered lists (bulletpoint; <li> tags ) to clearly section the page information into more readable chunks.
  • * Utilize internal links in your text to guide both visitors and search engines through relevant content on your site. Use of external links may also be beneficial, since Google considers that as being a good neighbor to similar-themed sites.
  • * Use interactive elements where possible; such as menus and buttons. Just stay away from hiding your content using javascript, as this past weeks article explains – Google’s search robots doesn’t read dynamic javascript content.

Searchmetrics’ 2015 Ranking Factors report goes into more detail on how we can structure our content in order to rank.

C) Mobile-friendliness:

Since Google’s “mobile-friendly” update at the beginning of 2015, mobile-friendliness has become an ever bigger ranking factor. It’s no longer enough to optimize for desktop and ignore mobile users. If you haven’t already ensured your site uses a “responsive” (mobile friendly) design; or otherwise you have a dedicated mobile site, or app, in place – keep that agenda item on a sticky note – mobile sites are here to stay. Mobile pages should load even quicker that desktop pages, as they are naturally smaller in size than their desktop twins.

Using bulleted lists can greatly help, as can using slightly larger font sizes.

Google recently announced that more searches happen on mobile devices than on desktop devices; which is hard to believe for some, but easy to believe for others.

E) Site speed & file sizes:

These factors are both important for ranking for both mobile and desktop searches. The faster your host, the higher your ranking will float. (4GoodHosting offers the fastest type of hosting possible; SSD hosting). Keep file-sizes, images, videos, etc as small as possible without diminishing proper visibility. Searchmetrics discovered that top-ranking pages loaded with an average of 1.17 seconds for desktop results, and for mobile, 1.2 seconds. Google’s PageSpeed Insights tool is able to help you figure out how quickly your site is loading; keep in mind this tool only works on a page-by-page analysis, not on a site-wide basis). Pingdom is another similar and reliable tool.

F) Use of <header> & <meta> tags:

This is still of critical importance. 99% of Top-10 search results have a meta-description in their pages. 80% use at least 1 <H1> Meta-description help Google and other search engine better know how to list your pages. Also ensure your <H1> tags and descriptions are unique and do accurately describe the principle subject matter of your page.

G) Word counts:

Mobile content should be shorter than desktop content. It is just easier more conducive for a person to read more on a larger screen. Word count averages for top-ranking mobile pages in 2015 was 869, compared with 688 the year before. And these numbers are of course far lower than the average content lengths for top-ranking desktop sites viewed. If your site is responsive, you will have to decide on the right balance as the same amount of content will be shown on all devices. According to the report, the average word count for top-ranking content is between 1200-1400 words per page.

Google in the past has shown a preference for more comprehensive content. The figures above is up from 903 words, in 2014. So, when creating content, focus on providing comprehensive coverage of your topic, but as concisely or eloquently as possible. Finding the right balance is key for this ranking factor.