Site Isolation from Google Promises to Repel More Malware Attacks

Against malware
Reading Time: 2 minutes

Against malware

Security in the digital business world is really a challenge these days, and the world wide web is becoming as full of nefarious characters at the town of Machine, the ‘End of the Line’ as it were in the cool monochrome Western Dead Man with Johnny Depp from the ‘90s. A few months back we had detailed the big bad Spectre virus that had come onto the scene and posed major threats as regarded the insecurity of data for any type of website handling sensitive personal information.

It continues to be a ‘thing’, and in response to it Google recently enabled a new security feature in Chrome that secures users from malicious attacks like Spectre. It’s called Site Isolation, and is a new feature available with Chrome 67 on Windows, Mac, Linux, and Chrome OS. Here at 4GoodHosting, we’re a Canadian web hosting provider that puts an emphasis on this for obvious reasons, always seeking to be as on top of our clients’ web hosting needs as effectively as possible.

Google’s experimentation with Site Isolation has been going on since Chrome 63, and they’ve patched a lot of issues before enabling it by default for all Chrome users on desktop.

Chrome’s multi-process architecture allows different tabs to employ different renderer processes. Site Isolation functions by limiting each renderer process to documents from a single site. Chrome then relies on the operating system, and mitigates attacks between processes and any site.

Google has stated that in Chrome 67, Site Isolation has been enabled for 99% of users on Windows, Mac, Linux, and Chrome OS, according to a recent post on their company blog, stating further that ‘even if a Spectre attack were to occur in a malicious web page, data from other websites would generally not be loaded into the same process, and so there would be much less data available to the attacker. This significantly reduces the threat posed by Spectre.’

Additional known issues in Chrome for Android have been identified and are being worked on. Site Isolation for Chrome for Android should be ready with Chrome 68.

Need for Speed

Quick mention as well to Speed Update for Google Search on mobile. With this new feature the speed of pages will be a ranking factor for mobile searches. Of course, page speed has already been factoring into search engine rankings for some time now, but it was primarily based on desktop searches.

All of this is based on unsurprising finding showing people want to find answer to their searches as fast as possible, and page loading speed is an issue. Keeping that in mind, Google’s new feature for mobile users will only affect the pages that are painfully slow, and that has to be considered a good thing. Average pages should remain unaffected by and large.

We’re always happy to discuss in more detail how our web hosting service comes with the best in security and protective measures for your website when it’s hosted with us, and we also offer very competitively priced SSL certificates for Canadian websites that go a long way in securing your site reliably. Talk to us on the phone or email our support team.

Seven Steps to a Reliably Secure Server

Reading Time: 5 minutes

In a follow up to last week’s blog post where we talked about how experts expect an increase in DDoS attacks this year, it makes sense for us to this week provide some tips on the best way to secure a server. Here at 4GoodHosting, in addition to being a good Canadian web hosting provider we also try to take an interest in the well being of clients of ours who are in business online. Obviously, the premise of any external threat taking them offline for an extended period of time will endanger the livelihood of their business, and as such we hope these discussions will prove valuable.

Every day we’re presented with new reports of hacks and data breaches causing very unwelcome disruptions for businesses and users alike. Web servers tend to be vulnerable to security threats and need to be protected from intrusions, hacking attempts, viruses and other malicious attacks, but there’s no replacing a secure server with its role for a business that operates online and engages in network transactions.

They tend to be the target because they are many times all too penetrable for hackers, and add to that the fact they’re known to contain valuable information. As a result, taking proper measures to ensure you have a secure server is as vital as securing the website, web application, and of course the network around it.

Your first decisions to evaluate are the server, OS and web server you’ll choose to collectively function as server you hope will be secure, and then the kind of services that run on it. No matter which particular web server software and operating system you choose to run, you must take certain measures to increase your server security. For starters, everyone will need to review and configure every aspect of your server in order to secure it.

It’s best to maintain a multi-faceted approach that offers in-depth security because each security measure implemented stacks an additional layer of defence. The following is a list we’ve assembled from many different discussion with web development and security experts that individually and collectively will help strengthen your web server security and guard against cyberattacks, stopping them essentially before they even have the chance to get ‘inside’ and wreak havoc.

Let’s begin;

  1. 1. Automated Security Updates

Unfortunately, most vulnerabilities come with a zero-day status. Before you know it a public vulnerability can be utilized to create a malicious automated exploit. Your best defence is to keep an eye ALWAYS on the ball when it comes to receiving security updates and having them put into place. Now of course your eye isn’t available 24/7, but you can and should be applying automatic security updates and security patches as soon as they are available through the system’s package manager. If automated updates aren’t available, you need to find a better system – pronto.

  1. Review Server Status and Server Security

Being able to quickly review the status of your server and check whether there are any problems originating from CPU, RAM, disk usage, running processes and other metrics will often help pinpoint server security issues with the server in a much faster period of time. In addition, ubiquitous command line tools can also review the server status. Each of your network services logs, database logs, and site access logs (Microsoft SQL Server, MySQL, Oracle) present in a web server are best stored in a segregated area and checked with regularity. Be on the lookout for strange log entries. Should your server be compromised, having a reliable alerting and server monitoring system standing guard will prevent the problem from snowballing and allow you to take strategic reactive measures.

  1. Perimeter Security With Firewalls

Seeing to it you have a secure server means involves the installation of security applications like border routers and firewalls ready and proven effective for filtering known threats, automated attacks, malicious traffic, DDoS filters, and bogon IPs, plus any untrusted networks. A local firewall will be able to actively monitor for attacks like port scans and SSH password guessing and effectively neutralize their threat to the firewall. Further, a web application firewall helps to filter incoming web page requests that are made for the explicit purpose of breaking or compromising a website.

  1. Use Scanners and Security Tools

Fortunately, we’ve got many security tools (URL scan, mod security) typically provided with web server software to aid administrators in securing their web server installations. Yes, configuring these tools can be a laborious process and time consuming as well – particularly with custom web applications – but the benefit is that they add an extra layer of security and give you serious reassurances.

Scanners can help automate the process of running advanced security checks against the open ports and network services to ensure your server and web applications are secure. It most commonly will check for SQL injection, web server configuration problems, cross site scripting, and other security vulnerabilities. You can even get scanners that can automatically audit shopping carts, forms, dynamic web content and other web applications and then provide detailed reports regarding their detection of existing vulnerabilities. These are highly recommended.

  1. Remove Unnecessary Services

Typical default operating system installations and network configurations (Remote Registry Services, Print Server Service, RAS) will not be secure. Ports are left vulnerable to abuse with larger numbers of services running on an operating system. It’s therefore advisable to switch off all unnecessary services and then disable them. As an added bonus, you’ll be boosting your server performance by doing this with a freeing of hardware resources.

  1. Manage Web Application Content

The entirety of your web application or website files and scripts should be stored on a separate drive, away from the operating system, logs and any other system files. By doing so it creates a situation where even if hackers gain access to the web root directory, they’ll have absolutely zero success using any operating system command to take control of your web server.

  1. Permissions and Privileges

File and network services permissions are imperative points for having a secure server, as they help limit any potential damage that may stem from a compromised account. Malicious users can compromise the web server engine and use the account in order to carry out malevolent tasks, most often executing specific files that work to corrupt your data or encrypt it to their specifics. Ideally, file system permissions should be granular. Review your file system permissions on a VERY regular basis to prevent users and services from engaging in unintended actions. In addition, consider removing the “root” account to enable login using SSH and disabling any default account shells that you do not normally choose to access. Make sure to use the least privilege principle to run specific network service, and also be sure to restrict what each user or service can do.

Securing web servers can make it so that corporate data and resources are safe from intrusion or misuse. We’ve clearly established here that it is about people and processes as much as it is about any one security ‘product.’ By incorporating the majority (or ideally all) measures mentioned in this post, you can begin to create a secure server infrastructure that’s supremely effective in supporting web applications and other web services.

IT Security Insiders: Expect an Escalation in DDoS Attacks for Duration of 2017

Reading Time: 4 minutes

The long and short of it is that Internet security will always be a forefront topic in this industry. That’s a reflection of both the never-ending importance of keeping data secure given the predominance of e-commerce in the world today and the fact that cyber hackers will never slow in their efforts to get ‘in’ and do harm in the interest of making ill-gotten financial gains for themselves.

So with the understanding that the issue of security / attacks / preventative measures is never going to be moving to the back burner, let’s move forward to discuss what the consensus among web security experts is – namely, that DDoS Attacks are likely to occur at an even higher rate than previously for the remainder of 2017.

Here at 4GoodHosting, in addition to being one of the best web hosting providers in Canada we’re very active in keeping on top of trends in the Web-based business and design worlds. as they tend to have great relevance to our customers. As such, we think this particularly piece of news is worthy of some discussion.

Let’s have at it – why can we expect to see more DDoS attacks this year?

Data ‘Nappers and Ransom Demands

As stated, IT security professionals predict that DDoS attacks will be more numerous and more pronounced in the year ahead, and many have started preparing for attacks that could cause outages worldwide in worst-case scenarios.

One such scenario could be – brace yourselves – a worldwide Internet outage. Before you become overly concerned, however, it would seem that the vast majority of security teams are already taking steps to stay ahead of these threats, with ‘business continuity’ measures increasingly in place to allow continued operation should any worst-case scenario come to fruition.

Further, these same insiders say that the next DDoS attack will be financially motivated. While there are continued discussions about attackers taking aim at nation states, security professionals conversely believe that criminal extortionists are the most likely group to successfully undertake a large-scale DDoS attack against one or more specific organizations.

As an example of this, look no further than the recent developments regarding Apple and their being threatened with widespread wiping of devices by an organization calling itself the ‘Turkish Crime Family’ if the computing mega-company doesn’t cough up $75,000 in cryptocurrency or $100,000 worth of iTunes gift cards.

A recent survey of select e-commerce businesses found that 46% of them expect to be targeted by a DDoS attack over the next 12 months. Should that attack come with a ransom demand like the one above, it may be particularly troublesome for any management group (given the fact that nearly ALL of them will not have the deep pockets that Apple has)

Further, the same study found that a concerning number of security professionals believe their leadership teams would struggle to come up with any other solution than to give in to any ransom demands. As such, having effective protection against ransomware and other dark software threats is as important as it’s ever been.

Undercover Attacks

We need to mention as well that these same security professionals are also worried about the smaller, low-volume DDoS attacks that will less 30 minutes or less. These have come to be classified as ‘Trojan Horse’ DDoS attack, and the problem is that they typically will not be mitigated by most legacy DDoS mitigation solutions. One common ploy used by hackers is to employ a Trojan horse as a distraction mechanism that diverts guard to open up the gates for a separate, larger DDoS attack.

Citing the same survey yet again, fewer than 30% of IT security teams have enough visibility worked into their networks to mitigate attacks that do not exceed 30 minutes in length. Further, there is the possibility of hidden effects of these attacks on their networks, like undetected data theft.

Undetected data theft is almost certainly more of a problem than many are aware – and particularly with the fast-approaching GDPR deadline which will make it so that organizations could be fined up to 4% of global turnover in the event of a major data breach deemed to be ‘sensitive’ by any number of set criteria.

Turning Tide against ISPs

Many expect regulatory pressure to be applied against ISPs that are perceived to be insufficient in protecting their customers against DDoS threats. Of course, there is the question as to whether an ISP is to blame for not mitigating a DDoS attack when it occurs, but again it seems the consensus is that it is, more often that not. This seems to suggest that the majority would find their own security teams to be responsible.

The trend seems to be to blame upstream providers for not being more proactive when it comes to DDoS defense. Many believe the best approach to countering these increasing attacks is to have ISPs that are equipped to defend against DDoS attacks, by both protecting their own networks and offering more comprehensive solutions to their customers via paid-for, managed services that are proven to be effective.

We are definitely sympathetic to anyone who has concerns regarding the possibility of these attacks and how they could lead to serious losses should they be able to wreak havoc and essentially remove the site from the web for extended periods of time. With the news alluded to earlier that there could even be a worldwide Internet outage before long via the new depth and complexity of DDoS attacks, however, it would seem that anyone with an interest in being online for whatever purpose should be concerned as well.

Understanding the New ‘Perimeter’ Against Cyber Attacks

Reading Time: 4 minutes

Hacker in hood with laptop initiating cyber attack. View from the back.

If you yourself haven’t been the victim of a cyber attack, you very likely know someone else who has, and in fact the numbers suggest that upwards of 90% of organizations experienced at least SOME level of an IT security breach in the past year. Further, it’s believed that one in 6 organizations have had significant security breaches during the same period.

Here at 4GoodHosting, we’ve established ourselves as a top Canadian web hosting provider but we’re always keen to explore industry trends – positive and negative – that impact what matters to our customers. And our array of customers covers pretty much any type of interest one could have in operating on the World Wide Web.

Cyberattacks have pretty much become a part of every day life. While not to suggest that these types of incidents are ‘inevitable’, there is only so much any one individual or IT team can do to guard against them. Yes, there are standard PROACTIVE web security protocols to follow, but we will not look at those here given the fact that they are quite commonly understood amongst those of you who have that as part of your job detail and responsibility within the organization.

Rather, let’s take a look at being REACTIVE in response to a cyber attack here, and in particular with tips on how to disinfect a data centre and beef it up against further transgressions.

Anti-Virus and Firewalls – Insufficient

It would seem that the overwhelming trend with cloud data security revolves around the utilization of firewalls, believing them to be a sufficiently effective perimeter. Oftentimes, however, exceptions are made to allow cloud applications to run and in thus doing so the door is opened for intrusions to occur.

So much for firewalls securing the enterprise.

Similarly, anti-virus software can no longer keep pace with the immense volume of daily viruses and their variants that are being created in cyberspace nearly everyday. A reputable cybersecurity firm recently announced the discovery of a new Permanent Denial-of-Service (PDos) botnet named BrickerBot, which serves to render the victim’s hardware entirely useless.

A PDoS attack – or ‘phlashing’ as it’s also referred to – can damage a system so extensively that full replacement or reinstallation of hardware is required, and unfortunately these attacks are becoming more prevalent.It is true that there are plenty of useful tools out there such as Malware bytes that should be used to detect and cleanse the data centre of any detected or suspected infections.

Making Use of Whitelisting And Intrusion Detection

Whitelisting is a good way to strengthen your defensive lines and isolate rogue programs that have successfully infiltrated your data center. Also known as application control, whitelisting involves a short list of the applications and processes that have been authorized to run. This strategy limits use by means of a “deny-by-default” approach so that only approved files or applications are able to be installed. Dynamic application whitelisting strengthens security defenses and helps with preventing malicious software and other unapproved programs from running.

Modern networking tools should also be integrated as part of your security arsenal, and if they are configured correctly they can highlight abnormal patterns that may be a cause for concern. As an example, intrusion detection can be set up to be triggered when any host uploads a significant load of data several times over the course of a day. The idea is to eliminate abnormal user behaviour and help with containing existing threats.

Security Analytics

What’s the best way to augment current security practices? Experts in this are increasingly advocating real-time analytics used in tandem with specific methodologies that focus on likely attack vectors. This approach revolves around seeing the web as a hostile environment filled with predators. In the same way behavioural analytics are used in protecting against cyber terrorists, we need to take an in-depth look at patterns to better detect internal security threats.

However, perhaps the most important thing to realize is that technology alone will never solve the problem. Perfect email filters and the transgressors will move to using mobile networks. Improve those filters and they’ll jump to social media accounts. The solution must address the source and initial entry concepts, with training and education implemented so that people in the position to respond and ‘nip it in the bud’ can be explicitly aware of these attacks just as they first begin.

End-user Internet security awareness training is the answer, but we are only in the formative stages of making it accessible for users across all the different types. Much of it is all about teaching users not to do inadvisable things like clicking on suspect URLs in emails, or opening attachments that let in the bad hats.

Putting all staff through requisite training may be expensive and time consuming / productivity draining, but we may be at the point soon where it’s no longer an option to NOT have these types of educational programs. The new reality is that what we previously referred to as ‘the perimeter’ no longer really exists, or if it does it’s by in large ineffective in preventing the entirety of cyber attacks. The ‘perimeter’ is now every single individual on their own, and accordingly the risks are even greater with the weakest link in the chain essentials being the entirety of your system defences.

Defining DNS…. And What’s Exactly In It For Hackers?

Reading Time: 3 minutes

DNS isn’t exactly a buzzword in discussions among web hosting providers or those in the web hosting industry, but it’s darn close to it. DNS is an acronym for Domain Name Servers and what DNS does is see to it that after entering a website URL into your browser you then end up in the right spot – among the millions upon millions of them – on the World Wide Web.

DNS. Domain name system sign on white background

When you enter this URL, your browser starts trying to figure out where that website is by pinging a series of servers. These could be resolving name servers, authoritative name servers, or domain registrars, among others. But those servers themselves – often located all around the world – are only fulfilling an individual part in the overall process.

The process itself is a verification of identities by means of converting URLs into identifiable IP addresses, which the networks communicate with each other and by which your browser confirms that it’s taking you down the right path. In a world with literally billions of paths, that’s a more impressive feat than you might think, especially when you consider it’s done in mere seconds and with impressive consistency.

It’s quite common to hear of DNS in conjunction with DDoS, with is another strange acronym that is paired with the term ‘attack’ to create a phenomena noun. What DDoS is and how it’s related so explicitly to DNS much of the time is as follows:

A DDoS attack is a common hack in which multiple compromised computers are used to attack a single system by overloading it with server requests. In a DDoS attack, hackers will use often use infected computers to create a flood of traffic originating from many different sources, potentially thousands or even hundreds of thousands. By using all of the infected computers, a hacker can effectively circumvent any blocks that might be put on a single IP address. It also makes it harder to identify a legitimate request compared to one coming from an attacker.

The DNS is compromised in the way browsers essentially can’t figure out where to go to find the information to load on the screen. This type of attack happens typically involves hackers creating a little army of private computers infected with malicious software known as a Botnet. The people that are often participating in the attack don’t realize their computer has been compromised, and is now a part of the growing problem.

Why Go To So Much Trouble?

With all of this now understood, it begs the question – What’s in it for hackers to do this?

technology, cyberspace, virtual reality and people concept - man or hacker in headset and eyeglasses with keyboard hacking computer system or programming over binary code projection

It’s believed that the initial appeal of hacking is in proving that you can penetrate something / somewhere that’s purported to be impenetrable, and where someone with a skill set similar to yours has gone to significant effort to make it that way. It’s very much a geeks’ chest thumping competition – my virtual handiwork is better than yours!

As hackers become established and the ‘novelty’ of hacking wears off however, these individuals often find new inspiration for their malicious craft. The more time they spend doing it, the sooner they realize that a certain level of skills can introduce them to opportunities for making money with hacking. Among other scenarios, this can be either by stealing credit card details and using them to buy virtual goods, or by getting paid to create malware that others will pay for. And that happens much more often than you might think.

Their creations may silently take over a computer, or subvert a web browser so it goes to a particular site for which they get paid, or lace a website with commercial spam. As the opportunities in the digital world increase, hacking opportunities increase right along with them and that’s the way it will continue to be

Here at 4GoodHosting, we are constantly reevaluating the security measures we have in place to defend our clients’ websites from DDoS attacks, as well as keeping on top of industry trends and products that help us keep hackers and their nefarious handiwork away from you and your website. It’s a priority for sure.

DNS – “What Is It and How Does It Work?”

Reading Time: 3 minutes

index

DNS is an acronym that stands for “domain name system”. Without domain names, we wouldn’t have a way of getting to particular a website, such as numbers like 292.14.78.251, or worse the new ipv6 numbers that are much longer, too long in fact to provide an example here.

In this article we will give an overview of how D.N.S functions. The most important thing, that our web hosting customers can learn from this, is the process of how DNS records are changed, or updated to your web host’s “name servers” address’. These are important domain names which also rely on the DNS system to be converted into your webserver’s numerical IP-address.

“Name Servers”

Name servers enable people to use their domain name in order to access your webserver; which then in turn directs your site visitors to your specific web hosting account and files, rather than a complex IP address. DNS also makes economical “shared hosting” possible, since the server’s IP address can be reused for dozens of different websites.

Your name servers are the most important detail of your domain record, again their purpose is to redirect a visitor’s web browser to the place, that is web-server on a rack in a data-room someplace, where your site is being hosted.

Modifying your domain name server(s) enables you to change your web host without having to transfer your domain to another registrar.

Name servers can also be referred to as DNS servers; which can create confusion due to the two synonymous terms.

DNS Records

DNS refers to the layer of the internet stack, very similar to a database application, that contains the domain names, name servers, IP address’ and personal or company registration information encapsulating every public site on the Internet.
DNS records contain various types of data, syntax, and commands for how a webserver should respond to lookup requests.

Some of the most common syntax items defined:

· “A”-record. The actual webserver IP address that is associated with the domain.

· “CNAME”-record. CNAME indicates sub-domains that can be associated with your domain.

· “MX”-record. This refers to specific mail servers that might be optionally used in accordance with your domain, such as using gmail with your domain for email.

· “NS”-record. The nameservers that are currently set for your domain.

· “SOA”-record. Information about your domain, like when your domain was last updated and relevant contact information.

· “TXT”-record. Text about any additional information about the domain.

As you can see, there are numerous components of your DNS records, but most of this information can’t and shouldn’t be altered. The main component of your DNS records that will be of concern to you, if you ever have to change, is your name servers.

Changing Name Servers



Registrars are responsible for allowing you to edit your DNS record name severs. The default usually automatically assigns same host and registrar upon registration. Domain transfers usually carry over using the same name server information as from the previous registrar.

However, if your domain is not registered and also hosted in the same place, then you’ll follow the general steps below to update the name servers.

4GoodHosting customers can follow these instructions to change their name servers:

1. Locate Domain Management

Every registrar has domain management tools which allow you to edit your name servers. This ability will usually be found on the domain management area your client account/portal.

2. Find Your Name Servers

Under each of your individual domain(s) you’ll be able to change your name servers. Your name servers will look something like this:
ns1.4GoodHosting.com
ns2.4GoodHosting.com
You will need to change both the “primary” and “secondary” name server. The second server exists in the rare even that the first one crashes, or some other condition that prevents it from being resourced.

3. Setting New Name Servers

Simply change your existing name servers to the new name servers and click “Update” or “Save”. These changes don’t take place immediately across the entire internet. Domain name server updates usually take anywhere from 4-24 hours to ‘propagate’ throughout the global DNS internet system.

With this information, you are able to understand some key functionality of how the internet logically works. If you have any remaining questions concerning your domains or domain records, please contact us at support @ 4goodhosting.com for a rapid response to your inquiries.

Please contact us at one of the very best Canadian Web Hosting Companies, 4GoodHosting.

Amnesty International Report on Instant Messaging Services and Privacy

Reading Time: 4 minutes4gh-privacyconcerns-b

Skype & Snapchat, among other companies, have failed to adopt basic privacy protection as recent stated in Amnesty International’s special report “Message Privacy Ranking” report. The report compares 11 popular instant messaging services.

Companies were ranked based on their recognition of online threats to human rights, default deployment of end-to-end encryption, user disclosure, government disclosure, and publishing of the technical details of their encryption.

“If you think instant messaging services are private, you are in for a big surprise. The reality is that our communications are under constant threat from cybercriminals and spying by state authorities. Young people, the most prolific sharers of personal details and photos over apps like Snapchat, are especially at risk,” Sherif Elsayed-Ali, Head of Amnesty International’s Technology and Human Rights Team said in a statement.

“Snapchat” only scored 26 points in the report (out of 100) and Blackberry was rated even worse at 20 points). Skype has weak encryption, scoring only 40.

The middle group in the rankings included Google, which scored a 53 for its Allo, Duo, & Hangouts apps, Line and Viber, with 47 each, and Kakao Talk, which scored a 40.

The report also stated “that due to the abysmal state of privacy protections there was no winner.”

On a side not protecting privacy rights is also part of the motivation behind the Let’s Encrypt Project, which to use to supply free SSL Certificates.

Amnesty International has petitioned messaging services to apply “end-to-end encryption” (as a default feature) to protect: activists, journalists, opposition politicians, and common law-abiding citizens world-wide. It also urges companies to openly publish and advertise the details about their privacy-related practices & policies.

About the most popular instant messaging app: “Whatsapp” – Facebook has thrown everybody a new surprise twist.

WhatsApp is updating its privacy policy. Facebook wants your data and end-to-end encryption is going to soon be shut off.
WhatsApp , now owned by Facebook, started some uproar this week after the announcement that it’s changing its terms (or privacy) to *allow* data to be shared with Facebook. It means that for the first time Whatsapp will give permission to connect accounts to Facebook. This is after pledging, in 2014, that it wouldn’t do so – and has now backtracked.

WhatsApp now says that it will give the social networking site more data about its users – allowing Facebook to suggest phone contacts as “friends”.

“By coordinating more with Facebook, we’ll be able to do things like track basic metrics about how often people use our services and better fight spam on WhatsApp,” Whatsapp has written.

“By connecting your phone number with Facebook’s systems, Facebook can offer better friend suggestions and show you more relevant ads if you have an account with them. … For example, you might see an ad from a company you already work with, rather than one from someone you’ve never heard of.”

Many aren’t pleased with the move, especially since WhatsApp previously promised not to change its privacy settings.
If you want to carry on using whatsapp, you can’t opt out of the Facebook connection feature, as the update of terms and privacy policy is compulsory. “This allows us to do things like improve our app’s performance and better coordinate,” says WhatsApp.

The app’s end-to-end encryption will also be stopped. However previously the company implemented it earlier this year and claimed it made conversations more secure.

The popular messaging service’s recent change in privacy policy to start sharing users’ phone numbers with Facebook—the first policy change since WhatsApp was acquired by Facebook in 2014 – has attracted regulatory scrutiny in Europe.

The Italian antitrust watchdog on Friday also announced a separate probe into whether WhatsApp obliged users to agree to sharing personal data with Facebook.

The European Union’s 28 data protection authorities said in a statement they had requested WhatsApp stop sharing users’ data with Facebook until the “appropriate legal protections could be assured” to avoid falling foul of EU data protection law.

WhatsApp’s new privacy policy involves the sharing of information with Facebook for purposes that were not included in the terms of service when users signed up, raising questions about the validity of users’ consent, known as the Article 29 Working Party (WP29), as the European authorities have responded with.

The Wp29 group also urges WhatsApp to stop passing user data to Facebook while it investigates the legality of the arrangement.
Subsequently a spokeswoman for WhatsApp said the company was working with data protection authorities to address their questions.

Facebook has had run-ins with European privacy watchdogs in the past over its processing of users’ data. However, the fines that regulators can levy are paltry in comparison to the revenues of the big U.S. tech companies concerned.

The European regulators will discuss the Yahoo and WhatsApp cases in November.

“The Article 29 Working Party (WP29) has serious concerns regarding the manner in which the information relating to the updated Terms of Service and Privacy Policy was provided to users and consequently about the validity of the users’ consent,” it writes.

“WP29 also questions the effectiveness of control mechanisms offered to users to exercise their rights and the effects that the data sharing will have on people that are not a user of any other service within the Facebook family of companies.”

We haven’t heard of any discussion within Canada as of yet.

Thank you for reading the 4GoodHosting blog. We would love to hear from you.

On Choosing the Best CMS for Your Particular Needs

Reading Time: 3 minutes4GoodHosting-Drupal-Joomla-Wordpress

You may have heard of the 3 more popular content management applications: WordPress, Drupal, Joomla – but you are not sure which one is best for your needs. Perhaps you remain curious; so we will focus the the two ‘other’ choices besides WordPress: Drupal & Joomla.

Each particular CMS will provide the basic functions of: adding, deleting, and publishing various types of content. Each program has different strong points (and weaknesses) which should be considered whole-cloth, prior to making your ultimate decision.

First write down your business’ objectives and goals. This should be is the first step in selecting the best CMS application suited for your particular business needs. Ultimately, optimally serving your business’ unique target audience.

Choosing the right CMS (by the way, easily confused with CNS (Central Nervous System)), is the backbone for your project it will save you a great deal of headaches later. A reliable web host, with super customer support, also saves you from initial and future headaches. With 4GoodHosting.ca you can get both ultra-reliable hosting and the CMS of your choice for free: Joomla, Drupal, or of course; WordPress – or any of the 200+ free scripts we offer you with any of our hosting package.

Drupal:

In 2016, there is an estimated 1 million+ websites built atop the Drupal CMS. Drupal is common to government offices, universities and colleges, Non-government Organization, Canadian & otherwise global enterprises. America’s White House website is taking advantage of Drupal’s strong website security features. Drupal is a comprehensive, expandable, powerful content management framework suitable to be the foundation of virtually any type of website.

Drupal’s Advantages:

  • * Tested Enterprise-level security; advanced control over URL structure
  • * Lots of functionality – including advanced menu management, graphics modification utilities, poll management, and administration/users management
  • * Built for high performance; pages load fast because of its defaulting caching features
  • * Ability to handle large amounts of content & data
  • * Extensive selection of themes, modules & extensions
  • * Ideal for community platform sites (requiring multiple users – admin, editors, logged in users requiring customized content, private groups, etc.)
  • * Large robust community generally responsive to inquiries and concerns.
  • * Good SEO configurability
  • * Clean/professional looking designs/themes.

Drupal’s Disadvantages:

  • * High/technical learning curve; not user-friendly
  • * Developer skills needed to install and apply upgrades requiring experienced knowledge of PHP and HTML languages as well as CSS
  • * More expensive: premium themes and plugins (modules) are prices considerably higher than say WordPress (and Joomla)
  • * Big name Brands who are Using Joomla:
  • * The Weather Channel
  • * NBC.com
  • * Twitter
  • * Oxford University
  • * Verizon Wireless
  • * The White House
  • * The Economist Magazine
  • * Forbes Magazine

Joomla:

Another good option for small to mid-sized websites or e-commerce stores (or for building a community or a social network with a membership features, forums, newsroom, articles, and a writing staff). However, if you need something more powerful for larger/enterprise projects where scalability, stability, & high versatility are essential, then learning and using Drupal would be more appropriate.

Joomla is becoming an increasingly popular CMS platform. Trailing WordPress, it is the 2nd most accepted CMS. Joomla is currently housing over 3 million websites.

Joomla level of complexity is somewhere between WordPress (simplest) in most advanced and enterprise-class Drupal.

Joomla has the extensibility of being extended in order to produce even new functionality. Joomla has won the Packt Open Source Awards now several years in a row.

Joomla entails a slight learning curve, particularly for novices, yet webmasters usually wind up happy with Joomla’s built in features.

Joomla’s Advantages:

  • * Installation is simple (developer knowledge of CSS, PHP, or HTML is not required) updates installs are easily done through web browser
  • * E-commerce made easy
  • * Thousands of free extensions available (for increased functionality of your site)
  • * Advanced administration panel offers many functions for complete optimization
  • * Manage users simply and easily
  • * Joomla’s application framework makes it possible for developers to create powerful add-ons
  • * URLs generated are SEO friendly
  • * Active community support (programmer tools and tutorials for users )

Joomla’s Disadvantages:

  • * Some learning curve to ride – but not as much as Drupal.
  • * About half of the plugins/extensions & modules are for purchase
  • * Limited configurability options (particularly for advanced users); Limited “access control list” (ACL) support
  • * Occasional compatibility issues with some of the plugins, which requires some PHP skill to iron-out the functions to work properly

Big name Brands who are Using Joomla:

  • * IKEA
  • * IHOP
  • * Harvard University (Graduate School of Arts and Sciences )

If you have some experience with content management systems and you’re considering alternatives to WordPress, and the prospect of diving into Drupal seems quite daunting, then Joomla might be your best option. Thank you for choosing 4GoodHosting.com, your trusted destination for white label SEO services and B2B SEO services, as your 5.0 Google-rated, A+ BBB Canadian Web Host.

“Irish”(Similarly Canadian) Search Warrant Found Invalid – Microsoft Currently Victorious in Fight for User Privacy

Reading Time: < 1 minuteMicrosoft

(US & Canadian News) Microsoft actually championed a huge victory in regards to user privacy {which certainly has affect on Canadians using Microsoft products and services: Email, Cloud Storage, Skype, etc.) on July 28th. An appeals court has ruled that a federal warrant to seize email from a Microsoft server in “Ireland” is invalid.

Federal investigators received a spy warrant (for email contents) as part of a criminal investigation in December 2013, which touched off a debate between the tech-industry and law enforcement about jurisdiction & data storage.

The timing of this coincides with Microsoft’s Worldwide Partner Conference (WPC) – where the company’s president and chief legal officer Brad Smith promoted a vision for the internet that respects people’s rightsand is “governed by good law.”

Microsoft said: “We obviously welcome today’s decision by the United States Court of Appeals for the Second Circuit. The decision is important for three reasons: it ensures that people’s privacy rights are protected by the laws of their own countries; it helps ensure that the legal protections of the physical world apply in the digital domain; and it paves the way for better solutions to address both privacy and law enforcement needs.”

Privacy protections for information stored on paper should persist as data moves to the cloud. This decision helps ensure this result.

— Brad Smith (@BradSmi) July 14, 2016

Microsoft has publicly acknowledged a need for cloud providers, particularly those based in the U.S., to win back over consumer trust.

Representatives for like-minded lobby groups include: the EFF (Electronic Frontier Foundation), i2Coalition, plus big tech companies such as: Rackspace, Apple, Amazon, Cisco, Hewlett-Packard, and Verizon – and notably in this case Ireland’s Parliament – and each submitted briefs in support of Microsoft’s initial statements and position.

“We conclude that Congress did not intend the SCA’s warrant provisions to apply extraterritorially,” the judges said in the ruling (PDF). “The focus of those provisions is protection of a user’s privacy interests. Accordingly, the SCA does not authorize a US court to issue and enforce an SCA warrant against a United States‐based service provider for the contents of a customer’s electronic communications stored on servers located outside the United States.”

Thank you for reading and sharing the 4GoodHosting Blog.

IPv6 – The future of Internet IP addressing…

Reading Time: 5 minutesIPv6-696x392

An IP (Internet Protocol) address is basically a postal address for each and every Internet-connected device. Without one, websites would not know where to send information/data each time you perform a search or try to access a website. IPv4 offers only about 4.3 billion IP addresses (specifically 4,294,967,296); which you most likely are familiar with already ( x.x.x.x; (1-255).(1-255).(1-255).(1-255) ). Through the use of techniques such as Network Address Translation (NAT) the life of IPv4 was extended, because NAT allows multiple devices to connected to the Internet through a single IP address, with the router in that particular household or business keeping track of which local device are receiving & sending data packets. But without IPv6,the Internet’s expansion and innovation could be limited, and the underlying infrastructure theoretically could become increasingly complex to manage; so a more expansive, address protocol has been deemed necessary.

IPv6, the latest – and possibly could be the ultimate addressing protocol, holds 2128 or 340,282,366,920,938,463,463,374,607,431,768,211,456 IP (340 billion billion billion billion) addresses. That is enough to give at least 4.3 billion IP addresses, or the addressing space of the current internet, individually to every person on Earth; or 7 billion current Internets!

Why the IPv6 protocol architects decided on such an unnecessary huge address space is unknown; Surely, 264 or 18,446,744,073,709,551,616 (18.4 trillion million) would have been way too many already. It seems like a bad call on the planner’s part, simply too excessive, when instead each packet could contain 64 bits of extra data. However, if we want to ever give an IP address to every mappable cubic centimeter of Earth’s entire atmosphere, IPv6 will provide future generations that capability, and more.