The Appeal of Hybrid Cloud Hosting

Most of you will need no introduction to the functionality and application of cloud computing, but those of who aren’t loaded with insight into the ins and outs of web hosting may be less familiar with cloud hosting and what makes it significantly different from standard web hosting. Fewer still will likely know of hybrid hosting and the way it’s made significant inroads into the hosting market with very specific appeals for certain web users with business and / or management interests.

Here at 4GoodHosting, we’ve done well establishing ourselves as a quality Canadian web hosting provider, and a part of what’s allowed us to do that is by having our thumb on the pulse of our industry and sharing those developments with our customers in language they can understand. Hybrid hosting may well be a good fit for you, and as such we’re happy to share what we know regarding it.

If we had to give a brief overview of it, we’d say that hybrid hosting is meant for site owners that want the highest level of data security along with the economic benefits of the public cloud. Privacy continues to be of a primary importance, but the mix of public and private cloud environments and the specific security, storage, and / or computing capacities that come along with the pairing are very appealing.

What Exactly is the Hybrid Cloud?

This combination of private and public cloud services communicate via encrypted technology that allows for data and / or app portability, consisting of three individual parts; the public cloud / the private cloud / a cloud service and management platform.

Both the public and private clouds are independent elements, allowing you to store and protect your data in your private cloud while employing all of the advanced computing resources of the public cloud. To summarize, it’s a very beneficial arrangement where your data is especially secure but you’re still able to bring in all the advanced functionality and streamlining of processes that come with cloud computing.

If you have no concerns regarding the security of your data, you are; a) lucky, and b) likely to be quite fine with a standard cloud hosting arrangement.

If that’s not you, read on…

The Benefits of Hybrid Clouds

One of the big pluses for hybrid cloud hosting is being able to keep your private data private in an on-prem, easily accessible private infrastructure, which means you don’t need to push all your information through the public Internet, yet you’re still able to utilize the economical resources of the public cloud.

Further, hybrid hosting allows you to leverage the flexibility of the cloud, taking advantage of computing resources only as needed, and – most relevantly – also without offloading ALL your data to a 3rd-party datacenter. You’re still in possession of an infrastructure to support your work and development on site, but when that workload exceeds the capacity of your private cloud, you’re still in good hands via the failover safety net that the public cloud provides.

Utilizing a hybrid cloud can be especially appealing for small and medium-sized business offices, with an ability to keep company systems like CRMS, scheduling tools, and messaging portals plus fax machines, security cameras, and other security / safety fixtures like smoke or carbon monoxide detectors connected and working together as needed without the same risk of web-connection hardware failure or security compromise.

The Drawbacks of Hybrid Clouds

The opposite side of the hybrid cloud pros and cons is that it can be something of a demanding task to maintain and manage such a massive, complex, and expensive infrastructure. Assembling your hybrid cloud can also cost a pretty penny, so it should only be considered if it promises to be REALLY beneficial for you, and keep in mind as well that hybrid hosting is also less than ideal in instances where data transport on both ends is sensitive to latency, which of course makes offloading to the cloud impractical for the most part.

Good Fits for Hybrid Clouds

It tends to be a more suitable fit for businesses that have an emphasis on security, or others with extensive and unique physical data needs. Here’s a list of a few sectors, industries, and markets that have been eagerly embracing the hybrid cloud model:

  • Finance sector – the appeal for them is in the decreased on-site physical storage needs and lowered latency
  • Healthcare industry – often to overcome regulatory hurdles put in place by compliance agencies
  • Law firms – protecting against data loss and security breaches
  • Retail market – for handling compute-heavy analytics data tasks

We’re fortunate that these types of technologies continue to evolve as they have, especially considering the ever-growing predominance of web-based business and communication infrastructures in our lives and the data storage demands and security breach risks that go along with them.

Seven Steps to a Reliably Secure Server

In a follow up to last week’s blog post where we talked about how experts expect an increase in DDoS attacks this year, it makes sense for us to this week provide some tips on the best way to secure a server. Here at 4GoodHosting, in addition to being a good Canadian web hosting provider we also try to take an interest in the well being of clients of ours who are in business online. Obviously, the premise of any external threat taking them offline for an extended period of time will endanger the livelihood of their business, and as such we hope these discussions will prove valuable.

Every day we’re presented with new reports of hacks and data breaches causing very unwelcome disruptions for businesses and users alike. Web servers tend to be vulnerable to security threats and need to be protected from intrusions, hacking attempts, viruses and other malicious attacks, but there’s no replacing a secure server with its role for a business that operates online and engages in network transactions.

They tend to be the target because they are many times all too penetrable for hackers, and add to that the fact they’re known to contain valuable information. As a result, taking proper measures to ensure you have a secure server is as vital as securing the website, web application, and of course the network around it.

Your first decisions to evaluate are the server, OS and web server you’ll choose to collectively function as server you hope will be secure, and then the kind of services that run on it. No matter which particular web server software and operating system you choose to run, you must take certain measures to increase your server security. For starters, everyone will need to review and configure every aspect of your server in order to secure it.

It’s best to maintain a multi-faceted approach that offers in-depth security because each security measure implemented stacks an additional layer of defence. The following is a list we’ve assembled from many different discussion with web development and security experts that individually and collectively will help strengthen your web server security and guard against cyberattacks, stopping them essentially before they even have the chance to get ‘inside’ and wreak havoc.

Let’s begin;

  1. 1. Automated Security Updates

Unfortunately, most vulnerabilities come with a zero-day status. Before you know it a public vulnerability can be utilized to create a malicious automated exploit. Your best defence is to keep an eye ALWAYS on the ball when it comes to receiving security updates and having them put into place. Now of course your eye isn’t available 24/7, but you can and should be applying automatic security updates and security patches as soon as they are available through the system’s package manager. If automated updates aren’t available, you need to find a better system – pronto.

  1. Review Server Status and Server Security

Being able to quickly review the status of your server and check whether there are any problems originating from CPU, RAM, disk usage, running processes and other metrics will often help pinpoint server security issues with the server in a much faster period of time. In addition, ubiquitous command line tools can also review the server status. Each of your network services logs, database logs, and site access logs (Microsoft SQL Server, MySQL, Oracle) present in a web server are best stored in a segregated area and checked with regularity. Be on the lookout for strange log entries. Should your server be compromised, having a reliable alerting and server monitoring system standing guard will prevent the problem from snowballing and allow you to take strategic reactive measures.

  1. Perimeter Security With Firewalls

Seeing to it you have a secure server means involves the installation of security applications like border routers and firewalls ready and proven effective for filtering known threats, automated attacks, malicious traffic, DDoS filters, and bogon IPs, plus any untrusted networks. A local firewall will be able to actively monitor for attacks like port scans and SSH password guessing and effectively neutralize their threat to the firewall. Further, a web application firewall helps to filter incoming web page requests that are made for the explicit purpose of breaking or compromising a website.

  1. Use Scanners and Security Tools

Fortunately, we’ve got many security tools (URL scan, mod security) typically provided with web server software to aid administrators in securing their web server installations. Yes, configuring these tools can be a laborious process and time consuming as well – particularly with custom web applications – but the benefit is that they add an extra layer of security and give you serious reassurances.

Scanners can help automate the process of running advanced security checks against the open ports and network services to ensure your server and web applications are secure. It most commonly will check for SQL injection, web server configuration problems, cross site scripting, and other security vulnerabilities. You can even get scanners that can automatically audit shopping carts, forms, dynamic web content and other web applications and then provide detailed reports regarding their detection of existing vulnerabilities. These are highly recommended.

  1. Remove Unnecessary Services

Typical default operating system installations and network configurations (Remote Registry Services, Print Server Service, RAS) will not be secure. Ports are left vulnerable to abuse with larger numbers of services running on an operating system. It’s therefore advisable to switch off all unnecessary services and then disable them. As an added bonus, you’ll be boosting your server performance by doing this with a freeing of hardware resources.

  1. Manage Web Application Content

The entirety of your web application or website files and scripts should be stored on a separate drive, away from the operating system, logs and any other system files. By doing so it creates a situation where even if hackers gain access to the web root directory, they’ll have absolutely zero success using any operating system command to take control of your web server.

  1. Permissions and Privileges

File and network services permissions are imperative points for having a secure server, as they help limit any potential damage that may stem from a compromised account. Malicious users can compromise the web server engine and use the account in order to carry out malevolent tasks, most often executing specific files that work to corrupt your data or encrypt it to their specifics. Ideally, file system permissions should be granular. Review your file system permissions on a VERY regular basis to prevent users and services from engaging in unintended actions. In addition, consider removing the “root” account to enable login using SSH and disabling any default account shells that you do not normally choose to access. Make sure to use the least privilege principle to run specific network service, and also be sure to restrict what each user or service can do.

Securing web servers can make it so that corporate data and resources are safe from intrusion or misuse. We’ve clearly established here that it is about people and processes as much as it is about any one security ‘product.’ By incorporating the majority (or ideally all) measures mentioned in this post, you can begin to create a secure server infrastructure that’s supremely effective in supporting web applications and other web services.

IT Security Insiders: Expect an Escalation in DDoS Attacks for Duration of 2017

The long and short of it is that Internet security will always be a forefront topic in this industry. That’s a reflection of both the never-ending importance of keeping data secure given the predominance of e-commerce in the world today and the fact that cyber hackers will never slow in their efforts to get ‘in’ and do harm in the interest of making ill-gotten financial gains for themselves.

So with the understanding that the issue of security / attacks / preventative measures is never going to be moving to the back burner, let’s move forward to discuss what the consensus among web security experts is – namely, that DDoS Attacks are likely to occur at an even higher rate than previously for the remainder of 2017.

Here at 4GoodHosting, in addition to being one of the best web hosting providers in Canada we’re very active in keeping on top of trends in the Web-based business and design worlds. as they tend to have great relevance to our customers. As such, we think this particularly piece of news is worthy of some discussion.

Let’s have at it – why can we expect to see more DDoS attacks this year?

Data ‘Nappers and Ransom Demands

As stated, IT security professionals predict that DDoS attacks will be more numerous and more pronounced in the year ahead, and many have started preparing for attacks that could cause outages worldwide in worst-case scenarios.

One such scenario could be – brace yourselves – a worldwide Internet outage. Before you become overly concerned, however, it would seem that the vast majority of security teams are already taking steps to stay ahead of these threats, with ‘business continuity’ measures increasingly in place to allow continued operation should any worst-case scenario come to fruition.

Further, these same insiders say that the next DDoS attack will be financially motivated. While there are continued discussions about attackers taking aim at nation states, security professionals conversely believe that criminal extortionists are the most likely group to successfully undertake a large-scale DDoS attack against one or more specific organizations.

As an example of this, look no further than the recent developments regarding Apple and their being threatened with widespread wiping of devices by an organization calling itself the ‘Turkish Crime Family’ if the computing mega-company doesn’t cough up $75,000 in cryptocurrency or $100,000 worth of iTunes gift cards.

A recent survey of select e-commerce businesses found that 46% of them expect to be targeted by a DDoS attack over the next 12 months. Should that attack come with a ransom demand like the one above, it may be particularly troublesome for any management group (given the fact that nearly ALL of them will not have the deep pockets that Apple has)

Further, the same study found that a concerning number of security professionals believe their leadership teams would struggle to come up with any other solution than to give in to any ransom demands. As such, having effective protection against ransomware and other dark software threats is as important as it’s ever been.

Undercover Attacks

We need to mention as well that these same security professionals are also worried about the smaller, low-volume DDoS attacks that will less 30 minutes or less. These have come to be classified as ‘Trojan Horse’ DDoS attack, and the problem is that they typically will not be mitigated by most legacy DDoS mitigation solutions. One common ploy used by hackers is to employ a Trojan horse as a distraction mechanism that diverts guard to open up the gates for a separate, larger DDoS attack.

Citing the same survey yet again, fewer than 30% of IT security teams have enough visibility worked into their networks to mitigate attacks that do not exceed 30 minutes in length. Further, there is the possibility of hidden effects of these attacks on their networks, like undetected data theft.

Undetected data theft is almost certainly more of a problem than many are aware – and particularly with the fast-approaching GDPR deadline which will make it so that organizations could be fined up to 4% of global turnover in the event of a major data breach deemed to be ‘sensitive’ by any number of set criteria.

Turning Tide against ISPs

Many expect regulatory pressure to be applied against ISPs that are perceived to be insufficient in protecting their customers against DDoS threats. Of course, there is the question as to whether an ISP is to blame for not mitigating a DDoS attack when it occurs, but again it seems the consensus is that it is, more often that not. This seems to suggest that the majority would find their own security teams to be responsible.

The trend seems to be to blame upstream providers for not being more proactive when it comes to DDoS defense. Many believe the best approach to countering these increasing attacks is to have ISPs that are equipped to defend against DDoS attacks, by both protecting their own networks and offering more comprehensive solutions to their customers via paid-for, managed services that are proven to be effective.

We are definitely sympathetic to anyone who has concerns regarding the possibility of these attacks and how they could lead to serious losses should they be able to wreak havoc and essentially remove the site from the web for extended periods of time. With the news alluded to earlier that there could even be a worldwide Internet outage before long via the new depth and complexity of DDoS attacks, however, it would seem that anyone with an interest in being online for whatever purpose should be concerned as well.

Understanding the New ‘Perimeter’ Against Cyber Attacks

Hacker in hood with laptop initiating cyber attack. View from the back.

If you yourself haven’t been the victim of a cyber attack, you very likely know someone else who has, and in fact the numbers suggest that upwards of 90% of organizations experienced at least SOME level of an IT security breach in the past year. Further, it’s believed that one in 6 organizations have had significant security breaches during the same period.

Here at 4GoodHosting, we’ve established ourselves as a top Canadian web hosting provider but we’re always keen to explore industry trends – positive and negative – that impact what matters to our customers. And our array of customers covers pretty much any type of interest one could have in operating on the World Wide Web.

Cyberattacks have pretty much become a part of every day life. While not to suggest that these types of incidents are ‘inevitable’, there is only so much any one individual or IT team can do to guard against them. Yes, there are standard PROACTIVE web security protocols to follow, but we will not look at those here given the fact that they are quite commonly understood amongst those of you who have that as part of your job detail and responsibility within the organization.

Rather, let’s take a look at being REACTIVE in response to a cyber attack here, and in particular with tips on how to disinfect a data centre and beef it up against further transgressions.

Anti-Virus and Firewalls – Insufficient

It would seem that the overwhelming trend with cloud data security revolves around the utilization of firewalls, believing them to be a sufficiently effective perimeter. Oftentimes, however, exceptions are made to allow cloud applications to run and in thus doing so the door is opened for intrusions to occur.

So much for firewalls securing the enterprise.

Similarly, anti-virus software can no longer keep pace with the immense volume of daily viruses and their variants that are being created in cyberspace nearly everyday. A reputable cybersecurity firm recently announced the discovery of a new Permanent Denial-of-Service (PDos) botnet named BrickerBot, which serves to render the victim’s hardware entirely useless.

A PDoS attack – or ‘phlashing’ as it’s also referred to – can damage a system so extensively that full replacement or reinstallation of hardware is required, and unfortunately these attacks are becoming more prevalent.It is true that there are plenty of useful tools out there such as Malware bytes that should be used to detect and cleanse the data centre of any detected or suspected infections.

Making Use of Whitelisting And Intrusion Detection

Whitelisting is a good way to strengthen your defensive lines and isolate rogue programs that have successfully infiltrated your data center. Also known as application control, whitelisting involves a short list of the applications and processes that have been authorized to run. This strategy limits use by means of a “deny-by-default” approach so that only approved files or applications are able to be installed. Dynamic application whitelisting strengthens security defenses and helps with preventing malicious software and other unapproved programs from running.

Modern networking tools should also be integrated as part of your security arsenal, and if they are configured correctly they can highlight abnormal patterns that may be a cause for concern. As an example, intrusion detection can be set up to be triggered when any host uploads a significant load of data several times over the course of a day. The idea is to eliminate abnormal user behaviour and help with containing existing threats.

Security Analytics

What’s the best way to augment current security practices? Experts in this are increasingly advocating real-time analytics used in tandem with specific methodologies that focus on likely attack vectors. This approach revolves around seeing the web as a hostile environment filled with predators. In the same way behavioural analytics are used in protecting against cyber terrorists, we need to take an in-depth look at patterns to better detect internal security threats.

However, perhaps the most important thing to realize is that technology alone will never solve the problem. Perfect email filters and the transgressors will move to using mobile networks. Improve those filters and they’ll jump to social media accounts. The solution must address the source and initial entry concepts, with training and education implemented so that people in the position to respond and ‘nip it in the bud’ can be explicitly aware of these attacks just as they first begin.

End-user Internet security awareness training is the answer, but we are only in the formative stages of making it accessible for users across all the different types. Much of it is all about teaching users not to do inadvisable things like clicking on suspect URLs in emails, or opening attachments that let in the bad hats.

Putting all staff through requisite training may be expensive and time consuming / productivity draining, but we may be at the point soon where it’s no longer an option to NOT have these types of educational programs. The new reality is that what we previously referred to as ‘the perimeter’ no longer really exists, or if it does it’s by in large ineffective in preventing the entirety of cyber attacks. The ‘perimeter’ is now every single individual on their own, and accordingly the risks are even greater with the weakest link in the chain essentials being the entirety of your system defences.

Amnesty International Report on Instant Messaging Services and Privacy

4gh-privacyconcerns-b

Skype & Snapchat, among other companies, have failed to adopt basic privacy protection as recent stated in Amnesty International’s special report “Message Privacy Ranking” report. The report compares 11 popular instant messaging services.

Companies were ranked based on their recognition of online threats to human rights, default deployment of end-to-end encryption, user disclosure, government disclosure, and publishing of the technical details of their encryption.

“If you think instant messaging services are private, you are in for a big surprise. The reality is that our communications are under constant threat from cybercriminals and spying by state authorities. Young people, the most prolific sharers of personal details and photos over apps like Snapchat, are especially at risk,” Sherif Elsayed-Ali, Head of Amnesty International’s Technology and Human Rights Team said in a statement.

“Snapchat” only scored 26 points in the report (out of 100) and Blackberry was rated even worse at 20 points). Skype has weak encryption, scoring only 40.

The middle group in the rankings included Google, which scored a 53 for its Allo, Duo, & Hangouts apps, Line and Viber, with 47 each, and Kakao Talk, which scored a 40.

The report also stated “that due to the abysmal state of privacy protections there was no winner.”

On a side not protecting privacy rights is also part of the motivation behind the Let’s Encrypt Project, which to use to supply free SSL Certificates.

Amnesty International has petitioned messaging services to apply “end-to-end encryption” (as a default feature) to protect: activists, journalists, opposition politicians, and common law-abiding citizens world-wide. It also urges companies to openly publish and advertise the details about their privacy-related practices & policies.

About the most popular instant messaging app: “Whatsapp” – Facebook has thrown everybody a new surprise twist.

WhatsApp is updating its privacy policy. Facebook wants your data and end-to-end encryption is going to soon be shut off.
WhatsApp , now owned by Facebook, started some uproar this week after the announcement that it’s changing its terms (or privacy) to *allow* data to be shared with Facebook. It means that for the first time Whatsapp will give permission to connect accounts to Facebook. This is after pledging, in 2014, that it wouldn’t do so – and has now backtracked.

WhatsApp now says that it will give the social networking site more data about its users – allowing Facebook to suggest phone contacts as “friends”.

“By coordinating more with Facebook, we’ll be able to do things like track basic metrics about how often people use our services and better fight spam on WhatsApp,” Whatsapp has written.

“By connecting your phone number with Facebook’s systems, Facebook can offer better friend suggestions and show you more relevant ads if you have an account with them. … For example, you might see an ad from a company you already work with, rather than one from someone you’ve never heard of.”

Many aren’t pleased with the move, especially since WhatsApp previously promised not to change its privacy settings.
If you want to carry on using whatsapp, you can’t opt out of the Facebook connection feature, as the update of terms and privacy policy is compulsory. “This allows us to do things like improve our app’s performance and better coordinate,” says WhatsApp.

The app’s end-to-end encryption will also be stopped. However previously the company implemented it earlier this year and claimed it made conversations more secure.

The popular messaging service’s recent change in privacy policy to start sharing users’ phone numbers with Facebook—the first policy change since WhatsApp was acquired by Facebook in 2014 – has attracted regulatory scrutiny in Europe.

The Italian antitrust watchdog on Friday also announced a separate probe into whether WhatsApp obliged users to agree to sharing personal data with Facebook.

The European Union’s 28 data protection authorities said in a statement they had requested WhatsApp stop sharing users’ data with Facebook until the “appropriate legal protections could be assured” to avoid falling foul of EU data protection law.

WhatsApp’s new privacy policy involves the sharing of information with Facebook for purposes that were not included in the terms of service when users signed up, raising questions about the validity of users’ consent, known as the Article 29 Working Party (WP29), as the European authorities have responded with.

The Wp29 group also urges WhatsApp to stop passing user data to Facebook while it investigates the legality of the arrangement.
Subsequently a spokeswoman for WhatsApp said the company was working with data protection authorities to address their questions.

Facebook has had run-ins with European privacy watchdogs in the past over its processing of users’ data. However, the fines that regulators can levy are paltry in comparison to the revenues of the big U.S. tech companies concerned.

The European regulators will discuss the Yahoo and WhatsApp cases in November.

“The Article 29 Working Party (WP29) has serious concerns regarding the manner in which the information relating to the updated Terms of Service and Privacy Policy was provided to users and consequently about the validity of the users’ consent,” it writes.

“WP29 also questions the effectiveness of control mechanisms offered to users to exercise their rights and the effects that the data sharing will have on people that are not a user of any other service within the Facebook family of companies.”

We haven’t heard of any discussion within Canada as of yet.

Thank you for reading the 4GoodHosting blog. We would love to hear from you.

Google & Facebook will be Building a Big Trans-Pacific Fiber-Optic Cable

transpacificcable

Map published by Facebook

Google and Facebook are engaging in a partnership to pay for the laying of what will be one of the highest-capacity undersea data cables – piping data in the form of light all the way across the Pacific; bridging Los Angeles & Hong Kong.

This project is the second partnership Facebook has joined in. It is yet another current example recent big business in the submarine-fiber optic cable industry. This internet-centric industry has traditionally been dominated by group of private, and government, carriers.

Companies like Facebook, Google, Microsoft, and Amazon operate huge-scale data centers that deliver various internet services to people worldwide. These internet big boys have are quickly reaching a point where their global bandwidth needs are so high that it makes more sense for them to fund cable construction projects directly; rather than to purchase capacity from established carriers.

Previously this year, in May 2016, Facebook announced that teamed up with Microsoft on a high capacity cable across the Atlantic called “MAREA”. This cable will be linking internet backbone hubs in Virginia Beach, and Bilbao, in Spain. Telefonica will be administrating this future transatlantic data line.

Europe and the Asia Pacific region are important markets internet services giants. The MAREA cable will boost bandwidth levels between both companies’ data centers both in Asia and the US.

The submerged fibre line is named the “Pacific Light Cable Network”, named after Pacific Light Data Communications, Inc – the 3rd partner of the project.

Both the MAREA and Pacific Light cable will be built by “TE SubCom”; one of the biggest names in the submarine fibre optic cable industry.

The 120Tbps (Terabits per second) PLCN system will provide greater diversity in transpacific cable routes, as Facebook recently published. “Most Pacific subsea cables go from the United States to Japan, and this new direct route will give us more diversity and resiliency in the Pacific,” Facebook’s article states.

read_more

One difference that PLCN and MAREA have from traditional transoceanic cable systems is they will be interoperable with different networking equipment; rather than being designed to function with specific or proprietary landing-station technologies. Companies will be able to choose what optical equipment best fits their needs best. When better technology become available, the companies involved will be able to change or upgrade that equipment – as better technologies becomes available. Equipment refreshes can occur as optical technology improves.

When equipment can be replaced by better technology at a quicker pace, costs should go down and bandwidth rates should increase more quickly.

Another cable, “FASTER”, is backed by Google and several Asian telecommunications and IT services companies, became operational in early 2016. Yet another big submarine cable project is the “New Cross Pacific Cable System”, which is backed by Microsoft and several Asian telecoms. NCP is expected to light up in 2017.

Also earlier this year; Amazon Web Services made its first direct investment on a submerged cable – helping make the planned “Hawaiki” Submarine Cable project between the US, Australia, and New Zealand possible. Both before-mentioned cables are to be surfacing in Oregon.

High speed optical cable is bringing the world together at the speed of light faster than ever before. At the speed of light, approximately 186,000 miles per second, data can circle the whole world more than 7 times a second.

Due to factors such as this, 4GoodHosting.com intends to continue serving websites all over the world, and reaching a larger, global market of new customers who wish to have their website hosted from Canada. (a most liberal, low-key & relaxed country)

United States/Canada to cede internet oversight to emerging UN global cabal

campaign-2016-trump-internet-handover-dns

The name of the organization with seen to remain the same, “ICANN”, but the people is power over the organization is about to shift, without your vote, Oct 1st to an assemblage of the world body politic, mostly compose of despots and dictators. The effect of this is still yet to be seen, but one key person involved in today’s political fray, presidential candidate Donald Trump, opposed the lackluster plan.

Basically, the current domain name system as you have grown to know, and trust, is about to go under changes that could easily lead to intimidation and censorship against free speech. Protection of grassroots political speech is also at risk. The world can go on with less adult websites, but the world simply won’t be as good or nice of a place without freedom of speech in other regards. The founders of America frequently stated that the citizenry must always be vigilant and jealously guard their rights and freedoms. This is the attitude now on the podium of Donald Trump; the only candidate opposed to the plan to hand over control of the internet to a conflicted party of those seeking ever more power in the world.

U.S. Republican presidential nominee Donald Trump is currently conducting verbal opposition to the semi-secretively planned transition of oversight of the internet’s domain name system (DNS) management from the US-based governance to the UN, a global organization of political stakeholders from around the world; which could abrogate and hand-over control of the internet itself to authoritarian regimes such as China and Saudi Arabia and others; foreseeably threatening online freedom. Internet DNS is basically a directory for internet-connected devices that helps translate domain names to numerical IP addresses.

He strongly contends that the US Congress should act swiftly block the handover, scheduled to occur next week on Oct. 1 2016, or as his campaign policy director Stephen Miller stated “internet freedom will be lost for good, since there will be no way to make it great again once it is lost.”

The ‘handover’ of the internet DNS was proposed in March 2014, implying the transfer of oversight from the nonprofit Internet Corporation for Assigned Names and Numbers (ICANN) and is soon expected to fully occur; unless Congress now acts quick and votes to block the move. Recently the US National Telecommunications and Information Administration (NTIA) finally signed off on the agreement.

Some congressional negotiators are currently working to finalize an agreement on a new spending package, due September 30th , allegedly containing a provision to delay the transition.

Democratic presidential candidate Hillary Clinton supports the Obama administration’s planned transition to give the UN control over everyone’s domain names. If that happens, a highly entrenched foreign political power would have control if anyone’s domain expires earlier than expected, without option to renew – and perhaps even with more authoritarian controls.

This is the workings of the current government and system that everyone worldwide is paying taxes in support of. Please share this article and/or speak your mind directly with others who should be concerned as well.

This page explains domain purchasing today.

On Choosing the Best CMS for Your Particular Needs

4GoodHosting-Drupal-Joomla-Wordpress

You may have heard of the 3 more popular content management applications: WordPress, Drupal, Joomla – but you are not sure which one is best for your needs. Perhaps you remain curious; so we will focus the the two ‘other’ choices besides WordPress: Drupal & Joomla.

Each particular CMS will provide the basic functions of: adding, deleting, and publishing various types of content. Each program has different strong points (and weaknesses) which should be considered whole-cloth, prior to making your ultimate decision.

First write down your business’ objectives and goals. This should be is the first step in selecting the best CMS application suited for your particular business needs. Ultimately, optimally serving your business’ unique target audience.

Choosing the right CMS (by the way, easily confused with CNS (Central Nervous System)), is the backbone for your project it will save you a great deal of headaches later. A reliable web host, with super customer support, also saves you from initial and future headaches. With 4GoodHosting.ca you can get both ultra-reliable hosting and the CMS of your choice for free: Joomla, Drupal, or of course; WordPress – or any of the 200+ free scripts we offer you with any of our hosting package.

Drupal:

In 2016, there is an estimated 1 million+ websites built atop the Drupal CMS. Drupal is common to government offices, universities and colleges, Non-government Organization, Canadian & otherwise global enterprises. America’s White House website is taking advantage of Drupal’s strong website security features. Drupal is a comprehensive, expandable, powerful content management framework suitable to be the foundation of virtually any type of website.

Drupal’s Advantages:

  • * Tested Enterprise-level security; advanced control over URL structure
  • * Lots of functionality – including advanced menu management, graphics modification utilities, poll management, and administration/users management
  • * Built for high performance; pages load fast because of its defaulting caching features
  • * Ability to handle large amounts of content & data
  • * Extensive selection of themes, modules & extensions
  • * Ideal for community platform sites (requiring multiple users – admin, editors, logged in users requiring customized content, private groups, etc.)
  • * Large robust community generally responsive to inquiries and concerns.
  • * Good SEO configurability
  • * Clean/professional looking designs/themes.

Drupal’s Disadvantages:

  • * High/technical learning curve; not user-friendly
  • * Developer skills needed to install and apply upgrades requiring experienced knowledge of PHP and HTML languages as well as CSS
  • * More expensive: premium themes and plugins (modules) are prices considerably higher than say WordPress (and Joomla)
  • * Big name Brands who are Using Joomla:
  • * The Weather Channel
  • * NBC.com
  • * Twitter
  • * Oxford University
  • * Verizon Wireless
  • * The White House
  • * The Economist Magazine
  • * Forbes Magazine

Joomla:

Another good option for small to mid-sized websites or e-commerce stores (or for building a community or a social network with a membership features, forums, newsroom, articles, and a writing staff). However, if you need something more powerful for larger/enterprise projects where scalability, stability, & high versatility are essential, then learning and using Drupal would be more appropriate.

Joomla is becoming an increasingly popular CMS platform. Trailing WordPress, it is the 2nd most accepted CMS. Joomla is currently housing over 3 million websites.

Joomla level of complexity is somewhere between WordPress (simplest) in most advanced and enterprise-class Drupal.

Joomla has the extensibility of being extended in order to produce even new functionality. Joomla has won the Packt Open Source Awards now several years in a row.

Joomla entails a slight learning curve, particularly for novices, yet webmasters usually wind up happy with Joomla’s built in features.

Joomla’s Advantages:

  • * Installation is simple (developer knowledge of CSS, PHP, or HTML is not required) updates installs are easily done through web browser
  • * E-commerce made easy
  • * Thousands of free extensions available (for increased functionality of your site)
  • * Advanced administration panel offers many functions for complete optimization
  • * Manage users simply and easily
  • * Joomla’s application framework makes it possible for developers to create powerful add-ons
  • * URLs generated are SEO friendly
  • * Active community support (programmer tools and tutorials for users )

Joomla’s Disadvantages:

  • * Some learning curve to ride – but not as much as Drupal.
  • * About half of the plugins/extensions & modules are for purchase
  • * Limited configurability options (particularly for advanced users); Limited “access control list” (ACL) support
  • * Occasional compatibility issues with some of the plugins, which requires some PHP skill to iron-out the functions to work properly

Big name Brands who are Using Joomla:

  • * IKEA
  • * IHOP
  • * Harvard University (Graduate School of Arts and Sciences )

If you have some experience with content management systems and you’re not wanting WordPress for any reason, and if diving into Drupal seems quite daunting, then Joomla might be your best option. Thank you for choosing 4GoodHosting.com as your 5.0 Google rated, A+ BBB Canadian Web Host.

“Irish”(Similarly Canadian) Search Warrant Found Invalid – Microsoft Currently Victorious in Fight for User Privacy

Microsoft

(US & Canadian News) Microsoft actually championed a huge victory in regards to user privacy {which certainly has affect on Canadians using Microsoft products and services: Email, Cloud Storage, Skype, etc.) on July 28th. An appeals court has ruled that a federal warrant to seize email from a Microsoft server in “Ireland” is invalid.

Federal investigators received a spy warrant (for email contents) as part of a criminal investigation in December 2013, which touched off a debate between the tech-industry and law enforcement about jurisdiction & data storage.

The timing of this coincides with Microsoft’s Worldwide Partner Conference (WPC) – where the company’s president and chief legal officer Brad Smith promoted a vision for the internet that respects people’s rightsand is “governed by good law.”

Microsoft said: “We obviously welcome today’s decision by the United States Court of Appeals for the Second Circuit. The decision is important for three reasons: it ensures that people’s privacy rights are protected by the laws of their own countries; it helps ensure that the legal protections of the physical world apply in the digital domain; and it paves the way for better solutions to address both privacy and law enforcement needs.”

Privacy protections for information stored on paper should persist as data moves to the cloud. This decision helps ensure this result.

— Brad Smith (@BradSmi) July 14, 2016

Microsoft has publicly acknowledged a need for cloud providers, particularly those based in the U.S., to win back over consumer trust.

Representatives for like-minded lobby groups include: the EFF (Electronic Frontier Foundation), i2Coalition, plus big tech companies such as: Rackspace, Apple, Amazon, Cisco, Hewlett-Packard, and Verizon – and notably in this case Ireland’s Parliament – and each submitted briefs in support of Microsoft’s initial statements and position.

“We conclude that Congress did not intend the SCA’s warrant provisions to apply extraterritorially,” the judges said in the ruling (PDF). “The focus of those provisions is protection of a user’s privacy interests. Accordingly, the SCA does not authorize a US court to issue and enforce an SCA warrant against a United States‐based service provider for the contents of a customer’s electronic communications stored on servers located outside the United States.”

Thank you for reading and sharing the 4GoodHosting Blog.

Hosting Upgrade Considerations


4GoodHosting_upgrade
Is your website becoming much more popular?

If you are searching for reliable yet inexpensive, and fast loading website hosting, “shared web hosting” or “V.P.S.” (Virtual Private Server hosting) service are two good, but not identical, options. The most common choice, to do it at rock bottom costs, shared hosting, but for many businesses the limitations of shared hosting eventually becomes outgrown.

Migrating from a shared server plan to a VPS (or an entirely “Dedicated” or “Standalone Server) is typically the next step.

4GoodHosintg provides free upgrade migration services; to solve any nervousness when you may decide to upgrade your website to its own server.

Some signs you have outgrown Shared Web Hosting:

Skyrocketting Traffic

For low traffic websites shared hosting is ideal. If you are noticing your traffic increase consistently, or if you are offering high-bandwidth content such as video(s); you may need to upgrade to a VPS (Virtual Private Server) for dedicated bandwidth, and for a lower-latency (faster) less congested network connections.

When your business/website grows in size: your email, disk space, CPU & RAM (Random access memory) requirements will also eventually surpass your existing shared hosting plan. The growth of your business will often dictate the need for upgrades.

Choosing between VPS and dedicated server

Perhaps you would feel best serving your website by renting your own private server (standalone web server with it’s own dual power supply). However, first consider the differences between VPS and Dedicated servers – to find out which one is best for your application; including cost, as VPS are less expensive than leasing dedicated equipment in our data center.

In either case, 4GoodHosting offers numerous advantages: such as 24/7 customer support, “RAID” hard drive and SSD redundancy, dual-coast back-ups, disaster recovery servers, plus the flexibility of upgrading or downgrading your server hosting package whenever you need with free migrations.