Higher than Anticipated Cloud Costs the Norm for Businesses

Over the past 10 years the digital expanse for businesses has become nearly all encompassing. Without going into too much detail that’s the primary factor as to why so many businesses have had to embrace the cloud as – quite simply – their operations are necessitating it. That goes beyond data too, and companies that are utilizing the IoT are especially in need of everything cloud computing makes possible.

Now there are many instances in life where something can be described as both a blessing and curse. That probably doesn’t legitimately apply to cloud computing, but for some of these businesses it may be at least somewhat applicable based on all the added costs they have had no choice to assume because of moving more and more into the cloud. Damned if you do, damned if you don’t might well apply here too.

That doesn’t so much apply to us here at 4GoodHosting in the same way it won’t for any good Canadian web hosting provider. But there is likely enough of the audience here that is making the move to a greater extent at all times too, and so we’ll make the expensiveness of cloud computing for businesses our topic for this week’s entry here.

Higher than Expected Costs

Even though most businesses are happy with the rate at which their company is transforming to the cloud, higher-than-expected costs are making some businesses revisit how they are allocating needed monies for it. All of this based on a Cloud Impact Study where senior IT professionals across large organizations in the US, Canada, and UK were asked about their organization’s spending on cloud infrastructure.

36% of the respondents reported that delivering cost predictability is one of the key challenges facing their organization. And the overarching belief is that while Cloud cost is high, the benefits don’t line up with how much is being spent on it. In addition, ‘many companies are not getting the benefit of a comprehensive management strategy they expected.

The adjoining belief is that much of this stems from having rushed into cloud service adoption due to the way the pandemic nudged many businesses online. The popularity of multicloud is a factor here too, with 59% of respondents saying their organizations prefer to combine public and private cloud services.

Mitigated Challenges = Lower costs

There are plenty of reasons why diversifying across more than one provider is appealing, and a big one is in hoping to save some costs by cherry-picking only the parts they need from different service providers. But even in best-case scenarios it seems that  higher-than-expected cloud costs continue to face them.

Any and all will know that embracing digital transformation is important, but more could be done to mitigate the challenges that face businesses to at least some extent. Investing in Cloud essential components for business continuity and growth in turbulent times is 100% worth it for all these companies, even if the cost of doing so is unappetizing . Increasing ITDM knowledge and understanding or employing a multicloud specialist provide will go a long way  to improving the cost-benefit ratio, especially as the period of economic uncertainty continues to be a detriment to businesses worldwide.

Benefits of Website Isolation for Shared Hosting

There have been plenty of entries here over the years where we’ve talked about the many benefits of VPS web hosting, and for those who truly need to have a private server it’s not even a consideration to go with a more affordable shared web hosting package. Websites that need elbow room to do what they do when pages load are often a big part of a businesses online presence, so while it would be nice to have a less expensive web hosting arrangement it’s just not possible to do with anything less than VPS hosting.

With that understood, anyone with a website that doesn’t need to be accommodated to such an extent can and will be fine with shared web hosting most of the time. And there are smaller and less dynamic e-commerce websites that have been on shared hosting plans for years. There is more in the way of valid concerns around the security and integrity of websites hosted in shared environments though, and that’s always going to be the case. Website isolation can be a partial fix for that though, and that’s what we’re going to go over here with this week’s entry.

It’s a topic that is going to be noteworthy for any Canadian web hosting provider, and here at 4GoodHosting we are the same in the way we put an emphasis on providing valuable information for anyone who has a website for their business, venture, or personal interest and want to be proactive in ensuring that everything stays uneventful with their site and that it’s always problem free alongside the information superhighway.

Isolation = Better Security

Businesses, bloggers, and web developers often rely on shared hosting environments based on their affordability and the way general management is less complicated. But security is nearly always a challenge in these environments. Website isolation can be part of a solution to reducing risks, and there are 5 primary advantages for putting website isolation in place.

Better Overall Security – Being more vulnerable to security breaches is natural with shared hosting when multiple websites share the same server resources. Implementing isolation techniques helps contain the impact of a security breach, making it more likely that a compromised website does not affect others on the same server. Maintaining the overall security and integrity of all websites hosted on the shared server is always going to be important for businesses and website owners alike.

Effective Resource Management – Shared hosting environments will have all websites sharing the server’s resources, with CPU, memory, and bandwidth split between them. Maintaining the stability of the entire server is dependent on each website operating independently. Proper resource allocation can prevent resource-intensive websites from causing performance issues or crashes that may well affect other websites on the server.

Safeguarding Privacy and Data – Users are only going to fully trust in the protection of their privacy if blocking unauthorized access to sensitive data like user information and database content can be relied on based on the web host’s infrastructure and measures in place. Techniques that restrict access to authorized users only play a key role in safeguarding the integrity of websites on shared servers, and this is particularly important for sites that handle sensitive customer data or payment information.

Regulation Compliance – Many industries have General Data Protection Regulation (GDPR) and the Health Insurance Portability and Accountability Act (HIPAA) as a requirement. Having isolation measures in place often goes a long way for meeting these regulations and their requirements as it helps block unauthorized access to sensitive data.

Protecting Website Performance – Having consistent performance levels for all websites hosted on a shared server can be a challenge, but it’s a necessity if a host is going to legitimately offer it to web hosting customers. Techniques that ensure each website operates independently can help prevent resource-intensive websites from monopolizing server resources, leading to improved performance across all hosted websites.

Isolation techniques for shared hosting websites do much for maintaining security, privacy, and performance in shared hosting environments. They allow for better performance metrics with the way that each website and user account operates independently. This contributes to protecting the integrity of websites on shared servers.

Anyone considering a shared web hosting provider will do well to look into any isolation features they offer, and it’s good to ask about what measures they have in place to guarantee the safety and integrity of your online presence.

Digging into Deep Learning

Anything in life that requires learning in order to have functional ability is going to also have a hands-on experiential component if you’re truly going to become adept at whatever it is. There is value in learning how to do it, but usually the most integral part of the learning comes from actually being at ground level and doing it. These days AI is the buzzword connected to nearly everything in the digital world, and that world is where an increasingly large part of everything we have and make use of has its roots.

We’ve talked about machine learning before, and it’s a concept that’s fairly well understood within the parameters of how advanced computing has made advances in leaps and bound over recent years. As the saying goes, you certainly can’t stop progress and the type of deep learning that people are talking about with relation to supercomputers and the like builds on machine learning and improves on it with a whole lot of experiential learning.

It is a profound subject, and more so for anyone or a business where much of what you do is being constantly reshaped by computing technology. That certainly applies to us here at 4GoodHosting in the same way it would for any reputable Canadian web hosting provider, and as AI continues its meteoric rise to prominence and increasing dominance in the space it makes sense for us to have a look at Deep Learning and how it contributes to influentially to supercomputing.

Data Abundance

Deep learning is an advanced artificial intelligence technique who’s usefulness is in the way it has abundant data and increased computing power. Online language translation, automated face-tagging in social media, smart replies in your email, and the new wave of generative models are among the many reasons it’s main technology behind many of the applications we use every day.

Everyone will have heard of all the buzz around the ChatGPT AI-powered chatbot and the way it is revolutionizing pretty much everything, and part of its strength is in the it can differentiate images from text descriptions, exampling how deep-learning systems that model the relation between images and text descriptions have massive potential for ‘smarter’ and more intuitive computing.

As mentioned, deep learning is a subset of machine learning but of the newer types where classic, rule-based AI systems, machine learning algorithms that use training to develop their behavior by processing annotated examples is much less relevant now. Instead, that relevance has now shifted to deep learning and neural networks.

Classic machine-learning algorithms that solve many problems that rule-based programs don’t do well with dealing with soft data such as video, images, sound files, and unstructured text. The best example of this is with predictive modelling, where the contributions domain experts, computer programmers, and mathematicians aren’t need to anywhere near the same extent anymore. What takes the place of all that is an artificial neural network.

Human Brain Inspiration

Deep-learning algorithms tackle the same queries and challenges using deep neural networks. It is a type of software architecture that actually takes a lot of its framework from the human brain. Neural networks are layers upon layers of variables, and they adjust themselves based on what they can recognize in the properties of the data they are trained to work with. This makes them capable of doing tasks like classifying images and converting speech to text.

Neural networks are especially capable with going over unstructured data and finding common patterns in it, and in recent years the greater availability and affordability of storage, data, and computing resources have solidified neural networks as one of the pinnacles of AI innovation.

Applications

Computer Vision

The science of having software make sense of content with images and video is referred to as computer vision. Deep learning has made extensive progress here, and one of the best use-case examples is with cancer detection. This type of deep learning is also well-established in many of the applications people use  every day. Take Apple’s Face ID, for example – it uses computer vision to recognize your face, the same way Google Photos does for various features like searching for objects and scenes as well as for image correction.

Voice and Speech Recognition

Most people have taken advantage of Siri or Alexa at some point, and deep-learning algorithms are very much at work here too as they play a role in how your voice is converted into text commands. Google’s keyboard app is Gboard, and it uses deep learning to deliver on-device, real-time speech transcription that types as you speak.

Natural Language Processing and Generation

Natural language processing (NLP) is the science of extracting the meaning from unstructured text, and developers of classic software have been looking for a workable fix to that challenge for decades. It’s not possible to define ALL the different nuances and hidden meanings of written language with computer rules, but neural networks trained on large bodies of text can execute many NLP tasks with accurate results.

Google translate is another popular resource for people, and it also experienced a boost in performance when Deep Learning was incorporated into it. Smart speakers use deep-learning NLP to understand the various nuances contained in spoken commands, and basic examples of this can be when asking for weather forecasts or directions.

Benefits of Domain Privacy for Business Websites

It’s fortunate that these days greater number of people or businesses who are online are more aware of the need to be smart about when and where you make phone number and email addresses available. Phishing scams are likely if you’re negligent about where your contact details are accessible, and in worst case scenarios identity theft or even having your website hijacked can happen.

Anyone who knows anything about acting will be familiar with what a stand-in does for an actor while on set, and they’re very valuable that way. The way domain privacy works is kind of like that it is the actor’s stand-in that it’s a service that replaces the individual’s contact information with that of the service provider. Doing this secures you against anyone who might be trying to access your details with the aim of initiating fraud or malevolent practice.

Business owners investing in expensive websites will want to consider getting the domain privacy add-on when building their website or having it built for them. Keep in mind as well that contact information doesn’t need to be publicly displayed on your website to have it still be available. The aim needs to be with making sure it doesn’t end up in WHOIS records and this is a way to protect you from spambots that exclusively check WHOIS data.

This option is going to make a lot of sense for man individuals or businesses, and the growing need for expanding web security practices makes domain privacy a topic that any good Canadian web hosting provider is going to take an interest in and that’s true for us here at 4GoodHosting too. So this is what we’ll look at with our blog entry this week.

Who’s Name?

The way domain privacy functions starts with how domain names are registered. That’s something we are explicitly familiar with and most internet domain names found in use on the Internet are registered through an organization called ICANN – Internet Corporation for Assigned Names and Numbers. Domain owners must provide their name and contact information when purchasing a new domain name and the information is then listed on a database that has the name WHOIS.

Once it is catalogued in the directory it becomes accessible to anyone on the internet via the many free domain lookup tools. Anyone with an internet connection can find one easily and use it. Domain privacy makes it possible to hide your contact information from the WHOIS directory and make a random email address and the registrar’s own contact details available instead of your own.

We should as well that there are legitimate reasons why ICANN makes it a requirement that every website owner put their contact details on display publicly. One of the bigger ones i the way it makes it easier for law enforcement agencies to track people in case there’s illegal activity on their websites. Helping lawyers hold website owners accountable in cases involving plagiarism and copyright infringement is important too.

It also makes it possible for people who are interested in buying your domain to contact, although those aren’t always harmless communications either. But the bad of it can be with people trying to send you marketing spam over email or phone. It is true that hackers have used contact information to hijack websites, and putting site addressees in line to be targets you for phishing scams happens very regularly.

Look to EU’s GDPR

Hackers and spammers often have dedicated bots at work crawling through WHOIS directories in order to generate lists of contact details and then pick and choose the ones where they think they see the most promise with whatever it is they’re aiming to do. When contact details don’t show up on the WHOIS records it’s much more likely a business will evade the attention of such systems and steer themselves clear of any of these types of problems.

Here in North America we might want to look at what has happened with the European Union’s General Data Protection Regulation and how it relates to better protections for businesses that need to make contact information available. This new set of data regulations aimed at protecting your internet privacy is going far with this and it’s creating legislation resulting in everyone protected under EU law will see their contact details redacted in WHOIS listings across the internet.

Apparently seeing ‘redacted for privacy’ rather than the actual contact information is what people will see. It may be wise for policy makers here in North America to do the same thing to offer people those same types of assurances when they’re doing business online.

Reddit Now Looking to get Paid for Role in A.I. Development

Gleaning insight from conversation with others is what has made Reddit the hit it has been in the digital space, and if you have any area of interest you can always find at least one sub-Reddit on it, and usually you’ll have more than a few to look through. It’s one of the most engaging social networking sites if you like to ingest your information by consuming text and the simplicity of it is really nice too. 57 million people visit Reddit every day to chat and broaden their horizons with whatever subject it is they’re interest in, and it’s a great resource for that.

Looking at it from a different angle, you might be surprised to learn that Reddit chats have also served as a free teaching aid for companies like Google, Microsoft and – most notably these days – OpenAI. Reddit conversations have been used in the development of giant artificial intelligence systems created by these 3 and we’re seeing how they’re already becoming such a big deal in the tech industry.

Definitely a development that everyone in our industry will take note of as well given the connection, and here at 4GoodHosting we’re like any other quality Canadian web hosting provider in that seeing an online chat forum become an open-source type of asset for these big players in the digital world is definitely something we’ll be keen to follow as well as share here with our blog.

Charging for Access

And it is definitely interesting to see that Reddit wants to be paid for accessing its application programming interface now. That’s the method through which outside entities can download and process the social network’s vast selection of person-to-person conversations, and the heads of the company at Reddit don’t think they should be making that value available for free.

So what we’ll see now is Reddit charging for access to the conversations it hosts for the purpose of developing A.I. systems like ChatGPT, OpenAI’s popular program. Reddit is also acting as it prepares for a possible initial public offering on Wall Street this year. As of now it makes most of its money through advertising and e-commerce transactions on its platform, but right now the discussion is ongoing around what they will be charging for A.P.I. access.

The New Bots

Here are the other chatbots powered by artificial intelligence that have been utilizing Reddit as a development resource, and here they are:

ChatGPT – the MUCH talked about artificial intelligence language model from a research lab, OpenAI. Able to respond to complex questions, write poetry, generate code, translate languages or even plan vacations. GPT-4 introduced in mid-March is even able to respond to images

Bing – Microsoft has a similar chatbot, capable of having open-ended text conversations based around virtually any topic but apparently does have occasionally inaccurate, misleading and weird responses that put it in a bit of a negative light in comparison when it first came out.

Bard – This is Google’s Chatbot and originally conceived as a creative tool designed to draft emails and poems. Notable for it is the way it is able to generate ideas, answer questions with facts or opinions, or write blog posts entirely on its own.

Ernie – Here is the lesser light one for sure, but Chinese search giant Baidu came out with their rival to ChatGPT in March. The name is a shortened version for Enhanced Representation through Knowledge Integration, but it hasn’t made anything near the splash that these other three have.

LLMs

L.L.M.s are sophisticated algorithms companies like Google and OpenAI have developed, and the algorithms are able to take Reddit conversations are data, and then adding that data to the vast pool of material being fed into the L.L.M.s. to develop them.

Other types of companies are seeing value in Reddit conversations and also for what they have with images. Most people will know Shutterstock, an image hosting service. It sold image data to OpenAI to help create the A.I. program that creates vivid graphical imagery with only a text-based prompt required – DALL-E.

Artificial intelligence makers need two significant things to ensure their models continue to improve. The first is a staggeringly large amount of computing power, and the second is an enormous amount of data. Many of these A.I. developer major players have plenty of computing power but still need to go outside their own networks for the data needed to improve their algorithms.

Other sources like Reddit that are utilized are Wikipedia, millions of digitized books, and academic articles, and Reddit has had a long-standing symbiotic relationship with the search engines of companies like Google and Microsoft. Their search engines have been crawling Reddit’s web pages in order to index information and make it available for search results.

With LLMs though the dynamic is different as they obtain as much data as they can to create new A.I. systems like the chatbots. Reddit’s A.P.I. is  still be going to be free to developers who wanted to build applications that helped people use Reddit and there’s also an aim to incorporate more so-called machine learning into how the site operates. One possible benefit of that is that it might identify the use of A.I.-generated text on Reddit and labelling it as that.