Reducing Costs in the Cloud using Machine Learning

In the modern-day business environment where organizations are consistently being asked to do more with less, businesses of all sizes are continuing to move to the cloud en masse. The switch to a cloud-based platform over a traditional on-premise installation is driven largely by a desire to reduce costs, with added benefits such as optimized performance and higher levels of compliance.

Image result for cloud vs on premise
A comparison of cloud-based platforms to on-premises installations

However, organizations that use the cloud for just hosting purposes have barely scratched the surface in terms of functionality. While features such as resource pooling and automatic backups are utilized commonly, some features, such as machine learning in the cloud, continue to be underutilized by users.

Machine learning can broadly be defined as the science of feeding computers data to autonomously improve their learning over time, and it is starting to explode in the cloud-sphere. Azure was one of the first cloud platforms to announce machine learning capabilities (back in late 2014) and Amazon followed suit in 2015. More recently, Google launched Google Cloud Machine Learning in 2017, while Amazon introduced a more advanced machine learning platform called SageMaker at re:Invent 2017.

Machine learning in the cloud, at its core, is comprised of three separate requirements:

  1. The first, and most important, part of the machine learning process is the converting of unstructured data into understandable, value-rich data. The core principle here is that without good data, there is no value to be derived from it – clean data is essential for machine learning purposes. Cloud providers make it easy for you to import your data – Amazon SageMaker allows native integration with S3, RDS, DynamoDB, and Redshift, and all 3 major companies allow you to bring in your own data as well.
  2. The second piece of the puzzle is the machine learning algorithm that the data runs through. SageMaker, for example, offers many built-in algorithms such as Linear Learner, XGBoost, and K-Means. It also allows you to use your own trained algorithms, which are packaged as Docker images (like the built-in ones). This gives you the flexibility to use almost any algorithm code, regardless of implementation language, dependent libraries, or frameworks.
  3. Lastly, machine learning scientists need a certain level of insight to be able to connect models and algorithms to business processes. Industry-standard processes for data mining (illustration below) show that understanding the core business and data is key to modeling algorithms. Without an understanding of what data can help make impactful business decisions, the output of any algorithm is useless.
Image result for azure machine learning process
An illustration of the Cross Industry Standard Process for Data Mining

Machine learning in the cloud really is as simple as finding the right dataset (or collecting the data using built-in cloud monitoring services), training the right algorithm, and continuing to fine-tune the machine learning system. The barrier to entry is not as large as organizations perceive it to be, and not every user of a cloud-based machine learning system must be a trained machine learning scientist.

At this point, you may be wondering – how can I use these systems to reduce costs and increase operational intelligence in my current cloud setup?

One of the most important ways any organization can reduce costs is by proactively predicting performance issues and remediating problems before they impact your business. Specifically, funneling a dataset of historical uptime/downtime metrics through a machine learning algorithm can help you identify when a server or system may go down in the future. Using this knowledge, you can avert any work slowdowns or stoppages and, in the process, save a substantial amount of potentially lost wages.

Image result for machine learning

In additional to technical cost-reduction, machine learning in the cloud is already being used to reduce costs by improving processes in many industries. One great use case is how Google Cloud augments typical help-desk scenarios with Cloud AI using an event-based server-less architecture. It also has the potential to completely revolutionize health-care, as it can provide alerts from real-time patient data, while also providing proactive health management for existing patients.

In summation, while cloud migrations are ubiquitous, the true power of the cloud is still untapped while services like machine learning tools are underutilized. These tools, offered by all the major cloud providers, can help organizations by providing proactive analyses and minimizing operational waste.



Best Practices for AWS EC2

best practices for AWS EC2 blog post by mike carlino, aws infrastructure engineer at Easy DynamicsThere are several things to consider before clicking that “Launch” button in the AWS (Amazon Web Services) console. The more you plan and take into consideration ahead of time, the more you can save yourself a few headaches down the road. I will go over some fundamental best practices to consider before launching your EC2 instance. These topics will cover storage, security, backup/recovery, and finally management. Continue reading “Best Practices for AWS EC2”

Improving Vendor-Provided Visio Stencils that are Less than Great

As an IT professional, I’ve had occasion to use Microsoft Visio perhaps two or three thousand times. Given that it has been the de facto standard in creating visual representations of complex architectures and other technological concepts for close to two decades, it’s likely you’ve encountered the tool yourself.

Generally speaking, it’s all well and good to open the application, choose applicable stencils for your project (bundled or those you’ve downloaded from a third party), and start dragging shapes onto the page. A few labels and some strategically placed lines, and you’ve got yourself a passable diagram ready to share with your colleagues… yay. There are sometimes, however, when one of these third parties make available a well-intended, but ultimately awful, set of stencils that they invite you to use to document their nifty widgets and doodads… thus was my experience recently when I downloaded Amazon’s AWS Visio stencils, and this blog post details what I did about it.

Continue reading “Improving Vendor-Provided Visio Stencils that are Less than Great”

AWS Certification – Solutions Architect Training Insights

AWS Certification Solutions Architect - Associate Level by Buddy Brooks at Easy Dynamics

Amazon Web Services offers certification testing for IT professionals interested in advancing their careers in the Amazon realm.

In September 2015, I completed the certification exam for the AWS Certified Solutions Architect – Associate. It is by far one of the most difficult certification exams I have ever taken. The exam is designed to measure your “technical expertise in desinging and deploying scalable, highly available, and fault tolerant systems on AWS.”  Continue reading “AWS Certification – Solutions Architect Training Insights”

AWS S3 Storage Concepts for Windows Admins

linux and windows language barriers in aws storage concepts

If you happen to be a Windows Admin (you know who you are) and are beginning to work in the Amazon Web Services Environment, there are some things you need to know! Unless you are at least a little familiar with Linux, you are in for an adventure in confusion. As with any product there is a learning curve when you first begin to explore it. The first challenge is learning to “speak the same language.” Even though I have experience using Linux, I still stumbled until I digested the explanations. Be careful! The language/concept barrier can lead to a great deal of confusion and difficulty. There are many AWS terms that have special meaning. In fact, it is almost a different way of thinking.This blog will cover how I went about breaking the barrier and how you can too!

Continue reading “AWS S3 Storage Concepts for Windows Admins”