A topnotch site

Leave a comment


According to the latest statistics, 78% of US small businesses have now fully adopted cloud computing.

Reasons Why You Need to Take Advantage of the Cloud Today

Boost Collaboration – The cloud allows you to access your files from anywhere. That means you can collaborate as long as you have an Internet-enabled device. It’s also beneficial if for some reason your usual computer or server is out of action.

More Engagement – The cloud is by far the best system of engagement for meeting the needs of customers. The ability to pull up files and presentations from anywhere provides more tools to work with.

Speed – The cloud enables companies to move forward faster with innovation. Kuma Games has worked with IBM to offer episodic video games via the cloud. They have made them faster, graphically superior, and higher performing than their competitors.

To achieve these three benefits doesn’t require any significant investment. The cloud has caught on so quickly with businesses of all sizes because of how affordable it is. CloudBerry, for example, has managed to scale its services and make them as flexible as possible so companies are only using exactly what they pay for.

The Cloud and Disaster Relief

Nevertheless, to go with the cloud you should combine your disaster relief services. This means that if something goes wrong you can continue to deliver services to customers. There are many reasons why you may want to combine your contingency plans with the cloud.

The benefits of combining the cloud with a disaster recovery plan include:

  • Better protection. The cloud ensures your data is always recoverable because of high-end encryption and the fact your data is now scattered across multiple locations. Even a natural disaster can no longer wipe you out.
  • Low cost of ownership. There’s little capital expense involved. All you have to do is pay the subscription for the service you happen to be using. There are no on-going maintenance costs to take into account.
  • Ease of use. Cloud interfaces are incredibly easy to use because they require no technical experience whatsoever.


Benefit of Cloud Based Model in Today’s competitive Environment where uptime and Scalability is challenge???  

  1. Reduced time to benefit

Different from the traditional model, in SaaS the software (application) is already installed and configured. The user has the advantage of provisioning the server for an instance in cloud and in a couple hours they can have the application ready for use. This reduces the time spent in installation and configuration, and can reduce the issues that  can get in the way of  the software deployment.

  1. Lower costs

SaaS has a differential regarding costs since it usually resides in a shared or multi-tenant environment where the hardware and software license costs are low compared with the traditional model.

Another advantage is that the customer base can be increased since it allows small and medium  businesses (SMB) to use a software that otherwise they would not use due to the high cost of license.

Maintenance costs are reduced as well, since the SaaS provider owns the environment and it is split among all customers that use that solution.

  1. Scalability and integration

Usually, SaaS solutions reside in cloud environments that are scalable and have integration with other SaaS offerings. Comparing with the traditional model, users do not have to buy another server or software. They only need to enable a new SaaS offering and, in terms of server capacity planning, the SaaS provider will own that.

  1. New releases (upgrades)

SaaS providers upgrade the solution and it becomes available for their customers. Costs and effort associated with upgrades and new releases are lower than the traditional model that usually forces the user to buy an upgrade package and install it, or pay for specialized services to get the environment upgraded.

  1. Easy to use and perform proof of concepts

SaaS offerings are easy to use since they already come with best practices and samples inside it. Users can do proof of concepts and test the software functionality or a new release feature in advance. Also, they can have more than one instance with different versions and do a smooth migration. Even for large environments, users can use SaaS offerings to test the software before buy it.

But before moving to Cloud Based Model, Every Company Should Ask Itself 12 Cloud Questions

#1: Are you using the right tools? 60 percent of UK IT managers surveyed by The Register‘s cloud survey said they were using VPN connections, but only 34 percent said they were using cloud firewalls or encrypting data at rest. “The numbers continued to drop in regards to other preventative measures until the bottom of the list where only 15 percent percent said they were using obfuscation or tokenization of sensitive data,”

#2: What cloud technologies are being shared, and with whom? Cloud service providers often share infrastructure, platforms and applications to deliver their services in a scalable way.

“Whether it’s the underlying components that make up this infrastructure (e.g. CPU caches, GPUs, etc.) that were not designed to offer strong isolation properties for a multi-tenant architecture (IaaS), re-deployable platforms (PaaS), or multi-customer applications (SaaS), the threat of shared vulnerabilities exists in all delivery models,” writes the Cloud Security Alliance.

#3: How do you define and determine the best ways to deal with cloud abuse? The Cloud Security Alliance defines cloud abuse as “a bad guy using a cloud service to break an encryption key too difficult to crack on a standard computer. Another example might be a malicious hacker using cloud servers to launch a DDoS attack, propagate malware, or share pirated software.” 

#4: Do you allow employees to use their own devices? The rise of bring-your-own-device (BYOD) and bring-your-own-application (BYOA) means that many cloud services and tools are sneaking into organizations under the noses of IT leaders. In a recent survey, more than half of the IT respondents said that when it came to cloud services, the biggest challenge was assessing the security risk before employee adoption.

#5: Are you ready for next-generation technology and the Internet of Things (IoT)? Gartner predicts that the IoT market will grow to 26 billion units by 2020. With the proliferation of connected devices, is it any surprise that IT managers are increasingly concerned about the security risk of those devices?

#6: How do you protect credentials from theft? In 2010, Amazon was subject to a cross-site attack that used malicious scripts in a benign account to launch more attacks. Many companies are prohibiting the sharing of accounts and now require strong two-factor authentication techniques.

#7: When do you identify and stop malicious insiders? A 2015 Experian study claimed that employees, particularly those working remotely or using their own mobile device, accounted for more than half of security incidents last year. A current or former employee, contractor, or a business partner with access through IaaS, PaaS, SaaS or traditional infrastructure, can often be the source of an enterprise’s greatest risk.

#8: How do you handle the riskiest of apps, storage? Cloud-based storage applications have access to very sensitive corporate data, particularly financial data.

#9: Is your cloud service provider responsible for security? To fully secure data in the cloud, enterprise IT teams should never solely rely on their cloud provider. Ensure you have a solid security strategy in place that is agnostic to the location of your data and applications.

#10: How flexible and collaborative is your IT department in meeting the challenges associated with new technologies and quickly responding to security threats? The majority of IT managers are seeing a shift toward more collaboration and pooling of previously siloed resources, opening up opportunities for better cloud security measures.

#11: Are your cloud-based applications being monitored for inbound and outbound traffic anomalies? The difference between a minor incident and massive breach often comes down to the ability to quickly detect, contain and mitigate an attack. Analysts at the Ponemon Institute estimate it took retailers, on average, 197 days to identify an advanced threat and 39 days to contain it, while financial services organizations needed 98 days to identify and 26 to contain.

#12: What is your company policy when it comes to managing sensitive data and file sharing? On average, more than 25 percent of employees will upload files containing sensitive data to the cloud.

And When To Say No To The Cloud

With cloud adoption on the rise, should you jump on the bandwagon? Depending on your business needs, it may make more sense to wait.

Cloud adoption rates are increasing as more organizations turn to a truly distributed infrastructure model and use more WAN-based tools. As underlying hardware components become better and more bandwidth becomes available, cloud computing has become a legitimate consideration for a wide array of industry verticals. Everyone should be either adopting it or at least considering it, right? Not so fast.

In numerous conversations with different customers using various technologies, I hear a lot of discussion about cloud computing. However, these conversations are changing. Managers are no longer ask what the cloud is; now, they want to know whether they really need it.

The reality is simple: Some businesses just don’t.

The term “cloud” really just means data distribution over the WAN. This can be a private, public, or hybrid model. Because of the massive presence of the Internet, most organizations are already using the cloud without even knowing it. They’re utilizing cloud computing components within their means and needs.

On the other hand, some organizations keep a completely localized environment and only use WAN-based technologies to share files, store backups, or host sites on the Internet. Really, all of these technologies were available before the cloud became popular. Because of this, administrators are asking, “Why do I need more when I already have so much?” Depending on their business needs, they may be quite right.

Too many organizations get caught up in the hype of the cloud conversation without really doing a true cost/benefit analysis. This can involve several business stakeholders, interviews with internal and external resources, and a clear vision for where the organization is going. Instead of jumping on the cloud bandwagon, organizations should take the time and understand cloud pros and cons and how those fit with their business strategies.

There are distinct advantages to moving to a cloud model, including improved disaster recovery, backup and storage, testing and development, easier management, and enabling data center consolidation.

At the same time, it’s important to remember that cloud computing also has these drawbacks

  • Certain knowledge levels are required. Remember, the cloud isn’t just one platform — it’s a lot of different technologies all working together to bring you data. Your organization will need to have virtualization, application, security, and cloud experts on hand to guide the whole process along.
  • Management and monitoring can be a challenge. Improper resource allocation can make cloud computing a serious cost centre for any organization.
  • Security and data control, in some cases, may still be an issue for you. If you’re bound by some type of compliance requirements, putting data into the cloud can violate some rules. Also, the cloud can sometimes be a dangerous place. A recent Amazon Web Services console breachis evidence of just one instance where a DDOS attack impacted a major cloud provider.
  • Reliability isn’t a given. There have been major outages, which have forced some businesses to rethink the cloud model. For example, outages at AWS have caused companies like Netflix to go down for extended periods of time.

In some cases, a cloud model is just not the right fit. Whether it’s cost prohibitive or it just doesn’t provide any additional benefits, cloud computing may not be the right choice for the time being.

However, I’m not saying to be complacent. Complacency in IT can ruin an organization or a career. Take the time to fully understand the cloud model and how it may — or may not — fit your business.

As with any technology, there will be benefits and challenges. In some cases, moving to the cloud may just not be conducive to the goals of the organization. It’s quite possible that a company has no intention of expanding or moving its infrastructure to the Internet. Or there may not be a need to offload workloads into the cloud. Also, there may be other good technologies to help deliver data and content to the end-user.

The bottom line is: The cloud model is powerful, and many organizations are adopting some part of it. But with any tool, piece of software, or technological advancement, there needs to be a fit.

Thanks and Regards

Vijay Jain

+91 9870 291860



Microsoft Azure vs. Amazon Web Services: Cloud Comparison

Azure vs. AWS: Side-by-Side Feature & Services Comparison

While some of the features, services and options that you’ll find in Azure and AWS can’t be fully compared to one another, many come pretty close. Here’s our attempt at a side-by-side comparison between the two cloud platforms.


Microsoft Azure Amazon Web Services (AWS)
Available Regions Azure Regions AWS Global Infrastructure
Compute Services Virtual Machines (VMs) Elastic Compute Cloud (EC2)
Cloud Services
Azure Websites and Apps
Amazon Elastic Beanstalk
Azure Visual Studio Online None
Container Support Docker Virtual Machine Extension (how to) EC2 Container Service (Preview)
Scaling Options Azure Autoscale (how to) Auto Scaling
Analytics/Hadoop Options HDInsight (Hadoop) Elastic MapReduce (EMR)
Government Services Azure Government AWS GovCloud
App/Desktop Services Azure RemoteApp Amazon WorkSpaces
Amazon AppStream
Storage Options Azure Storage (Blobs, Tables, Queues, Files) Amazon Simplge Storage (S3)
Block Storage Azure Blob Storage (how to) Amazon Elastic Block Storage (EBS)
Hybrid Cloud Storage StorSimple None
Backup Options Azure Backup Amazon Glacier
Storage Services Azure Import Export (how to) Amazon Import / Export
Azure File Storage (how to) AWS Storage Gateway
Azure Site Recovery None
Content Delivery Network (CDN ) Azure CDN Amazon CloudFront
Database Options Azure SQL Database Amazon Relational Database Service (RDS)
Amazon Redshift
NoSQL Database Options Azure DocumentDB Amazon Dynamo DB
  Azure Managed Cache (Redis Cache) Amazon Elastic Cache
Data Orchestration Azure Data Factory AWS Data Pipeline
Networking Options Azure Virtual Network Amazon VPC
Azure ExpressRoute AWS Direct Connect
Azure Traffic Manager Amazon Route 53
Load Balancing Load Balancing for Azure (how to) Elastic  Load Balancing
Administration & Security Azure Active Directory AWS Directory Service
AWS Identity and Access Management (IAM)
Multi-Factor Authentication Azure Multi-Factor Authentication AWS Multi-Factor Authentication
Monitoring Azure Operational Insights Amazon CloudTrail
Azure Application Insights Amazon CloudWatch
Azure Event Hubs None
Azure Notification Hubs Amazon Simple Notification Service (SNS)
Azure Key Vault (Preview) AWS Key Management Service
Compliance Azure Trust Center AWS CLoudHSM
Management Services & Options Azure Resource Manager Amazon CloudFormation
API Management Azure API Management None
Automation Azure Automation AWS OpsWorks
Azure Batch
Azure Service Bus
Amazon Simple Queue Service (SQS)
Amazon Simple Workflow (SWF)
None AWS CodeDeploy
Azure Scheduler None
Azure Search Amazon CloudSearch
Analytics Azure Stream Analytics Amazon Kinesis
Email Services Azure BizTalk Services Amazon Simple Email Services (SES)
Media Services Azure Media Services Amazon Elastic Transcoder
Amazon Mobile Analytics
Amazon Cognitor
Other Services & Integrations Azure Machine Learning (Preview) None
None AWS Lambda (Preview)
None AWS Config (Preview)


Thanks and Regards

Vijay Jain

Leave a comment

Amazon Workspaces: You’re Desktop in the AWS Cloud

Amazon Workspaces: You’re Desktop in the AWS Cloud


The cloud-based virtualized desktop is, according to many cloud experts, the up and coming next step toward a complete takeover of all of our computing activities by the cloud. Virtualized desktops hosted in the cloud can take two different forms:

  • VDI (Virtualized Desktop Infrastructure) that is provided over the Internet, or
  • Desktop as a Service (DaaS), which is still a form of virtualized desktops but is a true multi-tenant cloud service

Persistent desktops: what problems do they solve?

The lack of consistency has long been a source of frustration for computer users, and it’s a pain to have to either spend time changing settings or adapt to a different (even if only slightly) interface when switching from one device to another. It’s not uncommon, either to find that the document a user was working on with one device was saved to that local hard drive and either isn’t available at all on the current device (and in a worst case scenario, will have to be recreated), or the user must lose productivity time to establishing a connection back to the home or work network where the document is stored in order to retrieve it and continue working on it.

Another common scenario is that the user has access to his or her documents (perhaps because they’re stored on a cloud service, perhaps because the user transferred a copy by putting it on a USB drive or sending it to him/herself via email) – but then finds that the application that’s needed to work with it isn’t installed on the new device. Granted, this is less of an issue that it used to be, now that functional online versions of Microsoft Office programs or Google docs can be used from any machine, and many mobile apps can be downloaded quickly and easily (and without paying for them again if the user already owns them) from a mobile OS vendor’s Store – but it still happens. This is particularly true in the case of custom line-of-business applications.

A big advantage of persistent desktops is that both the user’s applications and the user’s data are always in the same place and accessible through the virtualized desktop, so that switching from one device to another suddenly becomes a seamless experience.

From the company’s point of view, the ability to quickly deploy desktops to new users can be a big plus, especially in special cases such as mergers and acquisitions, which seem to be increasingly common in many industries these days. You can bring in a large number of new employees and quickly get them up and running.

The other big thing for administration is that deploying virtual desktops gives you more control over them, more easily, than you might have with dozens, hundreds or even thousands of individual computer desktops. And this is true independently of the hardware. That is, users can bring their own devices – laptops and tablets – or work from their home desktop systems, and you still have control over their work desktops that they’re accessing with those devices. What’s not to like?

Concerns and issues surrounding Desktop as a Service

In spite of the benefits of delivering user desktops as a service over the Internet, as described above, there are downsides, as there are with any technology. Some users and IT pros may be resistant to the idea of a desktop that lives in the cloud. There may be concerns around security and privacy, which are common in relation to any transition to a cloud computing experience. In the case of the desktop, reliability and accessibility might also be an issue for some users, who fear that a loss of Internet connection (or just a loss of connectivity between the user’s machine and the server) could result in an almost total loss of productivity since the desktop is the location, for the average user, where everything lives. It’s their window to the computing world and if that window is closed for any reason, they may feel lost.

Admins may have the same and/or different concerns about DaaS. The major cloud providers offer SLAs that generally start at around “three nines” or 99.9% up time. That’s pretty standard throughout the industry and it sounds good – but in reality that translates to about eight and three quarters hours of down time per year, or almost forty-four minutes per month. While that’s not a lot, that much time without the use of his/her desktop could make a big difference if a particular user happens to be working to a very tight deadline on a critical project at the time that the cloud service goes down.

Another common problem is adapting to the occasional (or sometimes frequent) latency issues that can plague DaaS implementations. Latency doesn’t just add up to a performance hit – it also makes for a frustrating experience for users who aren’t used to the “sit and wait” situation when performing tasks on their desktops. There has to be enough bandwidth to give users an experience that’s the same as or close to what they’re used to when working on a local desktop, because otherwise you’ll end up with very unhappy users.

Different types of applications are more or less affected by latency (or perhaps more accurately, the effects of latency will be more or less noticed by users). Real time communications and collaboration tools such as Skype are noticeably affected, as are multi-media applications that involve high quality video. Applications such as email, or browsing low bandwidth web sites (mostly text and photos), on the other hand, won’t be noticeably affected.

Weighing the pros and cons

Many companies are coming to the conclusion that the drawbacks of putting user desktops in the cloud are outweighed by the benefits, and in particular the cost benefits. Providing the CPU, RAM and disk space for individual workstations can be much more expensive than virtualizing those resources, and DaaS solutions generally add up to significant cost savings over on-prem VDI for most organizations, due to the economies of scale and the difference in expensive administrative overhead as well as the capital expenditure required for the latter.

DaaS, like other “as a service” computing, cuts the need for capital investments and shifts that cost to fixed and predictable on-going monthly or annual fees, thus moving big chunks of budget from CapEx to OpEx (capital to operational expenses). It also provides fast scalability (both up and down) and fits better into today’s “agile” model of doing business.

Once you’ve decided that DaaS is the right option for your org, you’re faced with the challenge of evaluating different DaaS providers and determining which is right for your needs. A comprehensive comparison of DaaS providers is beyond the scope of this article, but many companies today are using Amazon’s AWS (Amazon Web Services) for their IaaS, PaaS, cloud storage, and other cloud computing needs. If you’re already an AWS customer or if you’re considering AWS services in general, it makes sense to check out their DaaS offering when you decide to put some or all of your users’ desktops into the cloud.

Amazon’s DaaS: Workspaces in the AWS cloud

Amazon’s DaaS offering is called Amazon WorkSpaces. This can be a lower cost alternative to expensive and difficult to configure (and manage) VDI deployments in your on-premises data center, while giving users similar functionality. The nice thing for users about VDI, in comparison to traditional local desktops, is that they are able to have the same experience and interface regardless of whether they’re connecting from a PC at the office, a home computer or a laptop. They can even get that same computing environment when using a Mac or iPad, a Chromebook, or an Android tablet (including, of course, Amazon’s own Kindle Fire), as Workspaces supports all of these.

Workspaces is a robust DaaS solution that will work in conjunction with your company’s Active Directory, making it easy for users to sign onto their desktops with their current credentials that they use in the enterprise. It also makes things easy for admins, taking much of the burden of deploying and managing VDI off of you; Amazon takes care of such tedious tasks as desktop OS patching, and there are a number of different “bundles” of the service that you can subscribe to, depending on what your hardware and software needs are.


But what is it really going to cost?

We all know that service providers’ claims of gargantuan savings by adopting their services sometimes pan out and sometimes they don’t. Hosted desktops have been touted in many circles as a way to lower your TCO but sometimes there are hidden costs. In general, DaaS is more cost effective than VDI, which can be difficult to scale and upgrade because it’s usually built on enterprise (vs. cloud) infrastructures.

Within the DaaS options, though, there is a wide variance in pricing structures and ultimate costs. Some DaaS providers set a minimum number of desktops that you have to order, or set minimum usage requirements. Some charge licensing fees for the operating system separately. Others require that you commit to a long term contract (one year is common) so you’re locked in for that time period even if you find the DaaS solution doesn’t meet your needs.

When Amazon first released their WorkSpaces service in 2013, Gene Marks over at asked if it was Too Good to Be True, and in the end concluded that the cost – which on the face of it seemed considerably lower than that of the company he was using to host his 10-person company’s applications – after adding in the cost for Exchange and migrating databases would end up at close to the same per-month per-user outlay.

That was two years ago and the market has matured somewhat over that time. Amazon offers three WorkSpaces bundles as well as a couple of applications options, which we will look at now.

AWS WorkSpaces pricing options

As a cloud service, Work Spaces is a subscription that you pay for on a per-desktop per-month basis. Unlike with some services, you are not required to sign a contract that locks you into Work Spaces for any set period of time. You can delete some or all of your Work Spaces as your changing needs dictate and you’re charged only for those Work Spaces that you use that month. That means if you have a user who takes a two-month leave of absence and that Work Space is never launched, you won’t be billed for it. That’s a big advantage over deploying your own desktops (for instance, from a local Remote Desktop Server) since you would be incurring the costs of that desktop whether or not it was used.

In order to provision Work Spaces to your users, you have to have an AWS account; it’s not a standalone service. Of course, Amazon offers free accounts for one year that include 750 hours of EC2 usage (Windows or Linux t2 level, 5 GB of S3 storage as well as RDS (relational database) and Dynamo DB (No SQL database) and a number of other services. Note that after one year, you have to pay regular rates, and you have to sign up with a credit card to get the free trial, since you’re charged if you exceed the usage caps. Individual users themselves do not need to have AWS accounts.

The basic Work Space for each user runs the Windows 7 desktop operating system experience, running on Windows 2008 R2 servers and has Internet Explorer 11, Firefox and 7-Zip installed already. You can install your own software. You do this using Amazon WAM (Work Spaces Application Manager), which comes in two versions, lite (which is free) and a more full-featured standard version, which costs $5 per month per user. We’ll discuss later in this article series how to use WAM to add software to Work Spaces.

Cost per user

The cost per user for Work Spaces depends on the hardware configuration that you need for each desktop. That, of course, is dependent on what software the users use, how much data they need to store, the number of applications they need to be able to run at the same time, and so forth. In other words, will the work scenarios be light usage, typical/average office usage, or heavy usage with resource-intensive applications. If the users only need to check email and do web searches, hardware requirements are minimal. If the users will be working with video editing or CAD programs or other “power user” type use cases, they will need more processor, memory and storage.

Amazon offers three different hardware configurations, which Amazon refers to as “bundles”:

  • For light users, the Value package provides one virtual processor, 2 GB of memory and 10 GB of storage. This is similar to a low-end PC or a mid-range tablet, although most top tier smart phones today actually have more memory and storage than this. The price is $21 per user per month if you “BYOL” (bring your own license, discussed in more detail below) or $25/user/month if you don’t already have the requisite Windows 7 licenses for your users.
  • For the average user, the Standard package will run you $10 per month per user. For that extra cost, you double the CPU to two virtual processors, double the RAM to 4 GB and increase the storage space five-fold to 50 GB. This should suffice for most office productivity programs and communications programs.
  • If you have power users who need to do heavy lifting from their desktops, then you’ll want to check into the Performance package. It’s pricey at $56/user/month with your own license or $60 without, but it ups the memory to 7.5 GB and increases storage space to 100 GB, which is enough to get some serious work done. CPU stays the same at two virtual processors.

The pricing mentioned above is for customers in the U.S./North America. Pricing in Europe is different (slightly higher) and a bit higher still in Asia Pacific regions (Sydney, Tokyo, Singapore).

As mentioned above, you can install your own software. Another option is to purchase the “Plus” add-on to any of the three packages. For an extra $15 (in all regions), you get Microsoft Office Professional Trend Micro Security Services already installed.

You can also create custom images that you configure with the desired applications and settings and then deploy to your users. We’ll talk about how to do that later. Admins can create as many as five custom images for an AWS account (per region). You can install any software you want that is compatible with Windows 7, but of course you are responsible for having the proper licenses for the programs that you install.


The Bring Your Own License option is for those organizations that already have licenses for Windows 7 through a Microsoft Volume Licensing agreement with a software assurance contract. Doing this saves you money (about $4 per user per month) but it also makes getting started with WorkSpaces a little more complicated. In practice, you’ll probably need to work with your company’s Microsoft volume licensing representative (to verify that your licenses are eligible for BYOL) and with the AWS account manager for the particulars on uploading your Windows 7 images and making an AMI (Amazon Machine Image). All of this typically takes a week or two so you might not be able to get started immediately as you can without BYOL.

Windows 7 Professional or Enterprise edition can be used to create your AMI. After you import the image, you’ll need to build a custom bundle that includes that image. You’ll need to activate the OS, which can be done with Microsoft activation servers within your virtual private cloud (VPC) or that can be accessed from your VPC. We’ll go into deeper detail about how to do all this in a later section.

Note that unlike the non-volume licensed WorkSpaces, there is a minimum number of WorkSpaces that you have to launch per month in order to use the BYOL option. At the time of this writing, that number is 200. For this reason, as well as the VL agreement requirement, BYOL is only feasible for large organizations, not for small businesses.

The cloud security dilemma

Security is always a consideration when facing the “to the cloud or not to the cloud” choice. Data breaches have become so commonplace that they’re almost not even “news” anymore, but they continue to dominate the headlines; a study from the Ponemon Institute found that more than 40% of companies experienced some type of breach in 2014, and this included big names such as Morgan Chase, Home Depot and of course the infamous Target case.

Organizations are terrified of being the next victim (and the loss of customers that can result from the bad publicity). Data leaks can occur in many different ways and employees who access sensitive data on their desktops present one attack vector. While virtual desktops have security advantages, they can bring new challenges. This is where the DaaS provider you choose can make a difference.

In search of a secure Work Space

In looking at the security of a DaaS solution, the questions that you want to ask and the features that you want to look for are similar to those you must consider with any cloud service. A secure DaaS implementation hinges on a number of factors:

  • Secure logon to the service: user authentication must be strong in order to prevent unauthorized users from accessing desktops where they can access sensitive information.
  • Reliable identity services: regardless of the strength of the authentication protocols, authentication is built on the foundation of identity, so the identity database itself must be secure.
  • Encryption: when the entire desktop is being delivered to the user over the Internet, it should be encrypted to prevent interception by unauthorized persons.
  • Effective key management: the keys used to encrypt the virtual drives on which the desktop “lives” must be protected.
  • Physical security: the servers in the provider’s datacenter where the applications actually run have to be secured from access by unauthorized or malicious persons, both internal and external.

AWS Work Spaces security

With AWS WorkSpaces, Amazon implements a number of different security mechanisms in an effort to address the above issues.

Log-in security

WorkSpaces admins can choose from a few different ways to allow their users to log onto the WorkSpaces desktops. The simplest is to have users create credentials (user name and password) of their choice after you provision their desktops. Most medium to large (and many small) organizations will already have an Active Directory deployment and you can integrate WorkSpaces with your Active Directory domain to make it easy for users – they sign in with their familiar AD credentials.

Identity and authentication

Users can log on with credentials stored in a directory that’s maintained and managed by Amazon on their servers, or with AD credentials, depending on how WorkSpaces has been configured. Active Directory is, of course, the standard identity repository on Windows-based networks and this is true as well for Amazon’s cloud-based services that are integrated with an organization that has integrated its on-premises AD with AWS.

You can integrate WorkSpaces with your RADIUS server if you have one. Amazon added this feature in August of 2014 and Microsoft RADIUS servers are supported, along with others. For redundancy and high availability, you can set it up to use multiple RADIUS servers, with or without a load balancer.

Admins configure the RADIUS integration through the WorkSpaces admin console (in the Directories section) and there’s no extra cost. You’ll need to configure the IP address(es) for your RADIUS server(s) or load balancer, the port your RADIUS server uses, a shared secret, and select the protocol you set up for your RADIUS endpoints. You can also configure server timeout in seconds and maximum number of retries to connect to the RADIUS server (up to 10).

To log in, users provide their AD user name and password and then enter a one-time passcode generated by a hardware or software token, giving the protection of multifactor authentication (MFA).

When using MFA, you can use either hardware or software tokens with Amazon’s MFA. Google Authenticator is a popular software based solution. If your RADIUS server is running on Linux, you can use a Pluggable Authentication Module (PAM) library to enable the use of Google Auth. MFA works for users who access WorkSpaces through client devices running Windows, Mac OS X, Chrome OS, iOS, Android or Kindle OS.

Controlling user access

You can limit the access that your users have to applications and other resources from their WorkSpaces.

It’s easy to keep a user from accessing his/her WorkSpace if the person leaves the company or for some other reason needs to be blocked permanently or temporarily; you simply disable the account in whichever directory is storing the user identities (your Active Directory if you’ve integrated AD with WorkSpaces or the Amazon directory service if you haven’t).

Note that Amazon Identity and Access Management (IAM) users are not given access to WorkSpaces resources by default. You probably already know that IAM is a means for allowing and denying permissions to resources via policies that can be attached to individual users, groups, or the resources themselves.

You would have to make a policy in IAM that grants the specific users permission to create and manage resources for WorkSpaces and EC2. Then you need to attach the policy to whichever users (or groups of users) you want to be able to access the WorkSpaces resources.

Amazon provides in their documentation a sample policy statement that can be used to grant permission to perform all WorkSpaces tasks for IAM users. You’ll find that sample script, along with more information on specifying WorkSpaces resources in IAM policies, on the AWS web site.

You can also control and limit access to network resources (including resources that reside on the Internet) from WorkSpaces by using VPC security groups. You might remember that VPC security groups behave sort of like virtual firewalls, because they control the inbound and outbound traffic to AWS virtual private clouds.

WorkSpaces will create a security group that’s assigned to all of the WorkSpaces you have provisioned to users in your directory. You can also create additional security groups, through the WorkSpaces console. If you’re going to want to allow Internet access from WorkSpaces, you need to assign a public IP address, and you need to set this up before you provision the WorkSpaces because it will only apply to those that are created after you enable this setting. If you already have WorkSpaces provisioned, it is possible to manually assign those WorkSpaces an Elastic IP address.Here are the instructions on how to do that.

Software Security

As all IT professionals know, one of the most important aspects of computer and network security is the patching of vulnerabilities in the software as quickly as possible, before their existence becomes widely known and attackers seize the opportunity to exploit them. WorkSpaces desktops are running popular applications on a popular client operating system and so security updates are just as important for these virtual desktops as they are for any network client.

Amazon gives you, as the WorkSpaces admin, control over the installation of security patches on the users’ WorkSpaces. This can be done through the Windows Update service that’s built into all modern versions of Windows, and Windows Update is turned on by default on all new WorkSpaces. If you prefer, however, you can use a patch management solution of your own choice, both to update Windows and Microsoft applications and to update third party apps.

Another “must have” for best security is anti-virus/anti-malware and you can install your favorite AV/AM software on the users’ WorkSpaces just as you install them on Windows client computers on your premises. You get Trend Micro AV as part of the package if you purchase one of the WorkSpaces “Plus” bundles (Value Plus, Standard Plus or Performance Plus), along with the Microsoft Office applications.

AWS Workspaces finally gets VoIP integration

The signs that unified communications is moving to the cloud continue to come. Now, it’s Amazon Web Services that’s getting in on the game – not so much as a provider, but rather integrating its existing virtual desktop infrastructure offering, Workspaces, to more easily integrate with VoIP and UC.

Now, organizations can add VoIP and UC capabilities to their cloud-based virtual desktops. According to an article on The Register, the new feature works by taking audio from any client – say, Skype for Business or WebEx – and run it through Workspaces.

This could make the lives of engineers and administrators a little easier while also making the VDI service far more useful. AWS launched Workspaces back in 2013, and there are examples of attempts to install softphones on it almost from the beginning (at least from 2014). But Workspaces wasn’t equipped to connect to audio devices, making it less than ideal for organizations that not only wanted the benefits of VDI, but also VoIP or full-on UC suites.

Thanks And Regards

Vijay Jain

Leave a comment

What is Windows Azure Pack?

 Windows Azure Pack

Cloud computing is making big inroads into companies today. Smaller businesses are taking advantage of Microsoft cloud services like Windows Azure, Windows Intune and Office 365 to migrate their line-of-business applications and services to the cloud instead of hosting them on-premises. The reasons for doing this include greater scalability, improved agility, and cost savings.

Large enterprises tend to be more conservative with regards to new technologies mainly because of the high costs involved in widespread rollout of new service models and integrating them with existing the organization’s datacentre infrastructure. Windows Azure Pack is designed to help large enterprises overcome these obstacles by providing a straightforward path for implementing hybrid solutions that embraces both the modern datacentre and cloud hosting providers.

What is Windows Azure Pack?

To understand what Windows Azure Pack is, you first need to be familiar with Windows Azure, Microsoft’s public cloud platform. To understand what Windows Azure is all about, here are some brief excerpts from my recent book Introducing Windows Azure for IT Professionals: Technical Overview from Microsoft Press:

As a cloud platform from Microsoft that provides a wide range of different services, Windows Azure lets you build, deploy, and manage solutions for almost any purpose you can imagine. In other words, Windows Azure is a world of unlimited possibilities. Whether you’re a large enterprise spanning several continents that needs to run server workloads, or a small business that wants a website that has a global presence, Windows Azure can provide a platform for building applications that can leverage the cloud to meet the needs of your business…

Let’s look at the definition that Microsoft uses for describing Windows Azure:

Windows Azure is an open and flexible cloud platform that enables you to quickly build, deploy, and manage applications across a global network of Microsoft-managed datacentres. You can build applications using any language, tool, or framework. And you can integrate your public cloud applications with your existing IT environment.

This definition tells us that Windows Azure is a cloud platform, which means you can use it for running your business applications, services, and workloads in the cloud. But it also includes some key words that tell us even more:

  • Open– Windows Azure provides a set of cloud services that allow you to build and deploy cloud-based applications using almost any programming language, framework, or tool.
  • Flexible– Windows Azure provides a wide range of cloud services that can let you do everything from hosting your company’s website to running big SQL databases in the cloud. It also includes different features that can help deliver high performance and low latency for cloud-based applications.
  • Microsoft-managed– Windows Azure services are currently hosted in several datacenters spread across the United States, Europe, and Asia. These datacenters are managed by Microsoft and provide expert global support on a 24x7x365 basis.
  • Compatible– Cloud applications running on Windows Azure can easily be integrated with on-premises IT environments that utilize the Microsoft Windows Server platform.

So what is WAP really ?

  • Windows Azure Pack’s main entry point is the portal or ‘Control Panel’ – a better known term in hosting industry.
  • Windows Azure Pack Portal brings together plethora of services and products to provide a single place to provision and manage resources for an enterprise or a hosting provider.

Services that are available out of the box with WAP:

  • Virtual Machines (via System Center VMM integration)
  • Websites (distributed, multi-tenant, highly available web hosting service)
  • Database (via SQL Server and My SQL)
  • Automation (via System Center Runbooks)

What is interesting about WAP Portal is, like never before you now have brought different Microsoft products under one single umbrella both for the IT Administrators and consumers such as development teams.

While this is not what Windows Azure Pack is mainly advertised for, this makes life so much simpler for administrators and consumers alike.

Take for example, the traditional way to get a database server and a web server, you send requests to IT department, goes through approval process, procurement, allocation, management, windows updates yada yada yada. With Windows Azure Pack, all these types of assets can be provisioned, managed and monitored in one central place.

Windows Azure Pack Portal serves as the single place where the following Microsoft products come together:

  • System Center VMM, Hyper-V
  • IIS web Server (yes it does)
  • SQL Server
  • Azure Service Bus

In addition to these Microsoft products getting delivered via the portal, WAP Portal is also highly extensible for 3rd party services and products to integrate with. The out of the box My SQL database as a service is one example. Cloud Cruiser billing platform integration is another example of 3rd party service. If you are a hosting service provider you can differentiate quickly by integrating your services with WAP or vice versa with Custom Resource Provider extensions.

Windows Azure provides businesses with four basic categories of cloud-based services:

  • Compute services
  • Network services
  • Data services
  • App services

At the core of the Windows Azure platform is its ability to execute applications running in the cloud. Windows Azure currently provides four different models for doing this: Web Sites, Virtual Machines, Cloud Services, and Mobile Services. Together these four approaches comprise the compute services portion of the Windows Azure platform, and they can either be used separately or combined together to build more complex solutions that can meet specific business needs…


Windows Azure Web Sites is a scalable, secure, and flexible platform you can use for building web applications that run your business, extend the reach of your brand, and draw in new customers. It has an easy-to-use self-service portal with a gallery of the world’s most popular web solutions including .DotNetNuke, CakePHP, DasBlog, WordPress, and many others. Or you can simply create a new website from scratch and then install a tool like WebMatrix—a free, lightweight web development tool that supports the latest web technologies such as ASP.NET, PHP, HTML5, CSS3, and Node. You can use WebMatrix to create websites and publish applications for Windows Azure. And if you use Microsoft Visual Studio as a development environment, you can download and install a Windows Azure SDK so you can build applications that can take advantage of the scalable cloud computing resources offered by Windows Azure…

Creating a new website with Windows Azure is so easy we have to show you how to do it. Begin by logging on to the Windows Azure Management Portal at using your Microsoft Account username and password. Then select the Web Sites tab on the left and either click Create A Web Site or click the New button on the command bar at the bottom as shown here:

Figure 1: You can create a new website using Windows Azure



The command bar then expands, as shown in the next figure, and allows you to quickly create a new website with no additional configuration, a custom website with either a new or existing database, or a new web application based on an application framework, blog engine, template, or any other app available in the Windows Azure gallery.

Figure 2: The Quick Create option for a Web Site


Windows Azure Virtual Machines is a scalable, on-demand IaaS platform you can use to quickly provision and deploy server workloads into the cloud. Once deployed, you can then configure, manage, and monitor those virtual machines, load-balance traffic between them, and connect them to other Windows Azure Cloud Services running web roles and worker roles. You can copy virtual hard disks (VHDs) from your on-premises environment into Windows Azure to use as templates for creating new virtual machines. And you can copy VHDs out of Windows Azure and run them locally in your datacenter.

You can create new virtual machines from a standard image available in the Windows Azure gallery. Standard images are included for current versions of Windows Server and for different flavors of Linux. Standard images are also available for Microsoft SharePoint, Microsoft SQL Server, and Microsoft BizTalk Server pre-installed on Windows Server. Standard images are a great way of quickly provisioning new virtual machines, but you can also use images you created on-premises to deploy new virtual machines.

Creating a new virtual machine in Windows Azure is easy. Just open the Windows Azure Management Portal and select the Virtual Machines tab on the left and click the New button in the command bar at the bottom. The command bar expands and displays two options for creating virtual machines: Quick Create or From Gallery.

The Quick Create option lets you create a new virtual machine which you can configure later. As Figure 3 shows, all you need to specify for this option is the DNS name for your virtual machine, the image to use as a template for your virtual machine, the size of the virtual machine (number of cores), a user name and password for administrative access to the virtual machine, and the region or affinity group to which the virtual machine should be assigned:

Figure 3: The Quick Create option for a Virtual Machine

The other option, called From Gallery, lets you create a virtual machine by specifying advanced options presented in a series of pages. The first page shown in Figure 4 allows you to choose an image to be used as a template when creating your virtual machine…



Figure 4: You can choose an image on which your new virtual machine will be based.


Windows Azure Pack vs. Windows Azure

Let’s review the definition that Microsoft uses for describing Windows Azure:

Windows Azure is an open and flexible cloud platform that enables you to quickly build, deploy, and manage applications across a global network of Microsoft-managed datacenters. You can build applications using any language, tool, or framework. And you can integrate your public cloud applications with your existing IT environment.

Now let’s examine how Microsoft describes Windows Azure Pack. First, here’s how they define Windows Azure Pack on their Server and Cloud Platform site:

The Windows Azure Pack is a collection of Windows Azure technologies available to Microsoft customers at no additional cost. Once installed in your datacenter, the Windows Azure Pack integrates with System Center and Windows Server to help provide a self-service portal for managing services such as websites, Virtual Machines, and Service Bus; a portal for administrators to manage resource clouds; scalable web hosting; and more.

Next, here’s how Microsoft defines Windows Azure Pack in the TechNet Library:

Windows Azure Pack for Windows Server is a collection of Windows Azure technologies, available to Microsoft customers at no additional cost for installation into your data center. It runs on top of Windows Server 2012 R2 and System Center 2012 R2 and, through the use of the Windows Azure technologies, enables you to offer a rich, self-service, multi-tenant cloud, consistent with the public Windows Azure experience.

Comparing these various definitions and reading the linked resources enables us to conclude the following about how Windows Azure Pack compares to Windows Azure:

  • Both platforms provide a set of cloud services that allow you to build and deploy cloud-based applications using almost any programming language, framework, or tool. But while Windows Azure provides a broad range of several dozen different cloud services, Windows Azure Pack provides only a subset of these services, primarily Web Sites, Virtual Machines and Service Bus.
  • Cloud applications running on either platform can easily be integrated with on-premises IT environments that utilize Windows Server to enable you to build hybrid solutions.
  • While Windows Azure is hosted in globally distributed datacenters managed by Microsoft, Windows Azure Pack is something you can deploy within your own datacenter.

Thanks And Regards

Vijay Jain

Leave a comment

Creating a Cloud Readiness Assessment


It’s a lot easier to move your infrastructure into the cloud than have to migrate everything back into a private data centre. The idea is to make sure you deploy the right workloads and have the correct deployment methodology throughout the entire process.

When cloud computing started getting popular, organizations began pushing more of their environments into a public or hybrid cloud model. Although this was absolutely a great move by many of these businesses, some began to feel the pains of putting the wrong application or database into a public cloud. User, data, and workload proximity are critical, as is deploying the right workload against the proper type of cloud model.

Before you migrate a workload into a colocation or public cloud provider space, there are some key infrastructure aspects to consider. One of the best ways to prep your entire organization for a potential cloud move is to utilize a cloud readiness assessment. Working with a cloud-ready partner can really help this process along. Here’s the challenge: every business and every data center is unique. However, the methodology around a readiness assessment can be standardized to some extent.

That said, here are some key points to consider in a Cloud Readiness Assessment Project:

  • Your business model and goals. It’s hard to narrow this down in just one article, but the first thing to understand will be your current business model and where your organization is headed. Are you planning aggressive expansion? Are you planning on taking on additional users or branches? Are you deploying a new type of application or product? Are there core reasons to move an application, data set or entire platform into the cloud? Through research and working with cloud and industry professionals, you’ll be able to create a business model that will scale from your current platform into the cloud. Here’s why this is important: ROI. Through your use-case and business model analysis, you may very well find that moving to a cloud platform is not financially conducive. Or you might require a different approach.
  • Your user base. In today’s ever-evolving technology world, the end-user has become even more critical. The always-on generation is now demanding their data anywhere, anytime, and on any device. How capable will your cloud platform be to deliver this rich content to your end-users? How well can you ensure an optimal user experience in the cloud? During your assessment, take the time to do a very good user survey. Find out how they compute, which devices they use, and what resources they are accessing. The last thing you want to do is build a cloud platform without direct end-user input.
  • Your existing physical infrastructure. Are you sitting on new gear or are you overdue for a hardware refresh? All of this is part of the cloud assessment process. Your ability to replicate into the cloud will be directly impacted by your current underlying physical environment. The reality is simple: if your gear is extremely outdated, you may need to fix some in-house issues before moving into the cloud. A workload running on a certain type of physical system now may behave very differently in the cloud later. If your environment is pretty much new, consider various cloud options. In some cases, organizations ship their own servers into a cloud provider’s data center. The need to upgrade or implement new hardware requirements can definitely add to the bottom line of any cloud migration project.
  • Your existing logical infrastructure. We operate in a virtualized world. Software-defined technologies, advanced levels of virtualization, cloud computing, and mobility are all influencing our data center and business models. With that in mind, a cloud readiness assessment must scale both the physical and logical aspects of your environment. Are you already virtualizing your applications? How old are those apps? Can pieces of your environment even run on a cloud platform? For replication purposes, do you need to upgrade your own virtual systems? Even beyond the physical aspect – working with the data side of your environment is going to be the most challenging. Applications, their dependencies, and the data associated with it all are important considerations during an assessment.
  • Selecting the optimal cloud option. The progression of cloud infrastructure offers an organization a number of options. Colocation, various cloud models, and even the hybrid approach are all viable for the modern business. The important piece is selecting the right option. To give you a realistic perspective, in some cases it makes sense to build out your own data center because your business model, user-base, and future business goals all require it. The point is that there are a number of options to work with.

The cloud can be a powerful tool. Already, many organizations are building their business process around the capabilities of their technology platform. As always, any push towards a new infrastructure will require planning, and a good use-case analysis. In the case of cloud computing, running a cloud readiness assessment can save quite a few headaches in the future. Basically, you’ll be able to better understand your current capabilities and what the optimal type of infrastructure would be. Ultimately, this helps align your IT capabilities directly with the goals of your organization.


Leave a comment

Private vs. Public vs. Hybrid Cloud: Which One to Choose?

Private vs. Public vs. Hybrid Cloud: Which One to Choose?

Most enterprise IT departments now manage applications across multiple environments in a dizzyingly complex overall IT architecture. They also must constantly reevaluate their unique mix of on-premises, private cloud and public cloud infrastructure to meet new business goals and determine how applications can be migrated to the public cloud in a cost-effective way.

This is no small feat. Dozens or even hundreds of applications built at different times, in different languages, and by different teams need to be evaluated for migration to the cloud, which often requires deep knowledge of the existing IT infrastructure as well as the public cloud resources that could replace these functions.

Ultimately, enterprises must determine the hosting solution that suits each application: on-premises, private cloud, public cloud, or hybrid cloud. Below we outline some basic considerations and cloud comparisons, as well as best practices for how to integrate and manage these complex deployments.

Public Cloud

By now, most organizations understand the cost benefits of an IaaS provider like Amazon Web Services, including a low and predictable cost of ownership and a shift from a capital expenditure to an operating expenditure. This makes it possible to significantly reduce an organization’s upfront costs, its on-going costs of IT labour and potentially its tax liability.

The technical benefits are equally attractive: scalability, automated deployments, and greater reliability, to name a few. There are also very few technical limitations that would prevent an organization from moving their infrastructure to AWS; almost every function a traditional resource supports in the private cloud or in a datacenter could be replicated in AWS.

These application tiers are especially well suited to the public cloud:

  • Long-term storage, including tape storage, which has significantly more cost-effective solutions in AWS (Glacier and Storage Gateway’s Virtual Tape Library)
  • Data storage of any kind, especially if you are currently hosting physical media that fails often or needs to be replaced (S3 is an infinitely expandable, low-cost storage resource)
  • The web tier of an application that is bursty or highly seasonal (EC2, Auto Scaling, ELBs)
  • The web tier of an application that is mission-critical or latency-intolerant (Custom Auto Scaling groups and automated deployments with Puppet scripts)
  • Any new application that demand is uncertain for, especially for microsites or other interactive properties for marketing and ad campaigns
  • Testing environments, due to the fact that it is so much easier to spin up and down instances for load testing.

Enterprises must then decide whether they want to manage their public cloud infrastructure themselves or outsource it to a managed cloud services provider. A managed cloud services provider can maintain the entire cloud infrastructure (web servers, application servers, load balancing, custom failover scripts) and some may also be able to integrate with on-premises or private cloud solutions to provide a single monitoring interface.

Note that compliance requirements no longer necessitate a private cloud solution rather than a public cloud solution. AWS has been on the leading edge of compliance in the cloud for several years, and while there is lingering skepticism, the adoption of AWS cloud by the largest and most complex healthcare and financial institutions is a indication of the degree to which AWS ensures compliance and security in the cloud.

Private Cloud

Although there are many advantages to the public cloud, enterprises very rarely deploy 100% of their applications into the public cloud. Logistically, it is often much simpler to move from your on-premises environment to a private cloud than from on-premises to public cloud.

Private cloud environments can be configured to support any application, just as your datacenter currently hosts it. Private cloud is an especially attractive option if certain features in legacy applications prevent some applications from operating well in the public cloud.

Hee are some indicators that your application would be a good candidate for maintenance in a private cloud:

  • You are using Oracle RAC (shared storage) and require dedicated infrastructure for compliance. The shared storage equivalent in AWS, RDS, is not HIPAA-compliant.
  • You need high performance access to a file system, as in a media company that creates or produces large video files.
  • An application is poorly written and infrequently used, and therefore not worth the effort of migrating to the public cloud.
  • The application has very predictable usage patterns and low storage costs.
  • An application is unstable and heavily trafficked, but current IT staff is unfamiliar with the application. This may instead be a case for partial rewriting in the cloud.
  • The engineering team responsible for maintaining the application is not equipped for migrating the application in a cost-effective time frame. This may instead be a case for bringing on a managed cloud service provider.

A private cloud solution can be implemented in your on-premises datacenter with a virtualization layer such as VMware, though many mid-sized and large enterprises let a managed private cloud services provider maintain servers, storage, network, and application infrastructure.

On-Premise Servers

While cloud-based infrastructure has many advantages, there are some applications that would see little to no cost benefit from migrating to the cloud. This is usually the case when you have invested significant capital in on-premise infrastructure, such as high-performance databases, that are specially configured to support that application.

Here are some situations where on-premises infrastructure might work best for your application:

  • The cost savings of cloud storage and compute resources do not outweigh significant capital in on-premise solutions
  • Your application already sees high performance and high availability from custom infrastructure
  • You produce large multimedia files that your in-house staff needs low-latency access to for editing purposes
  • An email platform that is high-volume, time-sensitive, and confidential. For example, some brokerage houses send very large volumes of email early each trading day.

Applications that meet these requirements are often not well-suited to the cloud. Often it would be wiser financially to maintain the infrastructure until its value has depreciated.

Hybrid Cloud

Ninety percent (90%) of enterprises say they are going to pursue a hybrid cloud solution this year. As explained above, enterprise architecture is often so complex that a hybrid cloud solution — where public, private or on-premises infrastructure supports a single application — is the best solution.

Hybrid architectures are especially attractive for large organizations that want to explore the flexibility and scalability of the public cloud. An audit will not always reveal how an application will perform in the public cloud, so enterprises choose to test a single tier in the public cloud while maintaining key infrastructure on their private cloud or dedicated infrastructure.

A hybrid system is also a good solution if there is institutional hesitancy about the security of the public cloud for sensitive data (whether this is justified or not). Frankly, it is often easier to convince internal executive or IT teams to experiment with cloud solutions rather than adopt them wholesale. Maintaining veteran IT staff and legacy applications on legacy infrastructure while opening new lines of business in the cloud is a cost-effective solution that also manages institutional risk.

Finally, an important thing to understand about hybrid environments is that they are only as strong as the integrations that unite them. Performance monitoring, regular testing, and data ingress and egress procedures will reveal future areas of difficulty as well as signal when and how to further evolve the application. The team orchestrating the infrastructure is almost always more important than the specific type of cloud solution you chose.


Leave a comment

IoT and Cloud Computing


Iot is the about the devices that connected to the internet to performs the processes and service that support our basics needs, economics, health and environment. Hence, cloud computing acts as a front end to access Internet of Things. Cloud computing now is more popular service that comes with more characteristics and advantages. Actually, cloud computing is based on the user performs the computer tasks using services delivered entirely through the Internet. Nowadays, we can see that the Internet of things gives hopes for human life activity. If a worker needs to finish their report to submit to Manager, suddenly she/he runs out of memory space on computer. There is no problem if the computer is connected to the Internet. She/he can use cloud computing service to finish theirs works because the data is controlled by the server. Another example is if your phone had a problem and you need to format your mobile phone. You can use google apps “Picassa “to store your picture to the Internet. So you can load the picture from Internet through those applications anytime


The cloud computing are tightly coupled in the Internet of thing . The growth of the Internet of Thing (IoT) and the rapid development of technologies create a widespread connection of “thing”. This will lead to the production of large amounts of data, which needs to be stores, processed and accessed. Cloud computing as a paradigm for big data storage and analytics. While the Internet of Thing is exciting on its own that the real innovation will come from combining it with cloud computing [6]. The combination of cloud computing and IoT can enable sensing services and powerful processing of sensing data stream. For example, the sensing data to be stored allowed by cloud computing and it used intelligently for smart monitoring and actuation with the smart devices. There are two systems in the cloud that will be used, which are to transform data to insight and drive productive, cost-effective actions from these insights. The cloud effectively serves as the brain to improve decision-making and optimization for Internet-connected interactions through this process . However, when IoT meets cloud, the new challenges arise. There is an urgent need for novel network architectures that seamlessly integrate them. The critical concerns during the integration are QoS and QoE, as well as data security, privacy, and reliability [8]. The virtual infrastructure for utility computing integrating applications, storage devices, monitoring devices, visualization platforms, analytics tools, and client delivery are provided in cloud computing. Cloud computing offers the utility-based model that will enable businesses and users to access applications on demand anytime, anywhere and anyplace.


First characteristics of IoT Cloud Computing are On-Demand Self Service which means it’s there when you need it. Cloud Computing resources is web-based service that can be access by the user or yourself without any helping or permission from other people but need the communication within internet because internet is everything in the world.

The second characteristics of IoT Cloud Computing are Broad Network Access which means a lot of connectivity options. Cloud computing resources can be access through a device that can access network or have internet connection such as tablets, mobile devices and laptops. With the help of IoT, cloud computing can be access with many device that have network access and will make the user easier to access with the devices that they mostly likes. Without helping of IoT, cloud computing can’t be access and not function, that’s why network are important nowadays.

Furthermore, the third characteristics are Resource Pooling which means it can be sharing for those who knows where resources address. Resource pooling will make that people are know the address can access anytime and anywhere as they want. It can make the user become easier to access in what they want and when they have a free time to access. In IoT context, humans could easily assign an IP address to every “thing” on the planet as they want like computing IP address accessing.

Moreover, the fourth characteristics is Rapid Elasticity which means that you get what you need. This cloud computing is freedom to suit with what you need. You can be easily and quickly to edit your software features and to add or remove user inside your cloud computing. This characteristic will empower IoT by providing elastic computing power, storage and networking .

Lastly, the fifth character of the IoT Cloud Computing is Measured Service which means that you get what you pay for. This cloud computing will measure your usage about their service such as storage, processing, bandwidth and active user accounts inside your cloud computing. The meter will increase as how much your usage. Many times you use, high value of money should you pay for it. This system is called as Pay Per Use (PPU). In IoT context refers to the ever-growing network of physical objects that feature an IP address for internet connectivity, and the communication that occurs between these objects and other Internet-enabled devices and systems needs to be pay as we pay to get the internet connection.
Service models

Service delivery in Cloud Computing comprises three different service models, namely Infrastructure-as-a-Service (IaaS), Platform-as-a-Service (PaaS), and Software-as-a-Service (SaaS). Software-as-a-Service (SaaS) provides applications to the cloud’s end user that mainly accessed via web portal or service oriented architectures based web service technologies [9]. These services can be seen as ASP (application service provider) on the application layer. Usually, a specific company that use the service would run, maintained and give support so that it can be used in a long term [9]. Platform-as-a-Service (PaaS) Stack is consists of an environment for developing also provisioning the cloud applications . The main users of this layer are developers that wanting to develop and run a cloud application for a particular purpose. A proprietary language was supported and provided by the platform (a set of important basic service) to ease communication, monitoring, or various billing and other component as well. For example, to ease startup or ensure application’s scalability also flexibility. Limitations regarding the programming languages supported, the programming model, the ability to access resources, and persistency are possible disadvantages .

Infrastructure-as-a-Service (IaaS) provide the necessary hardware and software upon which a customer can build a customized computing environment . Computing resources, data storage resources and the communications channel were linked together with these essential IT resources which it is to ensure the applications being purvey on the cloud resources thus new service would be applied on the higher layers . Those stack models is what we can says the medium of IoT being used and convey by the users in different method because it is the idea that IoT ecosystem includes any form of technology that can connect to the internet. This include connected cars, wearables, TVs, smartphones, fitness equipment, robots, ATMs, vending machines, and all of the vertical applications, security and professional services, analytics and platforms that come with them . Hence, those stack models main an important role to maintain the relation between the IoT itself.
Deployment models

Cloud computing is consist of four deployment models which is Private cloud, Public cloud, Community cloud and Hybrid cloud. Private cloud has infrastructure that’s provisioned for exclusive use by a single organization comprising multiple consumers such as business units. It may be owned, managed, and operated by the organization, a third party, or some combination of them, and it may exist on or off premises. Public cloud is created for open use by the general use. Public cloud sells services to anyone on the Internet. (E.g. currently Amazon Web Services is the largest public cloud provider). This model is suitable for business requirement where in it is required to manage load spikes, and manage applications which are consumed by many users that would otherwise require large investment in infrastructure from business. Public cloud also helps reduce capital expenditure and bring down operational IT costs. Community cloud is managed and used by a particular group or organizations that have shared interests, such as specific security requirements or a common mission. Hybrid cloud is combination of two or more distinct cloud infrastructures such as private, community, or public that remain unique entities, but are bound together by standardized or proprietary technology that enables data and application portability. Normally, information that’s not critical is outsourced to the public cloud, while business-critical services and data are kept within the control of the organization.

In conclusion, the IoT will dramatically change the way we live our daily lives and what information is stored about us. This cloud computing is free to use anytime and anywhere as long as the computer is connected with the Internet based on the service models which is Infrastructure-as-a-Service (IaaS), Platform-as-a-Service (PaaS), and Software-as-a-Service (SaaS). The cloud is the only technology suitable for filtering, analyzing, storing, and accessing the information in useful ways. The deployment models based on the group, community and purpose.