|Exchange Online Plan 1||Exchange Online Plan 2||Exchange Online Kiosk|
|Office 365 SKUs & Plans Inclusion||Office 365 Business Essentials
Office 365 Business Premium
Office 365Enterprise E1
Office 365Education E1
Office 365Government E1
|Office 365Enterprise E3
Office 365Education E3
Office 365Government E3
Office 365Enterprise E4
Office 365Education E4
Office 365Government E4
|Office 365Enterprise K1
Office 365Government K1
|Feature||Exchange Online Plans|
|Planning and Deployment||Exchange Online Plan 1||Exchange Online Plan 2||Exchange Online Kiosk|
|Hybrid deployment supported|
|IMAP migration supported|
|Cutover migration supported|
|Staged migration supported|
|Permissions||Exchange Online Plan 1||Exchange Online Plan 2||Exchange Online Kiosk|
|Role Assignment Policies|
|Message Policy and Compliance||Exchange Online Plan 1||Exchange Online Plan 2||Exchange Online Kiosk|
|Archiving Exchange Online-basedMailboxes|
|Cloud-Based Archiving of On-PremisesMailboxes|
|Retention Tags and Retention Policies|
|Encryption of data at rest (BitLocker)|
|IRM using Azure RMS (requires add-on purchase, included in E3 & E4)|
|IRM using Windows Server AD RMS|
|Office 365 Message Encryption (depends on Azure RMS)|
|In-Place Hold and Litigation Hold|
|Data Loss Prevention|
|Anti-Spam and Anti-Malware Protection||Exchange Online Plan 1||Exchange Online Plan 2||Exchange Online Kiosk|
|Built-In Anti-Spam Protection|
|Customize Anti-Spam Policies|
|Built-In Anti-Malware Protection|
|Customize Anti-Malware Policies|
|Quarantine – administrator management|
|Quarantine – end-user self-management|
|Mail Flow||Exchange Online Plan 1||Exchange Online Plan 2||Exchange Online Kiosk|
|Custom Routing of Outbound Mail|
|Secure Messaging with a Trusted Partner|
|Conditional Mail Routing|
|Adding a Partner to an Inbound Safe List|
|Hybrid Email Routing|
|Recipients||Exchange Online Plan 1||Exchange Online Plan 2||Exchange Online Kiosk|
|Offline Address Book|
|Address Book Policies|
|Hierarchical Address Book|
|Address Lists and Global Address List|
|External Contacts (global)|
|Universal Contact Card|
|Contact Linking with Social Networks|
|Conference Room Management|
|Reporting Features and Troubleshooting Tools||Exchange Online Plan 1||Exchange Online Plan 2||Exchange Online Kiosk|
|Office 365 admin center reports|
|Excel Reporting Workbook|
|Web Services Reports|
|Unified Messaging Reports|
|Sharing and Collaboration||Exchange Online Plan 1||Exchange Online Plan 2||Exchange Online Kiosk|
|Site Mailboxes (requires SharePoint Online)|
|Clients and Mobile Devices||Exchange Online Plan 1||Exchange Online Plan 2||Exchange Online Kiosk|
|Outlook Web App|
|POP and IMAP||(no IMAP)|
|EWS Application support|
|Outlook for Mac|
|Voice Message Services||Exchange Online Plan 1||Exchange Online Plan 2||Exchange Online Kiosk|
|Third-Party Voice Mail Interoperability|
|Skype for Business Integration|
|High Availability and Business Continuity||Exchange Online Plan 1||Exchange Online Plan 2||Exchange Online Kiosk|
|Mailbox Replication at Data Centers|
|Deleted Mailbox Recovery|
|Deleted Item Recovery|
|Single Item Recovery|
|Interoperability, Connectivity, and Compatibility||Exchange Online Plan 1||Exchange Online Plan 2||Exchange Online Kiosk|
|Skype for Business Presence in OWA|
|EWS Connectivity Support|
|SMTP Relay Support|
|Exchange Online Administration and Management||Exchange Online Plan 1||Exchange Online Plan 2||Exchange Online Kiosk|
|Microsoft Office 365 portal access|
|Microsoft Office 365 admin center access|
|Exchange admin center access|
|Remote Windows PowerShell access|
|ActiveSync Policies for Mobile Devices|
|Customization, Add-ins, and Resources for Exchange Online||Exchange Online Plan 1||Exchange Online Plan 2||Exchange Online Kiosk|
|Outlook Web App Web Parts|
|Outlook Add-Ins and Outlook MAPI|
|Limits||Exchange Online Plans|
|Address Book Limits||Exchange Online Plan 1||Exchange Online Plan 2||Exchange Online Kiosk|
|Offline Address Book (OAB)||250||250||250|
|Address Book Policies (ABP)||250||250||250|
|Global Address Lists||250||250||250|
|Storage Limits||Exchange Online Plan 1||Exchange Online Plan 2||Exchange Online Kiosk|
|User Mailboxes||50 GB||50 GB||2 GB|
|Archive Mailboxes||Shared with Primary||No Limit||Not Available|
|Shared Mailboxes||50 GB||50 GB||Not Available|
|Resource Mailboxes||50 GB||50 GB||50 GB|
|Public Folder Mailboxes (Max 50)||50 GB||50 GB||Not Available|
|Mailbox Folder Limits||Exchange Online Plan 1||Exchange Online Plan 2||Exchange Online Kiosk|
|Maximum number of messages per mailbox folder||1 million||1 million||1 million|
|Maximum number of messages per folder in the Recoverable Items folder||3 million||3 million||3 million|
|Maximum number of subfolders per mailbox folder||1,000||1,000||1,000|
|Maximum folder hierarchy depth||300||300||300|
|Maximum number of public folders||100,000||100,000||Not Available|
|Maximum number of subfolders per public folder||1,000||1,000||Not Available|
|Storage Limits||Exchange Online Plan 1||Exchange Online Plan 2||Exchange Online Kiosk|
|User Mailboxes||50 GB||50 GB||2 GB|
|Archive Mailboxes||Shared with Primary||No Limit||Not Available|
|Shared Mailboxes||50 GB||50 GB||Not Available|
|Resource Mailboxes||50 GB||50 GB||50 GB|
|Public Folder Mailboxes (Max 50)||50 GB||50 GB||Not Available|
|Message Limits||Exchange Online Plan 1||Exchange Online Plan 2||Exchange Online Kiosk|
|Message size limit||25 MB||25 MB||25 MB|
|Message size limit – migration||150 MB||150 MB||150 MB|
|Subject length limit||255 characters||255 characters||255 characters|
|File attachments limit||250 attachments||250 attachments||250 attachments|
|File attachment size limit||25 MB||25 MB||25 MB|
|Multipart message limit||250 parts||250 parts||250 parts|
|Embedded message depth limit||30 embedded messages||30 embedded messages||30 embedded messages|
|Recipient and Sender Limits||Exchange Online Plan 1||Exchange Online Plan 2||Exchange Online Kiosk|
|Recipient rate limit||10,000 recipients per day||10,000 recipients per day||10,000 recipients per day|
|Recipient limit||500 recipients||500 recipients||500 recipients|
|Distribution Group Limits||Exchange Online Plan 1||Exchange Online Plan 2||Exchange Online Kiosk|
|Maximum number of distribution group members||100,000 members||100,000 members||100,000 members|
|Limit sending messages to large distribution group||5,000 or more members||5,000 or more members||5,000 or more members|
|Maximum message size for large distribution groups||2 MB||2 MB||2 MB|
|Transport and Inbox Rule Limits||Exchange Online Plan 1||Exchange Online Plan 2||Exchange Online Kiosk|
|Maximum number of transport rules||300 rules||300 rules||300 rules|
|Maximum size of an individual transport rule||8 KB||8 KB||8 KB|
|Character limit (in KB) for all regular expressions used in all transport rules||20 KB||20 KB||20 KB|
|Maximum number of recipients added to a message by all transport rules||100 recipients||100 recipients||100 recipients|
|Forwardee limit||10 recipients||10 recipients||10 recipients|
|Number of times a message is redirected||1 redirection||1 redirection||1 redirection|
|Moderation Limits||Exchange Online Plan 1||Exchange Online Plan 2||Exchange Online Kiosk|
|Maximum size of the arbitration mailbox||10 GB||10 GB||10 GB|
|Maximum number of moderators||10 moderators||10 moderators||10 moderators|
|Expiration for messages waiting for moderation||2 days||2 days||2 days|
|Maximum rate for expired moderation notification messages||300 expiration notifications per hour||300 expiration notifications per hour||300 expiration notifications per hour|
|Exchange ActiveSync Limits||Exchange Online Plan 1||Exchange Online Plan 2||Exchange Online Kiosk|
|Exchange ActiveSync device limit||100||100||100|
|Exchange ActiveSync device deletion limit||20||20||20|
|Exchange ActiveSync file attachment limit||25 MB||25 MB||25 MB|
During his keynote address at vForum 2015 in Mumbai, Sanjay Poonen, Executive Vice President and General Manager, End-User Computing (EUC), VMware, drew the attention of the audience to a big popped-up photo on the giant screen and said a lot of things have changed in the data center and security segments since the man in the picture spoke on security and privacy. Poonen was referring to 32 year old, Edward Joseph Snowden, whose disclosure on numerous global surveillance programs run by different governments led to intense debate on data security and privacy, and finally catalysed the concept of data localisation, leading to indirect growth of the data center industry across the globe. And, the story for India, which has always been on the forefront of using technology, is no different.
The need to increase storage capacity, continuous rise of data usage, government focus on digitisation and the strategy of tech giants to diversify by establishing local data centers are propelling the demand for data centers across the globe. In India, this segment is also buoyed by the positive sentiments in the government projects such as Digital India, Make in India, Smart Cities and the strong resurgence of growth-related projects across different verticals such as manufacturing, e-commerce & retail, IT/ITeS, BFSI (primarily non-critical workloads) and emerging verticals like education, hospitality, healthcare and communications & media.
According to research carried out by Gartner, the value of the Indian data center infrastructure and solutions market will show a 5.2 percent increase year-on-year, totaling $2 billion in 2016. The research firm also believes that this segment will also be the fastest growing market, as spending is forecasted to increase 5.9 percent in 2016.
Even on the global level, the current macro trends driving the growth of data, and in turn the data center industry, more or less remain same – proliferation of tablets and smartphones, coupled with the content required to satisfy the seemingly insatiable end users. Billions of dollars are spent on data center infrastructure in order to meet the growing demands of businesses and their customers. Having the right data center infrastructure in place has become a kind of new “arms race” for companies trying to differentiate themselves and meet the growing demand in this crowded technology driven market. Says Sanjay Poonen, “It is the most exciting time to be in technology. Cloud, virtualisation, analytics and security are coming together in such a way that it is the right time for companies to be ready for any trend.”
The growth drivers
In India, around 2005, the trend shifted from paper-based to digital information management and therefore data centers became common and essential to the functioning of business systems for enterprises and governments. Apart from private data centers, the Government of India also set-up many data centers in states under the leadership of National Informatics Centre (NIC). Since then, the overall segment has moved so far that today cloud, mobility and virtualisation are not any more the technology trends that enterprises seek. These trends have gradually become mainstream.
“With technological advancement and most businesses expanding beyond geographical boundaries in the last couple of years, the generation and consumption of data has increased manifold, thereby creating a strong need for storage capacity. Additionally, the growth in software adoption with the proliferation of the cloud has made the concept of software-defined another business reality, with businesses adopting this model in the quest for efficiency, scalability and security,” says Srikanth Karnakota, who heads the server and cloud business for Microsoft India as Director. Karnakota also believes that the advent of digitisation across sectors and convergence of the “Digital Enterprise” has further fueled the demand for data centers.
Agreeing with the views of Karnakota, Ankesh Kumar who leads the product management & marketing at Emerson Network Power India as director says, “Cloud and virtualisation in the data center is no longer a myth with quite a few prominent players like AWS, Netmagic and Microsoft announcing extensive plans to invest in state-of-the-art data center facilities in the country.”
Many experts we spoke with, were unanimously of the view that factors such as – companies consolidating their collection of server rooms and data centers into centralised regional sites to cut cost; virtualisation and compression technology enabling companies to deploy more and use more data heavy applications; exponential growth of cloud based solutions and the need of cloud vendors for more data centers infrastructure to support their offerings – are giving steady pace to data center business growth in the country.
In India, the major demand is coming from the SMB and SME sector. They are mostly adopting hybrid cloud, which is a combination of on-premise and cloud offerings. Also, for reducing cost and risk, data center facilities are being designed and constructed from integrated, prefabricated modules. This approach enables organisations to develop fully customised, high performance data centers in far less time than it takes using traditional processes. Overall, the focus has shifted towards ensuring modularity with a razor sharp focus on maintaining energy efficiency.
Post Edward Snowden’s revelation, many countries including India are mulling to bring legislation on data sovereignty popularly known as data localisation and hence multinational companies are setting up local data centers in advance. Recently, we have seen some announcements from Microsoft on this front.
Analysts from International Data Corporation (IDC) expect the data center business to continue to increase for some time before these organisations also start looking at increased regulation and compliance issues. “Government approach in this regard would also matter with respect to the tenacity of regulations and verticals, going forward,” says Gaurav Sharma, research manager – enterprise & IPDS at IDC India.
Ankesh Kumar of Emerson Network Power also believes that capital expenditure, operating expenditure, along with regulatory compliance and security guidelines impact the decision of enterprises in setting up data centers.
Who is leading the growth
Many tech giants such as Microsoft, Amazon, IBM and NTT Communications, Tata Telecommunications and the government’s own National Informatics Centre are now racing to set up data centers.
Microsoft has commissioned three “hyper-scale” data centers in India. The firm has already started a private preview of these data centers with over 100 existing customers on board across different segments such as BFSI, government, manufacturing and start-ups. Microsoft will be spending Rs 1,400 crore on setting up these data centers. The company has already set up the cloud data centres in Mumbai, Pune and Chennai. Microsoft India’s cloud business is growing at over 105% annually and with the establishment of the local data centers, it is expecting a further acceleration in its business.
Amazon will establish multiple data centers in India in 2016 with an investment of millions of dollars. Amazon has 12,000 AWS active customers in India, across enterprises, small to medium size businesses, and startups. AWS is the pioneer in cloud infrastructure services and is the world’s largest in the space, with revenue expected to be $6.2 billion in 2015, out of Amazon’s overall revenue of over $90 billion (most revenues now come from the e-commerce business, but AWS is growing at 40-50% annually). Research firm Synergy estimates that AWS’s revenue from cloud infrastructure services in the first quarter of 2015 was larger than the combined revenue of its four main competitors – IBM, Microsoft, Google and Salesforce. Its Indian clients currently mostly use its Singapore data center.
IBM has already launched a 30000 sq. ft. cloud center in Airoli, Mumbai offering cloud services late last year and a second data center is expected to be ready later this year.
NTT Communications plans to invest $100 million in developing a new data center in Mumbai through its subsidiary Netmagic Solutions. NTT acquired Netmagic, a managed services provider, in 2012. The new data center will be spread across 300,000 square feet and host up to 3,000 server racks with 20-28 MW energy capacity. The company is likely to announce global cloud services to facilitate multinational companies.
Tata Communications will invest more than $200 million (Rs 1,200 crore) towards doubling its data centre capacity in India to 10 lakh square feet over three years. The company owns an undersea cable network and provides wholesale communication and data centre services to corporates. The Tata Group company leads the data centre market with a 31% share in India, and counts a global social media company among its customers.
Google’s investments in data centers in Asia are outside India. New investments will take Google’s data center investments in Singapore to $500 million and in Taiwan to $600 million.
US based, Pi Datacentres plans to invest Rs 600 crore to set up a facility in India and expects to start operations from March 2016. The data centre would be a purpose-built green field facility with a constructed area of more than 5 lakh square feet.
Nashik headquartered ESDS Software Solution is investing about Rs 335 crore to set up three data centres in India. The company will invest Rs 200 crore for the Navi Mumbai centre and Rs 100 crore for Bengaluru, while that for Nashik would be 35 crore. With this investment, the company will create a 2 lakh sqft data centre in Navi Mumbai, 1 lakh sqft data center in Bengaluru and a 50,000 sqft data centre in Nashik.
On the other hand companies such as Oracle are also mulling to set-up a data center in the country. Recently, in an interview to Express Computer, Prashant Ketkar, Vice President – Oracle Cloud, Oracle said, “As we focus more and more on India, we will have to think about a data center presence in the country. We are in the process of figuring out how we can get more involved and how can we facilitate that. When is the right time to make the investment and how can we overcome some infrastructural challenges like shortage of electricity grids etc. We do see ourselves setting up a data center in the future.”
Key obstacles and hurdles
Adopting efficient technologies and architectures for proper space utilisation; building an architecture and design for minimum redundancy; constant updation and skill building; maintenance and increased SLAs; optimisation and cost efficiency with converged infrastructure, Flash, SDI etc; getting more budgets and energy efficiency are some of the continuous challenges that every enterprise organisation needs to work on. However, one of the most challenging tasks has been security. In fact, in the recent past, it has proven to be both a boon and bane. On one hand, due to security concerns, companies are forced to open data centers across different locations and on the other hand, due to security fears, some companies are unable to move to cloud or adopt virtualisation.
“This is true to a certain extent. Giving physical control over to a third party is sometimes challenging – especially in the context of security. The other way to look at it is that, now one has access to the best in class security frameworks as the capital expenditure is shifted to cloud operators. However, a combination of private and public cloud federation can alleviate most of the issues,” says Sajan Paul, director – systems engineering, India & SAARC, Juniper Networks.
Adding to what Sajan opined, Rajesh Shetty, Vice President – sales – South, Cisco India, says “While considering the overall framework, security is a very important requirement as multiple mobile devices connect to the network, specifically with regard to the mechanism for these devices to connect wirelessly to the network.”
What lies ahead
With proliferation of e-commerce, focus on digitisation and India being the nerve center for major IT activities, the demand for high storage will continue, resulting in phenomenal growth of data centers in the country.
In India, Mumbai, Chennai, New Delhi, Bangalore, Kolkata are the most favourable sites for setting-up data centers. In fact, Mumbai’s location facing the west coast is perfect as it is well connected as multiple submarine cables land in this region, going through the middle east to Europe and the other way through South East Asia. On the other hand, Cochin longer down the coast is also connected to multiple of submarine cables, which also makes it an attractive destination.
“In the future, large enterprises will continue to invest in infrastructure replacement and growth related projects covering enterprise mobility, cloud and big data solutions. Also, they will be focusing on building intelligent data centers that focus on optimising existing hardware assets by using additional software capabilities. This will drive increased attention on newer trends such as public cloud and integrated systems,”opines Paul of Juniper Network.
While data security and regulatory concerns pose a big challenge for many enterprises in the global data center industry, it is also creating a massive opportunity for those firms that are able to help customers manage the compliance risk and shape this through their data center strategy. Like global CIOs and CTOs, the strategy of many of the Indian tech heads revolves around a hybrid data center approach that involves a blend of many strategies such as renting data center space, turning over some applications to a Software as a Service (SaaS) vendor or disaster recovery (DR) to an Infrastructure as a Service (IaaS) provider, and continuing to update and virtualise their own data center infrastructure and develop an internal cloud. As IDC observes, the future continues to lie in a hybrid data center.
In the future, the growth will be primarily driven by data center hosting players, high speed Internet bandwidth service providers, hardware vendors, power and cooling solution providers, and system integrators. As the market expands, some new players other than IBM, Microsoft, AWS, Netmagic, Tata Communications, may also move into the data center investment market. And as the market continues to mature, it is likely to see greater creativity and complexity being used in terms of the investment structures and options available.
It’s very clear that cloud is now impacting organizations of all sizes and across all verticals. Cisco recently reported that by 2019, more than 86 percent of workloads will be processed by cloud data centers. Furthermore, global spending on IaaS reached more than US$16.5 billion in 2015, an increase of 32.8 percent from 2014, according to Gartner’s latest forecast. Finally, findings from a recent Gartner report go on to say that the use of cloud computing is growing, and by 2016 this growth will increase to become the bulk of new IT spend.
A critical point to remember here is that cloud computing isn’t just one overarching umbrella term. Rather, today’s cloud ecosystems is a collection of services, infrastructure and resources all being delivered for various use-cases. With that in mind – let’s look at some modern cloud services and how they impact your business.
- Network-as-a-Service.As more users connect to the cloud, data centers will need to figure out a better way to deliver high quality, low latency, network services. Already, we’re seeing NaaS become a key category of cloud computing where specific delivery models are defining how users utilize these services. For example, Bandwidth-on-Demand (BoD) can be considered a NaaS service model where bandwidth can dynamically adapt to the live requirements of traffic. Furthermore this can be configured based on number of connections, nodes connected to the data center, and where traffic priority policies integrate. As more users connect to the data center for things like streaming, data sharing, and consuming compute cycles – delivering high quality network services will be an absolute necessity.
NaaS isn’t a new concept, but its deployment has been hindered by some of the same concerns that have affected other cloud computing services — especially questions about the provider’s ability to guarantee high availability (HA). Other concerns include dealing with service level agreements (SLAs), compliance issues related to data sovereignty and the possibility of vendor lock-in.
- Network as a service (NaaS) is a cloud model for delivering network services virtually, either through subscription or ‘pay as you use’ service model. Through NaaS all that is required of the customer is a computer with an internet connection that is connected to the NaaS portal made available through a NaaS provider/cloud provider. NaaS simplifies network architecture through virtualization.
- Cloud providers are able to control the network throughout, from provisioning to running and the decommission process and linking the software to the business support systems for necessary processes such as billing.
Various modes of cloud networking:
- Network-as-a-service (NaaS) :A managed networking service solution delivered on-demand and as a service. This form of networking may or may not be cloud based, dependant on your business requirements.
- Cloud-based networking : This is a virtualised network based in the cloud. Infrastructure and services all take place in the cloud including management of the network as well as policy and data forwarding or switching.
- Cloud-enabled networking : With this type of networking, the management of the network and policies is done in the cloud however data is not stored in the cloud and all actions are performed locally via an application or client.
Benefits and concern of cloud networking
Benefits of NaaS
- Operation benefits from centralized policy-based flow control
- Optimal flexibility in capacity control
- On-demand network resource usage and/or procurement
- Optimal network activity and/or bandwidth utilisation within minimal downtime
- Ability to deliver new network capabilities and services without the requirement of configuring individual devices
- Fast deployment and no time spent on installing and configuring networking equipment
- Management and maintenance is simplified, the cloud provider maintains the network
- Analytics, detailed reports and insight into how services are functioning is easily obtained
- Improved network efficiency
- Improved accessibility and mobility, the network can be connected to using any mobile device with internet capabilities from anywhere at any time
- Option for disaster recovery
- Increased scalability instantaneously, the ability to rapidly add capacity is hugely beneficial
- Cost savings through reduced or even no capital investment and no future purchasing of hardware upgrades or software
Concerns holding back cloud networking
- Availability guarantees, concerns of service outage or degradation
Companies need resiliency details such as redundancy and backup procedures in place for the maximum availability of data.
- Security concerns.
Providers need to ensure customers that the customer has sole access to data and only they can make changes to it. Security certification obtained may help in alleviating this concern.
Many companies face compliance regulations that they are required to abide by to function within the law. Cloud providers should be transparent with companies with regards to encryption methods used, reporting capabilities and data location.
Maintaining privacy of company data is also a valid concern. The customers’ need assurance that the data is not monitored by the cloud provider or outsiders. Using authentication techniques and encryption methods can lessen the concern.
- Lack of industry standards
- Vendor lock-in
- Decreased control of the infrastructure
- Concern of losing data
- Concerns of integration into an already mature environment
Looking at the benefits and concerns surrounding networking in the cloud a few broad benefits can be noticed, specifically in the areas of independence (segregated networks are possible), resilience (critical applications can be treated with caution), bursting (increased network capacity can be obtained at periods of peak usage, on-demand) and analytics (reporting on performance is simpler).
While these benefits hold great promise it’s essential to not overlook possible operational implications when moving from the present network infrastructure to a cloud based one. Moving to a cloud networking landscape infers a noteworthy transfer of ownership, like with many of the other cloud models now commonly being used. It’s important that this concern is managed with great care and that all possible agreements and training is in place to assure a good comfort level for all involved.
Initial thoughts and steps towards the transformation to a cloud networking environment
When contemplating the concerns of a cloud network, many of those concerns could be alleviated thorough investigation leading to the finding of a reputable cloud vendor that you have confidence in. A vendor with a proven track record and reputation in the market is always a good starting point.
Some areas to consider when researching and determining a suitable vendor:
- Cost or billing options, it’s always good to get the cost or billing options upfront to avoid any unnecessary tension and conflict further down the line
- Architecture and operating system support levels
- Ability to orchestrate across varied consumption models
- Data protection, compliance and security offered. The security you require is highly dependent on the type of data you are responsible for. Be sure to make this a priority.
- Services or features included in the services offered
- Vendor performance and availability SLA’s
- Contracts that provide agility and flexibility that can adapt over time if required
- Time to configure
- Vendor track record and reputation
- A vendor that is innovative and knowledgeable with a passion for the area of business
- Vendor location
Although cloud networking per se may not be common place as yet, especially regarding wide-area-networks and networks that expand entire regions, there is no reason why you can’t take the necessary steps in preparation for the transformation. Some forward thinking may take the weight off for the probable outcome, as networks are likely to be next to join the cloud landscape.
Apart from this below are few more as a service :
- Data-as-a-Service. With more users comes a lot more data. In this service model, data is delivered on demand in a manner which allows the actual information to be clean and very agile. The idea is to offer data to various systems, different types of applications, and different user groups. This data would be available regardless of whether the user is inside of the organization or out of it. Furthermore, policies can be wrapped around this data to further enhance QoS, integrity, and agility. Already, big cloud vendors are utilizing a variety of DaaS models to enhance the data delivery process. Providers like Microsoft Azuredeliver and store data via three different methods – queues, tables, and blobs. The future of DaaS is bright. Organizations will want to further control and optimize both structured and unstructured data sets. Applications include everything from optimized data delivery to big data analytics.
- Backend-as-a-Service. This one is becoming very popular – very fast. The major influx of users coming in via mobile devices has created a boom in mobile application development. BaaS allows for both web and mobile application platforms to link to backend cloud storage services. This helps provide optimized features around push notifications to a variety of devices, complete user management, and the ability to integrate with other social networking platforms. In utilizing SDKs and various APIs, BaaS is able to directly integrate various cloud services with both web and mobile applications. Already, there is a broad focus where open platforms aim to support every major platform including iOS,Android, Windows, and Blackberry. Furthermore, the BaaS platform aims to further enhance the mobile computing experience by integrating with cloud-ready hosting vendors like Azure, Rackspace and EC2. Still curious? Take a look at what some BaaS providers have been doing. For example, DreamFactoryprovides a truly open-source software platform capable of integrating with any cloud or data center provider. Basically, it gives you the back-end and you create the front-end app.
- Disaster Recovery-as-a-Service. A very popular cloud service is one that revolves around corporate and infrastructure resiliency. DR-as-a-Service comes in a number of flavors capable of supporting a lot of different types of use-cases. For example, if you have critical workloads that must stay up at all times – you can create a hot, mirrored, site capable of load-balancing users and workloads should an emergency occur. Similarly, if you have apps or resources which aren’t as critical, you can create a warm or cold site which can allow for fast failover, lower cost, and still meet business needs. The difference here is your tolerance to downtime. One major recommendation is to conduct a business impact analysis (BIA). This helps you understand which systems must stay up, which can sustain a little bit of downtime, and which are not critical. This allows you to design a DR strategy that best fits your business and user’s needs.
- Storage-as-a-Service. Maybe you’re trying to reduce your data center footprint or maybe you’re just trying to extend your data center ecosystem. Whatever the case, storage options are now great when it comes to cloud. Major vendors like AWS and Azure offer very specific storage services. Similarly, traditional data center providers and cloud hosting shops also offer various storage services. This is great to support applications living in the cloud, new users coming in to request services or resources, and it helps evolve business strategy. Storage can be tricky. When moving workloads into the cloud for storage purposes, make sure to understand requirements and performance metrics. User experience is critical when you begin to migrate workloads into a cloud storage ecosystem.
- Software-as-a-Service (Desktops and Apps). One of the founding types of cloud services, SaaS has really come a long way. In fact, I’m bundling in desktop delivery as well as application delivery into this category. There has been an absolute resurgence behind cloud-based software and desktop delivery. Organizations are seeing direct benefits in working with cloud systems which can, for example, control and delivery entire VDI environments. Now, you have more resources, powerful optimization layers, and even user control methodologies all living in the cloud. This means that use-cases which you never thought were possible in the cloud – can now be delivered in an “as-a-Service” ecosystem. Mid-market and SMBs are seeing the direct benefit of delivering powerful desktops and applications directly via the cloud. This introduces greater levels of competitive capabilities and helps create better data center economics.
- Infrastructure-as-a-Service (and Everything-as-a-Service). Data center resources have become a lot more powerful and are capable of supporting more diverse workloads. Cloud services now allow you to utilize specific physical resources, within your cloud provider’s environment, to deliver a variety of applications and use-cases. Today, many organizations are evaluating their own data center and business strategies. Do you invest in more on premise infrastructure or do you utilize cloud? The reality is that many organizations are finding an even balance in using hybrid cloud services to create a very agile business. Data center and cloud providers are offering many different types of services to get you to host “everything” in their cloud ecosystem. Today, there are many options to the type of infrastructure you can deploy, where cloud can have an impact, and where you can align your business.
If you’re looking at cloud hosting options and are feeling overwhelmed; don’t be. Take a step back and realize that cloud is a powerful ecosystem capable of fitting in with your very specific needs. If you lack expertise, work with a partner to help you understand where cloud fits in with your business strategy or contact on my Mobile no +91 9870291860 .
It’s much easier to move into a cloud environment or utilize a cloud service than ever before. Organizations looking to stay agile and competitive in today’s market must absolutely look at ways cloud can be incorporated into the business.
Thanks and Regards
Cloud Solution Consultant and Microsoft Licensing Sales Specialist for West Region at Dimension Data (NTT Group)
According to the latest statistics, 78% of US small businesses have now fully adopted cloud computing.
Reasons Why You Need to Take Advantage of the Cloud Today
Boost Collaboration – The cloud allows you to access your files from anywhere. That means you can collaborate as long as you have an Internet-enabled device. It’s also beneficial if for some reason your usual computer or server is out of action.
More Engagement – The cloud is by far the best system of engagement for meeting the needs of customers. The ability to pull up files and presentations from anywhere provides more tools to work with.
Speed – The cloud enables companies to move forward faster with innovation. Kuma Games has worked with IBM to offer episodic video games via the cloud. They have made them faster, graphically superior, and higher performing than their competitors.
To achieve these three benefits doesn’t require any significant investment. The cloud has caught on so quickly with businesses of all sizes because of how affordable it is. CloudBerry, for example, has managed to scale its services and make them as flexible as possible so companies are only using exactly what they pay for.
The Cloud and Disaster Relief
Nevertheless, to go with the cloud you should combine your disaster relief services. This means that if something goes wrong you can continue to deliver services to customers. There are many reasons why you may want to combine your contingency plans with the cloud.
The benefits of combining the cloud with a disaster recovery plan include:
- Better protection. The cloud ensures your data is always recoverable because of high-end encryption and the fact your data is now scattered across multiple locations. Even a natural disaster can no longer wipe you out.
- Low cost of ownership. There’s little capital expense involved. All you have to do is pay the subscription for the service you happen to be using. There are no on-going maintenance costs to take into account.
- Ease of use. Cloud interfaces are incredibly easy to use because they require no technical experience whatsoever.
Benefit of Cloud Based Model in Today’s competitive Environment where uptime and Scalability is challenge???
- Reduced time to benefit
Different from the traditional model, in SaaS the software (application) is already installed and configured. The user has the advantage of provisioning the server for an instance in cloud and in a couple hours they can have the application ready for use. This reduces the time spent in installation and configuration, and can reduce the issues that can get in the way of the software deployment.
- Lower costs
SaaS has a differential regarding costs since it usually resides in a shared or multi-tenant environment where the hardware and software license costs are low compared with the traditional model.
Another advantage is that the customer base can be increased since it allows small and medium businesses (SMB) to use a software that otherwise they would not use due to the high cost of license.
Maintenance costs are reduced as well, since the SaaS provider owns the environment and it is split among all customers that use that solution.
- Scalability and integration
Usually, SaaS solutions reside in cloud environments that are scalable and have integration with other SaaS offerings. Comparing with the traditional model, users do not have to buy another server or software. They only need to enable a new SaaS offering and, in terms of server capacity planning, the SaaS provider will own that.
- New releases (upgrades)
SaaS providers upgrade the solution and it becomes available for their customers. Costs and effort associated with upgrades and new releases are lower than the traditional model that usually forces the user to buy an upgrade package and install it, or pay for specialized services to get the environment upgraded.
- Easy to use and perform proof of concepts
SaaS offerings are easy to use since they already come with best practices and samples inside it. Users can do proof of concepts and test the software functionality or a new release feature in advance. Also, they can have more than one instance with different versions and do a smooth migration. Even for large environments, users can use SaaS offerings to test the software before buy it.
But before moving to Cloud Based Model, Every Company Should Ask Itself 12 Cloud Questions
#1: Are you using the right tools? 60 percent of UK IT managers surveyed by The Register‘s cloud survey said they were using VPN connections, but only 34 percent said they were using cloud firewalls or encrypting data at rest. “The numbers continued to drop in regards to other preventative measures until the bottom of the list where only 15 percent percent said they were using obfuscation or tokenization of sensitive data,”
#2: What cloud technologies are being shared, and with whom? Cloud service providers often share infrastructure, platforms and applications to deliver their services in a scalable way.
“Whether it’s the underlying components that make up this infrastructure (e.g. CPU caches, GPUs, etc.) that were not designed to offer strong isolation properties for a multi-tenant architecture (IaaS), re-deployable platforms (PaaS), or multi-customer applications (SaaS), the threat of shared vulnerabilities exists in all delivery models,” writes the Cloud Security Alliance.
#3: How do you define and determine the best ways to deal with cloud abuse? The Cloud Security Alliance defines cloud abuse as “a bad guy using a cloud service to break an encryption key too difficult to crack on a standard computer. Another example might be a malicious hacker using cloud servers to launch a DDoS attack, propagate malware, or share pirated software.”
#4: Do you allow employees to use their own devices? The rise of bring-your-own-device (BYOD) and bring-your-own-application (BYOA) means that many cloud services and tools are sneaking into organizations under the noses of IT leaders. In a recent survey, more than half of the IT respondents said that when it came to cloud services, the biggest challenge was assessing the security risk before employee adoption.
#5: Are you ready for next-generation technology and the Internet of Things (IoT)? Gartner predicts that the IoT market will grow to 26 billion units by 2020. With the proliferation of connected devices, is it any surprise that IT managers are increasingly concerned about the security risk of those devices?
#6: How do you protect credentials from theft? In 2010, Amazon was subject to a cross-site attack that used malicious scripts in a benign account to launch more attacks. Many companies are prohibiting the sharing of accounts and now require strong two-factor authentication techniques.
#7: When do you identify and stop malicious insiders? A 2015 Experian study claimed that employees, particularly those working remotely or using their own mobile device, accounted for more than half of security incidents last year. A current or former employee, contractor, or a business partner with access through IaaS, PaaS, SaaS or traditional infrastructure, can often be the source of an enterprise’s greatest risk.
#8: How do you handle the riskiest of apps, storage? Cloud-based storage applications have access to very sensitive corporate data, particularly financial data.
#9: Is your cloud service provider responsible for security? To fully secure data in the cloud, enterprise IT teams should never solely rely on their cloud provider. Ensure you have a solid security strategy in place that is agnostic to the location of your data and applications.
#10: How flexible and collaborative is your IT department in meeting the challenges associated with new technologies and quickly responding to security threats? The majority of IT managers are seeing a shift toward more collaboration and pooling of previously siloed resources, opening up opportunities for better cloud security measures.
#11: Are your cloud-based applications being monitored for inbound and outbound traffic anomalies? The difference between a minor incident and massive breach often comes down to the ability to quickly detect, contain and mitigate an attack. Analysts at the Ponemon Institute estimate it took retailers, on average, 197 days to identify an advanced threat and 39 days to contain it, while financial services organizations needed 98 days to identify and 26 to contain.
#12: What is your company policy when it comes to managing sensitive data and file sharing? On average, more than 25 percent of employees will upload files containing sensitive data to the cloud.
And When To Say No To The Cloud
With cloud adoption on the rise, should you jump on the bandwagon? Depending on your business needs, it may make more sense to wait.
Cloud adoption rates are increasing as more organizations turn to a truly distributed infrastructure model and use more WAN-based tools. As underlying hardware components become better and more bandwidth becomes available, cloud computing has become a legitimate consideration for a wide array of industry verticals. Everyone should be either adopting it or at least considering it, right? Not so fast.
In numerous conversations with different customers using various technologies, I hear a lot of discussion about cloud computing. However, these conversations are changing. Managers are no longer ask what the cloud is; now, they want to know whether they really need it.
The reality is simple: Some businesses just don’t.
The term “cloud” really just means data distribution over the WAN. This can be a private, public, or hybrid model. Because of the massive presence of the Internet, most organizations are already using the cloud without even knowing it. They’re utilizing cloud computing components within their means and needs.
On the other hand, some organizations keep a completely localized environment and only use WAN-based technologies to share files, store backups, or host sites on the Internet. Really, all of these technologies were available before the cloud became popular. Because of this, administrators are asking, “Why do I need more when I already have so much?” Depending on their business needs, they may be quite right.
Too many organizations get caught up in the hype of the cloud conversation without really doing a true cost/benefit analysis. This can involve several business stakeholders, interviews with internal and external resources, and a clear vision for where the organization is going. Instead of jumping on the cloud bandwagon, organizations should take the time and understand cloud pros and cons and how those fit with their business strategies.
There are distinct advantages to moving to a cloud model, including improved disaster recovery, backup and storage, testing and development, easier management, and enabling data center consolidation.
At the same time, it’s important to remember that cloud computing also has these drawbacks
- Certain knowledge levels are required. Remember, the cloud isn’t just one platform — it’s a lot of different technologies all working together to bring you data. Your organization will need to have virtualization, application, security, and cloud experts on hand to guide the whole process along.
- Management and monitoring can be a challenge. Improper resource allocation can make cloud computing a serious cost centre for any organization.
- Security and data control, in some cases, may still be an issue for you. If you’re bound by some type of compliance requirements, putting data into the cloud can violate some rules. Also, the cloud can sometimes be a dangerous place. A recent Amazon Web Services console breachis evidence of just one instance where a DDOS attack impacted a major cloud provider.
- Reliability isn’t a given. There have been major outages, which have forced some businesses to rethink the cloud model. For example, outages at AWS have caused companies like Netflix to go down for extended periods of time.
In some cases, a cloud model is just not the right fit. Whether it’s cost prohibitive or it just doesn’t provide any additional benefits, cloud computing may not be the right choice for the time being.
However, I’m not saying to be complacent. Complacency in IT can ruin an organization or a career. Take the time to fully understand the cloud model and how it may — or may not — fit your business.
As with any technology, there will be benefits and challenges. In some cases, moving to the cloud may just not be conducive to the goals of the organization. It’s quite possible that a company has no intention of expanding or moving its infrastructure to the Internet. Or there may not be a need to offload workloads into the cloud. Also, there may be other good technologies to help deliver data and content to the end-user.
The bottom line is: The cloud model is powerful, and many organizations are adopting some part of it. But with any tool, piece of software, or technological advancement, there needs to be a fit.
Thanks and Regards
+91 9870 291860
Azure vs. AWS: Side-by-Side Feature & Services Comparison
While some of the features, services and options that you’ll find in Azure and AWS can’t be fully compared to one another, many come pretty close. Here’s our attempt at a side-by-side comparison between the two cloud platforms.
Thanks and Regards
Amazon Workspaces: You’re Desktop in the AWS Cloud
The cloud-based virtualized desktop is, according to many cloud experts, the up and coming next step toward a complete takeover of all of our computing activities by the cloud. Virtualized desktops hosted in the cloud can take two different forms:
- VDI (Virtualized Desktop Infrastructure) that is provided over the Internet, or
- Desktop as a Service (DaaS), which is still a form of virtualized desktops but is a true multi-tenant cloud service
Persistent desktops: what problems do they solve?
The lack of consistency has long been a source of frustration for computer users, and it’s a pain to have to either spend time changing settings or adapt to a different (even if only slightly) interface when switching from one device to another. It’s not uncommon, either to find that the document a user was working on with one device was saved to that local hard drive and either isn’t available at all on the current device (and in a worst case scenario, will have to be recreated), or the user must lose productivity time to establishing a connection back to the home or work network where the document is stored in order to retrieve it and continue working on it.
Another common scenario is that the user has access to his or her documents (perhaps because they’re stored on a cloud service, perhaps because the user transferred a copy by putting it on a USB drive or sending it to him/herself via email) – but then finds that the application that’s needed to work with it isn’t installed on the new device. Granted, this is less of an issue that it used to be, now that functional online versions of Microsoft Office programs or Google docs can be used from any machine, and many mobile apps can be downloaded quickly and easily (and without paying for them again if the user already owns them) from a mobile OS vendor’s Store – but it still happens. This is particularly true in the case of custom line-of-business applications.
A big advantage of persistent desktops is that both the user’s applications and the user’s data are always in the same place and accessible through the virtualized desktop, so that switching from one device to another suddenly becomes a seamless experience.
From the company’s point of view, the ability to quickly deploy desktops to new users can be a big plus, especially in special cases such as mergers and acquisitions, which seem to be increasingly common in many industries these days. You can bring in a large number of new employees and quickly get them up and running.
The other big thing for administration is that deploying virtual desktops gives you more control over them, more easily, than you might have with dozens, hundreds or even thousands of individual computer desktops. And this is true independently of the hardware. That is, users can bring their own devices – laptops and tablets – or work from their home desktop systems, and you still have control over their work desktops that they’re accessing with those devices. What’s not to like?
Concerns and issues surrounding Desktop as a Service
In spite of the benefits of delivering user desktops as a service over the Internet, as described above, there are downsides, as there are with any technology. Some users and IT pros may be resistant to the idea of a desktop that lives in the cloud. There may be concerns around security and privacy, which are common in relation to any transition to a cloud computing experience. In the case of the desktop, reliability and accessibility might also be an issue for some users, who fear that a loss of Internet connection (or just a loss of connectivity between the user’s machine and the server) could result in an almost total loss of productivity since the desktop is the location, for the average user, where everything lives. It’s their window to the computing world and if that window is closed for any reason, they may feel lost.
Admins may have the same and/or different concerns about DaaS. The major cloud providers offer SLAs that generally start at around “three nines” or 99.9% up time. That’s pretty standard throughout the industry and it sounds good – but in reality that translates to about eight and three quarters hours of down time per year, or almost forty-four minutes per month. While that’s not a lot, that much time without the use of his/her desktop could make a big difference if a particular user happens to be working to a very tight deadline on a critical project at the time that the cloud service goes down.
Another common problem is adapting to the occasional (or sometimes frequent) latency issues that can plague DaaS implementations. Latency doesn’t just add up to a performance hit – it also makes for a frustrating experience for users who aren’t used to the “sit and wait” situation when performing tasks on their desktops. There has to be enough bandwidth to give users an experience that’s the same as or close to what they’re used to when working on a local desktop, because otherwise you’ll end up with very unhappy users.
Different types of applications are more or less affected by latency (or perhaps more accurately, the effects of latency will be more or less noticed by users). Real time communications and collaboration tools such as Skype are noticeably affected, as are multi-media applications that involve high quality video. Applications such as email, or browsing low bandwidth web sites (mostly text and photos), on the other hand, won’t be noticeably affected.
Weighing the pros and cons
Many companies are coming to the conclusion that the drawbacks of putting user desktops in the cloud are outweighed by the benefits, and in particular the cost benefits. Providing the CPU, RAM and disk space for individual workstations can be much more expensive than virtualizing those resources, and DaaS solutions generally add up to significant cost savings over on-prem VDI for most organizations, due to the economies of scale and the difference in expensive administrative overhead as well as the capital expenditure required for the latter.
DaaS, like other “as a service” computing, cuts the need for capital investments and shifts that cost to fixed and predictable on-going monthly or annual fees, thus moving big chunks of budget from CapEx to OpEx (capital to operational expenses). It also provides fast scalability (both up and down) and fits better into today’s “agile” model of doing business.
Once you’ve decided that DaaS is the right option for your org, you’re faced with the challenge of evaluating different DaaS providers and determining which is right for your needs. A comprehensive comparison of DaaS providers is beyond the scope of this article, but many companies today are using Amazon’s AWS (Amazon Web Services) for their IaaS, PaaS, cloud storage, and other cloud computing needs. If you’re already an AWS customer or if you’re considering AWS services in general, it makes sense to check out their DaaS offering when you decide to put some or all of your users’ desktops into the cloud.
Amazon’s DaaS: Workspaces in the AWS cloud
Amazon’s DaaS offering is called Amazon WorkSpaces. This can be a lower cost alternative to expensive and difficult to configure (and manage) VDI deployments in your on-premises data center, while giving users similar functionality. The nice thing for users about VDI, in comparison to traditional local desktops, is that they are able to have the same experience and interface regardless of whether they’re connecting from a PC at the office, a home computer or a laptop. They can even get that same computing environment when using a Mac or iPad, a Chromebook, or an Android tablet (including, of course, Amazon’s own Kindle Fire), as Workspaces supports all of these.
Workspaces is a robust DaaS solution that will work in conjunction with your company’s Active Directory, making it easy for users to sign onto their desktops with their current credentials that they use in the enterprise. It also makes things easy for admins, taking much of the burden of deploying and managing VDI off of you; Amazon takes care of such tedious tasks as desktop OS patching, and there are a number of different “bundles” of the service that you can subscribe to, depending on what your hardware and software needs are.
But what is it really going to cost?
We all know that service providers’ claims of gargantuan savings by adopting their services sometimes pan out and sometimes they don’t. Hosted desktops have been touted in many circles as a way to lower your TCO but sometimes there are hidden costs. In general, DaaS is more cost effective than VDI, which can be difficult to scale and upgrade because it’s usually built on enterprise (vs. cloud) infrastructures.
Within the DaaS options, though, there is a wide variance in pricing structures and ultimate costs. Some DaaS providers set a minimum number of desktops that you have to order, or set minimum usage requirements. Some charge licensing fees for the operating system separately. Others require that you commit to a long term contract (one year is common) so you’re locked in for that time period even if you find the DaaS solution doesn’t meet your needs.
When Amazon first released their WorkSpaces service in 2013, Gene Marks over at Forbes.com asked if it was Too Good to Be True, and in the end concluded that the cost – which on the face of it seemed considerably lower than that of the company he was using to host his 10-person company’s applications – after adding in the cost for Exchange and migrating databases would end up at close to the same per-month per-user outlay.
That was two years ago and the market has matured somewhat over that time. Amazon offers three WorkSpaces bundles as well as a couple of applications options, which we will look at now.
AWS WorkSpaces pricing options
As a cloud service, Work Spaces is a subscription that you pay for on a per-desktop per-month basis. Unlike with some services, you are not required to sign a contract that locks you into Work Spaces for any set period of time. You can delete some or all of your Work Spaces as your changing needs dictate and you’re charged only for those Work Spaces that you use that month. That means if you have a user who takes a two-month leave of absence and that Work Space is never launched, you won’t be billed for it. That’s a big advantage over deploying your own desktops (for instance, from a local Remote Desktop Server) since you would be incurring the costs of that desktop whether or not it was used.
In order to provision Work Spaces to your users, you have to have an AWS account; it’s not a standalone service. Of course, Amazon offers free accounts for one year that include 750 hours of EC2 usage (Windows or Linux t2 level, 5 GB of S3 storage as well as RDS (relational database) and Dynamo DB (No SQL database) and a number of other services. Note that after one year, you have to pay regular rates, and you have to sign up with a credit card to get the free trial, since you’re charged if you exceed the usage caps. Individual users themselves do not need to have AWS accounts.
The basic Work Space for each user runs the Windows 7 desktop operating system experience, running on Windows 2008 R2 servers and has Internet Explorer 11, Firefox and 7-Zip installed already. You can install your own software. You do this using Amazon WAM (Work Spaces Application Manager), which comes in two versions, lite (which is free) and a more full-featured standard version, which costs $5 per month per user. We’ll discuss later in this article series how to use WAM to add software to Work Spaces.
Cost per user
The cost per user for Work Spaces depends on the hardware configuration that you need for each desktop. That, of course, is dependent on what software the users use, how much data they need to store, the number of applications they need to be able to run at the same time, and so forth. In other words, will the work scenarios be light usage, typical/average office usage, or heavy usage with resource-intensive applications. If the users only need to check email and do web searches, hardware requirements are minimal. If the users will be working with video editing or CAD programs or other “power user” type use cases, they will need more processor, memory and storage.
Amazon offers three different hardware configurations, which Amazon refers to as “bundles”:
- For light users, the Value package provides one virtual processor, 2 GB of memory and 10 GB of storage. This is similar to a low-end PC or a mid-range tablet, although most top tier smart phones today actually have more memory and storage than this. The price is $21 per user per month if you “BYOL” (bring your own license, discussed in more detail below) or $25/user/month if you don’t already have the requisite Windows 7 licenses for your users.
- For the average user, the Standard package will run you $10 per month per user. For that extra cost, you double the CPU to two virtual processors, double the RAM to 4 GB and increase the storage space five-fold to 50 GB. This should suffice for most office productivity programs and communications programs.
- If you have power users who need to do heavy lifting from their desktops, then you’ll want to check into the Performance package. It’s pricey at $56/user/month with your own license or $60 without, but it ups the memory to 7.5 GB and increases storage space to 100 GB, which is enough to get some serious work done. CPU stays the same at two virtual processors.
The pricing mentioned above is for customers in the U.S./North America. Pricing in Europe is different (slightly higher) and a bit higher still in Asia Pacific regions (Sydney, Tokyo, Singapore).
As mentioned above, you can install your own software. Another option is to purchase the “Plus” add-on to any of the three packages. For an extra $15 (in all regions), you get Microsoft Office Professional Trend Micro Security Services already installed.
You can also create custom images that you configure with the desired applications and settings and then deploy to your users. We’ll talk about how to do that later. Admins can create as many as five custom images for an AWS account (per region). You can install any software you want that is compatible with Windows 7, but of course you are responsible for having the proper licenses for the programs that you install.
The Bring Your Own License option is for those organizations that already have licenses for Windows 7 through a Microsoft Volume Licensing agreement with a software assurance contract. Doing this saves you money (about $4 per user per month) but it also makes getting started with WorkSpaces a little more complicated. In practice, you’ll probably need to work with your company’s Microsoft volume licensing representative (to verify that your licenses are eligible for BYOL) and with the AWS account manager for the particulars on uploading your Windows 7 images and making an AMI (Amazon Machine Image). All of this typically takes a week or two so you might not be able to get started immediately as you can without BYOL.
Windows 7 Professional or Enterprise edition can be used to create your AMI. After you import the image, you’ll need to build a custom bundle that includes that image. You’ll need to activate the OS, which can be done with Microsoft activation servers within your virtual private cloud (VPC) or that can be accessed from your VPC. We’ll go into deeper detail about how to do all this in a later section.
Note that unlike the non-volume licensed WorkSpaces, there is a minimum number of WorkSpaces that you have to launch per month in order to use the BYOL option. At the time of this writing, that number is 200. For this reason, as well as the VL agreement requirement, BYOL is only feasible for large organizations, not for small businesses.
The cloud security dilemma
Security is always a consideration when facing the “to the cloud or not to the cloud” choice. Data breaches have become so commonplace that they’re almost not even “news” anymore, but they continue to dominate the headlines; a study from the Ponemon Institute found that more than 40% of companies experienced some type of breach in 2014, and this included big names such as Morgan Chase, Home Depot and of course the infamous Target case.
Organizations are terrified of being the next victim (and the loss of customers that can result from the bad publicity). Data leaks can occur in many different ways and employees who access sensitive data on their desktops present one attack vector. While virtual desktops have security advantages, they can bring new challenges. This is where the DaaS provider you choose can make a difference.
In search of a secure Work Space
In looking at the security of a DaaS solution, the questions that you want to ask and the features that you want to look for are similar to those you must consider with any cloud service. A secure DaaS implementation hinges on a number of factors:
- Secure logon to the service: user authentication must be strong in order to prevent unauthorized users from accessing desktops where they can access sensitive information.
- Reliable identity services: regardless of the strength of the authentication protocols, authentication is built on the foundation of identity, so the identity database itself must be secure.
- Encryption: when the entire desktop is being delivered to the user over the Internet, it should be encrypted to prevent interception by unauthorized persons.
- Effective key management: the keys used to encrypt the virtual drives on which the desktop “lives” must be protected.
- Physical security: the servers in the provider’s datacenter where the applications actually run have to be secured from access by unauthorized or malicious persons, both internal and external.
AWS Work Spaces security
With AWS WorkSpaces, Amazon implements a number of different security mechanisms in an effort to address the above issues.
WorkSpaces admins can choose from a few different ways to allow their users to log onto the WorkSpaces desktops. The simplest is to have users create credentials (user name and password) of their choice after you provision their desktops. Most medium to large (and many small) organizations will already have an Active Directory deployment and you can integrate WorkSpaces with your Active Directory domain to make it easy for users – they sign in with their familiar AD credentials.
Identity and authentication
Users can log on with credentials stored in a directory that’s maintained and managed by Amazon on their servers, or with AD credentials, depending on how WorkSpaces has been configured. Active Directory is, of course, the standard identity repository on Windows-based networks and this is true as well for Amazon’s cloud-based services that are integrated with an organization that has integrated its on-premises AD with AWS.
You can integrate WorkSpaces with your RADIUS server if you have one. Amazon added this feature in August of 2014 and Microsoft RADIUS servers are supported, along with others. For redundancy and high availability, you can set it up to use multiple RADIUS servers, with or without a load balancer.
Admins configure the RADIUS integration through the WorkSpaces admin console (in the Directories section) and there’s no extra cost. You’ll need to configure the IP address(es) for your RADIUS server(s) or load balancer, the port your RADIUS server uses, a shared secret, and select the protocol you set up for your RADIUS endpoints. You can also configure server timeout in seconds and maximum number of retries to connect to the RADIUS server (up to 10).
To log in, users provide their AD user name and password and then enter a one-time passcode generated by a hardware or software token, giving the protection of multifactor authentication (MFA).
When using MFA, you can use either hardware or software tokens with Amazon’s MFA. Google Authenticator is a popular software based solution. If your RADIUS server is running on Linux, you can use a Pluggable Authentication Module (PAM) library to enable the use of Google Auth. MFA works for users who access WorkSpaces through client devices running Windows, Mac OS X, Chrome OS, iOS, Android or Kindle OS.
Controlling user access
You can limit the access that your users have to applications and other resources from their WorkSpaces.
It’s easy to keep a user from accessing his/her WorkSpace if the person leaves the company or for some other reason needs to be blocked permanently or temporarily; you simply disable the account in whichever directory is storing the user identities (your Active Directory if you’ve integrated AD with WorkSpaces or the Amazon directory service if you haven’t).
Note that Amazon Identity and Access Management (IAM) users are not given access to WorkSpaces resources by default. You probably already know that IAM is a means for allowing and denying permissions to resources via policies that can be attached to individual users, groups, or the resources themselves.
You would have to make a policy in IAM that grants the specific users permission to create and manage resources for WorkSpaces and EC2. Then you need to attach the policy to whichever users (or groups of users) you want to be able to access the WorkSpaces resources.
Amazon provides in their documentation a sample policy statement that can be used to grant permission to perform all WorkSpaces tasks for IAM users. You’ll find that sample script, along with more information on specifying WorkSpaces resources in IAM policies, on the AWS web site.
You can also control and limit access to network resources (including resources that reside on the Internet) from WorkSpaces by using VPC security groups. You might remember that VPC security groups behave sort of like virtual firewalls, because they control the inbound and outbound traffic to AWS virtual private clouds.
WorkSpaces will create a security group that’s assigned to all of the WorkSpaces you have provisioned to users in your directory. You can also create additional security groups, through the WorkSpaces console. If you’re going to want to allow Internet access from WorkSpaces, you need to assign a public IP address, and you need to set this up before you provision the WorkSpaces because it will only apply to those that are created after you enable this setting. If you already have WorkSpaces provisioned, it is possible to manually assign those WorkSpaces an Elastic IP address.Here are the instructions on how to do that.
As all IT professionals know, one of the most important aspects of computer and network security is the patching of vulnerabilities in the software as quickly as possible, before their existence becomes widely known and attackers seize the opportunity to exploit them. WorkSpaces desktops are running popular applications on a popular client operating system and so security updates are just as important for these virtual desktops as they are for any network client.
Amazon gives you, as the WorkSpaces admin, control over the installation of security patches on the users’ WorkSpaces. This can be done through the Windows Update service that’s built into all modern versions of Windows, and Windows Update is turned on by default on all new WorkSpaces. If you prefer, however, you can use a patch management solution of your own choice, both to update Windows and Microsoft applications and to update third party apps.
Another “must have” for best security is anti-virus/anti-malware and you can install your favorite AV/AM software on the users’ WorkSpaces just as you install them on Windows client computers on your premises. You get Trend Micro AV as part of the package if you purchase one of the WorkSpaces “Plus” bundles (Value Plus, Standard Plus or Performance Plus), along with the Microsoft Office applications.
AWS Workspaces finally gets VoIP integration
The signs that unified communications is moving to the cloud continue to come. Now, it’s Amazon Web Services that’s getting in on the game – not so much as a provider, but rather integrating its existing virtual desktop infrastructure offering, Workspaces, to more easily integrate with VoIP and UC.
Now, organizations can add VoIP and UC capabilities to their cloud-based virtual desktops. According to an article on The Register, the new feature works by taking audio from any client – say, Skype for Business or WebEx – and run it through Workspaces.
This could make the lives of engineers and administrators a little easier while also making the VDI service far more useful. AWS launched Workspaces back in 2013, and there are examples of attempts to install softphones on it almost from the beginning (at least from 2014). But Workspaces wasn’t equipped to connect to audio devices, making it less than ideal for organizations that not only wanted the benefits of VDI, but also VoIP or full-on UC suites.
Thanks And Regards
Windows Azure Pack
Cloud computing is making big inroads into companies today. Smaller businesses are taking advantage of Microsoft cloud services like Windows Azure, Windows Intune and Office 365 to migrate their line-of-business applications and services to the cloud instead of hosting them on-premises. The reasons for doing this include greater scalability, improved agility, and cost savings.
Large enterprises tend to be more conservative with regards to new technologies mainly because of the high costs involved in widespread rollout of new service models and integrating them with existing the organization’s datacentre infrastructure. Windows Azure Pack is designed to help large enterprises overcome these obstacles by providing a straightforward path for implementing hybrid solutions that embraces both the modern datacentre and cloud hosting providers.
What is Windows Azure Pack?
To understand what Windows Azure Pack is, you first need to be familiar with Windows Azure, Microsoft’s public cloud platform. To understand what Windows Azure is all about, here are some brief excerpts from my recent book Introducing Windows Azure for IT Professionals: Technical Overview from Microsoft Press:
As a cloud platform from Microsoft that provides a wide range of different services, Windows Azure lets you build, deploy, and manage solutions for almost any purpose you can imagine. In other words, Windows Azure is a world of unlimited possibilities. Whether you’re a large enterprise spanning several continents that needs to run server workloads, or a small business that wants a website that has a global presence, Windows Azure can provide a platform for building applications that can leverage the cloud to meet the needs of your business…
Let’s look at the definition that Microsoft uses for describing Windows Azure:
Windows Azure is an open and flexible cloud platform that enables you to quickly build, deploy, and manage applications across a global network of Microsoft-managed datacentres. You can build applications using any language, tool, or framework. And you can integrate your public cloud applications with your existing IT environment.
This definition tells us that Windows Azure is a cloud platform, which means you can use it for running your business applications, services, and workloads in the cloud. But it also includes some key words that tell us even more:
- Open– Windows Azure provides a set of cloud services that allow you to build and deploy cloud-based applications using almost any programming language, framework, or tool.
- Flexible– Windows Azure provides a wide range of cloud services that can let you do everything from hosting your company’s website to running big SQL databases in the cloud. It also includes different features that can help deliver high performance and low latency for cloud-based applications.
- Microsoft-managed– Windows Azure services are currently hosted in several datacenters spread across the United States, Europe, and Asia. These datacenters are managed by Microsoft and provide expert global support on a 24x7x365 basis.
- Compatible– Cloud applications running on Windows Azure can easily be integrated with on-premises IT environments that utilize the Microsoft Windows Server platform.
So what is WAP really ?
- Windows Azure Pack’s main entry point is the portal or ‘Control Panel’ – a better known term in hosting industry.
- Windows Azure Pack Portal brings together plethora of services and products to provide a single place to provision and manage resources for an enterprise or a hosting provider.
Services that are available out of the box with WAP:
- Virtual Machines (via System Center VMM integration)
- Websites (distributed, multi-tenant, highly available web hosting service)
- Database (via SQL Server and My SQL)
- Automation (via System Center Runbooks)
What is interesting about WAP Portal is, like never before you now have brought different Microsoft products under one single umbrella both for the IT Administrators and consumers such as development teams.
While this is not what Windows Azure Pack is mainly advertised for, this makes life so much simpler for administrators and consumers alike.
Take for example, the traditional way to get a database server and a web server, you send requests to IT department, goes through approval process, procurement, allocation, management, windows updates yada yada yada. With Windows Azure Pack, all these types of assets can be provisioned, managed and monitored in one central place.
Windows Azure Pack Portal serves as the single place where the following Microsoft products come together:
- System Center VMM, Hyper-V
- IIS web Server (yes it does)
- SQL Server
- Azure Service Bus
In addition to these Microsoft products getting delivered via the portal, WAP Portal is also highly extensible for 3rd party services and products to integrate with. The out of the box My SQL database as a service is one example. Cloud Cruiser billing platform integration is another example of 3rd party service. If you are a hosting service provider you can differentiate quickly by integrating your services with WAP or vice versa with Custom Resource Provider extensions.
Windows Azure provides businesses with four basic categories of cloud-based services:
- Compute services
- Network services
- Data services
- App services
At the core of the Windows Azure platform is its ability to execute applications running in the cloud. Windows Azure currently provides four different models for doing this: Web Sites, Virtual Machines, Cloud Services, and Mobile Services. Together these four approaches comprise the compute services portion of the Windows Azure platform, and they can either be used separately or combined together to build more complex solutions that can meet specific business needs…
Windows Azure Web Sites is a scalable, secure, and flexible platform you can use for building web applications that run your business, extend the reach of your brand, and draw in new customers. It has an easy-to-use self-service portal with a gallery of the world’s most popular web solutions including .DotNetNuke, CakePHP, DasBlog, WordPress, and many others. Or you can simply create a new website from scratch and then install a tool like WebMatrix—a free, lightweight web development tool that supports the latest web technologies such as ASP.NET, PHP, HTML5, CSS3, and Node. You can use WebMatrix to create websites and publish applications for Windows Azure. And if you use Microsoft Visual Studio as a development environment, you can download and install a Windows Azure SDK so you can build applications that can take advantage of the scalable cloud computing resources offered by Windows Azure…
Creating a new website with Windows Azure is so easy we have to show you how to do it. Begin by logging on to the Windows Azure Management Portal at https://manage.windowsazure.com using your Microsoft Account username and password. Then select the Web Sites tab on the left and either click Create A Web Site or click the New button on the command bar at the bottom as shown here:
Figure 1: You can create a new website using Windows Azure
The command bar then expands, as shown in the next figure, and allows you to quickly create a new website with no additional configuration, a custom website with either a new or existing database, or a new web application based on an application framework, blog engine, template, or any other app available in the Windows Azure gallery.
Figure 2: The Quick Create option for a Web Site
Windows Azure Virtual Machines is a scalable, on-demand IaaS platform you can use to quickly provision and deploy server workloads into the cloud. Once deployed, you can then configure, manage, and monitor those virtual machines, load-balance traffic between them, and connect them to other Windows Azure Cloud Services running web roles and worker roles. You can copy virtual hard disks (VHDs) from your on-premises environment into Windows Azure to use as templates for creating new virtual machines. And you can copy VHDs out of Windows Azure and run them locally in your datacenter.
You can create new virtual machines from a standard image available in the Windows Azure gallery. Standard images are included for current versions of Windows Server and for different flavors of Linux. Standard images are also available for Microsoft SharePoint, Microsoft SQL Server, and Microsoft BizTalk Server pre-installed on Windows Server. Standard images are a great way of quickly provisioning new virtual machines, but you can also use images you created on-premises to deploy new virtual machines.
Creating a new virtual machine in Windows Azure is easy. Just open the Windows Azure Management Portal and select the Virtual Machines tab on the left and click the New button in the command bar at the bottom. The command bar expands and displays two options for creating virtual machines: Quick Create or From Gallery.
The Quick Create option lets you create a new virtual machine which you can configure later. As Figure 3 shows, all you need to specify for this option is the DNS name for your virtual machine, the image to use as a template for your virtual machine, the size of the virtual machine (number of cores), a user name and password for administrative access to the virtual machine, and the region or affinity group to which the virtual machine should be assigned:
Figure 3: The Quick Create option for a Virtual Machine
The other option, called From Gallery, lets you create a virtual machine by specifying advanced options presented in a series of pages. The first page shown in Figure 4 allows you to choose an image to be used as a template when creating your virtual machine…
Figure 4: You can choose an image on which your new virtual machine will be based.
Windows Azure Pack vs. Windows Azure
Let’s review the definition that Microsoft uses for describing Windows Azure:
Windows Azure is an open and flexible cloud platform that enables you to quickly build, deploy, and manage applications across a global network of Microsoft-managed datacenters. You can build applications using any language, tool, or framework. And you can integrate your public cloud applications with your existing IT environment.
Now let’s examine how Microsoft describes Windows Azure Pack. First, here’s how they define Windows Azure Pack on their Server and Cloud Platform site:
The Windows Azure Pack is a collection of Windows Azure technologies available to Microsoft customers at no additional cost. Once installed in your datacenter, the Windows Azure Pack integrates with System Center and Windows Server to help provide a self-service portal for managing services such as websites, Virtual Machines, and Service Bus; a portal for administrators to manage resource clouds; scalable web hosting; and more.
Next, here’s how Microsoft defines Windows Azure Pack in the TechNet Library:
Windows Azure Pack for Windows Server is a collection of Windows Azure technologies, available to Microsoft customers at no additional cost for installation into your data center. It runs on top of Windows Server 2012 R2 and System Center 2012 R2 and, through the use of the Windows Azure technologies, enables you to offer a rich, self-service, multi-tenant cloud, consistent with the public Windows Azure experience.
Comparing these various definitions and reading the linked resources enables us to conclude the following about how Windows Azure Pack compares to Windows Azure:
- Both platforms provide a set of cloud services that allow you to build and deploy cloud-based applications using almost any programming language, framework, or tool. But while Windows Azure provides a broad range of several dozen different cloud services, Windows Azure Pack provides only a subset of these services, primarily Web Sites, Virtual Machines and Service Bus.
- Cloud applications running on either platform can easily be integrated with on-premises IT environments that utilize Windows Server to enable you to build hybrid solutions.
- While Windows Azure is hosted in globally distributed datacenters managed by Microsoft, Windows Azure Pack is something you can deploy within your own datacenter.
Thanks And Regards