«Peak 10 is a hybrid infrastructure provider across the nation. We are in the thick of things when it comes to hybrid IT,» explained Michael Fuhrman, Chief Technology Officer at Peak 10, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
Archivo mensual: julio 2017
The top three approaches for improving cloud migration and security
For many enterprises, migrating towards a cloud delivered approach for IT systems is an attractive proposition. Cost efficiency and business agility are big drivers for CIOs to make the move. Most modern companies have either started migrating toward a public cloud or they are in the early planning and analysis phases of doing so.
At the same time, making the jump from on-site infrastructure to cloud-hosted platforms is not free of challenges such as, regulations, data governance, billing and cost management. One of the CIO’s highest priorities must be to minimise migration risk.
According to a LinkedIn Information Security Community survey, 49 percent of CIOs and CSOs feel that one of the major barriers to cloud adoption is the fear of data loss and leakage and 59 percent believe that traditional network security tools/appliances worked only somewhat or not at all, in the cloud.
Before an organisation makes the leap to the cloud, it’s imperative for CIOs and CSOs to address the following risks and concerns:
Regulatory requirements: Depending upon the industry, a company may be subject to more stringent regulations such as PCI DSS (payment cards), SOX, and HIPAA (health data). While the cloud doesn’t change the process and requirements needed in order to meet those regulatory standards, it often means that an organisation will need to leverage new approaches and technology. Some examples include identity and access management (IAM), audit logging and anomaly detection, and incident response and responsible disclosure.
Data governance: In addition to, or a part of, regulatory requirements is having a well-formed strategy for data governance and locality. As with on-premises, CIOs need to make sure that they have a well-defined data access policy in place to ensure that users can’t access or move data unless they are first approved. In addition to data access, encryption of sensitive data (both in-transit and at rest) should be implemented, and in the case of HIPAA, it’s required.
Infrastructure and application security: One of the main changes to infrastructure security in the Cloud is the move to a software-defined security model instead of a hardware-defined appliance and perimeter-based model. The same network planning needs to take place up-front, but it should be done with remembering that there is no true perimeter, and that all resources are elastic.
Due to this elastic, programmatic environment, it’s advised to have a continuous change monitoring solution in place so that there are never any configuration “surprises” that can potentially expose critical data or assets. In addition to infrastructure security, application security testing should be ideally performed during every new update that is delivered to provide continuous security assurance.
The best migration approaches
Once the IT department has fully addressed these risk factors, they can move on to plan the best cloud migration approach to meet the company’s business objectives and requirements. While there are a number of approaches used in the industry, below are the most broad:
Lift and shift: This approach involves mapping the on-premises hardware and/or VMs to similar resource-sized cloud instances. For example, if a company’s front-end application server has 4 CPUs, 64GB of RAM, and 512GB of local storage, they would use a cloud instance that matches that configuration as closely as possible. The challenges with this approach is that on-premise solutions are typically over-provisioned with respect to resources in order to meet peak loads as they lack the elastic, auto-scaling features of cloud. This results in increased cloud costs, which may be fine if this is a short-term approach
Refactor and rearchitect: In order to best maximize the features of cloud, such as auto-scaling, migration can be the forcing function to take some time and re-architect the application to be more performant and also keep the costs under control. It is also a good time to re-evaluate technology choices, as a company may be able to switch some solutions from more expensive commercial ones, to open-source or cloud-native offerings.
Shelve and spend: This third approach involves retiring a monolithic on-premises application and moving to a SaaS solution. An example of this would be an HCM (Human Capital Management) application, which is often times a disparate set of code bases tied together with a relational database, migrating to an offering such as Workday HCM. This allows the modernisation of business logic and offloads the operational burden of the service and infrastructure to the SaaS provider.
While there are a number of hurdles and challenges to overcome when it comes to cloud migration, these approaches can ensure that CIOs and CSOs take the best route in order to capitalize on the benefits of moving to the cloud, while minimising risk at the same time.
[video] Leveraging the Full Potential of Cloud with @Ocean9Inc | @CloudExpo #SAP #API #Cloud #DataCenter
«We are focused on SAP running in the clouds, to make this super easy because we believe in the tremendous value of those powerful worlds – SAP and the cloud,» explained Frank Stienhans, CTO of Ocean9, Inc., in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
Microsoft announces $23 billion revenue quarter as strong Azure growth continues
Microsoft has announced revenues of $23.3 billion (£17.9bn) in its most recent financial quarter – with cloud again at the heart of its success.
This will be something of a familiar story to those who have taken notice of Microsoft’s financials over the past year or so, with CEO Satya Nadella telling analysts that financial year 2017 “all up was a tremendous year of customer momentum with cloud, AI, and digital transformation.”
Microsoft does not disclose specific product revenues but instead gives a guideline; Azure revenue went up 97% year over year, with Office 365 commercial revenues up 43%. The company instead puts its revenues into a few buckets. ‘Productivity and business processes’ – the Office 365 side – was at $8.45 billion over the most recent quarter, an increase of 21%, while ‘intelligent cloud’ – Azure and server products – rose 10% to $7.43bn.
“Our technology world view of an intelligent cloud and an intelligent edge is resonating with businesses everywhere,” Nadella said, as transcribed by Seeking Alpha. “Every customer I talk to is looking for both innovative technology to drive new growth, as well as a strategic partner who can help build their own digital capability. Microsoft is that trusted partner.”
Among the quarter’s highlights for Microsoft included the acquisition of Cloudyn in June, to help Azure customers manage and optimise their cloud usage, as well as joining the open source Cloud Foundry Foundation as a gold member and an expanded partnership with cloud storage provider Box focusing around machine learning. The latter angle was noted by Nadella, who told analysts: “The core currency of any business going forward will be the ability to convert their data into AI that drives competitive advantage.”
Meanwhile, another company whose cloud arm stands out alongside the rest of their results is SAP, who filed yesterday. Revenues for the German software giant were at €5.8 billion (£5.2bn) for Q2, with cloud revenues seeing the biggest growth at 29% to €932 million.
You can read the full Microsoft release here.
Cloud security spending goes up for organisations as app-level responsibility bites
Organisations are more likely to prefer storing data in the cloud instead of on a legacy system – but are spending significantly on security to keep up.
That is the latest finding from Clutch, a B2B research firm, which put out the latest report from its annual cloud computing survey earlier this week.
Nearly 70% of the 283 IT professionals polled said they would be more comfortable storing data in the cloud; yet more than half of companies surveyed admitted to spending more than $100,000 per year on additional cloud security features. 22% of respondents spend at least $500,000 on additional cloud security features per year, while 8% spend more than $1 million.
Of the security measures available, additional encryption was the most popular among respondents, while two thirds (65%) of businesses said they follow regulatory standards from the Cloud Security Alliance.
The report also delved into who should do what when it comes to cloud security. As this publication has explored this month, through a report from Barracuda Networks, there appears to be a disconnect among organisations around the shared responsibility model for infrastructure as a service.
Application level controls, identity and access management, and endpoint protection, among others, are the customer’s responsibility, as outlined by both Microsoft and Amazon Web Services (AWS) in their documentation. Clutch argues that the high investment in cloud security is related to the risks that are out of their cloud provider’s control.
“There is suddenly a number of people recognising that application-level security needs to be done by the user, not the vendor,” said Haresh Kumbhani, founder and CEO of cloud consulting provider Zymr. “If this is the case, then they need to invest top dollar in securing the data.”
Almost a quarter (23%) of respondents said they use Internet of Things (IoT) services on the cloud, although when it came to security on top of it – a significant threat, given the frequent global cyberattacks which invariably make the headlines – it was described by Jamie MacQuarrie, co-founder of Appivo, as ‘nascent’. “For every company that properly locks down IoT-enabled machines on a factory floor, you have thousands of unsecured ‘smart’ lightbulbs,” he said.
You can read the full report here.
IBM adds four new cloud data centres as second quarter results hit
IBM has issued its financial results for Q217, with revenues down 5% year over year but with its cloud arm leading ‘continued growth in strategic imperatives’.
According to the release (pdf), total revenues were at $19.3 billion (£14.9bn), compared to $20.2bn this time last year. The first half of the year totalled $37.4bn, down from $38.9bn in 2016. IBM said its second quarter cloud revenues were at $3.9 billion, up 15%, with cloud revenues over the last 12 months totalling $15.1bn.
IBM puts its revenues into four primary buckets; cognitive solutions at $4.6bn, global business services, at $4.1bn, technology services and cloud platforms, at $8.4bn, and systems, including systems hardware and operating systems software, at $1.7bn.
“In the second quarter, we strengthened our position as the enterprise cloud leader and added more of the world’s leading companies to the IBM cloud,” said Ginni Rometty, IBM chief executive officer in a statement. “We continue to innovate, adding regtech capabilities to our portfolio of Watson offerings; developing solutions based on emerging technologies such as blockchain; and reinventing the IBM mainframe by enabling clients to encrypt all data, all the time.”
IBM’s focus on blockchain and artificial intelligence (AI) was made abundantly clear at the company’s InterConnect event in Las Vegas in March. Rometty told attendees of her belief that blockchain “will do for trusted transactions what the internet has done for information”, and how quantum computing will solve problems businesses ‘never knew [they] had’.
Signifying the cloud push, the company has also announced the arrival of four new cloud data centres yesterday, with two opening its doors in London and the others in San Jose and Sydney. IBM’s global cloud data centre footprint now sits at almost 60, across 19 countries.
The company name dropped two customers, in the shape of Bit.ly and oilfield services provider Halliburton, with John Considine, general manager for cloud infrastructure services, saying “we continue to expand our cloud capacity in response to growing demand from clients who require cloud infrastructure and cognitive services to help them compete on a global scale.”
Given the continued downturn in overall revenues – revenue falling for the 21st consecutive quarter – analysts would be expected to have a pessimistic outlook. An excoriating note from Jefferies, reiterating ‘underperform’, said that while IBM’s Watson is one of the most complete cognitive platforms, the company is ‘outgunned’ in the war for AI talent and return on investment could be negligible.
Yet writing for Seeking Alpha, Thomas Pangia – a long IBM supporter – argued the company’s declining revenue trend should be a thing of the past by 2019, adding its strong cash flow was also a positive.
Among the highlights for IBM this quarter is buying Verizon’s cloud and managed hosting service, collaborating with Nutanix to help enterprises with hyperconverged deployments, as well as securing a cloudy client win with American Airlines.
Navigating pain points when migrating your enterprise to the cloud
Many enterprises are jumping on the cloud to modernize their IT. Cloud computing not only provides an on-demand highly scalable compute, storage and network infrastructure, it allows IT to spin up an environment in minutes, driving agility the enterprise has never seen before.
However, moving workloads to cloud or refactoring them to run natively in the cloud is not easy. Once those workloads are running in the cloud, monitoring and managing them is required. IT needs all the help it can get.
Fortunately, an entire cloud channel ecosystem has evolved to help companies plan for the cloud, move to the cloud and maximize their investments in the cloud. There are partners specializing in strategy and assessment, those who focus on implementation, others which do monitoring, management and security, and some companies that handle it all.
Then of course there are hundreds of vendors offering commercial and open source tools to automate various steps of the journey, and others which deliver analytics to monitor performance and aid decision-making and refinement.
Let’s look at three of the top challenges for midsize and larger companies when migrating to public cloud infrastructure, and how channel partners can help:
Understanding ROI
CIOs need to make a sound case for saving money over time by moving significant portions of their environment to the public cloud. This can be a complicated endeavor to calculate, as enterprises have a mix of legacy and homegrown applications, third-party systems such as HR and finance, usually a few SaaS applications, and likely more than one data center. The cost evaluation must consider the assessment, planning and migration costs; application modernization requirements and any training and staff needs.
How the channel helps: Skilled partners can help assess and compare on-premises infrastructure costs to the cloud with better accuracy and speed. Some channel companies have developed strong methodologies and best practices, others have powerful tools that can map your on-premises infrastructure to the cloud, right-size your resources in the cloud, and identify infrastructure and application inter-dependencies.
Determining and achieving business advantage
Of course, moving to the cloud is not just about saving money. It’s about gaining new capabilities from the rapid scale, elastic workload and geolocation benefits of public cloud infrastructure. If you need to support a business unit in London, you don’t need to contract with a separate data center provider to make it happen. The major cloud providers have data centers all around the world offering the best reliability and fail-over capabilities for 24-hour businesses. Rapid scale can provide a competitive advantage, yet outside experts with plenty of experience migrating companies to the cloud can help a business understand exactly what’s needed to get there.
How the channel helps: Companies should look for a partner that has experience in modernizing and refactoring applications within the industry that they work. You should engage partners that have experience in DevOps and TechOps tools, and those that can help you with continuous integration, continuous delivery (CI/CD), build automation and more.
Managing the cloud
Once you have successfully moved workloads or deployed new services in the cloud, you need a partner to monitor and manage your cloud environment. Even though cloud providers manage the infrastructure, you still must manage your own workloads. Cloud infrastructure services generate tons of data, events and alerts that need to be analyzed. Also, you are constantly incurring costs that can be reduced by optimizing your resources.
How the channel helps: Cloud service providers can manage and monitor your cloud environment, ensuring your services are running efficiently. Cloud optimization requires deep expertise with tools and techniques, something a good partner can bring.
When it comes to making transformational changes in how IT resources are developed, managed, provisioned and delivered, most companies will need some outside help. Partners that can offer a program incorporating competitively priced solutions for the above three key pain points will have an edge in the cloud services market.
[session] Demystifying #Kubernetes | @DevOpsSummit #CloudNative #Serverless #DevOps
Kubernetes is an open source system for automating deployment, scaling, and management of containerized applications. Kubernetes was originally built by Google, leveraging years of experience with managing container workloads, and is now a Cloud Native Compute Foundation (CNCF) project. Kubernetes has been widely adopted by the community, supported on all major public and private cloud providers, and is gaining rapid adoption in enterprises. However, Kubernetes may seem intimidating and complex to learn. This is because Kubernetes is more of a toolset than a ready solution. Hence it’s essential to know when and how to apply the appropriate Kubernetes constructs.
What’s Driving Subscription Services? | @CloudExpo #DaaS #XaaS #Cloud
From personal care products to groceries and movies on demand, cloud-based subscriptions are fulfilling the needs of consumers across an array of market sectors. Nowhere is this shift to subscription services more evident than in the technology sector. By adopting an Everything-as-a-Service (XaaS) delivery model, companies are able to tailor their computing environments to shape the experiences they want for customers as well as their workforce.
Google Cloud catches up to AWS with Transfer Appliance
Google Cloud has caught up to AWS with a physical ‘Transfer Appliance’ to move your data from your own local servers into the giant’s cloud.
Amazon already has a solution which it calls ‘Snowball’ and features 50TB or 80TB capacities in a ruggedised appliance which the company sends to your premises so you can fill it with your data locally before it heads back to your preferred AWS data centre. The idea, of course, is that you can benefit from a much quicker transfer without the latency and cost of uploading over a standard WAN (Wide Area Network).
If you have a large amount of data and don’t want to pay through the roof, Google’s solution may benefit you more than Amazon’s. The web giant has upped the capacity of Amazon’s similar offerings with a 100TB/2U basic Transfer Appliance, or an incredible 480TB/4U variation. Both are designed to fit into 19” racks.
Google has provided this handy chart of the estimated time differences between a physical and online transfer:
«Using a service like Google Transfer Appliance meant I could transfer hundreds of terabytes of data in days not weeks,” comments Tom Taylor, Head of Engineering at The Mill. “Now we can leverage all that Google Cloud Platform has to offer as we bring narratives to life for our clients.»
As for pricing, the 100TB model is priced at $300, plus shipping via Fedex (approximately $500); the 480TB model is priced at $1800, plus shipping (approximately $900). Initially, the appliance will only be available in the US.
It’s worth noting, of course, that Amazon still takes the crown if you need to transfer an insane amount of data with its 100PB (yes, petabyte) truck it calls the Snowmobile. Before the 45-foot long ruggedized shipping container – which is pulled by a semi-trailer truck – rolls out to your premises, you will need an initial assessment.
Are you impressed with Google’s Transfer Appliance? Share your thoughts in the comments.