AWS launches new features at Chicago Summit

amazon awsAmazon Web Services has launched a number of new features, along with the announcement that AWS Import/Export Snowball is now available in four new regions, including Europe.

Speaking at AWS Chicago Summit, the team announced several updates including new security features, tools which simplify the movement of data around an organizations cloud, platforms for automatically deploying and running apps on Amazon’s cloud infrastructure, testing features, as well as authentication services.

Firstly, the AWS Device Farm Update is a feature, initially introduced last June, which enables customers to test mobile apps on real devices. The service is built on the concept of ‘write once, test everywhere’, enabling developers the chance to test apps in more than 200 unique environments (a variety of carriers, manufacturers, models, operating systems etc.). The update now provides customers with remote access to devices for interactive testing.

Writing on the AWS blog, Jeff Barr, Chief Evangelist at Amazon Web Services said, “you simply open a new session on the desired device, wait (generally a minute or two) until the device is available, and then interact with the device via the AWS Management Console. You can gesture, swipe, and interact with devices in real time directly through your web browser as if the device was on your desk or in your hand. This includes installing and running applications.”

Amazon S3 and Snowball, designed to increase speed of the data migration process, also received attention during the event. The AWS Import/Export Snowball was launched for customers who intend to move larger amounts of data, generally 10 terabytes or more, and has now been beefed up once again. New features for S3 make use of the AWS edge infrastructure to increase speed, and Snowball also has larger-capacity as well as now being available in four new regions.

“Many AWS customers are now using AWS Import/Export Snowball to move large amounts of data in and out of the AWS Cloud,” said Barr. “The original Snowball appliances had a capacity of 50 terabytes. Today we are launching a newer appliance with 80 terabytes of capacity.”

Amazon Kinesis, a service which enables users to manage data that is streamed into the cloud, has been updated to allow users to deploy, run, and scale Elasticsearch in the AWS Cloud, as well interaction with Amazon CloudWatch, its monitoring service.

The Cognito service allows apps to add authentication, user management, and data synchronization without having to write backend code or manage any infrastructure. The ‘Your User Pools’ feature update allows developers to build a user directory that can scale to hundreds of millions of users, to help manage the authentication process.

“Using a user pool gives you detailed control over the sign-up and sign-in aspects of your web and mobile SaaS apps, games, and so forth,” said Barr. “Building and running a directory service at scale is not easy, but is definitely undifferentiated heavy lifting, with the added security burden that comes when you are managing user names, passwords, email addresses, and other sensitive pieces of information. You don’t need to build or run your own directory service when you use Cognito Identity.”

Finally, the Elastic Beanstalk, which automatically deploys and runs apps on Amazon’s cloud infrastructure, has also been updated, by adding support for managed platform updates. Developers are now able to select a maintenance window, and the new feature will update the environment to the latest platform version automatically.

“The updates are installed using an immutable deployment model to ensure that no changes are made to the existing environment until the updated replacement instances are available and deemed healthy (according to the health check that you have configured for the application),” said Barr.

Leadership restructure has little impact as VMWare reports 5% growth

VMWare campus logoVMWare has reported healthy growth during its Q1 earnings call despite disruptions in the management team over the period.

Revenues for the first quarter were reported at $1.59 billion, an increase of 5% in comparison to the same period in 2015, though license revenues saw a drop of 1% to $572 million. The company now expects second-quarter revenue of $1.66 billion to $1.71 billion, compared with analysts’ average estimate of $1.66 billion.

“Q1 was a good start to 2016, both for results and against our strategic goal of building momentum for our newer growth businesses and in the cloud,” said Patrick Gelsinger, CEO at VMWare. “Our results were in line with our expectations for the period and support our outlook for the full year.”

Over the course of the period, there may have been concerns surrounding changes in the leadership team, and how a restructure would impact the performance of the business on the whole. Carl Eschenbach announced last month he would be leaving his post as VMWare President and COO to join venture capital firm Sequoia Capital as a Partner. CFO Jonathan Chadwick also left the business in January.

Eschenbach joined the firm in 2002 as VP of Sales, was appointed as co-President and COO in 2011 and eventually as the stand-alone President in 2012. During Eschenbach’s time at VMWare, revenues grew from $31 million in 2002, to more than $6 billion in 2015. The changes in leadership would not have appeared to have stifled the company’s performance, as its cloud business units performed healthily over the first quarter.

“We think on the executive side, it really is the combination of being able to attract new players than – I mentioned Rajiv (Rajiv Ramaswami, GM, Networking and Security) we brought in a leader for China, Bernard (Bernard Kwok, Greater China President); we’ve been able to continue to attract talent,” said Gelsinger. “We’ve also had commented on our very strong bench, and – like Maurizio (Maurizio Carli, VP Worldwide Sales), we had brought him over from Europe a year plus ago to prepare for this eventuality, and so we had been grooming and preparing for these transitions.”

The company also reported healthy growth for its cloud business unit, including NSX, VSAN, End-User Computing and vCloud Air Network. The company highlighted standalone vSphere license bookings were less than 35% of total bookings, a figure which was more than 50% two years ago. The team claim this reduction demonstrates the product offering has been successfully diversified.

“Turning to hybrid cloud. Total bookings for vCloud Air Network grew over 25% year-over-year,” said Zane Rowe, CFO at VMWare. “We see significant interest from cloud and service providers around the world wanting to utilize our hybrid cloud technologies. For example, as Pat mentioned earlier, IBM will be delivering a complete SDDC offering based on VMware’s technologies across their expanded footprint of cloud data centres worldwide. vCloud Air also performed well in Q1 with large enterprise customer adoption.”

In terms of long-term strategy, Gelsinger outlined a three-point plan to facilitate VMWare’s growth in the cloud market segment. Firstly, the business will consolidate its position in the private cloud space, a segment which it describes as the ‘foundation of our business’. Secondly, through the vCloud Air service and vCloud Air Network, the company aims to encourage its customers extend their private cloud into the public cloud. And finally, connecting, managing and securing end points across a range of public clouds, including Amazon Web Services and Microsoft Azure.

The top five in-demand cloud security skills for 2016

(c)iStock.com/Jirsak

The cloud computing market continues to expand at a phenomenal rate. According to a recent report from IDC, worldwide spending on public cloud services will grow 19.4% annually through to 2019. Six times the rate of overall IT spend growth, this will double the current spend from $70 billion to $141 billion.

As more businesses transition to the cloud this will drive up demand for IT professionals with the skills to make the most of the technology. A market, which according to WANTED Analytics, already offers 18 million jobs worldwide. When considering this transition, a report from BT suggests security remains the number one concern for businesses looking to implement cloud technology. Therefore, IT professionals with cloud security skills will be more in-demand than most.

To help you take advantage of this growing opportunity, take a look at this list of the top five in-demand cloud security skills.

Compliance

The instant a business makes use of a cloud storage or backup service, compliance becomes an issue. Moving data from internal to external storage requires that business to closely examine how that data is kept to ensure they remain compliant with laws and industry regulations.

Understanding data types you can and can’t move to the cloud, asking the right questions of providers and ensuring correct terms are written into service level agreements (SLAs) are all critical areas in maintaining a business’s cloud compliance. Professionals who can demonstrate these skills within these areas will be in huge demand, crucial to any organisation using cloud storage services. Getting on the wrong side of cloud compliance can cost organisations $millions in fines, not to mention the reputational damage it can cause.

Ethical hacking

One of the more exciting skills on the list is that of ethical hacking. Businesses are increasingly on the lookout for professionals with ethical hacking skills as they look to test the security of their private, public and hybrid cloud deployments.

These ethical hackers combine a series of tools, hacking and penetration techniques in order to search for weaknesses in the businesses systems. The Certified Ethical Hacker course from EC-Council is a great way to develop these skills, with new modules dedicated to cloud computing.

Platform specific knowledge

When it comes to selecting a cloud platform, businesses actively seeking professionals who can demonstrate in-depth knowledge of the leading providers. This platform specific knowledge is hugely important when it comes to security.

These professionals help the business choose the platform, understanding how cloud service providers implement security. This can include knowledge of how each provider enforces physical security of its infrastructure and facilities through to the inbuilt security features that can be implemented once using a platform.

For leading providers like Amazon Web Services and Microsoft Azure, there are a range of cloud courses available to help develop this platform specific knowledge.

Communication

Interestingly, among the most in-demand cloud security skills is one entirely unrelated to the technologies themselves, communication. Both cloud and security combined include complex technological concepts and jargon that require explanation. The ability to be able to communicate these concepts and requirements to management and non-technical staff is a critical part of ensuring cloud security is correctly implemented in an organisation.

In fact, the (ISC)² Global Information Security Workforce Study places communication as the most important skill in contributing to being a successful information security professional. Amazingly, 90% of the 14,000 respondents ranked it as the number one.

Encryption

TalkTalk’s recent data breach cost the company £60m. The cause, cyber attackers intercepting unencrypted data packages.

Encryption remains one of the most effective ways to achieve data security. With the volume of data now being transmitted from businesses to the cloud, professionals who can create complex encryption algorithms whilst also understanding the different cloud encryption services on offer will continue to be a valuable and sought after asset.

Conclusion

These are some of the hottest security skills demanded of the modern cloud professional. Developing your skills in just one of these areas will open the door to new and exciting opportunities in 2016 and beyond. Why not start today?

Read more: The top five in-demand cloud skills for 2016

Threads of Digital Architecture | @CloudExpo #IoT #M2M #DigitalTransformation

The notion of customer journeys, of course, are central to the digital marketer’s playbook. Clearly, enterprises should focus their digital efforts on such journeys, as they represent customer interactions over time. But making customer journeys the centerpiece of the enterprise architecture, however, leaves more questions than answers.

The challenge arises when EAs consider the context of the customer journey in the overall architecture as well as the architectural elements that make up each customer journey. After all, dividing up the world into familiar layers like process, data, and technology is relatively straightforward. There are overlaps and ambiguities at the edges to be sure, but everyone approaches such layers with a relatively common understanding of the overall scope of each layer and what belongs within each one.

read more

Redundancy Doesn’t Equal Backup | @CloudExpo #Cloud

Currently, the preferred method of data protection of cloud giants, such as Google, is to replicate the data across different locations (i.e., data centers), rather than performing a true back up. This is done because a true back up seems logistically too complicated given the amount of data these giants store. These companies have turned to replicated data because, the assumed risk of all replications crashing simultaneously are extremely slim. This risk assumption may be accurate but it does not take into account unintentional data destruction.

read more

Introducing the 2016 DZone Guide to Data Persistence | @CloudExpo #Cloud

Data persistence has a way of sneaking up on developers. You start out with a simple, straightforward database that can functionally hold the data you’re working with and the data you need to work with later. But as your needs change, you start to modify it here and there, until it becomes a brittle tangle of tables and keys and indexes. You need something more dynamic, but you need to retain the ability to retrieve your persistent data. Now, you’re miles from where you started, and staring down the possibility of having to adopt a whole new system. That persistent data is fundamental to nearly any operation that utilizes database management systems (DBMS), but while the growth of dynamic data has led to sophisticated and reliable new relational management systems, real-world techniques for optimal data storage and retrieval have sometimes been lost in the shuffle. With that in mind, we’ve put together a new edition of our comprehensive DZone Guide to Data Persistence.

read more

Digital Influencer @DHinchcliffe | @ThingsExpo #IoT #DigitalTransformation

In the digital arena, it’s impossible to avoid Dion Hinchcliffe. Perhaps you’ve seen one of his numerous keynotes or joined one of his workshops. Maybe you’ve read one of his books, Web 2.0 Architectures (with coauthors James Governor and Duane Nickull) or Social Business By Design (Peter Kim, coauthor). Or possibly you’ve spotted one of his stream of articles over the years, for ZDNet, ebizQ, and many others.

read more

Telstra launches one-to-many Cloud Gateway offering

GatewayAustralian telco Telstra has bolstered his position in the growing cloud market with the launch of Cloud Gateway.

The Cloud Gateway is Telstra’s new solution which enables businesses to connect to multiple public cloud environments, acting as a one-to-many “gateway” model via Telstra’s IP network.

“Most organisations don’t realise the full value of cloud out of a single service,” said Philip Jones, Global Products and Solutions at Telstra. “Instead, our customers are investing in sophisticated hybrid cloud environments, which come with their own range of fragmented networking challenges.

“These include managing multiple vendors, portals and contracts, while trying to maintain a high level of security, performance and operational efficiency. We believe that just because these solutions are sophisticated, doesn’t mean that they should also be complex. Cloud Gateway is Telstra’s simple way to connect multiple clouds, and create hybrid environments.”

The product offering will enable Australian customers to connect to Microsoft Azure, Office365, AWS, IBM SoftLayer, and VMware vCloud Air, while international customers can only connect to AWS and IBM SoftLayer for the moment.

“Telstra is very well positioned to help customers with hybrid and multi-cloud strategies, as we bring the cloud and the network together,” said Jones. “The network is the fundamental piece of the puzzle that helps provide a secure and reliable application experience. Having a single touchpoint also helps reduce IT complexity, enabling our customers to maximise the benefits of investing in cloud.”

Telstra has been making moves within the cloud space in recent months, following the announcement of a cloud innovation centre in February. The centre was launched alongside partners AWS and Ericsson with the focus of accelerating the adoption of cloud technologies.

“Telstra’s vision is to build a trusted network service for mission critical cloud data, and we are excited to explore the opportunity of bringing this vision to life with Ericsson and AWS,” said Vish Nandlall, CTO of Telstra, at the time of the announcement. “The Cloud Innovation Center at Gurrowa intends to bring together cloud experts from Ericsson, AWS and Telstra to encourage cloud adoption and the development of new business opportunities for Telstra and our customers.”

Announcing @Stratoscale to Exhibit at @CloudExpo New York | #Cloud

SYS-CON Events announced today that Stratoscale, the software company developing the next generation data center operating system, will exhibit at SYS-CON’s 18th International Cloud Expo®, which will take place on June 7-9, 2016, at the Javits Center in New York City, NY.
Stratoscale is revolutionizing the data center with a zero-to-cloud-in-minutes solution. With Stratoscale’s hardware-agnostic, Software Defined Data Center (SDDC) solution to store everything, run anything and scale everywhere, IT is empowered to take control of their data centers. Stratoscale is offering a Hyperconverged cloud supporting OpenStack out of the box.

read more

Microsoft enters the containers race

male and female during the run of the marathon raceMicrosoft has cashed in on one of the industry’s trending technologies, with the announcement of the general availability of the Azure Container Service.

The Microsoft container service was initially announced in September 2015 and released for public preview in February, is built on Opensource and offers a choice between DC/OS or Docker Swarm orchestration engines.

“I’m excited to announce the general availability of the Azure Container Service; the simplest, most open and flexible way to run your container applications in the cloud,” said Ross Gardler, Senior Program Manager at Microsoft, on the company’s blog. “Organizations are already experimenting with container technology in an effort to understand what they mean for applications in the cloud and on-premises, and how to best use them for their specific development and IT operations scenarios.”

While the growth of containers technology has been documented in recent months, a number of industry commentators have been concerned about the understanding of the technology within enterprise organizations themselves. A recent survey from the Cloud & DevOps World event, highlighted 74% of respondents agreed with the statement “everyone has heard of containers, but no-one really understands what containers are.”

Aside from confusion surrounding the definition and use case of containers, the Microsoft team believe the growth of the technology is being stunted by the management and orchestration. While the technology does offer organizations numerous benefits, traditional means of managing such technologies has proven to be in-effective.

“Azure Container Service addresses these challenges by providing simplified configurations of proven open source container orchestration technology, optimized to run in the cloud,” said Gardler. “With just a few clicks you can deploy your container-based applications on a framework designed to help manage the complexity of containers deployed at scale, in production.”

Along the availability announcement, Microsoft has also joined a new open source DC/OS project enabling customers to use Mesosphere’s Data Center Operating System to orchestrate their containers projects. The project brings together the expertise of more than 50 partners to drive usability within the software-defined economy.

The Docker Swarm version ensures any Docker compliant tooling can be used in the service. Azure Container Service provides a ‘Docker native’ solution using the same open source technologies as Dockers Universal Control Plane, allowing customers to upgrade as and when required.