[video] Protect Your Organization with BMC BladeLogic | @CloudExpo @BMCSoftware #Cloud

Digital Initiatives create new ways of conducting business, which drive the need for increasingly advanced security and regulatory compliance challenges with exponentially more damaging consequences. In the BMC and Forbes Insights Survey in 2016, 97% of executives said they expect a rise in data breach attempts in the next 12 months. Sixty percent said operations and security teams have only a general understanding of each other’s requirements, resulting in a “SecOps gap” leaving organizations unable to mobilize to protect themselves. The result: many enterprises face unnecessary risks to data loss and production downtime.

read more

Storage Wars: Cloud vs. the Card for Storing Mobile Content

Cloud storageIn May, Samsung announced what it describes as the world’s highest capacity microSD card. The Samsung EVO+ 256GB microSD card has enough space to store more than 55,000 photos, 46 hours of HD video or 23,500 MP3 files and songs. It can be used for phones, tablets, video cameras and even drones. It’s set to be available in 50 countries worldwide.

The announcement of Samsung’s new card comes at a time when the amount of mobile content that consumers are creating, consuming and storing on their smartphones and mobile devices is increasing at an exponential rate.  The Growing number of connected devices with advanced features, including high-resolution cameras, 4K video filming and faster processors, are fuelling a global ‘content explosion’.  The content being created today is richer and heavier than ever, placing a growing strain on device storage capacities which could damage the data and impair user experience.

Earlier this year, 451 Research and Synchronoss Technologies charted the growth of smartphone content and found that the average smartphone user now generates 911MB of new content every month. At this rate, a typical 16GB smartphone – which already has almost 11GB of user content on it – will fill up in less than two months.  Given that a high proportion of smartphone owners have low-capacity devices – 31% 16GB; 57% 32GB or smaller – many will (if they haven’t already) quickly find themselves having to make difficult decisions. At the moment, this means having to frequently remove photos, videos and apps to make room for new ones.

It’s also surprising that almost half of smartphone users have no off-device storage in place at all, despite the variety of storage options available. One option is a hardware solution like a memory card. Samsung claim its new microSD card delivers a seamless experience user when accessing, sharing and storing content between different devices (depending on compatibility, of course). Samsung’s suggested price for this experience is $250 however there is another storage option for end-users, the cloud.

Cloud-based storage arguably provides a more flexible and secure method for end-users to back up, transfer and restore their precious content. A memory card, like a phone, can be damaged, lost or stolen. In contrast, the cloud is an ever-present secure repository that retains and restores consumers’ files, photos and media, even if they lose or damage their device or card. However, even in the US, the most mature market for consumer uptake of cloud storage services, more than half of smartphone users are not currently using the cloud to manage their smartphone content.

But why should operators care? 

Subscriber loyalty to operators is being tested. Rather than receive a subsidised handset as part of a contract with an operator, growing numbers of people purchase their devices directly in a regular subscription agreement with the manufacturer instead. Rather than commit to a long-term contract, these consumers enter into no-obligation rolling connectivity-only agreements with their operator.

Offering consumers access to a personal cloud platform is an important opportunity for operators to re-engage with these consumers and keep them tied to their services. Helping subscribers manage the spiralling volumes of their content could be much more effective for operators than faddish offers and promotional bundles to keep subscribers connected to their brand and their ecosystem.

While there is already a lot of cloud competition in the market, such as Google Drive, iCloud, Dropbox and Box, however hosted storage and access has the potential to be much more than a “me too” play for operators, or even an answer to churn.

Cloud services can be a viable revenue generator for operators in their own right. They equip operators with an attractive channel for brand partnerships and developers to reach subscribers with an expanded ecosystem of services. Considerable productivity and profitability benefits can also be found, including reducing time on device-to-device content transfer and freeing up operators’ in-store staff for more in-depth customer engagement.

Operators shouldn’t approach the provision of cloud technology with unease. After all, their core business is all about providing secure wireless transport for voice and increasingly data quickly, at scale, and to a wide range of mobile phones and other connected devices. Cloud storage and access is the natural extension of this business. Of course, given the current climate of heightened awareness around privacy and security, it’s crucial to work with a vendor with a strong track record.  However, operators should realise they’re in a stronger position than they think when it comes to providing cloud services.

Written by Ted Woodbery, VP, Marketing & Product Strategy at Synchronoss Technologies

What does Clinton have in store for the tech industry?

Location United States. Red pin on the map.Hillary Clinton has recently released her campaign promises for the technology sector should she be elected as President Obama’s successor in November, reports Telecoms.com.

The technology agenda focused on a vast and varied number of issues within the technology industry, including the digital job-front, universal high-speed internet for the US, data transmission across jurisdictions, technological innovation and the adoption of technology in government. Although the statement does indicate a strong stance on moving technology to the top of the political agenda, there does seem to be an element of ‘buzzword chasing’ to gain support of the country’s tech giant.

“Today’s dynamic and competitive global economy demands an ambitious national commitment to technology, innovation and entrepreneurship,” the statement read. “America led the world in the internet revolution, and, today, technology and the internet are transforming nearly every sector of our economy—from manufacturing and transportation, to energy and healthcare.”

But what did we learn about America’s technology future?

Focus on 5G and new technologies

One of the more prominent buzzwords through the beginning of 2016 has been 5G as it is seemingly the turn-to phrase for the majority of new product launches and marketing campaigns. The Clinton has aligned themselves with the buzz in committing to deploying 5G networks (no timeframe), as well as opening up opportunities for a variety of next gen technologies.

“Widely deployed 5G networks, and new unlicensed and shared spectrum technologies, are essential platforms that will support the Internet of Things, smart factories, driverless cars, and much more—developments with enormous potential to create jobs and improve people’s lives,” the statement said.

The deployment of 5G has been split into two separate areas. Firstly, the use of the spectrum will be reviewed with the intention of identifying underutilized bands, including those reserved for the government, and reallocating to improve the speed of deployment. Secondly, government research grants will be awarded to various vendors to advance wireless and data technologies which are directed towards social priorities including healthcare, the environment, public safety and social welfare.

A recent report highlighted from Ovum highlighted the US is on the right track for the deployment of 5G, as the team believe it will be one of the leading countries for the technology. Ovum analysts predict there will be at least 24 million 5G subscribers by the end of 2021, of which 40% will be located in North America.

Europe US court of justiceData Transmission between US and EU

From a data transmission perspective, the Clinton team are seemingly taking offence to the European Court of Justice’s decision to strike down Safe Harbour, and the varied reception for the EU-US Privacy Shield. It would appear the Clinton team is under the assumption the deal between the EU and US was struck down for economic reasons, as opposed to data protection.

“The power of the internet is in part its global nature. Yet increasing numbers of countries have closed off their digital borders or are insisting on “data localization” to attempt to maintain control or unfairly advantage their own companies,” the statement said. “When Hillary was Secretary of State, the United States led the world in safeguarding the free flow of information including through the adoption by the OECD countries of the first Internet Policymaking Principles.

“Hillary supports efforts such as the U.S.-EU Privacy Shield to find alignment in national data privacy laws and protect data movement across borders. And she will promote the free flow of information in international fora.”

While it is could be considered encouraging that the mission of the Clinton team is to open up the channels between the two regions again, it does seem to have missed the point of why the agreement was shot down in the first place. The statement seemingly implies EU countries refused the agreement on the ground of promoting the interests of EU countries in the EU, as opposed to privacy concerns and the US attitude to government agencies access to personal information.

Safe Harbour, the initial transatlantic agreement, was shot down last October, though its proposed successor has come under similar criticism. Only last month, the European Data Protection Supervisor, Giovanni Buttarelli, outlined concerns on whether the proposed agreement will provide adequate protection against indiscriminate surveillance as well as obligations on oversight, transparency, redress and data protection rights.

“I appreciate the efforts made to develop a solution to replace Safe Harbour but the Privacy Shield as it stands is not robust enough to withstand future legal scrutiny before the Court,” said Buttarelli. “Significant improvements are needed should the European Commission wish to adopt an adequacy decision, to respect the essence of key data protection principles with particular regard to necessity, proportionality and redress mechanisms. Moreover, it’s time to develop a longer term solution in the transatlantic dialogue.”

The Clinton team can continue to discuss changes to the transatlantic data transmission policy should they choose, however it is highly unlikely any positive moves are to be made until it gets to grips with the basic concerns of EU policy makers.

Navigating big dataAccess to Government data

Currently there are certain offices and data sets which are accessible to the general public, though this is an area which will be expanded under a Clinton regime. The concept is a sound one; giving entrepreneurs and businesses access to the data could provide insight to how money could be saved, used more efficiently or even new technologies implemented to improve the effectiveness of the government, though there could be a downside.

“The Obama Administration broke new ground in making open and machine-readable the default for new government information, launching Data.gov and charging each agency with maintaining data as a valuable asset,” the statement said. “Hillary will continue and accelerate the Administration’s open data initiatives, including in areas such as health care, education, and criminal justice.”

The downside has the potential to ruin any politician. The program is opening the door for criticism from all sides, and will offer ammunition to any opposition.

Connecting American Citizens

One of the most focused points of the document was around the country’s commitment to ensuring each household and business has the opportunity to be connected to high-speed broadband. While this could be considered an effective sound-bite for the party, it is not a new idea by any means. A recent report highlighted there is currently a surprising number of Americans who do not currently have access to broadband. Although it may be expected those in the rural communities would struggle at times, the report indicated 27% and 25% of New York and Los Angeles respectively would be classed in the “Urban Broadband Unconnected” category, which could be considered more unusual.

Connect America Fund, Rural Utilities Service Program and Broadband Technology Opportunities Program are all well-established operations (Rural Utilities Service Program has been around since 1935) which had been drums for previous presidents to bang also. Clinton has said very little new here or has made little commitment to the initiatives.

The team have however committed to a $25 billion Infrastructure Bank which will enable local authorities to apply for grants to make improvements. This is a new concept which Clinton plans to introduce though the details on how it will be funded, what the criteria for application will be or whether there are any stipulations on which vendors the money can be spend with, are not detailed.

Understanding UK cloud adoption strategies: Where are we currently at?

(c)iStock.com/PeskyMonkey

Last week, as part of London Technology Week, we held a breakfast panel discussion looking at cloud adoption strategies and overcoming cloud security and compliance challenges. Industry experts from Cisco, techUK, Behind Every Cloud, as well as cloud users joined us to discuss and explore adoption strategies for ensuring success. In particular, we wanted to understand what is working in the UK and what is not. What are companies aiming to achieve, and where are they in their cloud adoption journeys? 

Simon Herbert from Cisco kicked off the discussion by saying that companies will undoubtedly be atdifferent stages but what is important is to understand the priorities and desired outcomes for the business. Simon went on to say that it’s about understanding where you are and where you need to be and developing a cloud adoption strategy in line with this.  In Herbert’s eyes moving to the cloud is not simply flicking a switch, it is about having a more considered approach.

Sue Daley from techUK wholeheartedly agreedand said that at techUK they are focused on helping organisations optimise cloud and make the most out of their cloud investment. Daley went on to say that here in the UK we have a very vibrant cloud computing market with lots of opportunities to use cloud and a strong appetite amongst UK businesses.

Sue talked about the six key areas in techUK’s Cloud 20/20 vision aimed at keeping the UK at the forefront of cloud adoption. These are enabling data portability and system interoperability within the cloud computing ecosystem, building trust in the security of cloud services, making sure that we have the right regulatory environment to support cloud, while addressing the culture change that cloud brings. Another key area in the 20/20 vision is to ensure effective public sector adoption, and lastly making sure that here in the UK there is a communications infrastructure that is ready for mass cloud adoption.

Ray Bricknell from Behind Every Cloud talked about the fact that many of the companies he works with are mid-tier financial services companies who are not ‘born in the cloud’ and therefore not adopting cloud as quickly as you would expect.  He said that often there is a mandate from the CEO to move everything to the cloud but the reality is that many of these organisations are still at early stages.  They are trying to figure out which workloads to prioritise and who they should work with, often driven by a specific project rather than a big bang approach.

Krisztian Kenderesi from Bluestone Financial Services Group, an iland customer, agreed and said that often the cloud decision is not an IT decision but a business one.  He went on to say that Bluestone is using iland’s Disaster Recovery-as-a-Service and again that was driven by a specific need at the time.

As a panel we discussed some of the barriers to adoption and why some companies are not using cloud. Without a doubt data privacy, data protection, security and compliance are some of the main reasons. Bricknell suggested that one of the biggest barriers is a perception issue around cloud. He went on to say that it is an urban myth that cloud is cheaper but when an organisation compares the security, scalability and availability that is available with the right cloud solution, the costs of achieving this themselves become prohibitively expensive.

Kenderesi replied that at Bluestone they were motivated by cost saving initiatives as well as a ‘cloud-first’ mentality. He also went on to say that by using a DRaaS provider like iland a company could reduce overall in-house costs by 40% and achieve much better recovery points.

The conversation moved on to managing the cloud. Bricknell said that many of the companies he has worked with often don’t expect the move to the cloud to involve such a big management task to integrate information from multiple cloud solutions.  He commented that a lot of vendors don’t want to tell you what is going on under the hood and how important it is to have transparency with your cloud provider. Daley stated that interoperability and data portability in the cloud ecosystem are growing issues being discussed.

The panel concluded by summing up what IT leaders should look for when transitioning to cloud. Transparency and visibility are key. Shadow IT is prevalent and therefore interoperability and compliance with other vendors needs to be considered. Also, with imminent data protection changes around liability and data controls, this will introduce a lot more data protection requirements. We all agreed that when cloud becomes more strategic you need a much more open and trusted relationship with multiple providers, whereas when it is ad hoc, it doesn’t matter so much if the provider you work with is more proprietary.

Where security is concerned, the online threat environment is constantly evolving and therefore again you need an open and trusted relationship with your cloud service provider in order to constantly adjust to new threats. At the same time providers must tell their customers of any issues or breaches so that they are correctly dealt with at the time.  And finally, it is highly unlikely that an organisation will work with just one provider, most companies will spread their risk across multiple vendors which is why visibility into the service is absolutely key.

NTT makes play for IaaS

NTT CommunicationsNTT Communications has announced the deployment of managed private cloud solutions to HPE and NTT customers in the US in a play for the IaaS market.

Although not hitting the headlines as regularly as competitors such as AWS, Google and Microsoft, NTT has been recognized in the IaaS market by Gartner, and does already have a strong presence in the US market. Although noted as a niche player the IaaS segment, NTT does offer two platforms to global customers; NTT Enterprise Cloud and Cloudn. Gartner has noted the NTT does little to differentiate itself from the rest of the market, though it does have a healthy ecosystem of partners to compensate.

The new proposition will enable joint customers of HPE and NTT to purchase the company’s IaaS portfolio solutions, including cloud migration services, data centre consolidation, managed infrastructure services, and disaster recovery-as-a-service. The NTT team claim it is one of HPE’s first service provider partners capable of providing managed private cloud environments using the new HPE Helion CloudSystem.

“NTT America, the U.S. subsidiary of NTT Com, provides flexible, agile and cost-effective private hybrid cloud solutions to the NTT Com and HPE customer base,” said Indranil Sengupta, Regional Vice President of Product Management at NTT America. “These solutions can be delivered at NTT Com data centres, customer premises or at third party data centres.

“The solution architecture allows customers to leverage their current investments and augment with additional services that they need to run their business efficiently. All of NTT Com’s cloud solutions focus on the five key considerations of security, compliance, migration, legacy integration and change management.”

While a niche player in the market, the move could represent a strategic win for the NTT team, who already has a healthy reputation in North America, and a growing customer base. It would also be considered a timely move as trends in the industry are leaning more towards multi-cloud propositions, where decision makers are more open to working with different cloud providers for different workloads and data sets.

5G will be commercially ready by 2021 – Ovum

Tablet PC with 5GAnalyst firm Ovum has released its inaugural 5G Subscription Forecasts this week which estimates there will be 24 million 5G subscriptions worldwide at the end of 2021, reports Telecoms.com.

The team at Ovum believe 5G commercial services will be a normalized aspect of the telco landscape by 2020, though this will be dominated by those in North America and Asia, who will account for as much as 80% of global 5G subscriptions by 2021. Europe would only take 10% of the pie, with the Middle-East and Africa splitting the remaining 10%.

While 5G could be considered as an advertising buzzword within the telco industry on the whole, the need for speed has been driven primarily by the advancements in user device capabilities. Claims to be the first vendor to have achieved the 5G status are not uncommon, though Nokia was the latest to make such a statement, in claiming its 5G network stack, which combines radio access technology with a cloud-based packet core, running on top of an AirFrame data centre platform, is the foundation of a commercial 5G architecture. This in itself is a bold claim as 5G standards are yet to be fully ratified.

“The main use case for 5G through 2021will be enhanced mobile broadband services, although fixed broadband services will also be supported, especially in the US,” said Mike Roberts, Ovum Practice Leader for carrier strategy and technology. “Over time 5G will support a host of use cases including Internet of Things and mission-critical communications, but Ovum does not believe those use cases will be supported by standardized 5G services through 2021.”

While announcements claiming organizations are 5G ready, a number of these have been excluded from the research. Numerous operators have claimed they will be launching 5G prior to the 2020 date stated in the research though these are services which will generally be deployed on networks and devices which don’t complying with 5G standards. These examples are seemingly nothing more than advertising ploys playing on the excitement of 5G. Ovum defines a 5G subscription as an active connection to a 5G network via a 5G device. 5G is further defined as a system based on and complying with 3GPP 5G standards.

In the UK, Ofcom has stated 5G services could be commercially available by 2020, though this claim has not been backed up by the team at Ovum. While it has not directly commented on the state of play in the UK, Ovum believe the majority of subscribers will be located in the US, Japan, China, and South Korea, countries where the operators have set more aggressive deadlines for the introduction of 5G services.

One area which has not been cleared up is the future spectrum requirements of 5G. Use cases for the high speed networks have already been outlined, including varied cases such as scientific research to financial trading and weather monitoring, though how the supply of spectrum will be split for immediate and future needs is still unknown.

“5G must deliver a further step change in the capacity of wireless networks, over and above that currently being delivered by 4G,” Steve Unger, Group Director at Ofcom on the website. “No network has infinite capacity, but we need to move closer to the ideal of there always being sufficient capacity to meet consumers’ needs”

A disaster recovery plan: What is your IT team keeping from you?

(c)iStock.com/Dimitrios Stefanidis

Your disaster recovery program is like a parachute – you don’t want to find yourself in freefall before you discover it won’t open. But amid hastening development cycles, and cost, resource and time pressures, many CIOs are failing to adequately prioritise DR planning and testing.

While IT teams are running to stand still with day-to-day responsibilities, DR efforts tend to be focused solely on infrastructure, hardware and software, neglecting the people and processes needed to execute the plan. At best, this runs the risk of failed recovery testing. At worst, a business may be brought to its knees at a time of actual disaster without any chance of a swift recovery.

Even if you passed your last DR test, it’s only a predictor of recovery, not a guarantee

Your team may be reluctant to flag areas of concern, or admit that they aren’t confident your DR plan will work in practice. Perhaps they’re relying on the belief that “disaster” is a statistically unlikely freak of nature (we all know hurricanes hardly ever happen in Hertford, Hereford and Hampshire) rather than a mundane but eminently more probable hardware failure or human error. It’s possible that at least one of these admissions may be left unspoken in your own organisation:

“We’re not confident of meeting our RTOs/RPOs”

Even if you passed your last annual DR test, it’s only a predictor of recovery, not a guarantee. Most testing takes place under managed conditions and takes months to plan, whereas in real life, outages strike without notice. Mission-critical applications have multiple dependencies that change frequently, so without ongoing tests, a recovery plan that worked only a few months ago might now fail to restore availability to a critical business application.

“Our DR plan only scratches the surface”

Many organisations overlook the impact of disruption on staff and the long-term availability of their data centres. How long you can support an outage at your recovery centre – whether that’s days or weeks – will determine your DR approach. Can you anticipate what you would do in a major disaster if you lost power, buildings or communication links? What if you can’t get the right people to the right places? How well is everyone informed of procedures and chains of command? People and processes are as relevant as technology when it comes to rigorous DR planning.

“We know how to fail over… just not how to fail back”

Failback – reinstating your production environment – can be the most disruptive element of a DR execution, because most processes have to be performed in reverse. Yet organisations often omit the process of testing their capabilities to recover back to the primary environment. When push comes to shove, failure to document and test this component of the DR plan could force a business to rely on its secondary site for longer than anticipated, adding significant costs and putting a strain on staff.

“Our runbooks are a little dusty”

How often do you evaluate and update your runbooks? Almost certainly not frequently enough. They should contain all the information your team needs to perform day-to-day operations and respond to emergency situations, including resource information about your primary data centre and its hardware and software, and step-by-step recovery procedures for operational processes. If this “bible” isn’t kept up to date and thoroughly scrutinised by key stakeholders, your recovery process is likely to stall, if not grind to a halt.

“Change management hasn’t changed”

Change is a constant of today’s highly dynamic production environments, in which applications can be deployed, storage provisioned and new systems set up with unprecedented speed. But the ease and frequency with which these changes are introduced means they’re not always reflected in your recovery site. The deciding factor in a successful recovery is whether you’ve stayed on top of formal day-to-day change management so that your secondary environment is in perfect sync with your live production environment.

“Our backup is one size fits all”

In today’s increasingly complex IT environments, not all applications and data are created equal. Many organisations default to backing up all their systems and both transactional and supportive records en masse, using the same method and frequency. Instead, applications and data should be prioritised according to business value: this allows each tier to be backed up on a different schedule to maximise efficiency and, during recovery, ensures that the most critical applications are restored soonest.

“Backing up isn’t moving us forward”

Backups are not, in isolation, a complete DR solution, but data management is a critical element of a successful recovery management plan. Whether you’re replicating to disk, tape or a blend of both, shuttling data between storage media is achingly slow. And if it takes forever to move and restore data, then regular testing becomes even less appealing. But foregoing a regular test restoration process simply because of time-to-restore concerns is a recipe for data loss in the event of an outage.

“We don’t have the bandwidth for testing”

Testing recovery procedures of applications is a whole other ballgame than recreating a data center from scratch. Trying to squeeze the whole exercise into a 72-hour testing window won’t do – that’s just enough time to marshal the right employees and ask them to participate in the test when it’s not part of their core function. So, companies often end up winging it with whatever resources they have on hand, rather than mapping out the people they need to conduct and validate a truly indicative test.

“We don’t want to do it…but we’re not keen on someone else doing it”

Trying to persuade employees that an outsource option for recovery is in their best interests can be like selling Christmas to turkeys. 

Foregoing a regular test restoration process simply because of time-to-restore concerns is a recipe for data loss

But in fact, partnering with a recovery service provider actively complements in-house skills-sets by allowing your people to focus on projects that move your business forward rather than operational tasks. It is also proven to boost overall recoverability. Managed recovery doesn’t have to be an all-or-nothing proposition, either, but a considered and congruous division of responsibilities.

With always-on availability becoming a competitive differentiator, as well as an operational must-have, you don’t have the luxury of trusting to luck that your DR plans will truly hold up in the event of a disaster.

The first step to recovery starts with admitting you have a problem and asking for help.

Read more: Will your current DR plans truly hold up in the event of a disaster?

Chromebooks Are The Next Best Thin Client

When Google introduced the Chromebook, many people thought that it was not a good idea to build laptops specifically for browser use. The lack of traditional storage, incompatibility with Windows applications, and need for a continuous Internet connection were some of the limitations of this product. However, Chromebooks have surprised everyone with their rising sales. […]

The post Chromebooks Are The Next Best Thin Client appeared first on Parallels Blog.

Architecting Change | @CloudExpo #Cloud #Agile #DigitalTransformation

In order to deal with disruptive business environments – as well as introducing disruption intentionally to shake up the competition – organizations must become better at dealing with change generally.
In other words, change itself must become a core competency.

Furthermore, the EA should play a central role in this pursuit of greater business agility, as there are many different types of change that require different approaches. Taking a difficult, multifaceted problem and breaking it up into its core pieces in order to apply appropriate solutions, after all, is a core strength of the EA.

read more