What Is ADC Delivery and How Does It Bring Value To IT organizations?

In today’s dynamically changing business world, organizations are required to provide 24/7 access to resources. Speed and efficiency have become an inevitable requirement. With the advent of smartphones, virtual offices have taken a new shape. Moreover, a versatile range of devices have come into the network. Businesses not only have to provide continuous delivery of […]

The post What Is ADC Delivery and How Does It Bring Value To IT organizations? appeared first on Parallels Blog.

What did BCN readers say last week?

What do you think written on whiteboardOver the past week, we took the opportunity to gauge the opinion of the BCN readership on industry trends and issues, through a number of polls. Here’s what we found out:

Microsoft is unlikely to be successful? 58% say no

For the most part, Microsoft’s lawsuit has been keep out of the headlines. This is unlikely to indicate the whole episode is unimportant to the industry, but maybe more due to the fact the story has been overshadowed by the ongoing saga between Apple and the FBI.

In any case, Microsoft filed a lawsuit against the US government, citing the first and fourth amendment with regard to government agencies using secrecy orders to access its customer’s emails or records. From Microsoft’s perspective, the company should have the right to tell customers the government is accessing their data, aside from in exceptional circumstances. The government disagrees.

While the tech giant has taken it upon itself to fight the good fight alone, BCN readers are a bit more sceptical on the success of the venture. Only 42% believe Microsoft’s lawsuit will be successful, though this is a question which is unlikely to be answered for a substantial period of time. Any decision will be appealed by the despondent party, dragging out any decisions or changes in government practise.

When will containers hit mainstream? 21% say right now

Containers are one of the hottest trends in 2016. We recently ran a buzzword-buster article not only discussing what containers actually are, but more importantly what the value to enterprise actually is. Since then there have been numerous announcement focused around the technology, from Microsoft to Red Hat to Juniper, indicating containers are starting to get some traction.

But how much of the press is a smoke-screen and how much is reality? In short, it’s looking quite positive.

Cloud took a healthy amount of time to be trusted and understood by the mainstream market, and maybe it is this longer adoption time which has accelerated containers as a technology. 21% of BCN readers highlighted that they are currently using the technology in a meaningful way in their business currently, 50% believe it will be in the next 1-2 years, and only 29% said longer than three years.

Who is the best innovator in the cloud industry? 75% still say AWS

Last week AWS launched a host of new features at the AWS Chicago Summit, ranging from new security features, tools which simplify the movement of data around an organizations cloud, platforms for automatically deploying and running apps on Amazon’s cloud infrastructure, testing features, as well as authentication services.

Although this is the first major update from AWS in some time, Google and Microsoft have been feverishly bolstering their offerings over the last six months ranging from new hires, to new features and new acquisitions. Industry insiders have even told us at BCN that AWS could be seen to be sitting back to much, offering Google and Microsoft the opportunity to improve their own standing, and make up ground on the number one player in the cloud space.

BCN readers do not agree however. 75% believe AWS is still by far and away the industry leader, 10% believe AWS, Google and Microsoft are all on par, while 15% believe innovation has faltered at AWS, and the rest of the industry is catching up fast.

Is DevOps mainstream? 48% say no

DevOps is another of the buzzwords which has floated over from 2015 into 2016. However, as buzzwords go, few have captured the attention of the industry in the same manner. Such is the prominence of DevOps, it seems although every company is now a DevOps specialist, DevOps expert or DevOps orientated organization.

In fact, this isn’t only vendors who have adopted DevOps, but pretty much every enterprise decision maker has DevOps on the lips also. The main concern here is the definition of DevOps can be seen as lost on certain organizations. Yes, there are practitioners of the methodology, though there are also a host of people who have taken the phrase without fully understanding the implications and the means to implement such an idea.

And it would appear BCN readers also agree with that assumption. Despite DevOps being one of the most used words in the cloud industry, only 52% of our readers believe DevOps has hit the mainstream market.

Verizon launches NFV OpenStack cloud deployment over five data centres

VerizonVerizon has completed the launch of its NFV OpenStack cloud deployment project across five of its US data centres, alongside Big Switch Networks, Dell and Red Hat.

The NFV project is claimed to be the largest OpenStack deployment in the industry and is currently being expanding the project to a number of domestic data centres and aggregation sites. The company also expect the deployment to be adopted in edge network sites by the end of the year, as well as a number of Verizon’s international locations, though a time-frame for the international sites was not disclosed.

“Building on our history of innovation, this NFV project is another step in building Verizon’s next-generation network – with implications for the industry,” said Adam Koeppe, VP of Network Technology Planning at Verizon. “New and emerging applications are highlighting the need for collaborative research and development in technologies like NFV. We consider this achievement to be foundational for building the Verizon cloud that serves our customers’ needs anywhere, anytime, any app.”

Verizon worked with Big Switch Networks, Dell and Red Hat to develop the OpenStack pod-based design, which went from idea to deployment of more than 50 racks in five data centres in nine months, includes a spine-leaf fabric for each pod controlled through a Neutron plugin to Red Hat OpenStack Platform. The multi-vendor project uses Big Switch’s SDN controller software managing Dell switches, which are orchestrated by Red Hat OpenStack platform.

“Dell’s Open Networking initiative delivers on the promise of bringing innovative technology, services and choice to our customers and Verizon’s NFV project is a testament to that vision,” said Tom Burns, GM of Dell’s networking business unit. “With the open source leadership of Red Hat, the SDN expertise of Big Switch and the infrastructure, service and support at scale from Dell, this deployment demonstrates a level of collaboration that sets the tone for the Open Networking ecosystem. This is just the beginning.”

Juniper boosts security capabilities with two new product offerings

Secure cloudJuniper Networks has launched a number of new cloud and virtualised service offerings as part of its software-defined secure networks framework.

The new offerings include a new containerised virtual firewall called cSRX and a multi-core version of the Juniper Networks vSRX. The company claims the new vSRX version is ten times faster than the nearest competitor and creates new possibilities for using agile and flexible virtual firewalls, while cSRX is the first containerized offering for the industry.

“As the security landscape continues to evolve, it is more important than ever to work together to combat cyber threats,” Kevin Walker, Security CTO at Juniper Networks. “These key additions to our security portfolio will further our Software-Defined Secure Networks vision and greatly benefit our customers. Our products provide the best opportunity to create secure networks through policy, detection and enforcement. We are excited to be releasing the most flexible firewall solutions in the market and continue to showcase our commitment to bringing SDSN to organisations across the globe.”

Juniper believes the faster vSRX offering and the scalability of the containerized cSRX, combined with the higher density of services on the Intel Xeon processor family, will increase an organizations capability to detect threats.

“Juniper Networks is delivering significant scale and total cost of ownership advantages to its customers with the new cSRX, which fundamentally changes how security is deployed and illustrates the power of Software-Defined Secure Networks to provide a holistic network protection paradigm,” Mihir Maniar, VP of Security Product Management at Juniper Networks. “Moreover, with the addition of our 100 Gbps vSRX, our security portfolio is further advancing the industry’s highest performing virtual firewall.”

GE launches asset management offering for manufacturing industry

Engine manufactoringGE Digital has launched its suite of its suite of Asset Performance Management (APM) solutions, a cloud-based offering running on its Predix platform, to monitor industrial and manufacturing equipment and software.

The company claims industrial customers can now use data and cloud-based analytics to improve the reliability and availability of their GE and non-GE assets. While APM would generally not be considered a concept, GE claims its offering is the first commercially available to support the industrial data generated by a company’s assets, both physical and software based.

The launch builds on underlying IoT trends within the industrial and manufacturing industry to move towards a proactive performance strategy for their assets, repairing said assets before a maintenance issue as opposed to reacting to a fault.

“GE’s deep expertise in developing and servicing machines for industry gives us a greater understanding of real business operations and the insights to deliver on industry needs,” said Derek Porter, GM for Predix Applications at GE Digital. “With the launch of our APM solutions suite, GE is commercialising its own best practices for customers.”

The offering is split into three tiers. Firstly, a machine and equipment health reporting system will provide a health-check on the asset, detailing performance levels in real-time. Secondly, a reliability tool predicts potential problems within an asset, allowing engineers to schedule maintenance activities. And finally, a maintenance optimization tool will be available later in 2016 to optimize long-term maintenance strategies, which GE claim will enable customers to increase the lifecycle of the asset and reduce downtime.

Note: There is a poll embedded within this post, please visit the site to participate in this post’s poll.

The company also launched the generally available module of GE Digital’s Brilliant Manufacturing software suite, Efficiency Analyzer, which will be available through a new SaaS pricing model. Once again, the product offering is built on the need to analyse and activate data collected within manufacturing operations, to improve operational efficiency. One of the first use cases advertised by the company has been within its own transportation division.

“GE’s Brilliant Manufacturing Suite has enabled significant reduction in unplanned machine downtime resulting in higher plant efficiency,” said Bryce Poland, Advanced Manufacturing Brilliant Factory Leader, GE Transportation. “As part of our digital thread strategy, we will increase our machines and materials visibility by 400% in 2016.”

What did we learn from BT’s 2016 CIO Report?

Office worker sitting on rooftop in cityBT has recently released its 2016’s CIO report, dissecting the challenges and opportunities available for enterprise organizations, and the CIO, following the mainstream adoption of disruptive digital technologies.

The 2015 edition of the report highlighted CIO’s role was shifting away from that of a technologist and operations guru, and more towards a strategic, creative and consultative one. As organizations are still identifying what digital means for their own business, the CIO is becoming ever more central in the boardroom as each enterprise continues on the path to understand how technology adoption and integration could ultimately define its success or failure.

Here, we’ve detailed a few of the lessons learnt from the 2016 report:

Security is now being dealt with

Cloud and/or cyber security has been a topic of interest throughout the industry, though there has been a difficulty in addressing the challenge as few have identified a means to do so. It would appear that as there hasn’t been a concise or even complicated answer to the security conundrum, conversations have been swept under the carpet.

Through conversations BCN has had at recent events we understand security is still a major challenge, though discussions around how to become more secure are less taboo. In general, it would seemingly appear CIO’s have accepted the idea 100% secure is never possible, but this is okay. You have to continuously evolve your security strategy to adapt to a dynamic threat environment.

The report highlights 33% of respondents believe the transition through to cloud computing will act as a catalyst to improve security throughout the organization. It would appear the implementation of cloud is forcing enterprise to deal with security – it is no longer a subject which can be put off for another day.

Changes to CIO roleCloud is no longer a choice

65% of respondents stated their current infrastructures are struggling to deal with the rapid adoption of digital technologies. There are still challenges to the adoption of a cloud model (security, legacy systems, time constraints and budget), though the CIO’s in questions realize cloud is no longer an option to become more successful, but a necessity to remain relevant.

The CIO role has changed and there’s no going back

Traditionally the role of the IT department has been to ‘keep the lights switch on’ and to ensure the business does not close down. It’s operational, it’s in the backroom and it’s all about keeping things running. Not anymore.

The operational role of IT will never disappear, but the decision making capability and the influence on the businesses strategy has been increased. In fact, 72% of the respondents believe the CIO’s standing in the boardroom has improved increased, 73% believe the boards expectations of the CIO has increased and 70% believe the board are now looking for a creative CIO, not just someone to keep everything ticking along.

A successful CIO will be able to bridge the gap between IT and the rest of the business, becoming more of a businessman as opposed to a technologist. The disruptive nature of digital technologies ensure CIO’s now have to be driven by flexibility, adaptive to new ideas, understanding of agile models and more receptive to alternative trends. This could be seen as quite a shift in what would be the current perception of a CIO.

BT Quote

Why OpenStack success depends on simplicity

(c)iStock.com/cherezoff

Is it any wonder that OpenStack has become so popular? It accelerates an organisation’s ability to innovate and compete by deploying applications faster and increases IT teams’ operational efficiency, according to the latest OpenStack user survey.

OpenStack is an open-source cloud operating system that enables businesses to manage compute, storage and networking resources via a self-service portal and APIs at massive scale – significantly propelling cloud services forward. And with its growing popularity, the demand for OpenStack expertise is so high, employers are willing to pay top dollar for it and do everything they can to retain talent.

While this is a great thing for those with the expertise, it also hints towards a significant roadblock when it comes to OpenStack implementations. While OpenStack offers tremendous benefits, it is not simple. In fact, implementations are notoriously complex, resulting in skyrocketing demand for skilled specialists.

Removing complexity and focusing on foundations

Despite its complexity, the secret to OpenStack’s success is – perhaps ironically – in taking a simpler approach. For a successful implementation, removing unnecessary complexities is an essential first step.

For instance, you start with the basics by focusing on storage, compute power and infrastructure before adding further features. A solid and simple storage backend translates into a strong foundation.

This approach is opposite to the bolt-on method in which OpenStack features are added to existing architectures. This essentially treats OpenStack as an afterthought, ultimately results in added challenges further down the line and militates against the full potential of OpenStack.

Focusing on fundamental infrastructure considerations may be more resource-intensive and demanding in the beginning than the ‘bolt-on’ approach, but it makes all the difference between an OpenStack platform that organisations could use to its full potential and one that is bloated by unnecessary complexity.

Businesses need to take several considerations and processes into account when implementing OpenStack. After all, it isn’t just a lower-cost alternative to a traditional virtualised environment, it’s a fundamental shift in how applications are deployed, and how infrastructure is used.

For instance, you need to consider how the hardware components come together to drive cloud-focused goals and strategy, which is precisely why the initial focus should be on the underlying infrastructure foundation.

Orchestration

Keep in mind that a virtualised infrastructure is about data centre automation, whereas an OpenStack-powered cloud is about orchestrating the entire data centre. Orchestration builds on data centre automation, so understanding where you currently are in the orchestration process should also inform your approach.

When considering infrastructure components, you also need to think about what specific applications or workloads are driving your project and their individual requirements. This process of due diligence will serve you well; it will ultimately lead to a simpler and easy-to-use cloud.

Once the foundation is in place, this simplicity will translate into easier and cost-effective operations and maintenance, and the ability to easily and quickly customise your environment through APIs. It will also provide a clear path to scale and grow, while avoiding disruptive migrations or the rebalancing of resources.

Transforming infrastructure services

An OpenStack-powered cloud creates a rich platform for building and deploying applications, and it goes hand-in-hand with an automated and software-defined data centre. It manages the infrastructure below and orchestrates applications above, creating fully automated workflows that enable infrastructure to be deployed on demand, and for processes to become consistent and repeatable.

It not only lowers the operational cost of deployments and upgrades, but also provides a flexible platform for offering innovative new services to customers. And when it is combined with other technologies such as big data and virtualised network functions, it not only drives data centre transformation but also changes the very definition of infrastructure services.

That said, an OpenStack deployment can be a significant expenditure which is why deployment and management must be simplified to ensure a successful project and one which truly leverages the full potential of this exciting new technology.

Read more: OpenStack cloud adoption continues to rise but challenges remain

Accelerating #DigitalTransformation By @FormationDS | @CloudExpo #IoT

Organizations planning enterprise data center consolidation and modernization projects are faced with a challenging, costly reality. Requirements to deploy modern, cloud-native applications simultaneously with traditional client/server applications are almost impossible to achieve with hardware-centric enterprise infrastructure. Compute and network infrastructure are fast moving down a software-defined path, but storage has been a laggard. Until now.

read more

The Agile Accelerator | @CloudExpo @IsomorphicHQ #DigitalTransformation

Between the mockups and specs produced by analysts, and resulting applications built by developers, there exists a gulf where projects fail, costs spiral, and applications disappoint. Methodologies like Agile attempt to address this with intensified communication, with partial success but many limitations.
In his session at 18th Cloud Expo, Charles Kendrick, CTO & Chief Architect at Isomorphic Software, will present a revolutionary model enabled by new technologies. Learn how business and development users can collaborate – each using tools appropriate to their expertise – to build mockups and enhance them all the way through functional prototypes, to final deployed applications. This approach helps you improve usability, exceed end-user expectations, and still hit project milestones.

read more

IoT Sales: Enabling Adopters | @ThingsExpo #IoT #IIoT #DigitalTransformation

Most people love new technology. It can make us more productive. It can lower our costs. It can be very “cool.” So, if it’s true and most people love new technology, why do we tend to adopt new technology on a “curve?” Why do innovators and early adopters jump in early while others become late majority or laggards?
One answer is the effect of salespeople on the adoption curve. With innovators, salespeople probably make no difference. Innovators are going to buy new technology early and take a risk. It’s important that they are the first to utilize and implement new technology. They probably know more about the technology than the salespeople do anyway. They have likely been researching the new stuff for months and probably were involved in beta-testing the product for the manufacturer.

read more