Know Who’s on Your Network and What They Do | @CloudExpo #Cloud

Maintaining network security has never been more challenging than it is right now. Traditional network perimeters are beginning to blur in the face of consumerization, the rise of mobility, migration to the cloud, and the Internet of Things. The pursuit of business agility has driven these trends, and they offer tangible benefits, but in the rush to adopt them, information security has been left behind.
According to the Pricewaterhouse Coopers, Global State of Information Security Survey: 2015, the number of detected incidents reached 42.8 million last year. That’s an increase of 48% over 2013, and the total financial losses attributed to those security breaches were up 34% on the year before. Ever more stringent regulatory guidelines and compliance standards are also putting businesses at risk of legal liability in the event of a successful cyberattack.

read more

OPNFV announces first major software release

OPNFV has launched the first version of its NFV platform

OPNFV has launched the first version of its NFV platform

Linux Foundation driven open source NFV organisation OPNFV has announced the availability of the first version of its software, which it is calling Arno, reports Telecoms.com.

OPNFV was formed just eight months ago by a group of NFV veterans including Chairman Prodip Sen, who was at Verizon and now also heads up NFV at HP. Its aim is to develop an open platform for NFV, which in turn should accelerate the growth of the technology and shorten the time to market for NFV solutions.

As the first release Arno, which commences a sequence of river-based names the second of which will presumably begin with B, is aimed at those exploring NFV deployments. It provides an initial build of the NFV Infrastructure (NFVI) and Virtual Infrastructure Manager (VIM) components of ETSI NFV architecture.

“Only eight months after its formation, OPNFV has met one of its major goals by creating an integrated build, deployment and testing environment that accelerates NFV implementation and interoperability,” said Sen. “With Arno, we now have a solid foundation for testing some of the key resource orchestration and network control components for NFV. This is great a testament to the power of an open source collaborative model and the strength of the NFV ecosystem.”

Here’s a more detailed breakdown of what Arno (which is available to download here) brings to the table, according to the OPNFV announcement:

  • Availability of baseline platform: Arno enables continuous integration, automated deployment and testing of components from upstream projects such as Ceph, KVM, OpenDaylight, OpenStack and Open vSwitch. It allows developers and users to automatically install and explore the platform.
  • Ability to deploy and test various VNFs: End users and developers can deploy their own or third party VNFs on Arno to test its functionality and performance in various traffic scenarios and use cases.
  • Availability of test infrastructure in community-hosted labs: Agile testing plays a crucial role in the OPNFV platform. With Arno, the project is unveiling a community test labs infrastructure where users can test the platform in different environments and on different hardware. This test labs infrastructure enables the platform to be exercised in different NFV scenarios to ensure that the various open source components come together to meet vendor and end user needs.
  • Allows automatic continuous integration of specific components: As upstream projects are developed independently they require testing of various OPNFV use cases to ensure seamless integration and interworking within the platform. OPNFV’s automated toolchain allows continuous automatic builds and verification.

nGenx Selects @IndependenceIT Cloud Workspace Suite | @CloudExpo #Cloud

IndependenceIT has been selected by nGenx to power Windows-based DaaS and application delivery on Google Compute Engine to support the delivery of GoldMine Cloud software. For independent software vendors (ISVs) like GoldMine, this expands the theater of operations to increase revenue opportunities while reducing software management and maintenance liabilities.
IndependenceIT was selected by application and desktop pioneer, nGenx, to deliver its “Bring Your Own Cloud” strategy to GoldMine and other ISVs. GoldMine will now benefit from its selection of nGenx and Google Compute Engine to cloud-enable its popular contact management software by leveraging the power of IndependenceIT’s Cloud Workspace® Suite. By combining the automation/workflow of Cloud Workspace Suite with Google Compute Engine’s performance, scale and support for Windows-based workloads, GoldMine is able to create new profit centers from its existing product line. Google Compute Engine’s support for Windows-based cloud workspaces allows software developers like GoldMine to run business critical Windows-based workloads at high performance and scale.

read more

SingleHop to Present at @CloudExpo 2015 New York | @SingleHop #Cloud

SingleHop will speak at Cloud Expo 2015 in New York on Thursday, June 11. His presentation, “Why Public Clouds Are A Step Backwards in Efficiency,” will include an examination of the virtualization efficiencies that have been forgotten, and include tips to avoid common mistakes that as many as 90% of companies that utilize public clouds are making.
Recent estimates from Gartner predict that global IaaS spending will reach approximately $16.5 billion in 2015, growing by a projected 29.1 percent annually between now and 2019. It is clear organizations must understand how to maximize efficiencies in their IT infrastructure before committing to resources that may not provide the most value for them.

read more

DIY Enterprise DevOps | @DevOpsSummit @Datical #DevOps #Microservcies

I read an insightful article this morning from Bernard Golden on DZone discussing the DevOps conundrum facing many enterprises today – is it better to build your own DevOps tools or go commercial? For Golden, the question arose from his observations at a number of DevOps Days events he has attended, where typically the audience is composed of startup professionals:
“I have to say, though, that a typical feature of most presentations is a recitation of the various open source products and components and how they integrated them to implement their solution. In a word, how they created their home-grown solution. Given that many of these speakers hail from startups with small teams and a focus on conserving cash, this approach makes sense. Moreover, given that these are typically small teams working at companies following the Lean Startup approach, using open source that allows rapid change as circumstances dictate makes sense as well. And, in any case, startups need to solve problems today because who knows what the future will bring?”

read more

Is Your Data Encryption Kosher? @SafeLogic CEO at @CloudExpo NY | #Cloud

SafeLogic is proud to promote CEO Ray Potter’s speaking engagement on the agenda of Cloud Expo New York City.
Cloud Expo will be hosted at the Javits Center in Manhattan, New York, on June 9 – 11, 2015, alongside SYS-CON Media’s Big Data Expo, Internet of Things Expo, SDDC (Software-Defined Datacenter) Expo, DevOps Summit, and WebRTC Summit, to form one massive event focused on today’s hottest technologies. The need for reliable and secure data storage and processing is a common thread across each sub-event.

read more

Simple Steps for Your DevOps Teams | @DevOpsSummit @Ruxit #DevOps #Microservices

Software development can be an arduous endeavor with numerous steps spanning a few programmers to a couple hundred developers. And keeping that software running smoothly can be even more costly and time-consuming as a few features are added, bug fixes applied, or third party tools integrated. A recent post from Gartner estimated the cost of downtime could be as high as $5,600 per minute which translates to over $300K per hour! Of course this figure varies based on the industry, size of the organization, and other factors but no organization can afford extended downtime in today’s competitive marketplace.

Your Dev/Ops team needs the right application monitoring tools, processes, and culture to avoid downtime and support aggressive revenue and customer retention targets.

So how can you simplify your development process to deliver tangible savings and minimize downtime? Here are 15 simple steps you can implement to save time and money for your Dev/Ops teams.

read more

IDC: Cloud high on the list for utilities sector but skills shortage pervades

The utilities sector is struggling with an ageing workforce and lacks critical cloud skills

The utilities sector is struggling with an ageing workforce and lacks critical cloud skills

About three quarters of utilities see moving their on-premise apps and workloads into the public cloud as a dominant component in their IT strategies, according to a recent IDC survey. But Gaia Gallotti, research manager at IDC Energy Insights told BCN the firms need to first overcome a pretty significant skills gap and an ageing workforce if they are to modernise their systems.

According to the survey, which polled 38 international senior utility executives, the vast majority of respondents are sold on the benefits cloud could bring to their IT strategies.  About 87 per cent said cloud services provide better business continuity and disaster recovery than traditional technology, and 74 per cent said public cloud migration will be dominant within their broader IT strategy.

Interestingly, while 76 per cent of respondents believe cloud providers can offer better security and data privacy controls than their own IT organisation,  63 per cent said ceding control to a cloud provider is a barrier to their organisation’s adoption of cloud services.

“The utilities industry can no longer afford to deny the advantages of ‘going into the cloud.’ As security concerns are further debunked, utilities expect to see a significant push in their cloud strategy maturity in the next 24 months, so much so that they expect to make up lost ground and even supersede other industries,” Gallotti said.

But most also believe internal IT skillsets not up to speed with new cloud standards, methodologies, and topologies. 74 per cent said they will need a third-party professional services firm to help develop a public cloud strategy.

“This is a huge problem the industry is facing, but not exclusively for cloud services. Utilities are struggling to attract talent in all IT domains, especially for the ‘third platform’, as they compete with companies in the IT world that attract ‘Generation Y’ talent more easily,” Gallotti explained.

“The utilities industry also has an issue with aging workforce outside of IT and across its other business units. In the short term, we expect utilities to rely more on their service providers to fill skills gap that emerge, in the hope of more easily attracting the right talent as the industry transforms and becomes more appealing to Gen Y.”

SIOS Technical Evangelist to Present at @CloudExpo | @SIOSTech #Cloud

SIOS Technology Corp. has announced that SIOS Senior Technical Evangelist and Microsoft Clustering MVP, David Bermingham will present a session about protecting business critical applications in cloud environments at Cloud Expo taking place next week in New York, NY.
Titled, “Protecting Mission Critical Applications in Cloud Environments,” David’s session is scheduled for Thursday, June 11 at 4:50 pm at the Javits Center, New York, NY. For more information about the Cloud Expo or to register, visit here.
Gartner predicts that the bulk of new IT spending by 2016 will be for cloud platforms and applications and that nearly half of large enterprises will have cloud deployments by the end of 2017. The benefits of the cloud may be clear for applications that can tolerate brief periods of downtime, but for critical applications like SQL Server, Oracle and SAP, companies need a strategy for HA and DR protection. By adding SANless clustering software as an ingredient to Windows Server Failover Clustering, companies can provide HA and DR protection in a cloud where traditional shared-storage clusters may be impractical or impossible. Attendees at David’s session will learn the truths and myths of HA and DR in cloud deployments that can dramatically reduce data center costs and risks.

read more

IBM, UK gov ink $313m deal to promote big data, cognitive compute research

IBM and the UK government are pouring £313m into big data and cognitive computing R&D

IBM and the UK government are pouring £313m into big data and cognitive computing R&D

The UK government has signed a deal with IBM that will see the two parties fund a series of initiatives aimed at expanding cognitive computing and big data research.

The £313m partnership will see the UK government commit £113m to expand the Hartree Centre at Daresbury, a publicly funded facility geared towards reducing the cost and improving the efficiency and user-friendliness of high performance computing and big data for research and development purposes.

IBM said it will further support the project with technology and onsite expertise worth up to £200m, including access to the company’s cognitive computing platform Watson. The company will also place 24 IBM researchers at the Centre, who will help the researchers commercialise any promising innovations developed there.

The organisations will also explore how to leverage OpenPower-based systems for high performance computing.

“We live in an information economy – from the smart devices we use every day to the super-computers that helped find the Higgs Boson, the power of advanced computing means we now have access to vast amounts of data,” said UK Minister for Universities and Science Jo Johnson.

“This partnership with IBM, which builds on our £113 million investment to expand the Hartree Centre, will help businesses make the best use of big data to develop better products and services that will boost productivity, drive growth and create jobs.”

David Stokes, chief executive for IBM in the UK and Ireland said: “We’re at the dawn of a new era of cognitive computing, during which advanced data-centric computing models and open innovation approaches will allow technology to greatly augment decision-making capabilities for business and government.”

“The expansion of our collaboration with STFC builds upon Hartree’s successful engagement with industry and its record in commercialising technological developments, and provides a world-class environment using Watson and OpenPower technologies to extend the boundaries of Big Data and cognitive computing,” he added.