All Network Topology Maps Are Not Created Equal By @MJannery | @CloudExpo #SDN

Many network management systems can discover the topology of a network. As with inventory, some do it once (i.e., “get and forget”). Better ones continually monitor for topology changes. Lower-end tools, however, separate the topology from the management system. They get a map for the sake of getting a map but they don’t use it.
All maps are not created equal.
Some are extremely basic. Others contain interesting information about the connections and the devices displayed. Topology discovery covers both device interconnections and the connections between access switches and the servers and end user hosts. Providing both forms of topology discovery should be a fundamental capability of an advanced NMS, not an afterthought provided by a secondary product.

read more

A Look at Hybrid Cloud Catalysts By @DinkoEror | @CloudExpo #Cloud

Where’s my data? Not in the ‘where did I leave my glasses’ sort of way, but rather thinking of data resilience, a key concern for many businesses putting their data and applications in the cloud. It’s especially crucial in light of the recent EMC Global Data Protection Index, which showed that almost two thirds of companies surveyed suffered disruption due to data loss and unplanned downtime—losing as much as $1.7 trillion.” That’s a staggering number, larger than the GDP of many countries.

That’s why, when assessing their operational problems – be that those caused by people (e.g. users demanding more from their applications), process (e.g. new data protection or e-discovery requirements) or technology (an application underperforming , failing to scale or costing too much to maintain) many organizations are also looking for a solution that will solve the disaster recovery issues they face; a continuous availability solution that combines production, high availability, disaster recovery, and continuity of operations in a single solution. This is something that we have been looking to address with EMC Federation Hybrid Cloud disaster recovery, and why we’ve spent a lot of time making sure we get it right.

read more

State of Application Delivery 2015 By @LMacVittie | @CloudExpo #SDN #Cloud

For some, SDN was about operational efficiency; about driving more stability and consistency out of the processes that push applications through the app deployment pipeline into production. For others, it was really about financial efficiency – the drive to lower capital expenditures. And for yet others it was about efficient use of time – speed – in getting apps to market faster.
All had at their root this common theme – efficiency. IT and indeed businesses today are experiencing rapid and sometimes unexpected growth driven by demand for mobile applications and the introduction of things into the equation. All agree that IT is under incredible pressure to step up and become more fast, scalable and efficient in order to deliver the apps upon which business now relies in this new economy. SDN, like DevOps, is one of the ways in which organizations are looking to operationalize their networks using programmability to automate and orchestrate the processes that govern the production pipeline.

read more

Future of Information Storage with ISS SuperCore and Ceph | @CloudExpo #Cloud

The time is ripe for high speed resilient software defined storage solutions with unlimited scalability. ISS has been working with the leading open source projects and developed a commercial high performance solution that is able to grow forever without performance limitations.
In his session at Cloud Expo, Alex Gorbachev, President of Intelligent Systems Services Inc., shared foundation principles of Ceph architecture, as well as the design to deliver this storage to traditional SAN storage consumers.

read more

The Software Paradox By @JnanDash | @CloudExpo #Cloud

I just read a new booklet from O’Reilly called The Software Paradox by Stephen O’Grady. You can access it here.
Here is a direct quote:
“This is the Software Paradox: the most powerful disruptor we have ever seen and the creator of multibillion-dollar net new markets is being commercially devalued, daily. Just as the technology industry was firmly convinced in 1981 that the money was in hardware, not software, the industry today is largely built on the assumption that the real revenue is in software. The evidence, however, suggests that software is less valuable—in the commercial sense—than many are aware, and becoming less so by the day. And that trend is, in all likelihood, not reversible. The question facing an entire industry, then, is what next?”

read more

Even bigger data: Getting the most out of your implementation

(c)iStock.com/Erik Khalitov

By David Belanger, Senior Research Fellow in the Business Intelligence and Analysis Program at Stevens Institute of Technology and co-leader of the IEEE Big Data Initiative

Over the last week or so, I’ve had the opportunity to view a collection of talks and articles about new network technologies related to how data is gathered, moved, and distributed. A common element in all of these discussions is the presence of big data. 

New network technology is often the driver on quantum increases in the amount of data available for analysis. In order, think of the Internet, web, 3/4G mobility with smartphones and 24×7 access, and Internet of Things. Each of the above technologies will make dramatic changes in the way networks gather, carry and store data, while taken together they will generate and facilitate far more data for us to analyse and use.  The challenge will be getting more than proportional value from that increase in data.

We have already been through network technologies that have dramatically increased the number of hours a day that people could generate and consume data, and fundamentally changed our relationship with information. The half-life of usefulness of information has dropped as we can get, at nearly any time, many types of information in seconds using an army of “apps”; and the “inconvenience index”, or the amount of trouble we needed to go through to obtain information, has become measured in inches instead of feet or yards. The emergence of a vast array of devices connected to the Internet and measuring everything from human movement, to health, to the physical and commercial worlds, is starting to create an even larger flow of data.

This increase in the volume, volatility, and variety of data will be larger than any of its predecessors. The challenge is: will it create a proportionally large increase in the amount of information? Will it create a proportionally large increase in the value of that information? Or, will it create a deluge in which we can drown?

Fight or flight

Leaving aside the fact that much of the increase in “data” in flight over networks today is in the form of entertainment, the latest number I have heard is about 2/3, there is no question that the current flood of data has generated information of significant value.  This is certainly true in the financial industry, not least algorithmic trading, which not only uses big data to make better decisions, but automates the execution of those decisions.

In consumer marketing, the use of big data has fundamentally changed the approach to targeting from segmentation and aggregates, to targeting individuals, or even personas of individuals, by their behaviours. This has created much more customisation for applications ranging from recommendations to churn. Much of the management of current communications networks is completely dependent on big data for functions such as reliability, recovery, and security. It is clearly true in many branches of science, and is becoming true in the delivery of health care. Leaving aside potential issues of privacy, surveillance cameras have changed the nature of policing. As video data mining matures, cameras will challenge entertainment for volume of network traffic, and provide another opportunity for value generation.

We typically think of the analytics associated with these data as leading to more accurate decision making, followed by more effective actions.

Size is not always important 

The answer to the two questions above depends, in part, on how broadly based the skill set for effectively using this data becomes. Big data is not only a function of the volume (size), velocity (speed), and variety (text, speech, image, video) of the data available.  At least as important are the sets of tools that allow a broad variety of people to take advantage of that data, the availability of people with the necessary skills, and new types of applications that evolve.

Over much of the last two decades, big data was the province of organisations that both had access to lots of data, and had the scientific and engineering skills build tools to manage and analyse it; and in some cases the imagination to create business models to take advantage of it. That has changed dramatically over the last several years. The set of powerful, and usable tools has emerged both commercially and as open source.

Understanding how companies can obtain access to data in addition to their operationally generated data are evolving quickly, and leaders in many industries are inventing new business models to generate revenue. Finally, and perhaps most importantly, there is a large, growing body of applications in areas such as customer experience, targeted marketing, recommendation systems, and operational transparency, that are important to nearly every business, and are the basis for competition in the next several years. Skills needed to take advantage of this new data are within the reach of more companies than a few years ago. These include not only data scientists, but also a variety of engineers and technicians to produce hardened systems.

Conclusion

So, how do we think about this? First, the newly generated data will be much more open than traditional operational data. It will be worthwhile to those who think about an organisation’s data to look very seriously at augmenting their operational data with exogenously created data.

Second, you need to think creatively about integrating various forms of data together to create previously unavailable information. For example, in telecommunications, it is now fairly standard to integrate network, service, customer, and social network data to understand both customers and networks.

Third, skill sets must be updated now. You will need data scientists, data miners, but also data technicians to run a production level information based decision and automation capability. You will need people skilled in data governance – policy, process, and practices – to manage risks associated with big data use.

It is time to start building this capability.

About the author:

Dr. David Belanger is currently a Senior Research Fellow in the Business Intelligence and Analysis Program at Stevens Institute of Technology.  He is also co-leader of the IEEE Big Data Initiative. He retired in 2012 after many years as Chief Scientist and V.P. of Information, Software, and Systems Research at AT&T Labs.

Dropbox announces user base exceeds 400 million, with eight million business users

Picture credit: Flickr

Dropbox has announced its number of users has exceeded 400 million, with more than eight million business customers also on board.

The figures, disclosed on the occasion of the cloud storage provider’s eighth birthday, also reveal an impressive growth in the UK market, opened up six months ago. The company has grown from three to 10 offices over the past year.

According to the figures, 4,000 edits to documents are made each second on Dropbox, with users syncing 1.2 billion files each day and creating more than 100,000 new shared folders and links each hour. The number of business users has doubled in just 19 months.

Other figures announced, such as the Dropbox for Business product holding more than 100,000 users, were previously disclosed. For the UK, Dropbox predicts more than five million UK businesses will be using at least one cloud service within a year – and the company is hoping its name is on the ticket.

“We are now the world’s largest collaboration network,” proclaimed Dropbox UK country manager Mark van der Linden. “Consumers know Dropbox is intuitive to use, and businesses are discovering we can boost productivity and creativity, keep data secure and make collaboration simple and efficient,” he added.

It’s the middle point which has often been the stumbling block. From the enterprise security side, traditionally consumer-oriented storage solutions, such as Dropbox, were not considered enterprise class – and it was not at all surprising two years ago when Fiberlink reported Dropbox was the number one banned app for iOS and Android devices. It’s worth noting however that the cloud storage provider also appeared in the top 10 iOS whitelisted apps.

The mood has changed since then however. For enterprise mobile management providers, the threat of shadow IT is now turning into a positive. As employees continue to use the likes of Dropbox, despite the repeated warnings, it’s resulted in rethought strategies and a firmer blueprint of enabling greater employee productivity through personal technology use.

Dropbox has not been slow in beefing up its business credentials either, introducing two-step verification and tiered administrative controls, as well as achieving certification with the emerging ISO/IEC 27018 privacy standard. Even when the mask has slipped, such as the disclosure of a major vulnerability in the Dropbox Android SDK, the company has been praised for its response to the issue – in this instance by IBM Security, who disclosed the vulnerability after it was fixed.

“Dropbox is experiencing incredible growth in the UK and around the world,” van der Linden added. “It’s down to the fact Dropbox is a service people want to use and that businesses trust.”

North State Communications to Acquire Stalwart

North State Communications, a leading fiber optic network, data center and cloud services provider, has announced its plans to purchase Stalwart, an IT security integration firm. The purchase will complement DataChambers, North State Communication’s data center and cloud computing subsidiary. While North State Communications said that it expects to close the deal in the third quarter, no details pertaining to the deal were disclosed.

datachambers

 

Royster Tucker III, CEO of North State describes, “North State is extremely pleased to be gaining such a highly qualified and well-rounded IT security firm as Stalwart. Their professional integrity and mastery of advanced threat protection are ideal counterparts for DataChambers’ data center and cloud offerings. Bringing Stalwart onboard further fuels our strategic growth and helps round out our ability to effectively address some of the greatest IT challenges facing businesses today.” Tucker also said that the deal originated from North State Communication’s search for ways to add value for business customers.

Tucker also said, “As businesses begin to move into the cloud and IT infrastructure becomes distributed and moves into the cloud, we wanted to build that business and Stalwart has real expertise in IT infrastructure and security. Today more than ever, you have to wrap that in an envelope of security, and that’s what Stalwart brings to the table.”

Bill Cooper, CEO of Stalwart, also shared his opinion of the deal:  “North State is a strategic acquirer who shares our core virtues and beliefs. This, more than anything, will continue to make Stalwart unique and better. It is exciting to think of the myriad ways our team will now be able to create additional value for our coveted and growing customer base.” Bill Cooper will continue to lead Stalwart as it joins North State.

The post North State Communications to Acquire Stalwart appeared first on Cloud News Daily.