Choosing the right cloud for your workloads is a balancing act that can cost your organization time, money and aggravation – unless you get it right the first time.
Economics, speed, performance, accessibility, administrative needs and security all play a vital role in dictating your approach to the cloud. Without knowing the right questions to ask, you could wind up paying for capacity you’ll never need or underestimating the resources required to run your applications.
Monthly Archives: June 2016
What is the promise of big data? Computers will be better than humans
Big data as a concept has in fact been around longer than computer technology, which would surprise a number of people.
Back in 1944 Wesleyan University Librarian Fremont Rider wrote a paper which estimated American university libraries were doubling in size every sixteen years meaning the Yale Library in 2040 would occupy over 6,000 miles of shelves. This is not big data as most people would know it, but the vast and violent increase in the quantity and variety of information in the Yale library is the same principle.
The concept was not known as big data back then, but technologists today are also facing a challenge on how to handle such a vast amount of information. Not necessarily on how to store it, but how to make use of it. The promise of big data, and data analytics more generically, is to provide intelligence, insight and predictability but only now are we getting to a stage where technology is advanced enough to capitalise on the vast amount of information which we have available to us.
Back in 2003 Google wrote a paper on its MapReduce and Google File System which has generally been attributed to the beginning of the Apache Hadoop platform. At this point, few people could anticipate the explosion of technology which we’ve witnessed, Cloudera Chairman and CSO Mike Olson is one of these people, but he is also leading a company which has been regularly attributed as one of the go-to organizations for the Apache Hadoop platform.
“We’re seeing innovation in CPUs, in optical networking all the way to the chip, in solid state, highly affordable, high performance memory systems, we’re seeing dramatic changes in storage capabilities generally. Those changes are going to force us to adapt the software and change the way it operates,” said Olson, speaking at the Strata + Hadoop event in London. “Apache Hadoop has come a long way in 10 years; the road in front of it is exciting but is going to require an awful lot of work.”
Analytics was previously seen as an opportunity for companies to look back at its performance over a defined period, and develop lessons for employees on how future performance can be improved. Today the application of advanced analytics is improvements in real-time performance. A company can react in real-time to shift the focus of a marketing campaign, or alter a production line to improve the outcome. The promise of big data and IoT is predictability and data defined decision making, which can shift a business from a reactionary position through to a predictive. Understanding trends can create proactive business models which advice decision makers on how to steer a company. But what comes next?
For Olsen, machine learning and artificial intelligence is where the industry is heading. We’re at a stage where big data and analytics can be used to automate processes and replace humans for simple tasks. In a short period of time, we’ve seen some significant advances in the applications of the technology, most notably Google’s AlphaGo beating World Go champion Lee Se-dol and Facebook’s use of AI in picture recognition.
Although computers taking on humans in games of strategy would not be considered a new PR stunt, IBM’s Deep Blue defeated chess world champion Garry Kasparov in 1997, this is a very different proposition. While chess is a game which relies on strategy, go is another beast. Due to the vast number of permutations available, strategies within the game rely on intuition and feel, a complex task for the Google team. The fact AlphaGo won the match demonstrates how far researchers have progressed in making machine-learning and artificial intelligence a reality.
“In narrow but very interesting domains, computers have become better than humans at vision and we’re going to see that piece of innovation absolutely continue,” said Olsen. “Big Data is going to drive innovation here.”
This may be difficult for a number of people to comprehend, but big data has entered the business world; true AI and automated, data-driven decision may not be too far behind. Data is driving the direction of businesses through a better understanding of the customer, increase the security of an organization or gaining a better understanding of the risk associated with any business decision. Big data is no longer a theory, but an accomplished business strategy.
Olsen is not saying computers will replace humans, but the number of and variety of processes which can be replaced by machines is certainly growing, and growing faster every day.
Box financial results look good on the surface, but analysts unconvinced
(c)iStock.com/ngkaki
It has been a week of good and bad news for enterprise cloud storage provider Box with the announcement of first quarter financial results. The company announced record first quarter revenue of $90.2 million (£62m), an increase of 37% from this time last year – but because renewals were down, analysts came out of the quarter feeling somewhat unconvinced.
“We had a solid start to fiscal 2017,” said Box CEO Aaron Levie in the firm’s earnings call, as transcribed by Seeking Alpha. “We had strong customer momentum adding more than 5000 new customers in Q1, our largest number of new customers in a quarter. In addition, we continue to improve our already best in class customer retention with our customer churn rate improving to just below 3%.
“These metrics showcase how valuable and essential Box is to our growing global customer base,” he added.
Other results included billings in Q117 of $75.9m, an increase of 9% year over year, and GAAP operating loss of $38.6m representing 43% of revenue compared to 71% in 2015, while customer wins and extensions cited by Box included Airbnb, Brooks Brothers, and The Whirlpool Corporation.
Levie cited partnerships as key to Box’s strength going forward, also revealing the company will shortly be hiring a chief marketing officer with ’20 years of enterprise technology experience’. “As we’re becoming a more strategic investment for our customers, larger transactions are shifting towards later in the year,” Levie said. “Looking ahead, underlying demand for Box remains very strong and our competitive position in the market has never been better.
“Coming off of Box World Tour where we engaged with thousands of customers and prospective clients, we created record sales pipeline in the quarter with several seven figure deals in the mix,” he added. “This has been driven by the growing demand for a modern approach to enterprise content management, our differentiated product offerings and our maturing partnerships that are becoming an integral part of our go-to-market strategy.”
All good news, one would assume. Yet yesterday, as reported by Bloomberg, JPMorgan Chase & Co analyst Mark Murphy downgraded the cloud software vendor’s stock to neutral. “Short seller’s wet dreams coming true as Box gets hammered for a beat,” as Diginomica put it. “Box turned in a decent quarter with a broadly in-line outlook,” wrote Den Hewlett. “The problem is that on the call, the Box team didn’t seem as assured as analysts were likely requiring in order to maintain the market’s already fragile view of momentum company stocks.”
“We are confident in our growth opportunity, driven by our product differentiation and expanding market, and we remain committed to achieving positive free cash flow in the fourth quarter of this fiscal year,” said Box CFO Dylan Smith.
[session] How Cloud Is Revolutionizing the Middle Market By @VAIsoftware | @CloudExpo #Cloud
The middle market often experiences challenges adapting technology, lacking the agility of startups and the monetary investment of massive corporations. The cloud can help – cloud technology enables mid-market companies to streamline data and automate their processes. By utilizing cloud technology, employees can communicate and collaborate seamlessly from any device, whether it is a smartphone, tablet or desktop; thus improving user experience.
In his session at 18th Cloud Expo, Kevin Beasley, CIO at VAI, will spotlight successes of the mid-market that are finally able to leverage ERP solutions, thanks to the economies of cloud.
IBM makes software defined infrastructure smarter
IBM has expanded its portfolio of software-defined infrastructure solutions adding cognitive features to speed up analysis of data, integrate Apache Spark and help accelerate research and design, the company claims.
The new offering will be called IBM Spectrum Computing and is designed to aide companies to extract full value from their data through adding scheduling capabilities to the infrastructure layer. The product offers workload and resource management features to research scientists for high-performance research, design and simulation applications. The new proposition focuses on three areas.
Firstly, Spectrum Computing works with cloud applications and open source frameworks to assist in sharing resources between the programmes to speed up analysis. Secondly, the company believes it makes the adoption of Apache Spark simpler. And finally, the ability to share resources will accelerate research and design by up to 150 times, IBM claims.
By incorporating the cognitive computing capabilities into the software-defined infrastructure products, IBM believes the concept on the whole will become more ‘intelligent’. The scheduling competencies of the software will increase compute resource utilization and predictability across multiple workloads.
The software-defined data centre has been steadily growing, and is forecasted to continue its healthy growth over the coming years. Research has highlighted the market could be worth in the region of $77.18 Billion by 2020, growing at a CAGR of 28.8% from 2015 to 2020. The concept on the whole is primarily driven by the attractive feature of simplified scalability as well as the capability of interoperability. North America and Asia are expected to hold the biggest market share worldwide, though Europe as a region is expected to grow at a faster rate.
“Data is being generated at tremendous rates unlike ever before, and its explosive growth is outstripping human capacity to understand it, and mine it for business insights,” said Bernard Spang, VP for IBM Software Defined Infrastructure. “At the core of the cognitive infrastructure is the need for high performance analytics of both structured and unstructured data. IBM Spectrum Computing is helping organizations more rapidly adopt new technologies and achieve greater, more predictable performance.”
Hybrid cloud and software defined data centres: How companies can have it both ways
(c)iStock.com/4X-Image
Amazon Web Services (AWS) is the fastest growing enterprise company that the world has ever seen – a testament to the fact that huge numbers of businesses are moving their data to the cloud not only to save on costs but to be able to analyse their data more effectively. Netflix, Airbnb, the CIA and other high-profile organisations now run large portions of their businesses on AWS. Yet with lingering concerns about a ‘big bang’ move to cloud, many businesses are adopting a hybrid cloud approach to data storage.
Hybrid cloud is the middle ground of private storage with lower infrastructure overheads plus superfast, low latency connection to the public cloud. Companies increasingly have their own mix of cloud storage systems reflective of their legacy IT infrastructure, current budgets and current and future operational requirements. As a result, many CIOs are having to think about how data is moving back and forth between various on-premise systems and cloud environments. This can be challenging when the data is transactional and the data set changes frequently. In order to move this data, without interruption, active-active replication technology like WANdisco Fusion is required so the data can be moved and yet still fully utilised with business still operating as usual.
Software defined data centres (SDDC) – with virtualisation, automation, disaster recovery and applications and operations management – are making it easier for businesses to build, operate and manage a hybrid cloud infrastructure. Such a system enables businesses to move assets wherever they need to whilst maintaining security and availability.
As a result, according to an IBM study, “Growing up Hybrid: Accelerating digital transformation” many organisations that currently leverage hybrid cloud and use it to manage their IT environment in “an integrated, comprehensive fashion for high visibility and control” say that have already gained a competitive advantage from it. In many cases software defined data centres with hybrid cloud are accelerating the digital transformation of organisations as well as making it easier for companies to use cognitive computing such as predictive intelligence and machine learning.
Although hybrid cloud is rapidly becoming the option of choice by reducing IT costs and increasing efficiency, this shift is creating new concerns as CIOs must ensure a seamless technology experience regardless of where the company’s IT infrastructure resides. Whilst businesses are increasingly comfortable transitioning business-critical computing processes and data between different environments, it is vital on-premise infrastructure, private clouds and public clouds are monitored as changes can occur in any of these segments without notice.
In January this year HSBC saw a failure in its servers that left UK online banking customers unable to log in to their accounts for 9 hours. It took more than a day to identify the cause of the issue and as a result customers vented their anger on social media and the case made national headlines. Failures that cannot be quickly identified have the potential to cause huge financial losses as well as significant reputational damage. Businesses must have real time visibility in a hybrid cloud environment to enable them to be able to head off or respond to issues in real time.
With companies needing to indefinitely maintain legacy IT infrastructure, a software defined data centre supporting a hybrid cloud can allow a business to have the best of both worlds – the cost effectiveness with elastic expansion & contraction of public cloud computing and the security of a private cloud. If you want to future proof your business and remain at the cutting edge of innovation in your sector, hybrid cloud and software defined data centres are what you need to ensure you can access public cloud resources, test new capabilities quickly and get to market faster without huge upfront costs.
IoT-Enabled Services | @ThingsExpo #IoT #M2M #API #InternetOfThings
In addition to all the benefits, IoT is also bringing new kind of customer experience challenges – cars that unlock themselves, thermostats turning houses into saunas and baby video monitors broadcasting over the internet. This list can only increase because while IoT services should be intuitive and simple to use, the delivery ecosystem is a myriad of potential problems as IoT explodes complexity. So finding a performance issue is like finding the proverbial needle in the haystack.
Wipro open sources big data offering
Wipro has announced it has open sourced its big data solution Big Data Ready Enterprise (BDRE), partnering with California based Hortonworks to push the initiative forward.
The company claims the BDRE offering addresses the complete lifecycle of managing data across enterprise data lakes, allowing customers to ingest, organize, enrich, process, analyse, govern and extract data at a faster pace. BDRE is released under the Apache Public License v2.0 and hosted on GitHub. Teaming up with Hortonworks will also give the company additional clout in the market, at Hortonworks is generally considered one of the top three Hadoop distribution vendors in the market.
“Wipro takes pride in being a significant contributor to the open source community, and the release of BDRE reinforces our commitment towards this ecosystem,” said Bhanumurthy BM, COO at Wipro. “BDRE will not only make big data technology adoption simpler and effective, it will also open opportunities across industry verticals that organizations can successfully leverage. Being at the forefront of innovation in big data, we are able to guide organizations that seek to benefit from the strategic, financial, organizational and technological benefits of adopting open source technologies.”
Companies open sourcing their own technologies has become somewhat of a trend in recent months, as the product owners themselves would appear to be moving towards a service model as opposed to traditional vendor. According to ‘The Open Source Era’, an Oxford Economics Study which was commissioned by Wipro, 64% of respondents believe that open source will drive Big Data efforts in the next three years.
The report also claims open source has become a foundation stone of the technology roadmap of a number of businesses, 75% of respondent believe integration between legacy and open source is one of the main challenges and 52% said open source is already supporting development of new products and services.
Ver la imagen mapeada en un dispositivo loop
Si tenemos en el sistema un dispositivo loop nos puede interesar saber a que imagen corresponde dicho dispositivo, vamos a ver dos formas de hacerlo:
En las versiones reciente de losetup podemos usar la opción –show que nos mostrará la imagen, en este caso /root/ejemplo:
# losetup --show /dev/loop0 /dev/loop0: [fc00]:798589 (/root/ejemplo)
Pero también lo podemos hacer mediante el /sys, dentro del dispositivo loop a buscar deberemos hacer un cat del fichero loop/backing, por ejemplo para /dev/loop0 haríamos:
# cat /sys/block/loop0/loop/backing_file /root/ejemplo
Tags: losetup
Apple experiences outage in North America
Apple has restored services to customers around the world after many of its cloud-based offerings and other services faced outages of up to seven hours.
The outages, which were reported from users mainly in North America, are yet to be explained by the company but impacted numerous products including the App Store, iCloud, Apple TV, photos and iMovies as well as a host of others.
The issues would have appeared to have begun at around 9pm (GMT) June 2, and all services were resumed by 4:55am (GMT). Apple spokespersons have been directing journalists to the company’s support page where it posted insightful comments such as “Users experienced a problem with the service above” and “Users may have experienced slower than usual performance when using iCloud Drive, Backup, iCloud Notes, iWork for iCloud and Photos. Users may have experienced slowness with multiple services at iCloud.com”.
The services aspect of the Apple business has been reporting healthy numbers in recent months, seemingly offsetting the drop in iPhone sales. During its Q2 earnings call the company ended its long run of constant year-over-year revenue growth, as it reported a decline for the first time in 13 years, according to Telecoms.com. iPhone shipments were down 16% and Mac sales also went down from $5.61 billion to $5.1 billion, however its services business increased by 20% to almost $6 billion.