Your DevOps Flavor | @DevOpsSummit #DevOps #ITIL #Microservices

If there is anything we have learned by now, is that every business paves their own unique path for releasing software- every pipeline, implementation and practices are a bit different, and DevOps comes in all shapes and sizes. Software delivery practices are often comprised of set of several complementing (or even competing) methodologies – such as leveraging Agile, DevOps and even a mix of ITIL, to create the combination that’s most suitable for your organization and that maximize your business value. Our top 10 industry posts this week cover all the bases of software development/delivery and dig deep to discover more on the roles of Agile development teams, ITIL in FinServ and the emergence of microservices and containers as buzzwords.

As always, stay tuned to all the news coming from @ElectricCloud on DevOps and Continuous Delivery throughout the week and retweet/like to get your top news items featured in our weekly recap!

read more

In Remembrance of Jeremy Geelan

I’m going to cry tonight, long and hard. I’ll do so as I remember and mourn Jeremy Geelan.
I knew Jeremy for almost 20 years, from the time he showed up one day at Cloud Expo’s headquarters in Bergen County, New Jersey, and went to work. The show wasn’t called Cloud Expo then, of course – it was still known as Java Edge, a pioneering event that grabbed developers, architects, and enterprise IT users alike for twice-yearly confabs.
Jeremy didn’t have a job there, or even a job offer. He was simply a fan of the technology and the show. So when his wife, a Danish diplomat, was relocated to the UN in New York, Jeremy literally sprang into action and made himself at home at the show’s office.

read more

Cloud Is Now Mainstream ‘Power Panel’ at @CloudExpo | #Cloud

The initial debate is over: Any enterprise with a serious commitment to IT is migrating to the cloud. But things are not so simple. There is a complex mix of on-premises, colocated, and public-cloud deployments.
In this power panel at 18th Cloud Expo, moderated by Conference Chair Roger Strukhoff, panelists will look at the present state of cloud from the C-level view, and how great companies and rock star executives can use cloud computing to meet their most ambitious and disruptive business goals.

read more

How IT is struggling to cope with the weight of SaaS applications

(c)iStock.com/-Ivinst-

Corporate IT departments are drowning in the deluge of SaaS app requests, according to a report from BetterCloud.

The data, the first in a two part set, reveals how IT departments are “overworked and overwhelmed”, and a simple lack of time is prohibiting further cloud software being integrated into companies. Almost three quarters (73%) of IT pros surveyed feel guilt over this, saying they believe their end users lack SaaS apps that would increase productivity. In contrast, 45% of end users say the same. The number of cloud app deployments has gone up by half over the past year, with IT departments supporting on average 12 apps as of March.

The natural progression of this is that users, frustrated by IT’s lack of time, bring their own applications into the organisation without approval. When asked, only a quarter (26%) said they did ask IT for approval, 44% said no, while 30% said sometimes.

Cloud storage and collaboration appears to be the biggest sticking point as far as IT is concerned, with more than 40% of respondents saying those apps are the most difficult to secure and manage. It is a similar story with users; 29% say their storage and collaboration apps are in the biggest need of an upgrade, while other votes arrived for messaging (18%), project management (17%), human resources (14%) and ticketing (14%).

One of the more curious points of the research – which again shows the disparity between IT and end users for cloud apps – showed that while end users believe they take three months to become ‘proficient’ at SaaS applications, IT believes it should be nearer six months. BetterCloud content marketing manager Scott Solomon explains: “IT professionals should understand just how long it takes before end users consider themselves proficient.

“It’s not a stretch to think that offering continuous training sessions or materials could cut an end user’s time to proficiency in half,” he adds. “If end users feel proficient faster, IT will get fewer support requests and organisations will see an ROI more quickly.”

You can read the report findings here.

Met Office launches weather app on hybrid cloud platform

SkyThe Met Office has launched its latest app on its new hybrid cloud platform, Weather Cloud, in an effort to increase the speed of delivery and accuracy of its weather data to customers.

The platform itself enables the company to processes meteorological data for mobile, at scale, across all Met Office platforms, to ensure the team can deliver information to the public at times of extreme weather events. In designing the app, the team took a DevOps orientated approach, releasing a Minimum Viable Product (MVP) in the first instance, while monitoring customer feedback to refine the proposition.

“We know that more and more people are choosing mobile devices to access their weather information from the Met Office and it’s vital we continue to address this changing behaviour so we can deliver our world-class weather service,” said Owen Tribe, Head of Digital at the Met Office. “The new app technology will enable us to evolve our digital presence and the ways in which people want to access their weather information in the future.”

During Storm Katie in March, the Met Office received a 200% increase in traffic and with over 8 million visits over the course of the weekend. The team claim the new Weather Cloud platform will better enable them to deal with increased traffic and facilitate better planning for short-term weather events. The company also highlighted the ability to scale down in times of lesser demand to reduce public funds spent on the platform.

Weather Cloud was implemented in AWS with assistance from CloudReach, though the DevOps journey has been maintained as the team continue to make updates to the app based on customer feedback.

“The Met Office now has AWS Cloud infrastructure supporting its services, which can respond to changes in demand quickly, is highly resilient in case of any failures and supports stringent security requirements,” said James Monico, Founder at CloudReach. “Using AWS means that the Met Office does not have to maintain hardware that would otherwise be unused for large parts of the year, but it can instead add and remove resources quickly and dynamically as demand fluctuates.”

New EU data regulations receives warm reception from industry

EuropeThe European Union finally rubber-stamped a refresh of the General Data Protection Regulations (GDPR) that offers greater protection for individuals but at cost of a greater burden on businesses, reports Telecoms.com.

In customary EU fashion this is the culmination of four years of to-ing and fro-ing since the refresh was first proposed. Even the final sign-off took four months to complete, with the text having been agreed last December. Furthermore the new regulations won’t come into law until May 2018, giving all businesses who keep data on European citizens, which must include pretty much every multinational, two years to comply.

“The new rules will give users back the right to decide on their own private data,” said Green MEP Jan Philipp Albrecht, who led the drafting process. “Businesses that have accessed users’ data for a specific purpose would generally not be allowed to collect the data without the user being asked. Users will have to give clear consent for their data to be used. Crucially, firms contravening these rules will face fines of up to 4% of worldwide annual turnover, which could imply € billions for the major global online corporations.

“The new rules will give businesses legal certainty by creating one unified data protection standard across Europe. This implies less bureaucracy and creates a level playing field for all business on the European market. Under the new rules, businesses would also have to appoint a data protection officer if they are handling significant amounts of sensitive data or monitoring the behaviour of many consumers.”

Industry reaction has been broadly positive, but with caveats mainly concerning how easy it will be to comply and some concern about the high ceiling for potential fines. Compounding this is a requirement for companies to disclose data breaches within 72 hours of them happening, which is a pretty small window.

“This will be a technical challenge for those businesses unaccustomed to such stringent measures,” said David Mount of MicroFocus. “They will need to identify the breach itself and the information assets likely to have been affected so they can give an accurate assessment of the risks to the authorities and consumers.

“While this may seem like a positive step towards improved data protection, the US example shows that in reality there can be an unintended consequence of ‘data breach fatigue’. Consumers become accustomed to receiving frequent data breach notifications for even very minor breaches, and as a result it can be hard for them to distinguish serious breaches requiring action from minor events which can be safely ignored. The effect is that sometimes consumers can’t see the wood for the trees, and may start to ignore all warnings – which somewhat negates the point of the measure.

“It is now up to European data privacy regulators to work together to ensure that the GDPR rules are implemented in a way that supports economic growth and improved competitiveness,” said John Giusti, Chief Regulatory Officer of the GSMA. “Regulators will need to exercise particular care in interpreting GDPR requirements – around consent, profiling, pseudonymous data, privacy impact assessments and transfers of data to third countries – to avoid stifling innovation in the digital and mobile sectors.

“All eyes are now on the review of the e-Privacy Directive. The right balance needs to be struck between protecting confidentiality of communications and fostering a market where innovation and investment will flourish. To this end, the GSMA calls on legislators to address the inconsistencies between the existing e-Privacy Directive 2002/58/EC and the GDPR.”

The e-Privacy Directive covers things like tracking and cookies and seems to focus specifically on telecoms companies in the way they process personal data. So for the telecoms sector specifically this refresh could be even more important than the GDPR. The European Commission initiated a consultation on ePrivacy earlier this week and will conclude it on 5 July this year.

William Long, a partner at Sidley Austin, warned that individual countries may view the new GDPR differently. “There are still a number of issues where some member states have fought successfully to implement their own national law requirements, for instance in the area of health data, and this will no doubt lead to certain complexities and inconsistencies,” he said.

“However, organisations should be under no doubt that now is the time to start the process for ensuring privacy compliance with the Regulations. The penalties for non-compliance are significant – at up to 4% of annual worldwide turnover or 20 million euros, whichever is the greater. Importantly, companies outside of Europe, such as those in the US who offer goods and services to Europeans, will fall under the scope of this legislation and will face the same penalties for non-compliance.”

“Our own research shows that globally, 52% of the information organisations are storing and hoarding is completely unknown – even to them, we call this ‘Dark Data’,” said David Mosely of Veritas. “Furthermore, 40% of stored data hasn’t even been looked at in more than three years. How can companies know they’re compliant if they don’t even know what they’re storing? This is why GDPR represents such a potentially massive task, and businesses need to start tackling it now.”

“In order for data to remain secure, there are three core components that are now vital for EU businesses,” said Nikki Parker of Covata. “Firstly, encryption is no longer an optional extra. It provides the last line of defence against would-be snoopers and companies must encrypt all personally identifiable information (PII).

“The second component is identity. True data control involves knowing exactly who has access to it and this can be achieved through encryption key management. Enabling businesses to see who has requested and used which keys ensures a comprehensive audit trail, a requirement of the new regulation.

“Finally, businesses must set internal policies that specifically outline how data can be used, for example, whether data is allowed to leave the EU or whether it can be downloaded. Applying policies to each piece of data means access can be revoked at any moment if the company feels it is in violation of the ruling.”

All this is happening in parallel with the overhaul of the rules governing data transfer between Europe and the US, known as the Privacy Shield. By the time the GDPR comes into force pretty much all companies are going to have to tread a lot more carefully in the way they handle their customers’ data and it will be interesting to see how the first major transgression is handled.

Parallels Remote Application Server (RAS) Quick Installation Guide

Throughout this blog post we will show the installation and configuration process for Parallels Remote Application Server in the easiest and simplest form to actually deliver applications to the users device.   Starting from the left hand side of the diagram the device is connected to the internet via WAN which then passes through the firewall […]

The post Parallels Remote Application Server (RAS) Quick Installation Guide appeared first on Parallels Blog.

Microsoft files lawsuit against US government and secret snooping orders

Lady Justice On The Old Bailey, LondonMicrosoft has filed a new lawsuit in federal court against the United States government arguing the right that customers should have the right to know when the state accesses their emails or records.

Under current law, the government has the right to demand access to customer information, while also issuing orders to companies such as Microsoft to keep these types of legal demands secret. Microsoft claim these orders are becoming too often common place; rather than common routine, these secrecy issues should be the exception not the rule.

“We believe that with rare exceptions consumers and businesses have a right to know when the government accesses their emails or records,” said Brad Smith, President and Chief Legal Officer at Microsoft on the company blog. “Yet it’s becoming routine for the U.S. government to issue orders that require email providers to keep these types of legal demands secret. We believe that this goes too far and we are asking the courts to address the situation.

“Cloud computing has spurred a profound change in the storage of private information. Today, individuals increasingly keep their emails and documents on remote servers in data centres – in short, in the cloud. But the transition to the cloud does not alter people’s expectations of privacy and should not alter the fundamental constitutional requirement that the government must – with few exceptions – give notice when it searches and seizes private information or communications.”

While the company recognizes there are certain circumstances where secrecy would be required, it would appear the US government is using the legal demands to keep secrecy as a default setting. Microsoft has claimed the demands violates the company’s First Amendment right to free speech, as well as the customers Fourth Amendment right, which gives people and businesses the right to know if the government searches or seizes their property.

“Over the past 18 months, the U.S. government has required that we maintain secrecy regarding 2,576 legal demands, effectively silencing Microsoft from speaking to customers about warrants or other legal process seeking their data,” said Smith. “Notably and even surprisingly, 1,752 of these secrecy orders, or 68% of the total, contained no fixed end date at all. This means that we effectively are prohibited forever from telling our customers that the government has obtained their data.”

Microsoft’s case is built on the perception the Electronic Communications Privacy Act is currently being abused by US officials, but also the fact the act is dated and no longer relevant. The act, which is seemingly unpopular with technology firms, has been in place since 1986. Microsoft argues the time period between the act being written and the widespread use of the internet is too long for the legislation to be relevant to today’s world.

“While today’s lawsuit is important, we believe there’s an opportunity for the Department of Justice to adopt a new policy that sets reasonable limitations on the use of these types of secrecy orders,” said Smith. “Congress also has a role to play in finding and passing solutions that both protect people’s rights and meet law enforcement’s needs. If the Department of Justice doesn’t act, then we hope that Congress will amend the Electronic Communications Privacy Act to implement reasonable rules.”

The company believes the act should be updated in three areas. Firstly, from a transparency perspective, the government should be held accountable when it snoops through customer data, and in the majority of cases the customer should be informed. Second, there should be a focus on digital neutrality as customers should not receive less notice of government activities simply because emails are stored in the cloud. Finally, there should be a necessity clause which would limit what the government can keep secret. In these circumstances, Microsoft wants the right to tell its customers what has been seen outside of the necessity clause.

Big data applications for the enterprise: Have they finally come of age?

(c)iStock.com/3alexd

A recent survey by consulting firm NewVantage Partners reveals that the portion of U.S. companies using big data in the past three years has jumped from 5% to 63% of those companies, 70% now say that big data is critically important to their business, up from 21% in 2012.

Big data has revolutionised research methodology by making it possible to measure all of the data. Whether predicting earthquakes, providing real-time weather alerts or just analysing the best traffic patterns, big data is changing our lives and society. But how will big data transform business results? And what are the burning big data applications for the enterprise?

Big data for big problems

Surveys show the number one challenge CIOs face today is data growth. The amount of digital information created and shared in the world increased ninefold in just five years. Big data was at almost two zettabytes by the end of 2011, and by 2015 it had quadrupled to nearly eight zettabytes.

CIOs are challenged because with more data comes increased cost, complexity and risk. All costs go up as data grows including CPU, storage, network and data centre expenses. End users suffer as screen response times slow, and IT teams scramble – and fail – to complete critical jobs on time. Data growth reduces system availability and extends outages, since more data requires more time to manage. Governance and compliance concerns grow by the terabyte as well, because more data means more risk.

Leading organisations run their enterprise applications on high-end, multiprocessor servers that provide memory database processing on solid state arrays. These systems deliver ultra-high performance for relational database applications and businesses need them to meet critical objectives; but, as the amount of data climbs these production systems face performance challenges. The cost to upgrade is sky high.

One solution is to run only current data on Tier 1 infrastructure and move the rest to Apache Hadoop. As data ages it becomes less active and less valuable. Recent studies have shown that up to 80% of a typical organisation’s data is inactive. By moving inactive data to commodity platforms, businesses may achieve significant payback. Consider the following cost comparison:

A common data platform

So, if current data should run on Tier 1 infrastructure for optimised performance, then less current data should run on a big data platform for the same reason. Big data platforms, and in particular Apache Hadoop, are ideal common data platforms (CDPs) for older data as they offer uniform data collection, low-cost data storage and reporting across the enterprise.

Apache Hadoop ingests both structured and unstructured data, leverages low-cost commodity infrastructure, and delivers massive scalability. Using the MapReduce programming model to process large datasets across distributed compute nodes in parallel, Hadoop can process any workload and store data from any source at the lowest possible cost.

Information lifecycle management

Information lifecycle management (ILM) is a best practice for managing data throughout its lifecycle. ILM solutions improve application performance and optimise infrastructure utilisation to reduce costs. ILM also establishes a governance framework based on retention policies to establish compliance controls for enterprise data.

ILM classifies data at the time of creation based on security, access control and retention management. Business rules like “legal hold” ensure data governance is proper and retention policies are optimised for infrastructure utilisation. For instance, policies may be created to run “current” data on Tier 1 infrastructure and move all other “not current” data to low-cost Hadoop.

With as much as 80% of data inactive, the ROI to implement ILM is compelling; but, for many organisations, ILM simply provides the essential risk and compliance governance framework to manage data throughout its lifecycle.

Big data applications for the enterprise

Big data establishes a new enterprise blueprint on a petabyte scale, and big data applications are emerging to leverage the opportunity. Enterprise archiving and enterprise data lake are two of the most popular big data applications that have emerged because they reduce infrastructure costs, improve application performance, strengthen data governance and transform business results with advanced business intelligence.

Enterprise data lake and advanced analytics

Apache Hadoop represents a significant opportunity for enterprise data warehouse (EDW) and advanced analytics applications. Data warehouse users continually seek ways to describe data better, and EDW platforms sometimes struggle to deliver more specific views of data. Downstream analytics and NoSQL applications are also challenged by the canonical, top-down data approach delivered by traditional EDW systems.

Enterprise data lake applications store copies of production data “as is”, eliminating the need for heavy extract, transform and load (ETL) processes during data ingestion. Once stored within the Hadoop Distributed File System (HDFS), enterprise data may be more easily distilled by analytics applications and mined for critical insights.

Enterprise data lake leverages ILM to establish data governance controls and allow businesses to meet compliance objectives. ILM classifies data before ingestion based on security, retention management and access control policy.

Big data enhances traditional EDW strategies because Apache Hadoop stores and processes structured and unstructured enterprise data in bulk and at a very low cost. Lightweight ETL processes, massively scalable performance, low cost and flexible data-handling make enterprise data lake a powerful and efficient advanced analytics platform.

Enterprise archiving

Organisations continually demand improved performance from their mission-critical online applications, but the cost of ultra-high performance infrastructure is often too high. How high depends on how much data will be processed online using Tier 1 compute nodes with full-flash memory arrays.

Mission-critical enterprise applications perform better when inactive data is moved from production databases onto low-cost, bulk data storage platforms. Enterprise archiving with Apache Hadoop uses ILM retention policies to move older, inactive data from online systems to a nearline HDFS repository for easy access by end users.

When online datasets are reduced, enterprise applications run faster and with higher availability, and dramatic infrastructure savings are possible. Enterprise archiving uses ILM to establish a governance, risk and compliance framework for the data – from creation to deletion – where all data is classified and properly accessible at all times.

In ‘Market Overview for Big Data Archiving’, Forrester Research vice president Noel Yuhanna comments: «With growing data volume, increasing compliance pressure, and the revolution of big data, enterprise architect (EA) professionals should review their strategies, leveraging new technologies and approaches. In the era of big data, archiving is a no brainer investment.”

Conclusion

Gartner reports that data growth is the biggest data centre and hardware infrastructure challenge, and is also “particularly associated with increased costs relative to hardware, software, associated maintenance, administration and services.” As more and more data is processed and stored, system performance deteriorates, costs increase, and compliance objectives become harder to meet.

At the same time, demand has never been higher for improved access to data through data warehouse and enterprise analytics applications. Organisations are seeking competitiveness and new ways to gain value by mining enterprise data.

Solutions for big data are ideal common data platforms for enterprise data management, and big data applications that transform business results are available now. Apache Hadoop stores structured and unstructured data in a single repository accessible by text search for 55 times less than Tier 1 infrastructure. With ILM, organisations utilise infrastructure far more efficiently and improve governance, risk and compliance at the same time.

The wait is over. Big data applications for the enterprise have finally arrived.

IBM Partners with Reliance Communications

Emerging cloud giant IBM has recently announced its new partnership with Reliance Communications. This partnership serves to expand IBM’s current list of cloud services; IBM aims to provide “highly secure and scalable infrastructure as a service (IaaS)” on the IBM cloud. Reliance Communications will have the ability to provide its client base, consisting of upwards of 39,000 large enterprises, with the robustness that IBM’s cloud platform is able to offer. This infrastructure platform will securely run the business applications of these enterprises.

Reliance Communications has also launched new cloud service offerings that are designed to provide “end-to-end integrated e-commerce services” for India’s small and medium businesses; all of these services will be run on the IBM cloud.

Indian enterprises are increasingly leveraging cloud services and thus require more secure and refined levels of support in order to develop their growing e-commerce business. Research conducted by IBM indicates that many communication service providers have recognized the trend of increased interest in cloud services for their organizations. Reliance Communications joins a long list of communication service providers that IBM has partnered with, including Telstra, AT&T, and Verizon.

About Reliance Communications:

Reliance Communications is an Indian based Telecommunications company with company headquarters in  Navi Mumbai. Founded in 2002, Reliance Communications is India’s foremost and truly integrated telecommunications service providers. The company has a customer base of over 118 million. Reliance Communications owns and operates the world’s largest next generation IP enabled connectivity infrastructure. The company aims to become a world class communication service provider.

Comments:

 

Braham Singh: SVP of Global Product Management, Reliance Communications and Global Cloud Xchange: “Through our partnership with IBM, customers will instantly benefit from the added flexibility and global reach to be more competitive, especially as we look at new opportunities from the ‘Digital India’ program.”

Vivek Malhotra, Cloud Leader, IBM India / South Asia: “We are delighted to partner with Reliance Communications and support their efforts to offer our Cloud services in the Indian market. With a broad cloud portfolio, deep expertise and data privacy, the IBM Cloud offers businesses the ability to optimize its resources and investments to drive growth. With this collaboration, we will be able to address the requirements of organizations who have limited access to enterprise-grade cloud solutions”

The post IBM Partners with Reliance Communications appeared first on Cloud News Daily.