Call it DevOps or not, if you are concerned about releasing more code faster and at a higher quality, the resulting software delivery chain and process will look and smell like DevOps. But for existing development teams, no matter what the velocity objective is, getting from here to there is not something that can be done without a plan.
Moving your release cadence from months to weeks is not just about learning Agile practices and getting some automation tools. It involves people, tooling and a transition plan. I will discuss some of the benefits and approaches to getting there.
Monthly Archives: April 2016
A Doctrine for #DigitalTransformation | @ThingsExpo #IoT #M2M #BigData
In my 30-years in the high tech industry I have often heard the business maxim, “Develop a business strategy first, and then find the technology to support it.” This well-worn business doctrine, I have come to believe in this age of digital transformation, is wrong and it is time to repudiate it.
Let me support my argument by first asking a few questions. What came first e-commerce or the Internet? What came first mobile banking, or wireless mobile communications?
Are You Ready to Transition to Mobile and Cloud Technologies? | @CloudExpo #Cloud
In recent years, quality assurance teams have become accustomed to a variety of new technologies. It has been a big change for the many quality assurance analysts who came of age at a time when waterfall development methodologies and PC-centric applications were their front and center concerns.
Now they have to adjust to the realities of cloud computing and the ever-expanding mobile device ecosystem. A lot of infrastructures, development platforms and applications are being shifted off-premises to public and private clouds, while the device mix at many enterprises is becoming more diverse.
Making the Case for Monitoring Behind the Firewall By @Catchpoint | @CloudExpo #Cloud
If you read our blog regularly then you know we’re pretty bullish about our OnPrem Agent product and its behind-the-firewall user-experience monitoring capabilities.
What does it mean to monitor behind the firewall? Essentially you’re bringing Catchpoint’s Synthetic Monitoring capabilities to the online applications that run inside your firewall as opposed to on your customer-facing website. OnPrem Agent becomes your own synthetic monitoring node that scripts interactions with these applications and constantly tests them to make sure response times stay fast and your application remains available.
Visy Leverages the Cloud for Data Backup with @Riverbed SteelFusion | @CloudExpo #Cloud
Riverbed Technology has announced Australia-based packaging company, Visy, is the first organization to deploy a Riverbed and Microsoft joint solution designed to eliminate the headaches of its branch office IT infrastructure, improve business continuity and minimize business disruption in the event of an incident. Visy selected Riverbed’s® hyper-converged edge solution SteelFusion™ to virtualize and consolidate its islands of remote and branch office infrastructure (including server, storage, and network components) in its data center. Microsoft and its hybrid cloud storage environment, based on Microsoft Azure and Microsoft Azure StorSimple, was used to consolidate all branch office data for centralized storage, backup and recovery.
[session] Building Your Hybrid Data Warehouse Solution with dashDB By @IBM | @CloudExpo #Cloud
As you respond to increasing requests for new analytics, you need fast and flexible technology in your arsenal so that you can deploy the right workload to the right platform for the need at hand. Do you need self-service and fast time to value? Do you have data and application control and privacy needs, along with strict SLAs to meet? IBM dashDB™ is data warehouse technology powered by in-memory computing and in-database analytics that are designed for fast results, scalability and more.
Google cloud team launches damage control mission
Google will offer all customers who were affected by the Google Compute Engine outage with service credits, in what would appear to be a damage control exercise as the company looks to gain ground on AWS and Microsoft Azure in the public cloud market segment.
On Monday, 11 April, Google Compute Engine instances in all regions lost external connectivity for a total of 18 minutes. The outage has been blamed on two separate bugs, which separately would not have caused any major problems, though the combined result was a service outage. Although the outage has seemingly caused embarrassment for the company, it did not impact other more visible, consumer services such as Google Maps or Gmail.
“We recognize the severity of this outage, and we apologize to all of our customers for allowing it to occur,” said Benjamin Treynor Sloss, VP of Engineering at Google, in a statement on the company’s blog. “As of this writing, the root cause of the outage is fully understood and GCE is not at risk of a recurrence. Additionally, our engineering teams will be working over the next several weeks on a broad array of prevention, detection and mitigation systems intended to add additional defence in depth to our existing production safeguards.
“We take all outages seriously, but we are particularly concerned with outages which affect multiple zones simultaneously because it is difficult for our customers to mitigate the effect of such outages. It is our hope that, by being transparent and providing considerable detail, we both help you to build more reliable services and we demonstrate our ongoing commitment to offering you a reliable Google Cloud platform.”
While the outage would not appear to have caused any major damage for the company, competitors in the space may secretly be pleased with the level of publicity the incident has received. Google has been ramping up efforts in recent months to bolster its cloud computing capabilities to tackle the public cloud market segment with hires of industry hard-hitters, for instance Diane Greene, rumoured acquisitions, as well as announcing plans to open 12 new data centres by the end of 2017.
The company currently sits in third place in the public cloud market segment, behind AWS and Microsoft Azure, though has been demonstrating healthy growth in recent months prior to the outage.
Only 13% trust public cloud with sensitive data – Intel survey
A survey from Intel has highlighted companies are now becoming more trusting of cloud propositions, though public cloud platforms are still not trusted to secure sensitive data.
The Blue Skies Ahead? The State of Cloud Adoption report stated 77% of the respondents believe their company trusts cloud platforms more than 12 months ago, though only 13% would utilize public offerings for sensitive data. 72% point to compliance as the biggest concern with cloud adoption.
“This is a new era for cloud providers,” said Raj Samani, CTO at Intel Security EMEA. “We are at the tipping point of investment and adoption, expanding rapidly as trust in cloud computing and cloud providers grows. As we enter a phase of wide-scale adoption of cloud computing to support critical applications and services, the question of trust within the cloud becomes imperative. This will become integral into realising the benefits cloud computing can truly offer.”
One area of the survey which could be perceived as a concern is only 35% of the respondents believe C-level executives and senior management understand security risks of the cloud. Industry insiders have told BCN that executives are almost using cloud security as a sound-bite to demonstrate to investors that the board prioritizes technology as a means of driving business innovation, though few could be considered technology orientated or competent.
“The key to secure cloud adoption is ensuring sufficient security controls are integrated from the start so the business can maintain their trust in the cloud,” said Samani. “There is a growing awareness amongst the C-suite of the potential consequences of a data breach. Yet IT must take steps to educate senior management further on the enabling capabilities of the cloud, underlining the importance of always keeping security considerations front of mind.”
“Securing the cloud is a top-down process but getting every employee to follow best practice and behave in a secure manner requires company-wide participation. For example, when faced with many of the cloud threats defined by the Cloud Security Alliance (CSA), IT will absolutely require employee support to ensure data remains secure.”
From an investment perspective, Infrastructure-as-a-Service (IaaS) continues to lead the way with 81% of respondents highlighting their organization is planning to invest in this area. Security-as-a-Service followed closely with 79%, whereas Platform-as-a-Service (PaaS) and Software-as-a-Service (SaaS) accounted for 69% and 60% respectively. The survey also highlighted respondents expect 80% percent of their IT budgets to be dedicated to cloud computing services in the next 16 months.
While the increased trust in cloud platforms is a positive, it would appear in some circumstances it is a case of blind trust. More than a fifth of IT decision makers are not sure whether unauthorized cloud services are being used within the organization and 13% cannot account for what is currently being stored in the cloud. Shadow IT continues to distress IT departments throughout the industry and the most popular means of dealing with it would appear to be database activity monitoring according to 49% of the respondents.
Shadow IT maybe a concern for the vast majority of companies in the journey to cloud security, but it does lead to the question as to whether conquering shadow IT is possible, and whether 100% secure can ever be a realistic goal. “Faced with a rapidly expanding threat landscape, IT should never consider their infrastructure to be 100% secure,” said Samani. “Attack methods are constantly updated: there is no room for complacency. IT departments must ensure they regularly update and check their security measures, undertaking their due diligence to ensure corporate data remains secure.”
The concept of secure IT would appear to be a growing conversation throughout the ranks within enterprise, though the concrete understanding and commitment behind the sound-bites from executives remains unclear. 100% may well be an unattainable goal however until the concept of secure IT is appreciated completely throughout the organization, from top-to-bottom and bottom-to-top, it would appear companies will be unlikely to utilise cloud platforms for any sensitive data.
IBM DevOps Track at @CloudExpo | @DevOpsSummit @IBMDevOps #Cloud #DevOps
Join IBM June 8 at 18th Cloud Expo at the Javits Center in New York City, NY, and learn how to innovate like a startup and scale for the enterprise.
You need to deliver quality applications faster and cheaper, attract and retain customers with an engaging experience across devices, and seamlessly integrate your enterprise systems. And you can’t take 12 months to do it.
[whitepaper] Scale Like an Enterprise | @DevOpsSummit @IBMBluemix #DevOps
Across nearly every industry, innovative entrants are disrupting traditional markets and displacing long-established players. This IBM Point of View white paper describes the IBM Bluemix Garage Method that was created to support the October 19, 2015 launch.