The old monolithic style of building enterprise applications just isn’t cutting it any more. It results in applications and teams both that are complex, inefficient, and inflexible, with considerable communication overhead and long change cycles.
Microservices architectures, while they’ve been around for a while, are now gaining serious traction with software organizations, and for good reasons: they enable small targeted teams, rapid continuous deployment, independent updates, true polyglot languages and persistence layers, and a host of other benefits.
But truly adopting a microservices architecture requires dramatic changes across the entire organization, and a DevOps culture is absolutely essential.
Monthly Archives: September 2014
The evolving role of savvy business technology leaders
As the chief executive at your company, if you discovered that you had some major financial standards compliance issues within your organization, would you be concerned about the risks associated with that exposure?
If your designated outside auditor had bypassed your internal finance department and chose instead to work directly with your individual Line of Business leaders, would you want to know why? Moreover, would you intervene?
Yes, it’s a rhetorical series of questions. And I think we all know the answers.
Did you know there’s a movement that’s already in progress that could impact your company’s provision and consumption of IT services, with a corresponding potential exposure concern that’s related to compliance issues?
Have you heard about the Shadow IT phenomenon? According to the assessment of several leading IT market analysts, it’s a trend that’s already quite pervasive across a broad cross-section of industries. It’s also been a hotly debated topic.
Who is championing this cause from a horizontal organization perspective? According to market research from last year, it’s primarily the marketing leadership.
The Need for Greater Speed and Deep Knowledge
Granted, some of the inherent friction within the ongoing quest for strategic business technology competitiveness can be uncomfortable for some people. But in the grand scheme of things, it’s all good. This is the path to progress. Let’s consider the upside opportunities.
IT used to be primarily about operations, cost reduction, and management controls. That’s no longer the scenario at forward-thinking companies. Today, corporate IT departments are being asked to plan, build or procure, manage, and continuously improve upon their organization’s business technology infrastructure and associated processes.
As if that weren’t enough, you will likely also expect your re-energized IT leadership team to help your organization innovate, grow, and deliver unique customer experiences. If you haven’t yet reached that pivotal point, don’t fret…you will soon.
Besides, you’re most likely to be highly motivated to navigate this key market transition. As an informed executive, you’ve made it a personal goal to insure that you surround yourself with the best available talent.
While business technology deployments can’t provide you a permanent competitive advantage, the timely deployment of new IT systems can enable you to build a strong competitive position that will stand the tests of time.
Moreover, while cloud service adoption moves towards ubiquity in the global marketplace, the ability to perform the essential task of crafting a unique and powerful business application development environment is still somewhat scarce. Again, that’s an opportunity, not a problem.
IT Talent with New Skills and Big Aspirations
Granted, we’ve already reached an inflection point where software-as-a service offerings have made it easier for any Line of Business leader to deploy an IT solution, just-in-time. However, being able to attain the much broader business agility benefits — that are truly possible with cloud computing — requires a comprehensive strategic plan.
Now’s the time to encourage and coax your business unit leaders and open-minded IT managers to work together and collaborate on mutually beneficial projects. If you need help to bring this essential project orchestration together, then consider seeking out an accomplished pioneer.
Companies with an evolved ecosystem of integration channel partners, such as Cisco, has the depth and breadth of information, education and guidance resources that today’s forward thinking leaders need to execute their multifaceted cloud-enabled strategies.
Why Cloud Service Brokering Really Matters
By increasing their inherent sourcing flexibility, your current business technology managers can more successfully assume the role as a broker of IT services — thereby increasing transparency, and better aligning your business and IT agendas into a cohesive plan.
A valued and trusted broker, in this context, is an individual or group within your own IT team that acts as a mediator between the Line of Business end-user of a managed cloud service capability and the providers of that service offering.
But how does a top performing IT department gain the trust to carry out this evolving role?
What’s needed is a comprehensive, consistent cloud strategy that will offer an enlightened perspective, provide compelling direction, and build confidence in your IT team’s enabling platform procurement decisions.
As an example, when acting as a credible service broker, just imagine how your IT team can now take full advantage of multiple sourcing options and become a value-added intermediary of cloud services for their Line of Business internal customers. Now, that’s the makings of a solid foundation for a meaningful partnership and substantive progress towards your bold goal.
So, are you ready to enhance your business flexibility with a choice of consumption models in the world of many clouds? Are you fully prepared to embrace the emerging hybrid cloud era?
VMworld 2014 Recap: SDDC, EUC & Hybrid Cloud
By Chris Ward, CTO Another year, another VMworld in the books. It was a great event this year with some key announcements and updates. First, some interesting stats: The top 3 strategic priorities for VMware remain unchanged (Software Defined Datacenter/Enterprise, End User Computing, and Hybrid Cloud). Some interesting numbers presented included on premise infrastructure…Read More »
Infrastructure-as-a-Service POC Proves to Be Informative [#Cloud]
Regardless if you’ve migrated multiple applications or this is your first migration to a public Infrastructure-as-a-Service (IaaS) you will want to run a small proof-of-concept to make sure that the basic elements of data flow operate as expected and your components will run in the IaaS environment. This week I spent some time experimenting with the three top IaaS offerings: Amazon AWS, Google Compute Cloud and Microsoft Azure. The architecture was relatively simple: 3 docker containers, one hosting a LAMP—Linux, Apache, MySQL & PHP—stack running WordPress, one hosting Postfix mail server forwarding all mail, and one hosting CVS. The results of the testing were informative.
@ATT Launches #Cloud Storage Offer for Federal Government Agencies
Against the backdrop of a new survey demonstrating significant cost savings that can be attained by Federal Government adoption of cloud computing, AT&T* on Tuesday announced a highly secure cloud-based storage solution designed specifically to meet the robust security requirements of Federal Government agencies.
The offer, AT&T Synaptic Storage as a Service for Government, is a multi-tenant, community cloud offer. It has the same features, policies, capabilities, and EMC Atmos technology as AT&T’s commercial cloud storage offer.
Cloud Clinique & GovCloud Network Deliver On Affordable Cloud Certification
In August, GovCloud Network and Tech Equity Ltd promised to deliver cloud education and training on a global basis. Today the companies are making good on that promise by announcing a Cloud Certification Best Practices Program for the European and North American markets.
Building upon the Cloud Clinique database of over 7,500 cloud best practice concepts across 18 separate domains, certified cloud practitioners and trainers will now be delivering high quality cloud computing education in a affordable, pay-as-you-go, modular format. IT practitioners can attend any or all of these 16 topical webinar modules for only $25/€20 each. Attendees will also get Free Access to our database of best practice concepts on cloudclinique.com for 3 months!
@ThingsExpo | Wearables Market to Hit $8 Billion by 2018 (#IoT)
The global wearable electronics market is expected to cross $8 billion in 2018, growing at a healthy CAGR of 17.7% from 2013 to 2018. The total unit shipment is expected to cross 130 million units globally. In terms of products, wrist-wear accounted for the largest market revenue in 2012, with total revenue of the most established wearable electronic products – wrist-watches and wrist-bands combined, crossing $850 million. The potential of the industry can be gauged by the fact that both – the big and established players and small start-ups have up their ante in wearable electronics market.
@ThingsExpo | Next Generation @ARMEmbedded Processors (#IoT)
ARM is at the heart of the world’s most advanced digital products. Our technology enables the creation of new markets and transformation of industries and society. We design scalable, energy efficient-processors and related technologies to deliver the intelligence in applications ranging from sensors to servers, including smartphones, tablets, enterprise infrastructure and the Internet of Things.
100 ideas that changed the Web: Big data
© Ryoji Ikeda, data.tron [8k enhanced version], audiovisual installation, 2008-09. Photo by Liz Hingley
Every minute, 2 million searches are made, half a billion links are shared and 48 hours of footage are uploaded. That is a lot of data. And yet, in terms of how much is being produced worldwide, it barely scratches the surface. That is Big Data.
Big Data is the term used to describe data sets that are so large and complex that it takes a phenomenal amount of processing power to interrogate them. So why do it?
Fourteen seconds before the 2011 earthquake in Japan, every bullet train and every factory came to a halt. Many lives were saved thanks to the Quake-Catcher Network.
Many lives were saved in Japan in 2011 thanks to the Quake-Catcher Network
This network is made up of thousands of laptops with free software running in the background. The software makes use of the built-in sensors designed to protect the hard drive if the laptop is dropped. If there is an earthquake, all the sensors go off at the same time. By continuously aggregating and processing the data produced by all the sensors, it is possible to brace for impact before the earthquake strikes.
Fourteen seconds before, as it turns out.
In an increasingly connected world, our ability to capture and store data is staggering. We have sensors in everything, from running shoes to mobile phones. We are divulging more and more personal information to social networks. We supply more and more customer data to retailers, on and offline. Around 90 per cent of the data in the world today has been created in the last two years alone. Thanks to the Web, we have gone from information scarcity to information overload in two decades.
We have gone from information scarcity to information overload in two decades
Big Data needs big computers to process it. The algorithms that crunch Big Data require thousands of servers running in parallel. Currently, only governments and web giants like Google and Amazon have the necessary resources.
Barack Obama got elected off the back of it. Twice. By unifying vast commercial, political and social databases, his team was able to understand and influence individual swing voters. Google uses it to predict flu outbreaks, identify human trafficking hot spots and sell advertising.
When the Web was first conceived, it was intended to be more than an interconnected repository of information. The ultimate aim was the Semantic Web, a Web that drew meaning from information. Big Data is half the equation.
Extracted from 100 Ideas That Changed the Web by Jim Boulton, published by Laurence King Publishing, £9.95.
4chan user reportedly hacks iCloud with nude celeb pics – as Google strengthens security
An anonymous hacker on the 4chan site has published a series of naked photos of more than 100 celebrities, including Jennifer Lawrence, Kate Upton and Mary Elizabeth Winstead, after reportedly hacking into the users’ iCloud accounts.
Even though the photos appeared to originate from iCloud devices – and even though 4chan users mentioned it – it’s not been confirmed that Apple’s cloud storage system provided the leak. Other theories are being banded about as to how the photos were accessed, with one report pointing the finger of blame at Dropbox’s door.
The explicit pictures – not all of which have been verified at the time of writing – were posted on 4chan before circulating on Reddit and Twitter.
Lawrence’s agent confirmed the veracity of the pictures, and threatened legal action against those responsible.
“This is a flagrant violation of privacy,” the spokesperson told the BBC. “The authorities have been contacted and will prosecute anyone who posts the stolen photos of Jennifer Lawrence.”
Winstead, posting on Twitter, also confirmed the photos were of her, adding that because of their age she could “only imagine the creepy effort that went into this.”
Elsewhere, while iCloud accounts were being hacked, Google announced its Cloud Platform had received an upgraded set of security certificates.
The new documentation includes an updated ISO 27001 certificate, as well as an SOC 2 and SOC 3 Type II audit report, which can be seen in all its glory here.
“Keeping your data safe is at the core of what we do,” wrote Google Apps director of security Eran Feigenbaum in a blog post.
“These certifications, along with our existing offerings…help assure our customers and their regulators that we’re committed to keeping their data and that of their users secure, private and compliant,” he added.
In 2012, Christopher Chaney was jailed for 10 years after leaking nude pictures of actress Scarlet Johansson.
What do you make of this leak? Does this make you more concerned about the security of your cloud data?