Financial services organizations were among the earliest enterprise adopters of cloud computing. The ability to leverage massive compute, storage and networking resources via RESTful APIs and automated tools like Chef and Puppet made it possible for their high-horsepower IT users to develop a whole new array of applications. Companies like Wells Fargo, Fidelity and BBVA are visible, vocal and engaged supporters of the OpenStack community, running production clouds for applications ranging from dev/test and backup/DR to customer-facing applications that give them a competitive edge by rolling out new and improved software products faster than their peers.
Monthly Archives: April 2015
DevOps Interview: Ryan Frantz By @VictorOps | @DevOpsSummit [#DevOps]
As I travel around the country and meet folks who are interested in what VictorOps is all about, I’m often pressed with the question of “How are you different than some other alerting service?”. The responses are varied but more or less echo the sentiment that the initial alert is just step one in the incident management lifecycle.
The responsibility of being on-call is a much larger role than simply “ACK’ing” an alert pushed to your mobile device. Now that you’ve received the page, it’s time to get busy.
Would you prefer to have a ton of valuable context delivered to you automatically with the alert, or would you rather go look for that stuff on your own? If you’ve ever been woken up in the middle of the night and responsible for diagnosing and resolving a huge problem, I think you know the answer to that.
Getting Started with @Pivotal Cloud Foundry & @AppDynamics | @CloudExpo [#Cloud]
The primary objective of a platform should be to provide a high-level of automation. This provides easy management of applications and services, while delivering consistent and error-free deployment of applications. While this high level automation provides a critical foundation, additional specialized services can be added to increase manageability of the applications deployed on the platform. To assist operators in this pursuit the Pivotal Cloud Foundry platform provides a number of integrated services out-of-the-box, including AppDynamics, New Relic, and CloudBees Jenkins. This blog will focus on the “out-of-the-box” integration between Pivotal Cloud Foundry and AppDynamics.
WebRTC Archiving By @TokBox | @ThingsExpo [#IoT #WebRTC]
Developers working with WebRTC can now build native applications for Android, archive and playback live video communications within their application, and take advantage of a range of enterprise-grade quality-enhancing features, all thanks to four new developments from TokBox which launched today.
OpenTok Archiving & Playback, released today into beta, allows developers to simply add video stream recording for their live video chat applications, save the conversation into a single H.264/AAC MP4 file, and download or stream it through the player of their choice.
The Android SDK for WebRTC for the first time allows developers to build WebRTC video chat functionality into native Android apps using the OpenTok Platform.
IoT Bootcamp Coming to New York By @IoT2040 | @ThingsExpo [#IoT]
The IoT Bootcamp is coming to Cloud Expo | @ThingsExpo on June 9-10 at the Javits Center in New York. Instructor. Registration is now available at http://iotbootcamp.sys-con.com/
Instructor Janakiram MSV previously taught the famously successful Multi-Cloud Bootcamp at Cloud Expo | @ThingsExpo in November in Santa Clara. Now he is expanding the focus to
Janakiram is the founder and CTO of Get Cloud Ready Consulting, a niche Cloud Migration and Cloud Operations firm that recently got acquired by Aditi Technologies. He is a Microsoft Regional Director for Hyderabad, India, and one of the first few Microsoft Certified Azure Professionals in India.
Janakiram is also one of the first few professionals with Amazon Certified Solution Architect, Amazon Certified Developer and Amazon Certified SysOps Administrator credentials, and is also recognized by Google as the Google Developer Expert (GDE) for his contribution to the Cloud community.
Janakiram is a Guest Faculty at the International Institute of Information Technology (IIIT-H) where he teaches Cloud Computing, Containers, Big Data, DevOps to the students enrolled for the Masters course.
I recently interviewed him about the upcoming IoT Bootcamp, and this is what he had to say:
Please describe the outlines of the upcoming IoT Bootcamp.
The IoT Bootcamp is first of its kind that covers the fundamentals of devices, cloud and analytics.
During the first part, attendees will get to see Arduino, Raspberry Pi and, Intel Edison in action.
The second part is all about connecting the devices to the cloud. I show how to use Azure, Amazon and Google Cloud to stream and store sensor data.
The third and the final part is about logic and analytics. We will use advanced querying and visualization to make sense of the data. The bonus session is designing a smart home with all the above. We will show how to connect the dots in designing a real world home automation platform.
How does this build on the Cloud Bootcamp that you’ve previously taught at Cloud Expo?
Cloud Bootcamp focused on the infrastructure aspects of AWS, Azure and Google Cloud. It was meant for IT professionals and DevOps teams. IoT Bootcamp is all about utilising the cloud to design solutions.
The IoT Bootcamp will start with IoT 101 and goes all the way to up to the advanced concepts of steaming and analytics. One common factor between both the bootcamps is the multi-cloud aspect. We show how to perform common tasks on top cloud platforms.
Who should attend?
The Bootcamp is meant for both technology and business decision makers. Similar to the Cloud Bootcamp, it has the right balance between technology and the application of it. Developers and IT Pros can learn how to get started with the development and deployment while business professionals will be able to demystify the concepts involved with Internet of Things.
How is it different from other bootcamps and workshops?
The methodology we follow to deliver the Bootcamp is quite different from others. The emphasis is more on application of technology than using PowerPoint slides.
There will be no sessions without a demo! Each session is designed to cover the key aspect of IoT delivered through a real world demonstration. For many attendees, this would be their first workshop with exposure to a wide range of devices, sensors, tools and technologies.
What specific technologies do you cover?
The IoT Bootcamp has a comprehensive coverage of devices, cloud and analytics. We cover the following technologies:
Devices
Arduino, Raspberry Pi, BeagleBone, Spark, Intel Galileo and Edison
Development Tools
Arduino, Cloud9, and Node-Red
Streaming
Cloud based streaming tools like Azure Event Hubs
Storage
NoSQL storage such as Amazon DynamoDB
Querying & Analytics
Real time analytics through Google BigQuery and other tools
What will attendees take away from the IoT Bootcamp?
Attendees will walk away with the big picture of IoT. They will be able to demystify the concepts of sensors, protocols, streaming, storage and analytics.
The goal of the Bootcamp is to provide 360 degree view of the IoT landscape. This is an ideal forum to explore what IoT means to their organization and how to apply the concepts in their business.
How will attendees be able to apply this knowledge?
The attendees can jumpstart their IoT development right after the Bootcamp. They will be able to choose the right development board, tools, and cloud the cloud platform to develop their IoT solution.
7 Awesome Instagram Accounts Tech Geeks Should Follow
No selfies or food photos to be found on these awesome accounts. Instagram is undeniably a fun and quirky social network that allows you to share your life with friends through pictures (and more recently, video). But your friends and family aren’t the only ones worth following on Instagram—believe it or not, a lot of brands […]
The post 7 Awesome Instagram Accounts Tech Geeks Should Follow appeared first on Parallels Blog.
Operationalizing the Network on Target By @LMacVittie | @DevOpsSummit [#DevOps]
Operationalizing the network continues to be a driving force behind DevOps and SDN. The ability to solve real problems using programmability to automate and orchestrate infrastructure provisioning and configuration across the application release process remains the hope for many interested in one or the other – and often times both.
A recent Avaya sponsored, Dynamic Markets survey (reg required) dove deep into the demesne of SDN and found that many of the problems companies have – and expect to be solved by SDN – are directly related to provisioning, configuration and downtime concerns around services and applications across the network.
Sinopec taps Alibaba for cloud, analytics services
Aliyun, Alibaba’s cloud services division is working with China Petroleum & Chemical Corporation (Sinopec) to roll out a set of cloud-based services and big data technologies to enable the firm to improve is exploration and production operations.
In a statement to BCN the companies said they will work together to roll out a “shared platform for building-based business systems, big data analytics” and other IT services tailored to the petroleum industry.
“We hope to be able to use Alibaba’s technology and experience in dealing with large-scale system architecture, multi-service data sharing, data applications in the large-scale petrochemical, oil and chemical industry operations,” Sinopec said.
The two companies also plan to explore the role of cloud and big data in connected vehicles.
Just last month Aliyun opened its first overseas datacentre in Silicon Valley, a move the Chinese e-commerce giant said will bolster its appeal to Chinese multinational companies.
The company has already firmed up partnerships with large multinationals including PayPal and Dutch electronics giant Philips. The company has five datacentres in China.
It would seem a number of large oil and gas firms have begun to warm to the cloud as of late. Earlier this week Anadarko Petroleum Corporation announced it had signed a five year deal that would see the firm roll out PetroDE’s cloud-based oil and gas field evaluation analytics service.
Equinix announces sixth London datacentre
Datacentre giant Equinix has announced the launch of its sixth London-based International Business Exchange (IBX) datacentre.
Equinix said the datacentre, LD6, will offer customers the ability to leverage its cloud interconnection service – which lets users create private network links to Microsoft Azure, Amazon Web Services (AWS) and Google Cloud services among others.
The company said the $79m facility, which is located in Slough, is extremely energy efficient (LEED gold-accredited), and utilizes mass air cooling technology with indirect heat exchange and 100 percent natural ventilation.
It measures 236,000 square feet (8,000 square meters) and has capacity for 1,385 cabinets, with the ability to add another 1,385 cabinets in phase two of the facility’s development. Once phase two is complete, the Equinix London Slough campus will provide more than 388,000 square feet (36,000 square meters) of colocation space interconnected by more than a thousand dark fiber links.
“LD6 is one of the most technically advanced datacentres in the UK. It has been designed to ensure that we can continue to provide state-of-the-art colocation for our current and future customers,” said Russell Poole, managing director, Equinix UK. “This latest addition to our thriving London campus sets new standards in efficiency and sustainability.”
The facility is among five new datacentres announced last month. Equinix announced plans in March to roll out new state-of-the-art datacentres in New York, Singapore, Melbourne and Toronto.
Google boosts cloud-based big data services
Google announced a series of big data service updates to its cloud platform this week in a bid to strengthen its growing portfolio of data services.
The company announced the beta launch of Google Cloud Dataflow, a Java-based service that lets users build, deploy and run data processing pipelines for other applications like ETL, analytics, real-time computation, and process orchestration, while abstracting away all the other infrastructure bits like cluster management.
The service is integrated with Google’s monitoring tools and the company said it’s built from the ground up for fault-tolerance.
“We’ve been tackling challenging big data problems for more than a decade and are well aware of the difference that simple yet powerful data processing tools make. We have translated our experience from MapReduce, FlumeJava, and MillWheel into a single product, Google Cloud Dataflow,” the company explained in a recent blog post.
“It’s designed to reduce operational overhead and make programming and data analysis your only job, whether you’re a data scientist, data analyst or data-centric software developer. Along with other Google Cloud Platform big data services, Cloud Dataflow embodies the kind of highly productive and fully managed services designed to use big data, the cloud way.”
The company also added a number of security features to Big Query, Google’s SQL cloud service, including adding row-level permissioning for data protection, made it more performant (raised the ingestion limit to 100,000 rows per second), and announced its availability in Europe.
Google has largely focused its attention on other areas of the stack as of late. The company has been driving its container scheduling and deployment initiative Kubernetes quite hard, as well as its hybrid cloud initiatives (Mirantis, VMware). It also recently introduced a log analysis for Google Cloud and App Engine users.