Today we can collect lots and lots of performance data. We build beautiful dashboards and even have fancy query languages to access and transform the data. Still performance data is a secret language only a couple of people understand. The more business becomes digital the more stakeholders are interested in this data including how it relates to business. Some of these people have never used a monitoring tool before. They have a question on their mind like “How is my application doing” but no idea how to get a proper answer.
Monthly Archives: November 2016
Top #M2M Brand | @ThingsExpo #IoT #AI #ML #DL #DigitalTransformation
@GonzalezCarmen has been ranked the Number One Influencer and @ThingsExpo has been named the Number One Brand in the “M2M 2016: Top 100 Influencers and Brands” by Onalytica.
Onalytica analyzed tweets over the last 6 months mentioning the keywords M2M OR “Machine to Machine.” They then identified the top 100 most influential brands and individuals leading the discussion on Twitter.
How to Sponsor at @WebRTCSummit | #IoT #RTC #Java #UCaaS #WebRTC
WebRTC is the future of browser-to-browser communications, and continues to make inroads into the traditional, difficult, plug-in web communications world. The 6th WebRTC Summit continues our tradition of delivering the latest and greatest presentations within the world of WebRTC. Topics include voice calling, video chat, P2P file sharing, and use cases that have already leveraged the power and convenience of WebRTC.
The Communication Challenge | @CloudExpo #API #Cloud #BigData
The question in a recent #CIOChat tweet chat (Weekend Edition) was, “What are the hardest tech skills to find for your IT organization? Do you groom them or recruit them?” A recurring theme that quickly appeared in the responses was communication skills. This was not surprising, as it is a frequent theme and topic of discussion. The ability for technologists to be able to communicate well with the business is critical to the overall success of an organization. We all speak the same language, so what makes this such a difficult challenge? Surprisingly, I came across a great movie this weekend that helped focus on that question.
[slides] Config as Code and Immutable Patterns | @CloudExpo @Azure #CD #APM #DevOps #Microservices
In his session at 19th Cloud Expo, Claude Remillard, Principal Program Manager in Developer Division at Microsoft, contrasted how his team used config as code and immutable patterns for continuous delivery of microservices and apps to the cloud. He showed how the immutable patterns helps developers do away with most of the complexity of config as code-enabling scenarios such as rollback, zero downtime upgrades with far greater simplicity. He also demoed building immutable pipelines in the cloud using both containers and VMs.
[session] #MachineLearning – It’s All About the Data | @CloudExpo #BigData
Data is the fuel that drives the machine learning algorithmic engines and ultimately provides the business value.
In his session at 20th Cloud Expo, Ed Featherston, director/senior enterprise architect at Collaborative Consulting, will discuss the key considerations around quality, volume, timeliness, and pedigree that must be dealt with in order to properly fuel that engine.
Hostway to Provide Computing Infrastructure for Core dna Platform | @CloudExpo #API #Cloud
Hostway has announced an agreement with Core dna to provide the computing infrastructure for Core dna’s software-as-a-service (SaaS) platform that is used to build marketing, e-commerce, intranets and community portal solutions.
In addition, Hostway makes available for the first time the option to deploy Core dna on a dedicated private cloud for enterprise customers requiring high levels of security for compliance with regulations. Available now, these are custom configurations that are priced individually.
Migrating from Desktop to Cloud-Based Software | @CloudExpo #API #Cloud #Security
Seeing how a chorus of business leaders and tech innovators has sung praises to the cloud for quite some time, you’re probably aware of the fact that the cloud adaptation is simply flourishing. According to research from cloud solutions provider RightScale, roughly 93% of business today are using cloud technology in some form or another. If you’re not that familiar with the technology, you might be wondering – why is cloud just so popular? Furthermore, you’re probably also thinking – are cloud services right for your organization?
Google Unveils Machine Learning APIs
In a latest move to take on competitors in the cloud market, Google has unveiled a set of machine learning tools, APIs, and services. This announcement also signifies a big push into machine learning and artificial intelligence market for the search-engine giant.
A slew of announcements were made in this regard. Here’s a look at each:
Google Cloud Jobs API
This API will make it easier for companies to find candidates for open positions in their company, as it uses machine learning to better match the skills of candidates to the existing open jobs. This API is available only in the US and Canada as of now, though it plans to expand it to other countries soon.
As soon as the announcement was made, two leading job sites – Careerbuilder.com and Dice.com, adopted this API and built a prototype within just 48 hours. Currently, this API is available in limited alpha on both these sites.
Cloud Vision API
The next announcement pertains to Google’s popular cloud API – the Cloud Vision API. The company has decided to reduce the price of this service by almost 80 percent because of the significant savings that come from switching its operations to custom TPU chips, that have the capability to amplify machine learning performance. As a result of these capabilities, performance and efficiency are greatly increased, thereby making this API a compelling product for customers.
In addition, Google has enhanced its image recognition capabilities, and has also brought in capabilities to identify landmarks, logos, and other entities.
Cloud Translation API
Google is planning to release the premium version of another popular API called the Cloud Translation API. This API is based on Google Neural Machine Translation System, that uses machine learning to improve speech recognition. This API has the capability to reduce translation errors anywhere between 55 to 85 percent – a significant jump when compared to existing services. It is mainly used in the travel industry where getting the best and closest translation can make a substantial difference.
Currently, this API supports eight languages, namely, English, Chinese, French, German, Japanese, Turkish, Korean, Portuguese, and Spanish. Soon, the company is expected to add more languages to this API.
Cloud Natural Language API
Google has decided to make its Cloud Natural Language API available for the general public. Also based on machine learning, this API allows a user to reveal the structure, emotions and meaning of any text. In a sample demonstration, Google showed how news stories from the New York Times can be analyzed for sentiments. The same idea can be extended to digital marketing campaigns, where marketers can use the sentiment analysis capability of this API to monitor online product reviews and customer service. The latest version also comes with advanced features such as a more granular sentiment analysis and better syntax analysis.
Cloud Machine Learning Group
Lastly, Google is creating a Cloud Machine Learning Group headed by Fei-Fei Li, former director of Stanford’s AI Lab, and Jia Li, former research head of SnapChat. This group is expected to conduct many more experiments in AI and machine learning.
With these announcements, Google is cementing its place in the competitive cloud market.
The post Google Unveils Machine Learning APIs appeared first on Cloud News Daily.
How SimpliVity Gave Me Back My Weekend
At GreenPages, we have a well outfitted lab environment that is used for customer facing demos and as a sandbox for our technical team to learn/experiment/test various solutions in the market. We’ve been in the process of refreshing the lab for a couple of months but have kept a skeleton environment up and running for simple administrative remote access. As part of the refresh, we had been cleaning up old VMs, systems, storage, etc. to reduce our footprint, and as part of the cleanup we moved several management VMs from an aging HP blade environment over to a small 2+1 SimpliVity OmniStack environment. I really didn’t think much about it at the time as I just needed a place to put these VMs that had no tie to older systems, which were being decommissioned. Also, the OmniStack made sense because it had plenty of capacity and performance self-contained, thus freeing up any reliance on other external storage and older compute environments.
I just recently came back from a West coast trip. While I was there, I needed to re-configure something so that a colleague could do some other configuration work. I brought up my RDP client to login to the jump box terminal server we use to administer the lab, and I got an error that said my profile wouldn’t load. So, I VPN in to check out the VM, logged in as the local administrator, and quickly discovered the box had been pwned with ransomware and a good majority of the data files (my profile included) were encrypted. After saying a few choice words to myself I investigated and determined an old lab account with a less than secure password had been used to access the system. I got the account disabled and started thinking to myself how long it’s going to take me to either attempt to ‘clean’ the box and get the files decrypted (assuming I could even find a tool to do it) or to just trash and rebuild the box. I figured that was going to take up most of my weekend but then the thought crossed my mind that we had moved all of the management VMs over to the SimpliVity boxes.
For those who may not be aware, SimpliVity’s core value proposition is all around data protection via integrated backup, replication, and DR capabilities. I knew we had not configured any specific protection policies for those management VMs, we had simply dumped them into newly created resource pool, but I figured it was worth a look. I logged into the vSphere client and took a look at the SimpliVity plugin for that terminal server VM and, low and behold, it had been backed up and replicated on a regular basis from the moment it was put into the environment. From there, I simply went back a couple of days in the snap-in, right click, restore VM. Within about half a second, the VM had been restored, and I powered it up and within another five minutes, I was logging into it via my RDP session from the West coast. Bottom line, SimpliVity took a four to six hour process and transformed it into something that takes less than six minutes. Therefore, I suggest you check it out. Thank you SimpliVity, for being kind enough to donate some gear to our lab and for giving me some family time back this weekend!
By Chris Ward, CTO, GreenPages Technology Solutions
If you would like to discuss how SimpliVity could fit into your IT strategy, reach out to us here.