Todas las entradas hechas por Lavanya

GE Cloud to Power India’s IIoT

Industrial Internet of Things (IIoT) is seen as the next big wave of cloud applications, as more industries around the world are looking to tap into the benefits of IIoT. This trend is seen not just in developed countries, but also in emerging markets like India. To service this sector, GE Cloud, in partnership with EY, is all set to make a foray into the Indian market.

This partnership between EY and GE Cloud was launched globally in May, to collect data from industrial machines with an aim to provide the right analytics for asset performance management (APM) and optimization of business operations. This software, called Predix, is offered as a cloud-based Platform as a Service (PaaS) for industrial applications.

Built on an open source platform called Cloud Foundry, Predix works towards creating a detailed model that encompasses the operations of the entire organization. Based on this information, it provides the right analytics that’ll help to improve the overall efficiency of operations.

GE’s foray into the Indian market augurs well for both the company as well as its Indian consumers. The capital good market in India is estimated to be worth around $42 billion. However, the last few years has seen a lag in this sector, as the total production increased only by 1.1%. This slow growth is not only due to lowered demand from the domestic market, but also because of inefficiencies plaguing the industries. To improve efficiency and reduce the cost of operations, technology is the way forward, and IIoT can play a crucial role to bridge this gap. In this sense, the entry of GE’s Predix maybe the silver lining that the Indian market needs to spruce up its efficiency and improve its production.

For GE too, the Indian market is important simply because it’s too large to be missed. Firstly, manufacturing is still a staple part of the Indian economy, unlike US and other markets that have moved towards a service economy. Secondly, India is known for a young and educated population, and a booming economy – the perfect recipe for an explosive increase in the demand for goods and services. As industries are poised to meet this demand, they need advanced analytics, and this is where GE’s Predix fits in. As the demand for industrial product grows, so will the need for predictive software, and all this means GE is possibly on the verge of stepping into a gold mine.

Despite this optimism and the benefits for both the parties, there are still some roadblocks. First off, much of the Indian market is fragmented, and many industries are dominated by a bunch of small players. Reaching out to these smaller companies maybe a difficult task for GE, as the funds available for technological innovations may be limited for smaller companies. Secondly, technological adaptation and the mindset to break away from traditional manufacturing practices may take some time, and GE should be ready for this waiting period. Thirdly, there is still a substantial amount of rigid bureaucracy and corruption in India, and this can complicate implementation.

Despite these roadblocks, the entry of Predix can augur well for everyone involved, though it may take some time for the benefits to materialize.

The post GE Cloud to Power India’s IIoT appeared first on Cloud News Daily.

Google Cloud Platform Extends Support for More Tools

Google Cloud Platform has extended support for a range of tools to make it easier for developers working across different platforms to integrate Google Cloud into their applications. One of the prominent changes made by Google in this respect is to extend support for applications built on ASP .NET – the open source web application framework that was created by Microsoft.  With this change, other related tools such as Visual Studio, C#, Microsoft SQL Server, and PowerShell will also get support from Google Cloud. The best part of this support is that all the APIs are open source and are available for free on github.

Let’s briefly go through each of these APIs.

C# Bindings

Though Google used many innovative technologies, its internal APIs did not really benefit its end-users. This is why the Google Cloud Platform started using public APIs for many things such as machine learning, logging, and so on. The obvious advantage is you can add any capability you want to these APIs, without ever having to worry about the complex infrastructure that executes them for you.

These public APIs come from many platforms including C#. In fact, the Google Cloud Platform receives million of C# clients from around the world every day, and it services them. If you’re wondering what’s new – it’s the support for newer APIs that require a set of advanced features like bidirectional streaming. To service these new APIs, Google is providing support to run them on a high-performance universal RPC called gRPC, instead of the regular HTTP/REST protocol.

PowerShell

Google Cloud’s tools for the Windows PowerShell helps you to manage the different resources of Google Platform easily. These tools include:

  • Google Cloud DNS – This tools allows you to manage Google Cloud DNS to publish information pertaining to your zone and record in the DNS, without the burden of managing your own DNS server.
  • Google Cloud SQL – This tool makes it convenient to setup, manage, and maintain your own MySQL database on the Google Cloud Platform.
  • Google Cloud Storage – This tool makes it easy to store and retrieve any data at any time of your choice. This flexibility allows you to use Google Platform for a wide range of scenarios such as storing web content, archiving data, distributing large objects, and more.
  • Google Compute Engine -This tool helps to create and run virtual machines on the Google Cloud Platform. Compute Engine offers many advanced capabilities that makes it easy to launch large compute clusters. Also, this tool is known for its speed and consistency of performance that is sure to add value to your application.

Visual Studio

Cloud tools for Visual Studio is a powerful way to build .NET applications and to deploy them directly on Cloud Platform. It also gives you the choice to run or test your application locally, and also deploy them directly to the cloud, right from your Visual Studio.

With these tools and support, Google plans to reach out to users across different platforms.

The post Google Cloud Platform Extends Support for More Tools appeared first on Cloud News Daily.

Dyn’s DDoS Attack – What it Means for the Cloud?

Prominent websites like Twitter, Netflix, Airbnb, and Spotify were having sporadic problems since Friday, thanks to a Distributed Denial of Service (DDoS) attack on Dyn’s servers. Dyn is one of the largest ISPs in the world, so an attack on its servers meant a significant chunk of DNS (Internet’s address directory) went down.  DNS is something similar to a phone book, and it allows users to connect to different websites and applications. Thus, when the DNS servers were attacked, users could not connect to certain IP addresses.

In most DDoS attacks, the information is intact, but temporarily unavailable. But in this case, Dyn’s core Internet infrastructure was hacked, so any organization that is directly dependent on Dyn or a service provider that uses Dyn’s servers were affected.

Besides websites, a whole lot of Internet of Things (IoT) devices that are hooked to the Internet were also affected. Cameras, baby monitors, and home routers are some of the devices that were affected by the outage. Also, corporate applications that are used to perform critical business operations were affected, thereby raking up huge losses for different companies.

Out of these companies, the ones that were worst-affected are those that rely on SaaS for critical business operations. This attack, in many ways, exposes the vulnerability of cloud computing, and the consequences of depending on third-party servers for the most critical of operations. Had these companies used multiple DNS providers or if they had stored their critical business applications in local servers, the impact of such an attack would have been greatly negated.

Going forward, what does it mean for businesses that depend on the cloud?

First off, this is a complex attack that is believed to have been done by a large group of hackers. The nature and source of the attack is still under investigation, so at this point in time, it’s hard to tell who’s behind the attack. But such complex attacks can’t happen every day as it requires enormous amounts of planning and coordination. That said, Verisign came up with a report recently that showed a 75 percent increase in such attacks from April to June. How much of it translated to loss for companies? Only a miniscule when compared to the direct security attacks that companies face.

Secondly, we’ve come too far ahead with cloud, to imagine a world without it. SaaS, PaaS, and IaaS have become integral aspects of businesses, and the benefits that come from it are enormous as well. So, compromising on the huge benefits for a rare attack is not a sound decision.

From the above argument, we can say that this DDoS attack is not going to change the cloud market overnight. However, it will make users more aware of the vulnerabilities of the cloud, so they will be better prepared to handle such situations in the future. This is also a good learning experience for companies like Dyn, as it’ll look at ways to beef up its security arrangements.

In short, though the DDoS attack was dangerous and widespread, its impact on cloud may be minimal because the benefits from cloud are huge, and such attacks are seen as rare instances when compared to direct attacks on large companies.

The post Dyn’s DDoS Attack – What it Means for the Cloud? appeared first on Cloud News Daily.

Microsoft’s Earnings get a Big Boost from its Cloud Business

Microsoft’s cloud computing business went into acceleration mode during the last quarter, and this is evident from the latest earnings reported by the company on Thursday. According to a press release, Microsoft’s commercial cloud business, including the sale of cloud-based Office 365, soared by 49 percent from the previous quarter.

Though cloud business is not a separate reporting segment, the online and on-premises versions of its cloud-based software such as Office 365 and Dynamics saw a six percent increase, as sales climbed to a whopping $6.7 billion. Revenue from Azure grew by 116 percent, and the company reports that the use of Azure’s compute service doubled as well during the same period.

These numbers signal a number of things for the company as well as for the industry as a whole. Firstly, there is a digital wave influencing the operations of many mainstream businesses, and Microsoft is making the most of this wave. The company’s overall strategy and long-term vision helped it to position itself to ride this digital wave, and these efforts have definitely paid off.

Satya Nadella, the CEO of Microsoft, opines that digital technology is not just for the startups located in Silicon Valley, rather they are becoming an integral part of the operations of every company. In this sense, almost every company is becoming a digital one, and Microsoft has positioned itself to serve these companies well.

The second significant aspect about this earning is the change in Microsoft’s business. Earlier, licensing software that customers install on their own computers was a lucrative business for the company. However, declines in this line of business were offset by sales from the cloud business, and this goes to show that cloud business is overtaking software licensing in terms of revenue generation. Going forward, there is a possibility for Microsoft to increase its focus on the cloud business over its licensing segment, though there are no official confirmations in this regard.

Thirdly, these results prove the fact that Microsoft is one of the leading cloud providers out there today. Along with Amazon AWS, a substantial part of the market is shared by these two companies, and much of it can be attributed to the early investments it made in cloud. Machine learning capabilities and a deep business understanding are other factors that have contributed to this stellar growth for Microsoft.

Over the next year too, the company sees a lot of optimism in the cloud market, and it expects sales from its cloud-based platforms to increase. This is good news not just for Microsoft, but for the tech industry as a whole, as many segments are still reeling from the economic slowdown.

All these factors have pushed the price of Microsoft’s shares to an all time high of $60.75 during after-trading hours on Thursday. If this buoyancy continues today, then Microsoft’s shares would set new records for its share price. The highest so far was $59.56 during the dot-com bubble of 1999, and this record could bring much cheer for investors.

The post Microsoft’s Earnings get a Big Boost from its Cloud Business appeared first on Cloud News Daily.

Latest Changes in Google Cloud Platform – A Peek

Google is the latest of cloud storage providers to announce changes to its service. In the GCPNext Event in London, the company announced the launch of Coldline, a new cold storage service that would store archival data. This storage offers a cheap rate for customers to store data that they’re likely to access less than once a year. The cost for this service is 0.7 cents for every gigabyte of data.

This announcement has come as a surprise because Google’s rates are already one of the lowest in the market. Also, a closely related service called Nearline is already offered by Google for users who access data less than once a month. This service costs only one cent per gigabyte, so a further slash and a new product along the same lines is a surprise.

Besides Coldline, Google made a few other changes to its cloud storage services. Firstly, it has slashed the price of its regular single-region Cloud by 23 percent, which means, it’ll cost only two cents per gigabyte per month starting from November 1st. In addition, calls to its Application Programming Interface (APIs) will cost only half a cent for every thousand operations, and this is a whopping 50 percent cut in price. This rate is applicable for both regional and multi-regional storage, that are also called Class A types of API calls.

If Coldline is one end of the spectrum, Multi-Regional Cloud Storage Service is the other end. This service is available for customers who require incredibly high levels of data availability. To meet the needs of these customers, Google will replicate data across its many cloud data storage centers spread across different regions. This way, latency will be low and customers can access data from any location quickly.

Both these offerings are designed to capture a larger market share in a highly competitive cloud storage industry. It widens their reach to include more types of customers with varying cloud storage needs, within their business circle.

Another interesting change is that Google now allows its customers to move their data from one tier to another at any time, regardless of the bucket in which the data is stored. This is a significant change, and one that corporate customers have been asking for some time now, as it helps them to make the most of economical IT resources without compromising on the needs of the users as well as regulatory stipulations.

These announcements come at a time when Amazon AWS and Microsoft have been making headlines about their cloud business, especially in terms of the new partnerships and offerings they have been able to clinch in the recent past. For Google, these changes represent a significant shift in its cloud business, as it gears up to take on the challenges from AWS and Microsoft. Capturing a larger market share begins with excellent products at affordable rates, and Google is right on target. The next few months is sure to be interesting for Google, and the cloud storage market as a whole.

The post Latest Changes in Google Cloud Platform – A Peek appeared first on Cloud News Daily.

Microsoft Rolls Out Custom Versions of Azure for the US DoD

Microsoft is known to provide customized products for government agencies at the federal, state, and local levels, to ensure that its products meet the necessary requirements and certifications laid down for these agencies. To add a feather to its cap, Microsoft announced on Tuesday that it would create custom versions of its cloud platform, Azure and Office 365, to meet the Impact Level 5 requirements laid down by the US Department of Defense (DoD). This product is expected to be available by the end of 2016, according to a press release.

In this version, Azure and Office 365 will be physically isolated and kept in two new regions dedicated for it. According to the company, these two regions will be located one each in Arizona and Texas, though the exact cities were not made public. To connect to these two centers, other DoD servers can use Microsoft ExpressRoute – Microsoft’s private connection that offers higher levels of security and lower levels of latency. Such a setup is expected to give an extra layer of security for data transmission, especially to access information that are considered to be critical for national security.

With this setup, Microsoft can meet the next level of security requirements namely the Impact Level 5 controls, that are laid down by DoD. This new addition is significant for Microsoft, as it means that Azure cloud products will be an integral part of National Security System Data and other mission critical information. In fact, Microsoft will be the only cloud provider to offer a cloud that meets these stringent requirements, and in this sense, it gives Microsoft an edge over its competitors in a crowded cloud market. Currently, Amazon’s AWS is Level4 compliant, whereas there are no such known certifications for Google.

Earlier, Microsoft’s cloud had the certifications to handle up to DoD’s Impact Level 4, which includes controlled but unclassified data such as privacy information, and protected health information. Though Impact Level 5 is also unclassified data, it includes those that are critical for National Security.

With this new addition, the total number of regions for Azure Government services will go up to six, and this includes Virginia, Iowa, and two unnamed data centers, apart from the new ones. Microsoft claims that its Azure services are being used by more than 70,000 customers in the government sector, and six million end users are accessing its various cloud products.

The post Microsoft Rolls Out Custom Versions of Azure for the US DoD appeared first on Cloud News Daily.

How does Cloud Impact the Job Market?

Gone are the days when people used to work in an office from 9 AM to 5 PM before heading back home. Today, it’s a connected world where you can work at anytime and from any place of your choice. Much of this job convenience can be attributed to rapid advancements in the world of connectivity, and the emergence of cloud as a platform to bring workers together.

In general, if you need a computer to do your work, then you can do it from anywhere. Obviously, more people are taking to this idea of remote working because it gives them the choice to balance different aspects of their life. It also reduces the need for people to take breaks from work. For example, a young mom can continue working from home while caring for her infant, and this means, she can continue to focus on her career without having to give up her priorities at home. Such conveniences go a long way in bringing more people into the economic world, thereby generating greater wealth for individuals, companies, and economies at large. In addition, they are also not restricted to any specific geographical area to find their dream job. Rather, the entire world is their option.

For companies too, this is a convenient option, as they can cut back on overheads. They no longer need huge plush offices with air-conditioning all through the day and night. This is sure to bring down their operating expenses substantially. Further, they are also not restricted when it comes to hiring talented people. They can choose to hire anyone located in any part of the world, so in this sense, they can always have the best of talent.

Due to such conveniences for both employers and employees, more people are looking at this option. In fact, it is estimated that more than three million Americans already work on cloud-based platforms like Upwork, CrowdFlower, and Amazon Mechanical Turk. It’s not going to be long before more people take this route. A report by London Business School shows that more than one-third of the workforce would be working remotely by 2020!

Much of this idea of remote working has been possible because of the cloud. Since this technology allows users to store and access their files on virtual servers, rather than on a particular computer’s hard drive, they can access it on any device and from any location of their choice. Further, many of the applications they use are located in cloud servers, and this also gives them flexibility to access these apps from anywhere. Many cloud tools like SugarSync allow real-time collaboration, and this means, workers from different parts of the world can work on a document at the same time.

As cloud becomes more sophisticated, more jobs are likely to be remotely doable. If you’re already working in the fields of data entry, programming, content creation, design, and customer service, you’re likely to be working remote. Soon, teachers, lawyers, psychologists, counselors, researchers, nurses, paralegals and others too will join the bandwagon.

The post How does Cloud Impact the Job Market? appeared first on Cloud News Daily.

IBM Bags A Cloud Contract from the US Army

IBM was awarded a contract to run a pilot program, that could lay the basis for this company to build, own, and operate data centers on behalf of the US Army. This contract, worth $62 million, is called the Army Private Cloud Enterprise, and it is the first step ever taken by the US Army to tap into the expertise of commercial IT industry to run a large-scale data center on its behalf.

The exact document was not revealed, so the scope of the project is not known. But press releases show that IBM will get one base year, and four option years to build a data center, and manage it for the Army. Also, this new data center would start off as a migration point for all the systems and applications that are currently hosted at different government data centers located at Redstone Arsenal in the city of Huntsville, Alabama. It is also expected that other systems from the Army, spanning all its operations, would be moved to this center within the next five years, provided of course, there are no challenges during this period.

Though this award was in the offing for some time now, it’s still a surprise as the Army deals with large amounts of classified data, including secret-level data that are hugely sensitive and can have immediate ramifications for national security. Despite this level of confidentiality, the Army has chosen a private company to run data centers on its behalf. Why?

Cloud computing offers many benefits that are hard for any organization to ignore, and the Army is no exception. This award, in many ways, represents the first step towards implementing the Army’s cloud computing strategy, that is aimed to create an excellent user-experience, improve mission command, and reduce IT costs as well as the overall fiscal footprint of the Army.

Also, Redstone Arsenal is considered to be a safe haven, so it makes for an ideal location to try out the idea of a private cloud for the Army, within the gates of its own military establishment. In addition, the Army plans to implement the necessary secret controls to handle such high levels of secure data.

This contract is sure to have a substantial positive impact for the Army, the primary of which is the choice to reduce inefficient data centers that are run by different governmental agencies. Currently, the Army runs anywhere between 200 to 1,200 data centers, most of which are done under the guidance of the Office of Management and Budget (OMB). With this contract in place, it plans to close at least 350 of these data centers over the next two years. In Redstone Arsenal alone, it owns 11 out of the 24 data centers that operate here. Over the next couple years, the Army wants to consolidate all its information and applications within the 11 data centers it owns. Such a move is sure to save tons of taxpayer dollars for the government, and this money can be used for beneficial social, welfare, and economic programs.

The post IBM Bags A Cloud Contract from the US Army appeared first on Cloud News Daily.

VMware’s Software on Amazon Cloud – Surprise!

For many years, VMware and Amazon have been on two sides of the storage world. While VMware had asked customers to run their businesses on their own computer servers, Amazon had always encouraged companies to move it to the cloud. But now, both the companies have teamed up to integrate their views and services.

Beginning next year, VMware’s software will run on Amazon cloud, thereby giving VMware customers to use the existing tools to manage their servers, except that their servers will be located in the cloud. Alongside, users can also make use of Amazon’s database and storage tools and services.

Now, you can already run VMware’s virtual machines on Amazon’s cloud, and can even use VMware’s management tool, vCenter, to manage your virtual machines. So, what’s different with this partnership? Well, to start with both the companies have decided to create a new version of Amazon cloud that’ll allow VMware’s virtual machines to run directly on the cloud, without the need for an Amazon software in between. Also, this new partnership will give users the flexibility to run their software both on the cloud as well as on the existing data centers.

In many ways, this partnership has reiterated the fact that cloud is the present and the future, and no business can afford to ignore it. In addition, it’s also a significant milestone in the world of cloud computing, as VMware has gone from seeing Amazon as a rival, to admitting that its products are the future.

In fact, VMware made a brief foray into the cloud world with its own product called vCloud Air, but it never really took off. As a strategic move, this company has decide to focus on its core business of running virtual machines, and at the same time, is looking to expand to capabilities that’ll allow these virtual machines to run on the cloud. This way, VMware can cater to businesses that want to stay on their own data centers, and also to businesses that want to move to the cloud. This is why such a move is likely to expand the market reach and customer base of VMware, as it’s looking for ways to cope with the changing digital environment.

This partnership is significant for Amazon too, as this is an opportunity to reach out to customers who haven’t still migrated to the cloud. This would give it a better foothold among conservative businesses that still want to keep data on their local servers for reasons ranging from lack of knowledge to security concerns. In fact, this partnership with VMware can help Amazon to reach these customers before its rivals, thereby increasing its market share in a competitive market.

According to International Data Corporation (IDC), this deal will be significant in the short-term for VMware, but in the longer term though, Amazon will be a huge beneficiary as it can bring on more corporate customers under its AWS cover. The biggest winner is of course, the customers, as they can now choose to run their operations on their own VMware-equipped data centers and on Amazon Cloud.

 

The post VMware’s Software on Amazon Cloud – Surprise! appeared first on Cloud News Daily.

A Look Into IBM’s Cloud Object Storage

IBM has set a high standard for cloud storage with its new service called Cloud Object Storage. This service allows organizations to store any amount of unstructured data on their own servers, in the cloud, or in any combination, at one of the lowest rates in the cloud storage market today.  This service will be available from October 13th, 2016 in the US, and from April 1, 2017 in Europe.

This service is built on a technology called SecureSlice that combines encryption, erasure coding, and geographical dispersal of data. Encryption is a technology where messages are encoded in such a way that only those who are authorized can view this message. Erasure coding is also a way of securing the data. This technology breaks the data down into different segments, expands them, and finally encodes them with redundant data pieces. These data fragments are then stored across different geographical locations or across different devices. This method is particularly useful to reconstruct corrupted data, and in this sense, they are a better replacement for RAID systems, as the time and overhead needed to reconstruct data is greatly reduced. Lastly, geographic dispersal of data is the method by which data is spread across different locations for greater security and redundancy. IBM acquired the SecureSlice technology when it bought a company called CleverSafe last year for $1.3 billion.

There are multiple options available for users with respect to storage. One option called Cross Regional Service, allows users to send their treated data to three separate cloud regions located in different parts of the world, while another option, called Regional Service, allows users to store in multiple data centers located within the same region. Regardless of which choice customers make, their data will be made secure and redundant with SecureSlice technology.

With this service and its many options, IBM has extended the SecureSlice technology to hybrid clouds too, thereby giving customers more flexibility and scalability, without compromising on their control over in-house infrastructure. This product comes at a time when the IDC has predicted that hybrid cloud architecture would be adopted by more than 80% of enterprises by 2018. IBM has made a strategic move by acquiring CleverSafe and extending it to a hybrid cloud environment to tap into this growing market.

In terms of cost too, this service is likely to be a good deal for customers. IBM claims that this service costs 25 percent less than Amazon Web Services S3 storage service. Also, it believes that many customers who are already using IBM’s cloud services would be willing to adopt this technology. According to Russ Kennedy, VP Product strategy, users who run apps on Amazon Web Services can also use this service to store their data, as it supports the S3 interface. The same applies to OpenStack Swift customers too, as Cloud Object Storage supports this API as well.

This service has already been deployed at a few early-adopter companies, and many more are expected to adopt it in the next few months.

The post A Look Into IBM’s Cloud Object Storage appeared first on Cloud News Daily.