Archivo de la categoría: News

Oracle’s Profits Soar on Cloud Optimism

Oracle’s third quarter earnings soared past analysts’ expectations, as it reported a revenue of $9.3 billion. The consensus estimate was $9.25 billion.

Oracle’s revenue rose by three percent when compared to last year, and much of this can be attributed to Oracle’s transition into a cloud-based services provider. Three years ago, Oracle began making a big push towards the cloud market, and the efforts are visible in its revenue.

According to the company, Oracle’s revenue from its software-as-a-service and platform-as-a-service areas, that jumped 85 percent to $1.1 billion, while its total cloud revenue exceeded $1.2 billion. These numbers reflect a 62 percent increase in revenue from the cloud.

Its traditional software licensing business, on the other hand, fell by 16 percent. This goes to show that the company is focusing more on the transition towards its cloud segment than its other traditional areas of business. This is no surprise considering the power of cloud and the huge potential it offers for service providers. In fact, its CEO, Safra Catz opined that the hyper-growth that’s happening in the cloud market has driven his company’s SaaS and PaaS business, and he hopes to capitalize on this trend over the coming years.

Overall, this has been an impressive financial performance from Oracle. It reported an earnings per share of 69 cents, which is almost eight percent more than what it reported a year earlier. In fact, it beat the analysts expectations of 62 cents per share. The company also decided to hike its dividend to 19 cents, up from 15 cents of last year.

Such a positive result caused the company’s share price to soar five percent to $45.18 in after-hours trading. This would probably be the second highest price ever, with the highest being $46.70 set in December 2014.

With this revenue, the painful process of transition is over for Oracle, and it is all set to take on competition from giants like AWS, Microsoft and Google.

In an earnings call with analysts, Larry Ellison, the executive Chairman, said that Oracle has a huge technology lead when compared to AWS and Microsoft. He is believed to have bragged many times about the company’s services, saying that Oracle’s services are cheaper and better than AWS or Microsoft.

To give a perspective, Microsoft’s revenue increased by eight percent while that of Amazon rose by 47 percent during the same period. Given these numbers, we can conclude that Ellison’s comments are grandiose at best. Though Oracle has the potential to overtake Microsoft or AWS in the future, currently, AWS continues to be the undisputed leader in this market.

That said, the progress that Oracle has made over the last three years is remarkable, and at this pace of growth, it can take a big lead over its competitors. What is impressive is the company’s quick transition into the cloud market and the relatively little negative impact on its revenues over the course of this period.

In all, Oracle is making rapid strides, but still has a long way to go to catch up with its competitors.

The post Oracle’s Profits Soar on Cloud Optimism appeared first on Cloud News Daily.

Societe General Leads the Way in Cloud Adoption

Societe General, the Paris-based bank, is looking to leverage cloud to lower its costs and to provide better services to its customers. Eventually, it wants to become the largest European bank to adopt cloud computing for a good amount of its operations.

To this end, it has entered into an agreement with Microsoft and Amazon. In fact, Societe General’s developers and engineers have been running pilot programs for more than a year now on both Azure and AWS to check for security and reliability of these public cloud platforms. More importantly, these pilot programs looked into the feasibility of using public cloud for banking transactions, where confidential information of users and processes at stored at giant third-party data centers, in a faraway place.

So far, the tests have been satisfactory and Societe General wants to use public cloud services by June. Initially, it plans to start with non-client and non-sensitive content such as financial research and marketing data. Depending on the success of these changes, the bank plans to eventually have 80 percent of its infrastructure and data on internal and external cloud systems.

One of the challenges that Societe General, or for that matter any bank in Europe will have, is the regulatory concerns laid down by the ECB. Currently, ECB has restricted banks to use clouds for storing only non-sensitive data and operations like product development. However, these regulations are expected to ease out in the near future because there is a greater pressure on banks than ever before to reduce costs and improve efficiency.

According to IBM, moving to the cloud can save banks about ten percent of the budget allocated to information technology and operations, to start with. Continued use of cloud can allow banks to save almost 40 percent of costs because they can do away with systems that are not needed anymore. At the same time, their investment in capital infrastructure is also greatly reduced when they have their operations in the cloud.

Intense competition between banks is another factor that can propel banks to take to the cloud. Currently, competition has ensured that profit margins are not easy to come by, so more banks are increasingly looking to move their operations to cloud to leverage its lower costs. To top it, the millennial generation wants to have a digital banking experience, where everything is customized to meet their specific needs. To cater to this demand, banks are forced to embrace advanced technologies and again, want to leverage the power of cloud to run these technologies.

A case in point is Big Data, using which banks can better understand their customers and their expectations. Using these insights, they can provide a more customized service to their customers, that in turn, can go a long way in retaining existing customers and attracting new ones into their fold.

Due to these many advantages, some financial institutions have already started moving to the cloud. HSBC Holdings has partnered with Google while Capital One Financial Corp has partnered with AWS to move its operations to the cloud. This trend has started in Europe too, with Societe General leading the way.

The post Societe General Leads the Way in Cloud Adoption appeared first on Cloud News Daily.

Maryland is all set to move its services to the cloud

Maryland is leading the way in moving its services to the cloud, as it has started building a cloud-based platform to upgrade its customer processes, specifically its human services technology infrastructure.

Maryland is thinking of creating a unique product called Total Human Services Information Network, dubbed as, MD THINK. This is a cloud-based repository that will offer integrated data to all the programs and projects handled by the departments of Human Services, Licensing, Regulation, Juvenile Services, Health and Mental Hygiene.

The obvious advantage with such a system is that users can get the information they want from a single repository. In the past, data was siloed in different departments and this made it difficult for one department to access the information contained in another. As a result, they were not able to make the best decisions because no department ever had the complete picture of an individual. Since everything about an individual was not known, there was no scope for contextual decisions, and this really impeded the success rate of many of its programs, not to mention the hardships it caused to the residents of Maryland.

To overcome this problem, MD THINK came into being. Gov. Larry Hogan and his administration are striving to make this project a reality by investing $14 million from the allocated budget funds for the financial year 2017.

According to experts, MD THINK will use a scalable and cloud-based platform that will streamline all the processes and consolidate it into a single repository for data storage and retrieval. The first phase will focus on children and families, so that social workers can better cater to their needs.

In addition, case workers will be given a tablet device that they can use while on field work, as a part of MD THINK. Such a move will help case workers to record information right away and even lookup for  any pertinent information that can help them offer a better service.

There are many advantages that come with such a program. First off, Maryland is likely to save a ton of time and money, as it can streamline its resources and use them in a more productive manner. More importantly, it will be able to gather data analytics that can provide deep insights into the program beneficiaries, how it benefits them and what more can be done to improve the quality of life of Maryland citizens.

According to Gov. Hogan, this project will transform the way the state of Maryland deliver human services to its residents and it’ll also finally bring the process of delivering government services into the 21st century.

This cloud-based project, in many ways, reflects the power of technology and how it can be used to make our society a better place for living. Though the benefits of cloud are well-understood, not all government services are ready to move to it. This can be due to a combination of many factors such as existing legacy systems, the cost of migrating existing data and systems to a new system, budget constraints, mindset of policymakers and more.

Hopefully, this step by Maryland paves the way for other states to also follow suit and embrace cloud in a big way.

The post Maryland is all set to move its services to the cloud appeared first on Cloud News Daily.

SUSE is HPE’s Main Linux Provider

SUSE is a major Linux provider, and recently, it has entered into an agreement with HPE to tap into each other’s assets.

Under the terms of this partnership, SUSE will acquire HPE’s cloud assets such as the HPE OpenStack and HPE Stackato. Using these assets, SUSE plans to expand its own OpenStack Infrastructure-as-a-Service (IaaS) solution, that in turn, will accelerate its entry into the Cloud Foundry Platform-as-a-Service (PaaS) market.

In a release made by the company, the OpenStacks assets from HPE will be integrated into its own OpenStack cloud to help SUSE bring to the market a certified enterprise-ready solution for its clients and customers who use the SUSE eco-system.

Besides acquiring these assets,  HPE has also named SUSE as its preferred opensource partner for Linux, OpenStack and Cloud Foundry solutions.

While it may sound like SUSE is a major beneficiary of this partnership, in reality, it’s a win-win situation for both the companies. Under this partnership, HPE will use SUSE’s OpenStack Cloud and Cloud Foundry solutions as the foundation for its popular Helion Stackato and Helion OpenStack solution. This company believes that by partnering with SUSE, it can provide the best in class PaaS solutions that are simple to deploy in the multi-cloud environments of its customers.

From the above terms, it’s clear that both HPE and SUSE will hire programmers and do the cloud development work together, but HPE will sell and deploy these services, in addition to providing support for it. Of course, this is not an exclusive partnership as SUSE is always open to finding other partners too in the future.

Also, both the companies have a non-exclusive agreement under which HPE has a right to use SUSE’s OpenStack IaaS and SUSE’s Cloud Foundry PaaS technology for its own development in its Stackato and OpenStack platforms.

This agreement represents the long and complex relationship that these two companies have. A few years ago, HPE merged its non-core software assets with a company called Micro Focus, that owns SUSE. Secondly, SUSE has always worked with HPE on the Linux side. This additional partnership will further cement the relationship between these two companies in the long run.

So, how is this beneficial to everyone involved?

For SUSE, these partnerships describe its evolution from a Linux provider to a full-fledged cloud software development company. It’s in a better position to take on competition from companies like Red Hat and Canonical. In this sense, this partnership can signal its strong entry into the cloud, and unlike the other two companies, it has partnered with a strong and top computer partner in HPE.

As for HPE, this joint development efforts can greatly cut down the time and resources needed to create applications. In today’s competitive market, getting products to the market as quickly as possible is the key, and with its efforts shared with SUSE, this can greatly help HPE to speed up its development process.

Due to the enormous benefits for both these companies and for the cloud industry as a whole, this partnership can be a significant one for everyone involved.

The post SUSE is HPE’s Main Linux Provider appeared first on Cloud News Daily.

eBay is Planning its Own Public Cloud-Based Platform

eBay, the world’s largest marketplace, has launched its own public cloud-based platform that will run parallel to its existing on-premise system. This is the first step taken by the company to fully migrate its platform from an on-premise one to the cloud. The company claims that it made this transition within a period of six months, which is no ordinary feat, considering that at any given moment, eBay has at least one billion listings spread across the 200 countries in which it operates.

If you’re wondering why it’s transitioned to the cloud, the answer is easy. Cloud offers a ton of benefits for businesses, and this is partly the reason for almost every major company in the world to have some or all major applications in the world. eBay wanted to follow suit, and tap into the many advantages that comes with cloud.

The next obvious question is, why it chose to build a public cloud-based platform from scratch instead of choosing one of the prominent platforms like AWS and Azure? Well, there are many reasons for it.

First off, eBay wants to cater to a growing digital customer base, as new generations of tech-savvy buyers and sellers are emerging. With its own platform, it will have the flexibility to customize and add-on all the features it wants.

Secondly, eBay will have a high level of control if it owns the cloud, when compared to storing all the data in a system owned by another company. Let’s say, it stores all its important applications in AWS. What happens if Amazon decides to start its own business similar to eBay? Technically speaking, this shouldn’t affect eBay’s operations, but it can still create doubts in the minds of its customers. Moreover, when eBay retains the cloud, it has greater visibility into what’s going on inside its operations.

The third factor is cost. In general, cloud is cheaper than investing in capital-based items like servers and data centers. However, this rule applies only when your business is small and you use only a small amount of space on the cloud. Remember, most cloud providers use a pay-as-you-go model, which means, when your usage level increases, you’re going to have to pay more.

As your business grows, there will come a point when the subscription fee you pay towards a cloud service will be more than your capital expenditure. At that time, it’ll be too late to turn back. To avoid getting into such a situation, eBay decided to build its own platform. This is a sensible strategy given that eBay already has more than one billion listings, and this is only likely to grow over the next few years.

Lastly, the preferences of buyers and sellers are changing rapidly, so eBay wants to be in a position to react well to these dynamic changes. This requires a complete control over its operations, including its data management, and this is another compelling reason for eBay to launch its own cloud platform.

Overall, this is a great decision by the company, and hopefully it works well in the coming years.

The post eBay is Planning its Own Public Cloud-Based Platform appeared first on Cloud News Daily.

The World’s First Commercial Quantum Computing Service is Here

Quantum computing is the technology that allows you to develop computational systems based on quantum theory. Broadly speaking, this theory taps into the power of atoms to perform memory and processing-related tasks. The obvious advantage with this technology is its speed, as it can perform calculations significantly faster than any device we know today.

To be more specific, quantum computing will tap into the power of the properties of subatomic physics, where small bits of information called quantum bits or qubits possess the property to change into multiple states simultaneously. This way, bits don’t have to be just 0 or 1, as in classical computing. Rather, they can take on any value, and this flexibility can open a world of possibilities for computing.

While the above is a theoretical explanation, the practical side of it has been more challenging. Getting enough qubits to work together to run any algorithm is challenging, to say the least. To address this issue, two major systems have helped. The first one traps individual ions in a vacuum using magnetic or electrical fields while the other sends qubits to microscopic superconducting circuits.

IBM has been one of the pioneers in the area of quantum computing and it has relied heavily on the second approach to build its systems. Recently, it has announced that the world’s first quantum computing service will be available later this year. Called as IBM Q, this service will be accessible over the Internet for a fee.

So, what can we expect from this system? A super-computer that will outperform all the existing computers today? Well, not really.

What we can expect is a system that will play a crucial role in developing other quantum machines that can perform complex tasks, especially those that have been impossible with our current computing technologies.

This system builds around the knowledge and research around IBM’s cloud computing eco-system called Quantum Experience, that anyone can now access for free. This system has been available for public use since May 2016, and it got an upgraded interface recently. Currently, this system allows thousands of researchers worldwide to build the quantum algorithms they want, without having to build their own quantum computer.

So far, IBM has not been forthcoming about the details of IBM Q. It has not given a specific release date, and hasn’t mentioned about how powerful this system would be, or how much it will cost to access it. The only information we have in this regard is that it has lined up its first set of clients, though it hasn’t even specified the exact names.

Though we don’t have much information in this regard, the fact that the first quantum computing system is going to be available, is a big news by itself. It can pave the way for future developments in this industry, and can even propel it to great heights in the near future. In many ways, this is an exciting development simply because quantum computing can do things that were believed to be undoable earlier.

The post The World’s First Commercial Quantum Computing Service is Here appeared first on Cloud News Daily.

2nd Watch Gets a New Round of Funding

2nd Watch, a managed cloud services provider, has raised $19 million in Series D funding. This round was led by a company called Delta-v Capital, with participation from Madrona Venture Group, Columbia Capital and Top Tier Capital Partners.

This Seattle-based company is likely to use this money to scale its cloud operations, add a managed cloud operations in the state of North Carolina, and hire people in the departments of sales, software, operations and client management. A good chunk of this money is expected to go towards servicing its East Coast clients by establishing a dedicated center in the city of Raleigh.

2nd Watch is a premier partner of AWS that provides many managed cloud services in the ecosystem of AWS. It was founded in 2010 with a clear plan of designing, building and managing public clouds in the areas of Big Data Analytics, digital marketing, cloud analytics and more. Specifically, this company helps clients to modernize their IT infrastructure and services in the cloud.

In 2010, it was one of the first companies to join the AWS partner network. It was among the first audited and approved AWS managed service partners, and it even has the distinction of being the only cloud native partner to earn SOC2 compliance with a perfect score.

Over the years, it has added many prestigious clients like Conde Nast, Coca-Cola, Lenovo, Motorola and Yamaha. Due to this rapid growth, it has been adding more people to its rolls, with February alone seeing an addition of 20 more people. Overall, there about 160 employees so far, and the company is planning to grow to 200 people by the second half of 2017, to meet the demands of its growing client base.

In terms of its cloud presence, 2nd Watch claims that it has 400 enterprise workloads under management and more than 200,000 instances in its managed cloud services.

The success of this company once again brings out the growing cloud market, and the many opportunities it presents for small and medium companies to carve a niche for themselves. There are hundreds of companies today that offer specialized services, thereby making the cloud a more attractive and feasible option for many clients around the world.

The success rate of this company has helped it to raise $56 million so far, and going forward, it is only expected to have more business and a larger client base than now. According to the CEO of 2nd Watch, Doug Schneider, the firm doubled its revenues in 2016, and much of this can be attributed to the growing interest of companies across different sectors to take to the cloud. Almost every company today understands the power of cloud, and are sooner or later, expected to move to it.

Schneider opined that to meet the growing needs of its current and future clients, it needs more investment. Considering the astronomical growth this company has seen over the last year, funding should never be issue, as long as the money is used towards the right channels that will further propel growth.

It’s sure going to be an exciting ride for 2nd Watch.

The post 2nd Watch Gets a New Round of Funding appeared first on Cloud News Daily.

GitHub Offers New Business Option

GitHub has now become accessible to clients who want to host their complete project on the cloud. Recently, this company  has released a plan called “business” package to give customers the same features as those available on GitHub com, by hosting their code on GitHub’s own servers.

According to the CEO of GitHub, Chris Wanstrath, this package is to give customers a choice to host their code online, and away from their servers. He opined that his clients want to host their code on the cloud because of the many benefits that come with, and GitHub is simply making this a reality.

This is a strategic move by the company, given that GitHub is the world’s most popular repository for storing programming code. It is estimated that currently more than 20 million developers spread across one million teams use GitHub for storing their code. Also, there are an estimated 52 million projects on GitHub, and almost 1,500 teams are joining daily. In fact, out of the 50 largest companies in the US, almost 22 use this service as their code repository.

These numbers show the enormous reach that GitHub has in the programming community, and now it wants to further monetize its popularity. Already, it has an Enterprise offering under which companies can host their team’s code in a private cloud. This service is available since 2012, and the cost is $21 per month. Many premium customers such as Walmart, Ford, IBM, John Deere, the government of United Kingdom and NASA use this service.

This “business” package is also priced at $21 per month, and with this, the company wants to establish itself as a software company, and not just as a startup anymore.

While these packages work well for large corporations, the company has other plans for small and medium businesses. A package called “Team” is ideal for small businesses, and it is priced at $9 per month. The most basic package is called “Developer”, and it is priced at $7 per month. This package is ideal for an individual user only.

With these pricing options, GitHub is confident that it’ll bring more customers into its fold. Already, the cloud is becoming a preferred option for individuals and companies to store their code because it is easy to manage. Also, the fact that all the code is stored in a remote location means greater data redundancy and better protection against natural calamities and disruptions. Since cloud ensures business continuity, more companies are preferring to switch to it.

In addition, storing your code in the cloud offers the highest level of mobility and flexibility for employees. They are no longer restricted to office devices or network, and can work pretty much from anywhere and from any device. Since the current generation want to strike the right balance between personal and work life, such an option can help in attracting the best of talent.

 

These different advantages are what GitHub wants to tap into it. With a promise of 99.99% uptime and attractive pricing, GitHub is sure to expand its reach.

The post GitHub Offers New Business Option appeared first on Cloud News Daily.

Storj Labs Raises $3 Million

This is the perfect time to be a tech startup, as many angel investors and venture capitalists are looking for innovative products and ideas that would propel technological impact to new heights. Over the last year, many startups have successfully raised seed capital and funding. The latest in this list is Storj labs that raised $3 million in funding from companies such as Google Ventures, Qualcomm Ventures and Techstars.

Storj (pronounced “storage”) is a distributed cloud-service provider that has created a decentralized peer-to-peer solution. This service works by organizing users who are willing to rent out their spare hard drive space and bandwidth to customers who are looking for the same. All these users are connected through a peer network.

In the past, one of the biggest challenges of having a peer-to-peer cloud sharing technology was security. How can users know that their private data will not be accessed by other users in the same network? In fact, this concern was one of the reasons why this idea never took off in the first place.

Storj claims that it has found a secure solution for this problem. Their system enables users to store their unused space in a secure and decentralized location using block chain features such as a transaction ledger, public or private key encryption and cryptographic hash functions. In this sense, Storj  has brought block chain technology back to the world to ensure that files are safe and not accessed by unauthorized users.

So, what’s the advantage of choosing Storj over the leading providers like AWS and Microsoft?

Cost. AWS and Microsoft need large datacenters to store the data, not to mention the high cost of physical space and electricity needed to power it. This cost is passed on to developers and users. In addition, there is always a possibility for physical servers and networks to have problems, and this can lead to data failure or loss.

In stark contrast, the services offered by Storj is cost-effective because the money goes towards users who are renting out their space. Since they don’t need physical space or electricity, the cost per unit of consumption is low. Besides being cost-effective, users are in control over their device and data because of the decentralized model.

There are no central servers that can be compromised, so data is safe. In addition, the Storj system uses client-side encryption, which means, only the end users have access to their keys needed to decrypt files. This way, security is Storj’s biggest selling point.

Such a unique model has seen good response from both users and customers. Currently, Storj has about 7,500 users who rent out their hard drive space, and about 15,000 API users worldwide who use them.

Storj has also entered into an agreement with Heroku, a cloud-based platform-as-a-service (PaaS). This partnership helps developers to build and run their applications completely in the cloud, as Storj offers them a distributed object storage solution complete with advanced features such as encryption.

Little wonder then that Storj Labs has raised a funding of $3 million. It could be the first of many more to come.

The post Storj Labs Raises $3 Million appeared first on Cloud News Daily.

Adobe Introduces new Cloud-based Digital Signatures

Adobe has transformed the way we read, share and sign digital documents. The latest in this category is cloud-based digital signatures.

Now, Adobe Sign will offer cloud-based digital signatures on any device that uses the Adobe Document Cloud. This move is a major step for Adobe and a significant one for the cloud community as a whole, as there have been many challenges associated with implementing it.

Digital signatures are much different from e-signatures, and they require a detailed verification process that can be time-consuming and effort-centric. In addition, the existing digital signatures are fragmented, with most of them being proprietary.

To overcome this challenge, Adobe decided to come up with cloud-based digital signatures that are open to multiple certificate providers. The obvious advantage with such open standards is that it allows interoperability between solutions, and helps in widespread adoption. In this sense, it can be the beginning of exciting times for the digital signature industry.

Adobe is all set to embrace this trend, as a press report released by the company says that its digital signatures are one of the most advanced and secure type of electronic signatures available today, and it can be used for signing many important documents like mortgage applications and healthcare forms.

Besides digital signatures, Adobe has also added a host of other features that is sure to appeal to large teams. Some of the features it has added to its arsenal include document routing and integration with Microsoft SharePoint to allow for easy signing and tracking.

All the changes are based on the recommendations given by Cloud Signature Consortium, a global network of industry experts who are working together to create standardized specifications for online document sharing and digital signatures. For some time now, this group has been advocating for wide and open standards, as it can help to build a secure digital signature functionality across all devices and applications. Adobe headed this consortium in 2016, and is today leading the way, with the world’s first open cloud-based digital signatures.

Alongside introducing digital signature, Adobe has also added new functionality to its Adobe Sign technology, so it can now streamline the flow of documents and tasks across different teams, and even the organization as a whole. This way, the documents can be integrated better with the digital processes and systems of the organization.

As a part of its updates, Adobe has enhanced its mobile app with a new technology called Adobe Sensei. This technology uses machine learning, artificial intelligence and deep learning capabilities to make the right predictions. This mobile app will also make it easy for users to convert any paper document to its digital version with just a smartphone scan.

To top these changes, the process of digital signing complies with standards such as eIDAS that has to be followed in the EU.

With such sweeping changes, Adobe is all set to take the digital signature industry by storm.  It also wants to tap into the growing demand for digital signatures, as workforce are increasingly mobile and prefer to have digital copies of documents.

The post Adobe Introduces new Cloud-based Digital Signatures appeared first on Cloud News Daily.