Todas las entradas hechas por Lavanya

Edge Computing vs Cloud Computing

Cloud computing is now an established technology that transforming the world in more ways than we can imagine. But, technology and its outreach is growing too, so can cloud computing manage all the data.

Let’s take an example here. The Internet of Things (IoT) is real and will soon become an integral part of our lives. It will ease our everyday operations and become a handy way to get more out of life. At the same, IoT will generate large amounts of data, way more than what we can even imagine to begin with.  A report from Cisco states that cloud computing will grow by four-fold before 2020, and will reach a whopping 14.1ZB (zettabytes) of data. To give you a perspective, 2015 had a mere 3.9ZB of data.

This explosive growth brings up the question of whether cloud computing can handle this workload. Is the existing infrastructure capable of processing such vast amounts of data? Well, the simple answer is we don’t know. Based on some estimates, it may be possible, but still the infrastructure will be overworked.

To ease out this pressure on cloud computing, we can use what’s called edge computing. This is a technology that allows us to move computation to the edges of the network, thereby distributing the workload among different nodes. In contrast, cloud computing is the process where all the data are collected and processed in a centralized location. Obviously, moving data to the edges can make computing faster and data more accessible since much of this happens near the data source.

These benefits have triggered the big question – can edge computing replace the cloud?

No, because cloud and edge computing are complementary technologies that can work in tandem. There’s no need for one to replace the other completely, rather both centralized and edge processing can happen. This way, more data can be processed without putting excess pressure on the underlying infrastructure, and at the same time, it can give superior service to customers as the workload is distributed.

Additionally, end devices can be made smaller and cheaper because they don’t require any computing capabilities at all. With edge computing, a fair amount of processing will happen at the nodes and the remaining will be handled by centralized cloud computing infrastructure. There’s no need for any processing from the end devices and this could be a huge saving for customers, both in terms of size and cost.

Another benefit of using both the technologies is better management of applications and more resilience. With cloud computing, we’re relying heavily on the network to move our data to a central location and this is going to cause network congestion sooner or later. When we combine edge computing, we can obviously avoid this network congestion and improve the overall performance of our applications.

This integration of both edge and cloud computing opens new possibilities for a connected world because we’re no longer restrained by network  and infrastructure limitations. Exciting times are ahead of us for sure!

The post Edge Computing vs Cloud Computing appeared first on Cloud News Daily.

Microsoft Wants to Add DNA Storage

How cool would it be if all our data is stored inside a DNA instead of tapes and drives? Imagine the amount of data that can be stored when you use DNA for storage instead of tapes and other relatively bulkier forms of storage.

Well, that will soon be a reality, going by what Microsoft is planning. Computer architects and researchers in this company are formalizing the goal of having a DNA system for storage inside large data centers by the end of this decade. They are working towards having a working prototype for using DNA to store the ever-growing volume of data.

Researchers primarily want to use the DNA to replace tape drives that are currently used for archiving information. The obvious advantage with this form of storage is the space and money savings that’ll accrue to a company. When you store in a DNA, you can store tons of data when compared to tapes because a DNA is only a miniscule size when compared to tapes.

Also, the cost per square foot of storage goes down because you can store petabytes of data in a small space and this is another added advantage that companies can get.

Already, there’s much buzz surrounding the announcement made by Microsoft. Researchers, IT administrators and tech geeks are looking forward to how see how this idea will shape up in the next few years.

On one side, it also reflects the commitment of companies like Microsoft that are taking innovation to the next level. A wild and weird idea like saving valuable documents, photos, videos and more in a small molecules that are genes are made of, is gaining traction. In fact, the company has allocated money and resources to further this abstract and wild idea in the hope that something would come out of it in the future.

That’s the spirit of innovation and this spirit is what takes companies and ideas to new heights. In fact, history is filled with the successful implementation of such ideas. When a couple of PH.D students, Larry Page and Sergey Brin, were working on a search engine, little would they have thought that today Google would become such a large corporation. The same goes for Bill Gates and millions of other entrepreneurs and scientists who have changed the way we live and communicate.

Considering these past examples, we can’t dismiss Microsoft’s idea as absurd or impractical. For all you know, it could be on the threshold of a great discovery that can set it apart from other companies in the tech world.

Right now, the efforts to shrink memory size are reaching physical limits. There’s only so much you can go in terms of size, so the next option would be to look for something that is already small and is capable of taking data. What better than the DNA as it fits the description well, both in terms of size and in terms of storage capacity.

Exciting days are surely ahead for mankind.

The post Microsoft Wants to Add DNA Storage appeared first on Cloud News Daily.

Google Brings Startups Under its Fold

Startups are the drivers of future business. Regardless of their geographic location, startups tend to bring in innovation and new products to the society. But, it’s definitely not an easy road, especially in terms of financial capital.

Almost every startup has budget constraints, so they’re forced to make certain cutbacks. A lot of these come in the areas of sales and marketing, and some of it in technology.

Google wants to change that. While it can’t provide much help in marketing and sales, it definitely is reaching out to companies to give them a solid technological platform needed to execute their ideas. In fact, one of Google’s strategy has been to bring many startups to its platform.

Startups such as Planet Labs, Oden technologies and more that operate in a range of different industries such as space satellites, climate change, smart cities and more want to make use of this offer from Google. For example, Planet Labs is a startup company that wants to image the earth everyday to highlight global change. This company has switched to Google Cloud to host its images and for data processing.

Planet Labs is not the only company. There are hundreds of other startups that are switching to Google Cloud because Google is offering a plan that combines a ton of storage with computing, and of course at affordable rates. So, these companies get the best computing power within their budgets, so why not tap into this technological powerhouse to further their own research and development.

In another interesting case study, Google Cloud partnered with the Indian Satellite Research Organization (ISRO) to launch 88 dove satellites into space. This is the largest satellite constellation ever to reach the space, and this organization uses Google Cloud.

A London-based startup called Improbable is also tapping into the computing power of Google Cloud. This company aims to provide the technologies that will lay the foundation for smart cities. It currently uses Google Cloud to simulate entire cities and to give the lawmakers and public, an impact on every decision, starting from planning to garbage disposal.

One of the reasons all these startups choose Google is the mutual understanding that the startups have to succeed for Google to get a strong hold in this market and vice-versa. This mutual benefit forges strong partnerships that eventually augurs well for everyone involved.

For Google too, this is a potent strategy that can bring rich rewards in the future. When these startups grow, they’ll continue to use Google Cloud platform for their needs. With more usage comes more revenue, so Google will eventually make its money too.

It is also in a position to empower local communities and maybe even help them bring to the world some life-changing products.

Besides, Google can establish a good rapport with these startups and maybe even get a boost in its brand image –  all of which can lead to more customers and a large revenue.

In all, this is a good strategy that can help Google to catch up with competition from Amazon Web Services (AWS), Microsoft and IBM.

The post Google Brings Startups Under its Fold appeared first on Cloud News Daily.

Microsoft Enters Africa

Africa is catching on with technology as many countries as adopting computers and cloud in a big way. The widespread work of many private organizations, philanthropists and nongovernmental organizations have helped to eradicate poverty and illiteracy in many communities, and this new generation of educated people are turning to technology to improve their lives and those of others in their communities.

With such a trend, it’s only natural for all major companies to make a bee-line to this continent to tap into the new and growing opportunities it presents. Microsoft has announced that it will be opening two data centers here by next year to serve customers who use the Azure cloud platform. This will be one of the largest data centers ever in this continent and Microsoft plans to open them in Johannesburg and Cape Town, both located in South Africa.

The choice of location is a little disappointing from a technology adoption point of view. South Africa is the most developed country in the African continent and has one of the highest rates of literacy. Also, there’s much economic development happening there already. Microsoft could have chosen a developing economy to boost their presence and impact on the local markets.

In their defense though, Microsoft would argue that it’s the nest choice from an economic standpoint. South Africa is a stable nation with plenty of educated workforce, so the data centers are more likely to be safe and secure here. In addition, cloud growth is happening more in this country than other areas in Africa.

According to Data Corp International, the renowned Research Firm, cloud revenue in South Africa was a mere $243 million last year, but is expected to grow by an annual rate of 20% a year through 2021. Considering these numbers, it’s only fair that Microsoft cater to this growing market before expanding to other regions within this continent.

Like Microsoft, Alphabet and Amazon also have been working to capture this potential market, but none have so far opened a dedicated operational center in this continent yet. In this sense, Microsoft has a lead over the other two. But, we can expect both Alphabet and Amazon to follow suit soon.

Reports show that these three major companies combined together have $31.54 billion in 2016 alone for capital expenses and leases. This is a whopping 22% more than what these companies spent in 2015. Though not every penny was spent on cloud infrastructure or data centers, a substantial part of this investment went into their cloud computing line of business.

This entry of Microsoft marks a new beginning for Africa and hopefully it can act as a catalyst for this continent to develop more rapidly. We can hope that these companies will soon move beyond lucrative countries like South Africa and spread their operations in poorer countries too, in order to give every community a chance at development.

Overall, this is a positive move that could transform Africa, like it did for many Asian countries that were grappling with many social and developmental issues.

The post Microsoft Enters Africa appeared first on Cloud News Daily.

Google’s Next Frontier – a New AI Chip

Artificial intelligence, AI for short, is almost overtaking software to become the preferred technology for all operations. This is not a surprising trend because AI is the next generation of machines that come with the ability to learn and implement certain behavior without needing a constant intervention from humans. AI is becoming a reality sooner than we may have guessed, and Google wants to lead the way.

During the company’s annual conference, its CEO, Sundar Pichai, announced that the company will be releasing a new computer chip that can perform advanced machine learning and AI tasks. Called Cloud Tensor processing Unit, this chip is named after Google’s open-source machine learning framework called TensorFlow.

Machine learning, deep learning, supercomputers and AI technologies have been transforming tech companies and its clients over the last few years, and this announcement from Google has made this trend official now. In many ways, AI is transforming Google and its operations too, so it is only natural that this company wants to set the trend and capture the AI market even when it is in its nascent stages.

So, what’s this new chip all about? What’s new in it? Well, lots!

This chip is being dubbed as the first one that can work at blistering speed not only in its executions, but also in its ability to learn. In other words, this chip can be trained incredibly fast and this can be a vital difference that can set this chip apart from others in the same category.

Let’s take a simple example. You want a machine to identify a pizza from other foods such as hot dogs, salads and burgers. To do this, you’ll have to feed in hundreds of different images of pizzas along with other foods to help the system to learn. The sheer amount of calculations needed to train such a model is mind-boggling and complex, so it can take days or even weeks for a system to identify a pizza from other foods.

Google wants to simplify this process, so the learning process is shortened greatly. To do this, the company plans to create machine learning supercomputers called cloud TPU pods. Many Cloud TPUs will be wired together using high-speed data connection. As a result, the learning will be split across different TPUs and they will happen in parallel, thereby leading to a shorter learning curve.

According to Pichai, Google wants to create thousands of TPUs and make them available over the Internet so that it can benefit researchers and developers in a big way. To start with, Google announced that it will share 1,000 TPUs with artificial intelligence researchers to help them in their studies.

In addition to this AI chip, Google plans to create algorithms that will fine-tune other machine learning algorithms and was developing AI tools for advanced studies such as genome analysis, molecular discovery and more.

In all, this is a great initiative from Google and one that could potentially transform our society in a big way. In return, Google can emerge as the leader of the AI chip market, right from its very beginning.

The post Google’s Next Frontier – a New AI Chip appeared first on Cloud News Daily.

Supercomputing as a Service is Available Now

We’ve been using Software as a Service (SaaS), Platform as a Service (PaaS) and Infrastructure as a Service (IaaS) for a few years now. We added something called Identity as a Service (IDaaS) a few years back, and now it’s time for Supercomputing as a Service.

Cray, a supercomputer vendor, has made its high  performing computers, technologies and servers available to users through the cloud. It has recently partnered with a company called Markley to offer supercomputing technologies as a hosted service.

Under the terms of agreement, both these companies will come together to offer supercomputing to clients in the life sciences industry, to start with. Later, they’ll expand to other verticals and have even hinted that they’ll look at developing some industry-specific software and solutions in the future.

Overall, this is a good move that’ll augur well not only for these two companies, but for the technology industry as a whole. Currently, not all computers are supercomputers simply because it takes a certain amount of time, money and resources, and many organizations may not be up to it. In fact, they may want to focus on their core operations, which is understandable. At the same time, they were missing out on the power that comes with supercomputers.

Now, with supercomputer being offered as a service, it should be easier than before to simply use this service to improve your business, without having to spend considerable time and effort in getting the supercomputers ready. In this sense, this announcement can be seen as a big leap in the adoption of technology and hopefully it can transform many businesses in the future.

From Cray and Markley’s perspective too, this is a good move as it can create a niche area for both the companies. Markley is already well-known for its multi-tenant and mission critical infrastructure and data center facility, so this idea to bring supercomputers as a service will be an important feather to its cap. In fact, this Boston-based company has never experienced a primary power outage in its 15 years of operation, which is a remarkable feat by itself. Add supercomputers to this list, and it’s sure to make it even better.

With this offering, Cray plans to expand its reach to the commercial and enterprise data spaces as well. This is an interesting that was announced by its CEO, Peter Ungaro, during the most recent quarterly report that took place earlier this month. In this announcement, Ungaro accepted that the high-performance computing (HPC) market is slowing down and it’s time for the company to look into related adjacent markets as well.

While the HPC market may rebound soon, it still makes sense for a company to expand to commercial areas, as this is could end up to be a more lucrative and lasting market for Cray.

In all, supercomputer as a computer is a fantastic piece of news that has already started sending exciting ripples in the world of technology. Let’s see how it shapes up over the next few months.

The post Supercomputing as a Service is Available Now appeared first on Cloud News Daily.

How is Snap Faring on its Cloud Agreements?

Snap is a popular social media network that’s caught on in a big way, especially among teenagers. It’s estimated that more young people prefer to use Snap when compared to other social media platforms.

That’s not the only distinction. It’s also one of the few social media companies that doesn’t spend tens of millions of dollars to store all content on its own servers. To get around this capital-intensive option, Snap is tapping into the infrastructure of companies that specialize in cloud. In other words, it uses the cloud for storing all its data and to gain the many advantages that come from it.

In fact, Snap’s biggest expenditure is the cost it pays to cloud companies for hosting its content.

Over the last year, Snap entered into an agreement with both Alphabet Inc and Amazon Web Services, though this platform was originally built on Google Cloud platform. The agreements signed with both these companies offers good discounts to Snap, as it has committed to spend almost $2 billion on Google cloud and an extra $1 billion with Amazon.

If you look at both these agreements, you’ll see that it’s contract with Google is fairly straightforward. It will spend $400 million per year for the next five years, amounting to a total of $2 billion. Out of these five years, Snap can defer only 15% of that $400 million for the next year. But with Amazon, it’s a little more complex. The spending will start with $50 million this year and will slowly increase to $350 million by 2021.

With this deal in place, how is Snap faring?

In 2017, Snap will spend $390 million on cloud and this includes the 15% deferred allowance from Google. Snap already spent $99 million in the first quarter for hosting. This is almost $14 million less than what it spent in the fourth-quarter of the previous year.

If you’re wondering why this fall, it all amounts down to user engagement. During the first quarter, Snap increased its daily users by eight million and each user is believed to have spent an average of 30 minutes every day. Essentially, when more users spend more time on Snap, the hosting costs go down and this is exactly how Snap was able to bring down its costs by $14 million. These numbers are only expected to improve and this means, lesser cost for more users.

In addition, Snap plans to move some of it’s tasks from Google Cloud to Amazon, in a bid to save some money. It’s agreement with Amazon is more robust and flexible than that of Google, and this explains the reason for this shift.

All this means Snap is doing really well, both on its commitments as well as in its business operations. But the big question is if this pattern is sustainable. If user growth slows or user engagement falls, then Snap is in trouble as it has committed to almost $3 billion.

In all, Snap’s deals are working great now, but let’s hope it doesn’t back later to bite it.

The post How is Snap Faring on its Cloud Agreements? appeared first on Cloud News Daily.

Do we Still Need Enterprise-Owned Data Centers?

With cloud technology booming and more companies moving their data and applications to the cloud, this is the right time to analyze if we still need enterprise-owned data centers.

Well, the answer is a surprising yes, according to a research conducted by Uptime Institute.

This study asked 1,000 IT professionals around the world about their respective companies’ preferred way to store data. It is reported that two-thirds of the respondents, almost 68 percent, said that they deploy their IT assets in enterprise-owned data centers. Out of the remaining, only 13 percent deploy it in the cloud while the remaining 22 percent deploy in multi-tenant data center providers.

This study is shocking in many ways because the general belief was that more companies are moving their assets to the cloud, leaving data centers behind. But this study has proved this theory wrong as only 13 percent of companies park their data in the cloud.

In fact, a significant aspect is that data centers have remained the central component around which many developments have happened such as improved processor performance, expansion of server virtualization and adoption of cloud computing.

This discussion would obviously rise the question why?

Well, we believe there’s no single answer to this question. First off, security continues to be a major concern for many companies as the management is unsure if their critical resources should be moved to the cloud at all. Some analysts point that companies prefer to keep their assets closer to them and would like to exercise a high degree of control over its usage and access.

Though that’s a common fear, cloud security has indeed come a long way from its nascent stages. The number of attacks on cloud networks have reduced drastically and this has made data more resilient and safe than before. However, the myths and misconceptions surrounding cloud security still remain, so it’s up to cloud companies to dispel these fears by providing the right tools and information to potential companies.

The second aspect that could acts as a possible deterrent to move to the cloud is the presence of legacy systems. The architecture and IT systems of many companies continue to be based on n-tier and legacy architecture that are not conducive to the cloud. Moving such applications can bring more harm than good, so it’s a wise move to continue to sue data centers.

By doing so though, companies miss out on the many benefits that come with cloud. A best approach in this case is to use the cloud when it’s time to decommission the existing legacy systems. Alternately, they can also slowly start building applications that work well in the cloud, because that’s after all the future.

In this process, cloud companies should also take an active role in coming up with ways and tools to help companies migrate the data from their legacy applications to the cloud.

Overall, data centers continue to dominate when it comes to IT. This should change and for that, a concerted effort is needed from all parties involved so everyone can reap the many benefits that come from cloud.

The post Do we Still Need Enterprise-Owned Data Centers? appeared first on Cloud News Daily.

President Trump Signs An Order for Cloud Cybersecurity

President Trump signed an executive order to put an end to the federal government’s cybersecurity woes.

This order mandates a single set of rules for all departments and puts the head of the department as accountable for that agency’s security. The obvious idea behind this accountability is to avoid department heads from passing the blame to the IT department, every time there’s a breach. By doing this, agency heads were caught passing the buck without taking any steps whatsoever to correct them.

To put an end to this blame games, this order was drafted quite some time ago and was supposed to have been signed on January 31 of this year, but was postponed without any reason or explanation. But, finally it was signed and would come into force right away.

An unexpected surprise that came to light in this order was an initiative to move as much of the government’s cyber-defense programs to the cloud. This means, the government wants departments to move their data and applications to the cloud to leverage its many benefits.

This is a significant and sensible move considering that there are roughly 190 departments today. If each department tries to develop its own defenses, then it’ll simply be a duplication of efforts and a waste of resources. A better way would be to share services and resources, so that efforts are more streamlined and departments can follow similar standards.

The Trump administration believes that historically a lot of time and effort went into creating and protecting legacy IT systems in federal departments and this spending did not yield any result. To top it, there are a number of security vulnerabilities as is evident from the many hacking incidents that have taken place over the last few years.

This is critical considering that U.S system’s store a lot of sensitive information about its residents such as their Social Security Number and date of birth, which can cause a lot of problems, when it falls in the wrong markets. There are already many criminals prowling in cyberspace to get such sensitive information, so they can sell it in the black market for a substantial amount of money.

When government systems don’t follow security best practices and use legacy systems, they make it that much more easier for these criminals to get the information they want. So, this move to move to the cloud could enforce better security standards and hopefully, will keep the criminals at bay.

Besides these important strategies, the order signed by President Trump also gives directives on reviewing the general vulnerabilities of the U.S government and its systems, zero in on the main culprits and adversaries of U.S cybersecurity and provide training to the next generation of cybersecurity professionals, so they can handle the future cyber needs of the country.

In all, this is an excellent and much-awaited move that’s needed to beef up the cybersecurity of U.S and also to protect the assets of its citizens.

The post President Trump Signs An Order for Cloud Cybersecurity appeared first on Cloud News Daily.

Is Middle East Moving to the Cloud?

When we think of cloud, we often think of advanced countries like the U.S, Canada, Australia, Germany, and Scandinavian countries. Or we think of emerging economies like India and China. We rarely associated cloud technology with the Middle East and that’s probably what needs to change.

A report by Gartner shows that companies in the Middle East are all set to spend more than two billion dollars over the next three years as they want to move their data and applications to the cloud. This represents a 22 percent growth as the revenue figures stood at $956 million in 2016.

So, what’s driving the cloud here?

The report further states that platform as a service (PaaS) is recording the highest growth rate at 28.8 percent closely followed by software as a service (SaaS) at 28.5 percent. Growth in both these areas indicate that companies are looking to migrate their applications and workloads from an on-premises data center to the cloud.

As with the rest of the world, companies operating in the Middle East also understand the benefits that come from cloud, and they want to make the most of it. Contrary to popular opinion, Middle East is no longer about oil. Falling oil prices and the growing demand for alternate sources of energy has forced oil-producing countries like Saudi Arabia to look at development of other industries.

Already Dubai and Abu Dhabi are leading the way as one of the best cities in the world for travel and living. There are many companies that have a presence in these two cities, thanks to their advanced infrastructure and friendly corporate policies. Other cities too such as Cairo are looking to follow suit, and it won’t be long before these countries become attractive destinations for companies of all sizes and sectors.

With such a trend, it’s no surprise that cloud will boom here too, as many companies will depend on the cloud one way or the other. In fact, keeping this trend in mind, companies like Alibaba have already started setting up cloud data centers in this part of the world.

This is a smart move considering that data center traffic will reach 366 exabytes per year, up from 68 exabytes in 2013. Such an explosive growth needs a ton of facilities and this is exactly what major cloud companies are vying to setup.

In addition to data center traffic, consumer adoption of cloud storage is also expected to grow astronomically. It is expected to represent 61 percent of total cloud transactions in 2018, up from 26 percent in 2013, according to a report by Cisco. To top it, Cisco predicts that this region will rank second in the world, just behind the Asia Pacific region when it comes to growth.

All these numbers and trends point yet again to the growing might of cloud and its overarching reach to almost every part of the world. Let’s hope that soon African and Asian countries also join the bandwagon, so every individual in every part of the world can leverage the benefits of cloud technology.

The post Is Middle East Moving to the Cloud? appeared first on Cloud News Daily.