Category Archives: Amazon AWS

FINRA Leads the Way in Cloud Adoption

If you thought cloud storage is only for private companies, you can’t be any farther from the truth. In fact, many government agencies, including data-sensitive organizations like the US Army and FBI, are moving their operations to the cloud, mainly because of the many benefits that come with it. Some financial organizations have also started looking into cloud, thanks to the Financial Industry Regulation Authority (FINRA)’s move to adopt cloud in a big way.

At the recently concluded Amazon’s conference, Steve Randich, the executive VP of FINRA, opined that cybersecurity is better in the cloud than in private data centers, and this is why it’s a good idea for financial companies to move their sensitive and confidential data to the cloud. This step has allayed the fear of many financial institutions who believe that data security is one of the biggest problems of cloud storage. A few years back, cloud security was in its nascent stages, so these fears made sense. However today, cloud security has improved by leaps and bounds, and these advancements make cloud one of the safest places to park your data.

These reasons are exactly why FINRA chose Amazon Web Services (AWS) as its service provider. Randich trusts cloud so much that the organization moved its primary and mission critical applications to AWS in the first move itself. There was no question of pilot testing by moving smaller applications. FINRA’s surveillance application alone processes more data on a single day, typically around 75 billion events a day, and this is more than what credit card companies like Visa and MasterCard process over a period of six months. Such is the magnitude of the data handled by FINRA, and it’s heartening to see that all of it is done in the cloud now!

In addition, FINRA has to search through this data and run advanced queries to identify insider trading or any other wrongful trading from this vast amount of data, and AWS makes it possible with its advanced query tools that come with a simple interface. As a result, the search process is greatly simplified, and at the same time, the results are accurate – something that is a must for FINRA’s operations.

This successful adoption of cloud by FINRA has boosted the confidence levels of other financial providers too. Randich says that he gets queries from many financial organizations about their cloud adoption, and is confident that more companies will take to the cloud in the near future. Already CapitalOne has partnered with AWS to move its applications to the cloud, and it won’t be long before other financial companies also follow suit.

Such a move augurs well for not just AWS, but for the cloud industry as a whole. Traditionally, financial companies are the slowest to adopt cloud, and it looks like cloud has broken this final frontier too. Given this scenario, it’s no surprise that many cloud providers including Google and Microsoft, are creating new products and entering into partnerships with other service providers to offer top of the line cloud experience for its customers.

The post FINRA Leads the Way in Cloud Adoption appeared first on Cloud News Daily.

A Look into the Announcements at AWS’ re:Invent

It’s been the trend for every major company to make significant announcements during their annual conference, and AWS is no different. It’s annual conference called “re:Invent” took place this year from November 27 to December 1 at Las Vegas. Here’s a look into some of the major announcements made by the company over these five days.

Athena

Athena is a SQL-supported query language from Amazon, designed to manage its Simple Storage Service, popularly known as S3. This is an important tool as it makes it easy for users to use standard SQL queries on S3. The cost of this service is $5 per TB of data that is scanned each time.

Aurora

Aurora is another in-house service from Amazon. This cloud-based relational database is already compatible with many open source platforms such as MySQL. Soon, the popular PostgreSQL will also be supported by Athena, according to the announcement made at re:Invent. This announcement brought much applause from attendees.

AWS Batch

This new tool is likely to coordinate all the resources needed to run batch jobs on Amazon’s cloud. This tool will be ideal for institutions or projects that want to run large-scale computing tests that would entail the setting up of several virtual machines. With this tool, organizations won’t have to worry about the setup, as it will be handled by Batch.

CodeBuild

CodeBuild, as the name suggests, can automatically deploy and test code. This can be particularly important for developers who are iteratively building apps over AWS, as it can ensure that the changes are implemented as intended by developers. As you may have guessed, it is a new addition to the family of existing development tools, namely, Code Commit, Code Deploy, and Code Pipeline.

GPUs

Amazon introduced the idea of Elastic Graphical processing Units (GPUs), that can be added to any EC2 instance when needed. This is sure to give gaming enthusiasts and CAD customers much to rejoice. Currently, this product is in private preview, and more details about its pricing and availability will be disclosed over the next few months.

Greengrass

Amazon gave a big fillip to its IoT arm as well, with the announcement of AWS Greengrass. This product will provide the necessary software components needed to manage local computing and caching for connected devices. It can be particularly useful for IoT devices that have intermittent Internet connection. In addition, it will also help the device to upload data to the cloud, when Internet connection becomes available.

Glue

AWS Glue is business intelligence software that will assist advanced analytics on information stored in Amazon’s cloud. Primarily, it will identify the location of a particular data, extract it, convert it into a specified format and prepare it for analysis, and will move it to the platform where it will be analyzed.

Lex

Lex is a machine language product that will allow programmers to create interactive and conversational applications. In fact, this technology is at the heart of its Alexa platform, and is now available separately for developers to build applications on AWS cloud.

These announcements mark the beginning of exciting times for both AWS and its customers.

The post A Look into the Announcements at AWS’ re:Invent appeared first on Cloud News Daily.

CapitalOne Teams With AWS for its Tech Transition

Technology is already ubiquitous, and is only going to get more integral in the future.  Almost every company foresees these trends, and this is why many are taking major steps to embrace it. One such company that wants to infuse technology in a big way into its operations, with an aim to meet the growing demands of its digital customers, is CapitalOne. This financial services provider believes cloud is an important technology that needs to be adapted to move forward, grow, and to continue to reach out to more customers. To achieve this long-term goal, it has partnered with the leader in cloud, Amazon Web Services (AWS).

 

Under the terms of the agreement, AWS will be the major cloud partner for CapitalOne. Though the company already uses the services of companies like Google and Salesforce for its small applications, it has announced that AWS will handle all of its legacy migrations.

Some researchers believe that this would be a disadvantage for CapitalOne because it will not have the flexibility to switch between providers to tap into their pricing or features. Maybe CapitalOne already thought about this disadvantage, and this is why it has announced that AWS will be its major partner, and not an exclusive one! Still, much of its migration to the cloud is going to be handled by AWS.

This decision to move to the cloud comes as a surprise because financial companies, in general, are slower to embrace technology partly because of the security concerns that arise with it. CapitalOne, though, wants to change this trend. It wants to cater to its growing tech-savvy customers, and to bring out new tech-based products and services that are sure to impress them.

CapitalOne’s move to the cloud began in 2013 when it hired people to develop and test cloud-based applications in its innovation lab. The many experiments necessitated the use of cloud, and this led the company to tap into the services of providers at a small scale. By 2015, it became clear that the company has to make a big foray into the cloud to continue with the rapidly developing projects in its lab. In addition, the company understood that cloud offers many benefits in terms of scalability, flexibility, and better user experience for its customers. Due to these factors, the company has taken the big step to partner with AWS to move all its applications to the cloud. CapitalOne has many mainframe applications too, and all these are also likely to be moved to the public cloud soon. Though no timeframe has been mentioned by either companies, this transition is expected to begin at the earliest.

Besides migrating its existing applications, CapitalOne also plans to develop new products, especially for the mobile platform. Currently, its mobile app is one of the most used customer-facing applications, and this was transitioned to AWS cloud last month, on a trial basis. The success of this transition has prompted the company to move all its applications to the cloud.

Overall, this is a strategic move by CapitalOne, and it plans to use technology to score over its competitors. Also, it is expected to fulfill its customers’ expectations.

The post CapitalOne Teams With AWS for its Tech Transition appeared first on Cloud News Daily.

What is Snowball Edge?

Snowball Edge is a hardware released by Amazon Web Services that incorporates both computing and storage power to help customers run tasks and store data locally. This product was announced at AWS’ annual event called re:Invent that took place in Las Vegas last week.

Snowball Edge is an extension of its product called Snowball – a data transport product that helps to transfer large amounts of data to and from the AWS cloud. This product can store up to a petabyte of data, and is most helpful to transfer data through a secure connection. Currently, challenges faced in the transfer of data is the high network costs and security, so Snowball was created to address both these challenges. When you sign up for an AWS job, the Snowball device will be automatically shipped to you. Once it arrives, simply attach it to your network, and run the client to establish a secure connection. Then, you’re all set to download large amounts of data at almost one-fifth of the cost.

Now that you know about Snowball is, it’ll be easy to understand what its extension would do. This new extension expands the scope of Snowball with increased connectivity and higher storage. Also, Snowball Edge enhances horizontal scalability through clustering, and also provides new storage endpoints to connect S3 and NFS clients. It’s Lambda-powered local processing makes it a handy tool for heavy computational tasks. This product can now store about 100 TB of data, an upgrade from Snowball’s maximum capacity of 80 TB.

In addition, Snowball Edge comes with a unique design that can take any amount of wear and tear at home. Also, it can now be used in industrial, agricultural, and military environments, thereby increasing the scope of its usage. Such a hardy design also helps with rack mounting, especially when you want to make use of the clustering feature of this product.

In terms of connectivity, Edge offers many options. Data can be transferred to Edge through Cellular data or Wi-Fi from any IoT-enabled device. Also, there’s a PCIe expansion port for additional data transfer. You can also use it with a wide range of network options such as 10GBase-T, 10 or 25 GB SFB28, and 40 GB QSFB+. With such advanced option, you can transfer about 100 TB of data within just 19 hours! In other devices, the entire process can even take a week.

Further, you can use clustering to gain the benefits of horizontal scaling. For example, you can configure two or more Edge appliances into a cluster, so the entire setup can have greater durability and higher capacity. Such an option is sure to make it more convenient for you to store and handle large amounts of data.  Also, you can remove devices when you want to shrink the storage and computational power of your system. This scalability and flexibility is truly where Snowball Edge scores over other devices.

In all, yet another innovation product from AWS that is sure to bring computation and cloud storage a lot closer to people.

The post What is Snowball Edge? appeared first on Cloud News Daily.

Google Acquires Qwiklabs

Google scores over its arch rival Amazon by acquiring a company called Qwiklabs, that provides hands-on training for AWS developers. The terms of the deal were not disclosed.

Qwiklabs was founded in 2012 to focus on teaching developers to create and run applications on the AWS platform. Though the original idea was to create a learning tool for cloud-based platforms, the focus quickly turned to AWS because of its dominance in the cloud market. AWS also started using Qwiklabs as its go-to service for providing self-paced labs for different developers.

All that is set to change with this acquisition. Clearly, Google acquired Qwiklabs with an intent to take on its competition head-on. Since the cloud attributes of Google and AWS are fundamentally different, it requires different development approaches. Given this difference, it won’t be a surprise if Qwiklabs transforms from a AWS-based service to a Google-based one, where Google’s own cloud tools and services are showcased to potential customers. It remains to be seen if Google will also allows AWS-based courses on Qwiklabs. Some experts though think Qwiklabs will continue to offer AWS courses, as Google’s cloud head, Diane Greene is a supporter of multi-cloud deployments.

At this point though, Google has announced no major changes to the operations of Qwiklabs, which means, Qwiklabs will continue to offer subscriptions and labs for AWS developers. However, we may not see new AWS courses on this platform, and also, there is no news on when Google’s programs will be included. It is expected that Google will use Qwiklabs to help people understand the Google Cloud Platform and G Suite productivity services better, so more apps can be based out of them.

This acquisition comes as a surprise for many reasons. Firstly, it’s not sure why AWS did not acquire this company before-hand, considering the fact that it sold only AWS-based courses. In fact, Qwiklabs says that more than half a million developers have used its platform, and have spent over five million hours learning about AWS. Secondly, it reflects the multi-pronged strategy that Google has taken over the last few months to get a firm hold on the cloud market. Thirdly, this acquisition can act as the perfect jump board for Google to reach out to more customers.

As for AWS, it has to find a new education partner to fill the absence of Qwiklabs. In September, AWS announced that it will give its Enterprise Support customers free Qwiklabs credits! That has to change now.

This acquisition goes to show the fickle nature of supporting tools. In the past, one company had complete control over all the tools and services that were related to its products, so there was greater certainty. Today, the market is a lot more fragmented, and this can be attributed to the nature of technology and the market itself. Though AWS and Google offer platforms, a host of services and tools from third-party companies are needed to make the most of these platforms, so acquisitions and mergers can significantly alter the market share.

It will be interesting to see the impact of this acquisition for both Google and AWS.

The post Google Acquires Qwiklabs appeared first on Cloud News Daily.

Does the FBI need cloud?

Cloud has become a ubiquitous term today, and it’s not just restricted to the economic side of our lives. Rather, it encompasses all areas of our society, including policing and vigilance. After all, organizations in charge of security can also leverage the power of cloud to protect their own digital assets, and monitor other critical aspects of national security. This is why, it’s no surprise that the Federal Bureau of Investigation (FBI), uses cloud extensively for its operations.

At the 2016 Structure Conference, the FBI explained how it uses cloud to manage security. One of the main challenges that the FBI faces now is information leaks. Post-Snowden era and the prevalence of sites like WikiLeaks have made it that much more difficult for the organization to manage its security. It has to lockdown confidential information, and at the same time, should make some information available to other law enforcement agencies to help prevent terrorist attacks.

In addition, the FBI should also protect itself from insider attacks. Like any other business, this security organization should also protect its data, intellectual property, and other assets from being stolen by its own employees. There are many cases of spying and espionage that have proved to be costly for the FBI. To prevent these insider attacks, the FBI should always stay on top of its data, along with an understanding of the possible ways by which it can be compromised.

Protecting itself from both internal and external threats is quite a challenge for the FBI. This is why it should choose tools that will address both sides of the coin. In this sense, the FBI has the same requirements as that of the private sector, but at a different level. There are other unique aspects too, when it comes to FBI’s security.

Firstly, it doesn’t allow its employees to bring their own device as it can be too much to monitor. With no BYOD, it’s one task less for the IT team. Secondly, the FBI’s website is not a mission critical asset, unlike that of private companies, because its website just provides information to everyone. Hence, the FBI’s website does not need the highest level of protection. Thirdly, availability is a top priority for the organization, as it has to be available for local and national law enforcement 24/7. Fourthly, risk and loss is not monetized. Rather, it can affect the national security or can lead to catastrophic events like the 9/11 attack. In this sense, data breaches can be extremely costly for the country as a whole.

With such unique considerations, it’s no doubt that it needs a customized cloud application. It turned to the market leader, Amazon, and this has resulted in the creation of Amazon GovCloud. Many of the FBI’s security concerns and requirements are being addressed by GovCloud, and the organization plans to move its legacy systems to the cloud  too.

Once again, GovCloud reflects the fact that cloud is a central part of our lives, regardless of the sector or organization in which it is used.

The post Does the FBI need cloud? appeared first on Cloud News Daily.

Powering Cloud with Renewable Energy

A common criticism of cloud technology is its energy consumption. The many data centers that make cloud storage seamless and convenient, is also a major consumer of power. According to the Natural Resource Defense Council, US data centers consumed a whopping 91 billion kilowatts of electricity in 2013, and this is roughly equal to the output of 34 mega coal power plants. At this rate of consumption, it is estimated that data centers will need 140 billion kilowatts of electricity by 2020.

Sounds unsustainable? You’re absolutely right!

The amount of coal that’ll be needed to power these plants coupled with the high rates of emission is sure to add more damage to climate change and environmental degradation.

To lower pollution levels, many top cloud providers are beginning to look at the possible use of renewable energy to power their data centers. Wind and solar are expected to the biggest sources of renewable power for cloud technology within the next few years.

Recently, Amazon Web Services (AWS) announced that it would build a 189-megawatt wind farm in Hardin County, Ohio. This wind farm alone is expected to generate 530,000 megawatts a year, beginning from the end of December, 2017. In fact, this wind farm will be AWS’ fifth renewable energy farm, with already three being operational, and the fourth one expected to come online in May 2017. The existing three wind farms are located in North Carolina, Virginia, and Indiana respectively. With these farms, already 40 percent of AWS’ operations are powered by renewable energy, and the company plans to increase it to 50 percent by the end of 2017. Eventually, AWS plans to source 100 percent of its energy consumption from renewable energy.

Besides AWS, other top cloud providers are also taking the renewable route, though their plans are not as ambitious as that of the market leader. Microsoft uses wind, solar, and hydro power plants to power its data centers, and estimates show that about 44 percent of its energy consumption comes from these sources. The company aims to reach the half-way mark by 2018, and keep improving from there on. In addition, Microsoft has been carbon neutral since 2012.

Google, another major player in the cloud market, has a different approach to tackling the problem of power consumption. This company believes it’s not practical to build large renewable energy farms in places where its data centers are located, rather it believes these farms should be located in places that are most conducive to it. For example, solar farms should be in sunny states like Arizona and California, and not in Minnesota and Wisconsin, even if this is where the data centers are located.

This company thinks a more pragmatic approach would be to buy renewable energy to power its data centers, instead of harnessing it directly from farms. This way, the company may have more flexibility in terms of the provider.

Regardless of the approach, it’s heartening to see these top companies lead the way in making this planet a more sustainable and livable place for future generations.

The post Powering Cloud with Renewable Energy appeared first on Cloud News Daily.

Amazon Cloud Posts Yet Another Stellar Quarter Results

Amazon Cloud posted yet another impressive number for the last quarter, signaling the continued strength of this line of Amazon’s business. AWS reported a sale of $3.2 billion, and this is almost 55 percent higher than the $2.08 billion it posted during the same period, a year ago. The operating income of the company was $1.02 billion, up nearly 96 percent from the $521 million it posted a year ago.

These numbers clearly show that AWS is growing at an incredible pace, despite facing intense competition from deep-pocketed companies such as Microsoft and Google. Much of this success can be attributed to a simple and clear strategy of helping business leverage the power of AWS to improve their performance. When AWS was launched ten years ago, it allowed firms to rent computing capacity, which means, they paid only for what they used. Such a model made cloud more accessible for all companies, including startups with limited budgets.

Another important strategy that AWS followed was to make strategic partnerships at the right times. Recently, AWS announced a partnership with VMware, under which VMware’s cloud software will run on AWS. Other similar partnerships have helped AWS to gain a strong foothold in the cloud market, and this in turn, has helped it to stay ahead of its competitors.

Even during the earnings call, AWS reiterated that the company will focus on helping more businesses to move to AWS, from both on-premise and hybrid environment. To this end, it has launched a new tool called Server Migration Service, that’ll ease the process of moving legacy applications to the cloud. This tool will help IT teams to create incremental replication of virtual machines from their on-premise infrastructure to AWS, with an aim to help them reap the many benefits that come with public cloud. This is an important move because moving legacy applications to the cloud is a painful process, to say the least, and this is mainly why many companies are opting for a hybrid environment. As a result, they miss out on the flexibility and cost-saving that a public cloud offers. With this tool, companies now have the option to move their operations entirely to the cloud, so they can make the most of the benefits offered by it.

Besides this tool, Amazon Cloud has also announced that it’ll add data center facilities across many new geographical regions. This move is in tune with the trend of keeping data as close to the customers as possible, so they experience low latency and faster access speeds. Some countries like Germany even mandate that data should be kept only within its sovereign borders, so the new data center facilities are being setup to comply with these regulations as well. These strategies are likely to bring more benefits to AWS and its customers in the future.

Despite all these positive data, shares of Amazon fell in after hours trading, with an almost six percent drop towards the end. This fall is because the parent company’s profits was lesser than what was expected. In this sense, Amazon Cloud can be the silver lining for this company.

The post Amazon Cloud Posts Yet Another Stellar Quarter Results appeared first on Cloud News Daily.

VMware’s Software on Amazon Cloud – Surprise!

For many years, VMware and Amazon have been on two sides of the storage world. While VMware had asked customers to run their businesses on their own computer servers, Amazon had always encouraged companies to move it to the cloud. But now, both the companies have teamed up to integrate their views and services.

Beginning next year, VMware’s software will run on Amazon cloud, thereby giving VMware customers to use the existing tools to manage their servers, except that their servers will be located in the cloud. Alongside, users can also make use of Amazon’s database and storage tools and services.

Now, you can already run VMware’s virtual machines on Amazon’s cloud, and can even use VMware’s management tool, vCenter, to manage your virtual machines. So, what’s different with this partnership? Well, to start with both the companies have decided to create a new version of Amazon cloud that’ll allow VMware’s virtual machines to run directly on the cloud, without the need for an Amazon software in between. Also, this new partnership will give users the flexibility to run their software both on the cloud as well as on the existing data centers.

In many ways, this partnership has reiterated the fact that cloud is the present and the future, and no business can afford to ignore it. In addition, it’s also a significant milestone in the world of cloud computing, as VMware has gone from seeing Amazon as a rival, to admitting that its products are the future.

In fact, VMware made a brief foray into the cloud world with its own product called vCloud Air, but it never really took off. As a strategic move, this company has decide to focus on its core business of running virtual machines, and at the same time, is looking to expand to capabilities that’ll allow these virtual machines to run on the cloud. This way, VMware can cater to businesses that want to stay on their own data centers, and also to businesses that want to move to the cloud. This is why such a move is likely to expand the market reach and customer base of VMware, as it’s looking for ways to cope with the changing digital environment.

This partnership is significant for Amazon too, as this is an opportunity to reach out to customers who haven’t still migrated to the cloud. This would give it a better foothold among conservative businesses that still want to keep data on their local servers for reasons ranging from lack of knowledge to security concerns. In fact, this partnership with VMware can help Amazon to reach these customers before its rivals, thereby increasing its market share in a competitive market.

According to International Data Corporation (IDC), this deal will be significant in the short-term for VMware, but in the longer term though, Amazon will be a huge beneficiary as it can bring on more corporate customers under its AWS cover. The biggest winner is of course, the customers, as they can now choose to run their operations on their own VMware-equipped data centers and on Amazon Cloud.

 

The post VMware’s Software on Amazon Cloud – Surprise! appeared first on Cloud News Daily.

Amazon Web Services Will Expand to India

Amazon Web Services has decided to enter India after having localized services in China. The decision to move into India was influenced by the fact that India is expected to see a lot of growth and has seen the emergence of a multitude of startups, including Flipkart, Snapdeal, payment services such as Paytm, which is backed by Alibaba and Uber’s main rival called Ola. Gartner research vice president Ed Anderson has said “Organizations in India seeking IT outsourcing services are increasingly turning to public cloud services as an alternative to traditional ITO offerings. In fact, cloud services are not only being used for low-value or transient workloads but also increasingly for production workloads, including some mission-critical initiatives.”

More traditional businesses are also a part of Amazon Web Services. These businesses, like automobile giant Tata Motors, media firm NDTV, and national flower retail chain Ferns N Petals, will be some of the initial launch partners.

While Amazon Web Services is already available in India, its expansion of its cloud computing platform will improve service for existing customers while potentially drawing in new ones.

DSC02401-300x247

Andy Jassy, senior vice president for Amazon Web Services,  said in a statement: “Tens of thousands of customers in India are using AWS from one of AWS’s eleven global infrastructure regions outside of India. Several of these customers, along with many prospective new customers, have asked us to locate infrastructure in India so they can enjoy even lower latency to their end users in India and satisfy any data sovereignty requirements they may have.”

The post Amazon Web Services Will Expand to India appeared first on Cloud News Daily.