Societe General Leads the Way in Cloud Adoption

Societe General, the Paris-based bank, is looking to leverage cloud to lower its costs and to provide better services to its customers. Eventually, it wants to become the largest European bank to adopt cloud computing for a good amount of its operations.

To this end, it has entered into an agreement with Microsoft and Amazon. In fact, Societe General’s developers and engineers have been running pilot programs for more than a year now on both Azure and AWS to check for security and reliability of these public cloud platforms. More importantly, these pilot programs looked into the feasibility of using public cloud for banking transactions, where confidential information of users and processes at stored at giant third-party data centers, in a faraway place.

So far, the tests have been satisfactory and Societe General wants to use public cloud services by June. Initially, it plans to start with non-client and non-sensitive content such as financial research and marketing data. Depending on the success of these changes, the bank plans to eventually have 80 percent of its infrastructure and data on internal and external cloud systems.

One of the challenges that Societe General, or for that matter any bank in Europe will have, is the regulatory concerns laid down by the ECB. Currently, ECB has restricted banks to use clouds for storing only non-sensitive data and operations like product development. However, these regulations are expected to ease out in the near future because there is a greater pressure on banks than ever before to reduce costs and improve efficiency.

According to IBM, moving to the cloud can save banks about ten percent of the budget allocated to information technology and operations, to start with. Continued use of cloud can allow banks to save almost 40 percent of costs because they can do away with systems that are not needed anymore. At the same time, their investment in capital infrastructure is also greatly reduced when they have their operations in the cloud.

Intense competition between banks is another factor that can propel banks to take to the cloud. Currently, competition has ensured that profit margins are not easy to come by, so more banks are increasingly looking to move their operations to cloud to leverage its lower costs. To top it, the millennial generation wants to have a digital banking experience, where everything is customized to meet their specific needs. To cater to this demand, banks are forced to embrace advanced technologies and again, want to leverage the power of cloud to run these technologies.

A case in point is Big Data, using which banks can better understand their customers and their expectations. Using these insights, they can provide a more customized service to their customers, that in turn, can go a long way in retaining existing customers and attracting new ones into their fold.

Due to these many advantages, some financial institutions have already started moving to the cloud. HSBC Holdings has partnered with Google while Capital One Financial Corp has partnered with AWS to move its operations to the cloud. This trend has started in Europe too, with Societe General leading the way.

The post Societe General Leads the Way in Cloud Adoption appeared first on Cloud News Daily.

What Is a Cloud Broker, Anyway? | @CloudExpo #API #Cloud #WebPerf

We’ve all had that feeling before: The feeling that you’re missing something that everyone else is in on. For today’s IT leaders, that feeling might come up when you hear talk about cloud brokers.
Meanwhile, you head back into your office and deal with your ever-growing shadow IT problem. But the cloud-broker whispers and your shadow IT issues are linked.

If you’re wondering «what the heck is a cloud broker?» we’ve got you covered.

read more

More of the same: Challenges with DevOps evident but potential vast in new report

Quali, a company which specialises in providing cloud and DevOps automation software, took to a variety of US industry events during 2016 to ask attendees about their DevOps practices and the challenges they face. The verdict: almost half of applications in traditional environments were considered complex for cloud, while the current vendor ecosystem is open-ended with no main player winning out.

This makes for something of a refreshing change when compared to the majority of such studies which hit this reporter’s inbox; attendees at event x are most interested in technology y which just so happens to be linked to event x, and so on. Quali put together responses from more than 2,000 IT executives at Cisco Live, VMWorld, AWS re:Invent and more, finding that plenty of integration challenges remain.

The goal is clear. Quali cites a Forrester Research report, Master DevOps For Faster Delivery Of Software Innovation, in which the author, Diego Lo Giudice, explained: “For many companies, staying ahead of disruption means not only delivering new innovations but also modernising current software and systems. The underlying cultural shifts, process improvements, and automation of DevOps build the foundation for development teams to mature to the next generation of modern software development.”

Yet more than half (54%) of respondents said they had no access to self-service infrastructure, impacting productivity and increasing time to market, while more than a third said it takes up to a month to deliver infrastructure. Jenkins was the most frequently cited tool among those polled with 21% of the vote, ahead of Docker (16%), Puppet (14%) and Chef (13%).  

When it came to challenges of integrating DevOps environments, existing company culture (14%) won it by a nose, ahead of testing automation (13%), legacy systems (12%), application complexity (11%) and budget constraints (11%). Regarding hybrid cloud environments, those who have gone down that path are only adopting 23% of their apps on a hybrid cloud platform on average. Two thirds (65%) are running fewer than 24 applications in hybrid environments.

In the context of other research reports, these figures make sense. In January, B2B research firm Clutch set out to get a definitive answer to what DevOps means for organisations and how it is being implemented and, well, didn’t. 95% of the 250 organisations polled then either already use or plan to use DevOps methodologies, but nobody could agree on a proper definition.

The cultural aspect has also been mused upon by this publication. Writing last year, David Auslander noted the importance of the tenet that ‘sometimes the greatest change in the enterprise comes from non-technical places’, adding: “While we are implementing cloud environments, common code repositories, agile development practices and infrastructure as code, we need to keep in mind that the cultural aspects of DevOps implementation are just as important, if not more so.”

“What stood out most to us were some of the barriers around DevOps including culture, test automation and integration of legacy investments,” said Shashi Kiran, Quali chief marketing officer. “These issues are consistent with patterns we’re seeing every day.”

Read more: If you can’t take your lab to the cloud – bring your cloud to the lab

Producing an Effective IT Strategy | @CloudExpo #API #Cloud #Agile

A strategy is a planning document that sets a direction for future work to ensure that you end up where you want to be. A strategy allows you to see the wood, despite the trees. A strategy is often used as a management tool for securing the resources needed to get there.
IT is now part of the business and forward looking organisations will have senior IT people responsible for helping devise the Corporate Business Strategy

read more

Get to know your cloud storage bill: How to choose the best options

When it comes to using a public cloud, there are incredible advantages – for a price. But what are you really paying for?

While the flat rate you see advertised may be appealing, there are several details that can raise or lower your cloud storage bill. Could you be paying for something you don’t need, and how can you lower your expenses by factoring in certain specifics before you choose a provider? Get a closer look at your cloud storage bill:

Itemising your bill

1.Price per GB: Most cloud providers price based on the amount of gigabytes used. Ones like Amazon reduce their rates per GB if you require a massive amount of storage, and others keep it steady despite your level of data. In either case, this rate is affected by how redundant or active your data is – or, in other words, whether it just sits there or is often interacted with. The idea behind this is to reward businesses that use their cloud as the primary point of storage rather than branching out over several.

2.Storage actions: To put it simply, storage actions are all the changes, adjustments, and deletions of the data within your cloud storage. If you move something to a new file within, get rid of it entirely, or post it, these are all considered actions which your cloud provider will track, tally up, and then charge a price for hosting these actions. Some providers, such as Amazon S3, don’t charge for storage actions. This can catch businesses off guard when they go to a provider that does.

3.Transfer costs: When you work within the cloud, whether public or hybrid cloud, it’s free. However, some providers charge a fee for removing data for their storage. While most will allow companies to transfer data in at no cost, when it comes to migrating to a separate cloud, removing data for edits and then replacing it, or sharing data across multiple clouds, this can incur a high level of expense.

How to choose the best option: Consider employee fluency

Ultimately, it’s your employees that will be interacting with the cloud. How easy their process is made will affect the rates you have to pay them, the amount of tech support required to help them navigate the new platform, and how efficiently the data is being used – which affects your profits.

If your employees aren’t as fluent with the cloud, they may trial and error with managing the data and boost the price of your storage actions. This makes an option with lower prices on action fees – or one that is free – the best option. On the flipside, while it may be more cost-effective to choose a specific option, shelling out the budget for a platform with particular features could help your employees complete work more efficiently, boosting profits in the end. 

How redundant or active is your storage?

For businesses that don’t work mainly online, data storage can be a way to safeguard data that’s not used often. This makes it redundant data. However, for online companies or larger corporations, data sharing and online collaboration is a chief part of their work. This makes it active data. Having a cloud option that offers cost effective deals depending on how often the data is interacted with is a key to cutting out the extra expenses you may be subjected to without knowing. Consider how much interaction your data will get on a regular basis, and then be sure to check options that accommodate your active or redundant data.

How important is storage transfer?

If you’re not sure you’ll stick with your current cloud option, need to transfer data in and out regularly, or like to spread your data across many clouds for better accessibility, choosing a cloud option with the lowest fee – or no fee at all – for transfers is crucial to shave off your expenses. However, if you intend on staying put for the future and working within the cloud, then you can save money over other options by taking advantage of their added features.

The cloud provider you choose and the actions you take with your storage all depend on those fine details not many businesses know about their real cloud bill. To save money and improve your data storage, keep this in mind.

[session] Docker and VM | @DevOpsSummit #VM #AI #DevOps #Serverless

Virtualization over the past years has become a key strategy for IT to acquire multi-tenancy, increase utilization, develop elasticity and improve security. And virtual machines (VMs) are quickly becoming a main vehicle for developing and deploying applications. The introduction of containers seems to be bringing another and perhaps overlapped solution for achieving the same above-mentioned benefits. Are a container and a virtual machine fundamentally the same or different? And how? Is one technically superior to the other? What about performance and security? Does IT need either one, or both?

read more

Four Best Practices for Agile API Load Testing | @CloudExpo #API #Cloud #Agile

One important organizational point of Agile API delivery is the concept of a team producing a usable version – which should result in the improved alignment of development, QA, and technical operations teams. Perhaps an even larger benefit of Agile is that it allows teams to effectively manage change in both external and software functionality, while simultaneously accelerating the development of quality software. Part of the challenge with managing this change, however, is providing quick feedback on when the change in software happens, and its resulting impact.

read more

[session] #DeepLearning, Trading & #FinTech | @CloudExpo #BigData #AI #ML

Deep learning has been very successful in social sciences and specially areas where there is a lot of data. Trading is another field that can be viewed as social science with a lot of data. With the advent of Deep Learning and Big Data technologies for efficient computation, we are finally able to use the same methods in investment management as we would in face recognition or in making chat-bots. In his session at 20th Cloud Expo, Gaurav Chakravorty, co-founder and Head of Strategy Development at qplum, will discuss the transformational impact of Artificial Intelligence and Deep Learning in making trading a scientific process. This focus on learning a hierarchical set of concepts is truly making investing a scientific process, a utility.

read more

Peak 10 Completes Data Center Expansions | @CloudExpo @Peak_Ten #Cloud #DataCenter

Peak 10 has announced that it has completed a 20,000 square foot expansion of its Cincinnati-area data center, a 6,000 square foot expansion of its data center campus in Charlotte’s University Research Park, and added a pair of seasoned executives to its leadership team. This further propels the company on its aggressive growth trajectory to meet the rising demand for flexible hybrid IT strategies and solutions across its enterprise customer base.
Cincinnati is home to companies like Kroger, Proctor & Gamble, and GE. The expansion of Peak 10’s facility, located in West Chester, OH, was driven by the increasing demand for its IT infrastructure solutions. Since first entering the Cincinnati market in 2008, Peak 10 has seen steady customer growth, primarily in the industries of healthcare, manufacturing and technology. The company serves hundreds of area customers, including Gorilla Glue, CareStar and Aero Fulfillment Services.

read more