Archivo de la categoría: Data center

Huawei Cloud pushes partners’ global business expansion at Go-Global Summit 2025

Huawei Cloud has announced several solutions and technologies at the Huawei Cloud Go-Global Summit 2025 (Chongqing, China, March 24-25), designed to support Chinese companies developing their businesses to serve overseas markets. Jacqueline Shi, President of Huawei Cloud’s Global Marketing and Sales Service (see image above), said the key to expansion was a change in practice, […]

The post Huawei Cloud pushes partners’ global business expansion at Go-Global Summit 2025 appeared first on Cloud Computing News.

Oracle’s $5bn UK cloud investment

Oracle UK cloud has announced plans to invest US$5 billion to expand its cloud infrastructure in Britain over the next five years. The March 17 announcement aims to support the UK Government’s priorities for what it terms on AI-driven future, and meet global demand for Oracle’s cloud computing services. The investment will expand Oracle Cloud […]

The post Oracle’s $5bn UK cloud investment appeared first on Cloud Computing News.

Ten year liquid cooling market analysis and forecast

The ‘Data Centre Liquid Cooling Market – A Global and Regional Analysis: Focus on Product, Application, and Country Analysis – Analysis and Forecast, 2024-2034’ report’s release predicts a strong CAGR of 23.96% in the market between 2024-2034. According to the forecast, the global data centre liquid cooling market is expected to reach $48.42 billion in […]

The post Ten year liquid cooling market analysis and forecast appeared first on Cloud Computing News.

DCIM software market to reach $3.63B by 2029

The global Data Center Infrastructure Management (DCIM) software market is on track for significant expansion, with projections showing growth from $2.02 billion in 2023 to $3.63 billion by 2029, according to a new report from ResearchAndMarkets.com. This robust growth trajectory, marked by a compound annual growth rate (CAGR) of 10.1%, reflects the increasing importance of DCIM solutions […]

The post DCIM software market to reach $3.63B by 2029 appeared first on Cloud Computing News.

Europe’s ‘AI factory’ builds smart cloud ambitions

European cloud company evroc plans to boost Europe’s standing in artificial intelligence by building a 96 MW data centre in Mougins, France, tailor-made for AI workloads. Construction is expected to wrap up in 2025, and the facility’s capacity will expand in phases over time. What sets an AI factory apart? evroc terms what it’s building, […]

The post Europe’s ‘AI factory’ builds smart cloud ambitions appeared first on Cloud Computing News.

Gartner Data Center Conference: Success in the Cloud & Software Defined Technologies

I just returned from the Gartner Data Center conference in Vegas and wanted to convey some of the highlights of the event.  This was my first time attending a Gartner conference, and I found it pretty refreshing as they do take an agnostic approach to all of their sessions unlike a typical vendor sponsored event like VMWorld, EMC World, Cisco Live, etc.  Most of the sessions I attended were around cloud and software defined technologies.  Below, I’ll bullet out what I consider to be highlights from a few of the sessions.

Building Successful Private/Hybrid Clouds –

 

  • Gartner sees the majority of private cloud deployments being unsuccessful. Here are some common reasons for that…
    • Focusing on the wrong benefits. It’s not all about cost in $$. In cloud, true ROI is measured in agility vs dollars and cents
    • Doing too little. A virtualized environment does not equal a private cloud. You must have automation, self-service, monitoring/management, and metering in place at a minimum.
    • Doing too much. Putting applications/workloads in the private cloud that don’t make sense to live there. Not everything is a fit nor can take full advantage of what cloud offers.
    • Failure to change operational models. It’s like being trained to drive an 18 wheeler then getting behind the wheel of a Ferrari and wondering why you ran into that tree.
    • Failure to change funding model. You must, at a minimum, have a show back mechanism so the business will understand the costs, otherwise they’ll just throw the kitchen sink into the cloud.
    • Using the wrong technologies. Make sure you understand the requirements of your cloud and choose the proper vendors/technologies. Incumbents may not necessarily be the right choice in all situations.
  • Three common use cases for building out a private cloud include outsourcing commodity functions, renovating infrastructure and operations, and innovation/experimentation…but you have to have a good understanding of each of these to be successful (see above).
  • There is a big difference between doing cloud to drive bottom line (cost) savings vs top line (innovation) revenue expansion. Know ‘why’ you are doing cloud!
  • On the hybrid front, it is very rare today to see fully automated environments that span private and public as the technology still has some catching up to do. That said, it will be reality within 24 months without a doubt.
  • In most situations, only 20-50% of all applications/workloads will (or should) live in the cloud infrastructure (private or public) with the remaining living in traditional frameworks. Again, not everything can benefit from the goodness that cloud can bring.

Open Source Management Tools (Free or Flee) –

 

  • Organizations with fewer than 2500 employees typically look at open source tools to save on cost while larger organizations are interested in competitive advantage and improved security.
  • Largest adoption is in the areas of monitoring and server configuration while cloud management platforms (i.e. openstack), networking (i.e. open daylight), and containers (i.e. docker) are gaining momentum.
  • When considering one of these tools, very important to look at how active the community is to ensure relevancy of the tool
  • Where is open source being used in the enterprise today? Almost half (46%) of deployments are departmental while only about 12% of deployments are considered strategic to the overall organization.
  • Best slide I saw at the event which pretty much sums up open source….

 

Gartner Data Center Conference

 

If this makes you excited, then maybe open source is for you.  If not, then perhaps you should run away!

3 Questions to Ask Your SDN Vendor –

  • First, a statistic…organization which fail to properly integrate their virtualization and networking teams will see a 3x longer MTR (mean time to resolution) of issues vs those who do properly integrate the teams
  • There are approximately 500 true production SDN deployments in the world today
  • The questions to ask…
    • How to prevent network congestion caused by dynamic workload placement
    • How to connect to bare metal (non-virtualized) servers
    • How to integrate management and visibility between the underlay/overlay
  • There are numerous vendors in this space, it’s not just VMware and Cisco.
  • Like private cloud, you really have to do SDN for the right reasons to be successful.
  • Last year at this conference, there were 0 attendees who indicated they had investigated or deployed SDN. This year, 14% of attendees responded positively.

 

If you’re interested in a deeper discussion around what I heard at the conference, let me know and I’ll be happy to continue to dialogue.

 

By Chris Ward, CTO. Follow Chris on Twitter @ChrisWardTech . You can also download his latest whitepaper on data center transformation.

 

 

Riding on the Cloud – The Business Side of New Technologies

For the last couple of years “The Cloud” has been a buzzword all over the business and IT world.

What is The Cloud? -Basically, it is the possibility to use remote servers to handle your processing, storage and other IT needs. In the olden days you only the resources that you physically had on your computer; these days that’s not the case. You can “outsource” resources from another computer in a remote location and use them anywhere. This has opened so many doors for the world of business and has helped bring new companies into the internet.

Why? Because of how much it reduces the cost of being on the internet. A server is a costly piece of equipment and not everybody can afford it. Between the initial cost and upkeep of the hardware, you could easily spend a few thousand pounds every year.

The cloud has brought on the Virtual Private Server, which gives you all the benefits of an actual server without the hefty price tag. A hosting company will rent out a piece of their processing capabilities to your company and create a server environment for you. You only pay for what you use and you don’t have to worry about things like hardware failure, power costs or having room for a couple of huge server racks.

But what if your business grows? One of the biggest advantages of the cloud is that it can grow along with your business and your needs. It’s highly scalable and flexible, so if you ever need some extra storage or extra bandwidth, it’s a really easy fix that does not require you to purchase new equipment.

Since your own personal business cloud is by definition a remote solution, this means that you can access it from anywhere and everywhere as long as you have an internet connection. Want to make changes to your server? You can probably do it without leaving your house, even from the comfort of your own bed.

The same applies to your staff. If anyone ever needs to work from home or from another machine that’s not their work computer, all of the important files and resources they could possibly need can be hosted in the cloud, making those files accessible from anywhere. If someone’s office computer breaks there’s a backup and no data is lost.

The Cloud also makes sharing files between members of your staff a lot easier. Since none of the files are hosted on a local machine everybody has access to the files they require. Files update in real time, applications are shared and you can create a business environment that’s exponentially more effective.

Of course, the cloud still offers security and access control so you can keep track of who can see which files. A good cloud services provider also provides protection against malware and other security risks, to make sure that no pesky interlopers get into your files.

If your business is growing and so are your IT needs, then the cloud is an option worth exploring. Embrace the future, adopt new technologies and take your business to the next level.

Amazon, Google: a Battle to Dominate the Cloud

The cloud is just a vast mass of computers connected to the internet, on which people or companies can rent processing power or data storage as they need it.

All the warehouses of servers that run the whole of the internet, all the software used by companies the world over, and all the other IT services companies hire others to provide, or which they provide internally, will be worth some $1.4 trillion in 2014, according to Gartner Research—some six times Google and Amazon’s combined annual revenue last year.

When that time comes, all the world’s business IT needs will be delivered as a service, like electricity; you won’t much care where it was generated, as long as the supply is reliable.

Way back in 2006, Amazon had the foresight to start renting out portions of its own, already substantial cloud—the data centers on which it was running Amazon.com—to startups that wanted to pay for servers by the hour, instead of renting them individually, as was typical at the time. Because Amazon was so early, and so aggressive—it has lowered prices for its cloud services 42 times since first unveiling them, according to the company—it first defined and then swallowed whole the market for cloud computing and storage.

Even though Amazon’s external cloud business is much bigger than Google’s, Google still has the biggest total cloud infrastructure—the most servers and data centers. Tests of Amazon’s and Google’s clouds show that by one measure at least—how fast data is transferred from one virtual computer to another inside the cloud—Google’s cloud is seven to nine times faster than Amazon’s.

The question is, is Amazon’s lead insurmountable?

 

Take a Photo Tour of Facebook’s Amazing Cold Storage Datacenter

There’s a fascinating photo tour of Facebook’s Oregon data center on readwrite today.

Facebook (arguably) owns more data than God.

But how to store a cache of user data collected at the scale of omniscience? If you’re Facebook, just build another custom-crafted server storage locker roughly the size of the USS Abraham Lincoln on top of a breezy plateau in the Oregon high desert. The company’s new Prineville, Ore., data center employs an ultra-green ”cold storage” plan designed from the ground up to meet its unique—and uniquely huge—needs.

The piece also includes useful links on the tech behind the data center, shingled drive tech, and the Open Compute project that led to the innovations on display here.

What’s Missing from Today’s Hybrid Cloud Management – Leveraging Brokerage and Governance

By John Dixon, Consulting Architect, LogicsOne

Recently GreenPages and our partner Gravitant hosted a webinar on Cloud Service Broker technology. Senior Analyst Dave Bartoletti gave a preface to the webinar with Forrester’s view on cloud computing and emerging technology. In this post we’ll give some perspective on highlights from the webinar. In case you missed it, you can also watch a replay of the webinar here: http://bit.ly/12yKJrI

Ben Tao, Director of Marketing for Gravitant, kicks off the discussion by describing the traditional data center sourcing model. Two key points here:

  1. Sourcing decisions, largely based on hardware selection, are separated by years
  2. In a cloud world, sourcing decisions can be separated by months or even weeks

 

The end result is that cloud computing can drive the benefit of a multi-sourcing model for IT, where sourcing decisions are made in close proximity to the use of services. This has the potential of enabling organizations to adjust their sourcing decisions more often to best suit the needs of their applications.

Next, Dave Bartoletti describes the state of cloud computing and the requirements for hybrid cloud management. The core of Dave’s message is that the use of cloud computing is on the rise, and that cloud is being leveraged for more and more complex applications – including those with sensitive data.

Dave’s presentation is based on the statement, “what IT must do to deliver on the hybrid cloud promise…”

Some key points here:

  • Cloud is about IT services first, infrastructure second
  • You won’t own the infrastructure, but you’ll own the service definitions; take control of your own service catalog
  • The cloud broker is at the center of the SaaS provider, cloud VAR, and cloud integrator
  • Cloud brokers can accelerate the cloud application lifecycle

 

Dave does an excellent job of explaining the things that IT must do in order to deliver on the hybrid cloud promise. Often, conversations on cloud computing are purely about technology, but I think there’s much more at stake. For example, Dave’s first two points above really resonate with me. You can also read “cloud computing” as ITIL-style sourcing. Cloud computing puts service management back in focus. “Cloud is about IT services first, infrastructure second,” and “You won’t own the infrastructure […]” also suggests that cloud computing may influence a shift in the makeup of corporate IT departments – fewer   core technologists and more “T-shaped” individuals. So called T-shaped individuals have knowledge and experience with a broad set of technologies (the top of the “T”), but have depth in one or more areas like programming, Linux, or storage area networking. My prediction is that there will still be a need for core technologists; but that some of them may move into roles to do things like define customer-facing IT services. For this reason, our CMaaS product also includes optional services to deal with this type of workforce transformation. This is an example of a non-technical item that must be made when considering cloud computing. Do you agree? Do you have other non-technical considerations for cloud computing?

Chris Ward, CTO of LogicsOne, then dives in to the functionality of the Cloud Management as a Service, or CMaaS offering. The GreenPages CMaaS product implements some key features that can be used to help customers advance to the lofty points that Dave suggests in his presentation. CMaaS includes a cloud brokerage component and a multi-cloud monitoring and management component. Chris details some main features from the brokerage tool, which are designed to address the key points that Dave brought up:

  • Collaborative Design
  • Customizable Service Catalog
  • Consistent Access for Monitoring and Management
  • Consolidated Billing Amongst Providers
  • Reporting and Decision Support

Chris then gives an example from the State of Texas and the benefits that they realized from using cloud through a broker. Essentially, with the growing popularity of e-voting and the use of the internet as an information resource on candidates and issues, the state knew the demand for IT resources would skyrocket on election day. Instead of throwing away money to buy extra infrastructure to satisfy a temporary surge in demand, Texas utilized cloud brokerage to seamlessly provision IT resources in real time from multiple public cloud sources to meet the variability in demand.

All in all, the 60-minute webinar is time well spent and gives clients some guidance to think about cloud computing in the context of a service broker.

To view this webinar in it’s entirety click here or download this free whitepaper to learn more about hybrid cloud management