Archivo de la categoría: Big Data

New Updates For HP’s Big Data Platform Haven

HP has updated its big data platform Haven to include new analytics and predictive capabilities. This platform is geared towards enterprises with lots of data of various types, and the new update expands the type of data that can be analyzed through a new connector framework. A new Knowledge Graphing feature will be implemented along with better speech recognition and language identification features.

 

The Haven big data platform is made up of analytics, hardware and services with some of this available on-demand. HP’s big data platform was begun in 2013 with Haven being the umbrella for various technologies. The update brings together analytics for structured and unstructured data by combining context-aware unstructured data service analytics of HP IDOL with SQL-based capabilities of HP Vertica.

 

haven-dev-haven-gird-diagram_tcm_245_1529621

 

Examples of this type of service include Microsoft Exchange, SharePoint, Oracle, and SAP enterprise applications and cloud services such as Box, Salesforce and Google Drive.

 

The knowledge-graphing feature mentioned above could analyze connections in data, enabling advanced and contextually aware research within assorted data sources. The enhanced speech and language capabilities of the update are able to work with 20 languages. This part of Haven is powered by advanced deep neural technology and stems from thousands of hours of audio sampling via this neural network.

 

Other enhancements include targeted query response and IDOL search optimizer. The targeted query response helps customize and improve search results based on specific criteria. The IDOL search optimizer is used for understanding the types of searches being done by users and then gauging the quality of results.

 

The goal of HP’s Haven platform is to not have big companies relying on specialized data scientists or costly, complex integration projects in order to benefit from big data computing across almost any data type.

The post New Updates For HP’s Big Data Platform Haven appeared first on Cloud News Daily.

Business Cloud News magazine Issue 2 | March/April 2015

OFC_BCN_April15-1Business Cloud News is proud to announce the second issue of BCN is now available online.

In this issue we focus on the two interrelated trends – which create equally entangled issues and questions – and the cloud’s role therein: Big Data and the Internet of Things.

We asked over 700 senior IT decision makers globally about their big data rollout plans in order to get a better sense of their views on where the challenges and bottlenecks lie, and crucially, what components of their data systems will move to the cloud, when, and why.

Also in this issue, we look at the role open source cloud technologies are playing in the rapid transformation of the film and TV industries; the IT strategy driving the innovative vehicle manufacturer Nissan; and the growing presence of cloud computing in what is often thought to be a technologically conservative industry – the financial services sector.

Elsewhere in this issue we look at the shifts in the core capabilities that underpin and to some extent enable cloud computing: the emergence and impact of software-defined networking, and computational heterogeneity in the cloud.

We hope you enjoy issue #2 of BCN!

 

Cloud-based data management provider Reltio scores $10m

Reltio scored $10m, which will be used to expand its sales and marketing efforts

Reltio scored $10m, which will be used to expand its sales and marketing efforts

Reltio, a startup founded by Informatica veterans, has secured $10m in its first round of funding and announced the launch of its cloud-based data management platform.

Much like the integration element Informatica specialises in, Reltio is pitching its services at those that don’t necessarily want to acquire and set up all of the front-end and back-end big data tools in piecemeal, siloed fashion, but instead want an integrated platform that can query, analyses and display multiple data types.

The company said its data management platform is designed for those accustomed to using services like Facebook or Linkedin, but within traditionally data-intense industries like healthcare and life sciences, oil and gas, retail and distribution.

“Data is the new natural resource, but it’s truly valuable only when it’s effectively mined, related and transformed into insight with business actions that can be taken within the context of day-to-day operations,” said Manish Sood, founder and chief executive officer of Reltio.

“With Reltio, data is collated and analysed for actionable intelligence with the speed needed to support innovation and spark new revenue streams. IT gets a modern data management platform while business users get easy to use data-driven applications to address their everyday needs,” Sood said.

The company was founded largely by Informatica data management specialists: Sood led product strategy for master data management at Informatica; Anastasia Zamyshlyaeva, chief architect for Reltio, helped design the core components of Informatica’s MDM offering; Curt Pearlman, vice president of solutions, previously held positions in sales consulting with Informatica, as did Bob More, Reltio’s senior vice president of sales.

Reltio is throwing its hat into an increasingly competitive but lucrative ring. Analyst firm IDC estimates spending on big data and analytics will reach $125bn in 2015, with Database-as-a-Service growing in importance as cloud and commercial vendors open up their data sets.

Every little helps: How Tesco is bringing the online food retail experience back in-store

Tesco is in the midst of overhauling its connectivity and IT services

Tesco is in the midst of overhauling its connectivity and IT services

Food retailers in the UK have for years spent millions of pounds on going digital and cultivating a web presence, which includes the digitisation of product catalogues and all of the other necessary tools on the backend to support online shopping, customer service and food delivery. But Tomas Kadlec, group infrastructure IT director at Tesco tells BCN more emphasis is now being place on bringing the online experience back into physical stores, which is forcing the company to completely rethink how it structures and handles data.

Kadlec, who is responsible for Tesco’s IT infrastructure strategy globally, has spent the better part of the past few years building a private cloud deployment model the company could easily drop into regional datacentres that power its European operations and beyond. This has largely been to improve the services it can provide to clients and colleagues within the company’s brick and mortar shops, and support a growing range of internal applications.

“If you look at what food retailers have been doing for the past few years it was all about building out an online extension to the store. But that trend is reversing, and there’s now a kind of ‘back to store’ movement brewing,” Kadlec says.

“If we have 30,000 to 50,000 SKUs in one store at any given time, how do you handle all of that data in a way that can contribute digital feature-rich services for customers? And how do you offer digital services to customers in Tesco stores that cater to the nuances in how people act in both environments?  For instance, people like to browse more in-store, sometimes calling a friend or colleague to ask for advice on what to get or recipes; in a digital environment people are usually just in a rush to head for the checkout. These are all fairly big, critical questions.”

Some of the digital services envisioned are fairly ambitious and include being able to queue up tons of product information – recipes, related products and so forth – on mobile devices by scanning items with built-in cameras, and even, down the line, paying for items on those devices. But the food retail sector is one of the most competitive in the world, and it’s possible these kinds of services could be a competitive differentiator for the firm.

“You should be able to create a shopping list on your phone and reach all of those items in-store easily,” he says. “When you’re online you have plenty of information about those products at your fingertips, but far less when you’re in a physical store. So for instance, if you have special dietary requirement we should be able to illuminate and guide the store experience on these mobile platforms with this in mind.”

Tomas_Kadlec“The problem is that in food retail the app economy doesn’t really exist yet. It exists everywhere else, and in food retail the app economy will come – it’s just that we as an industry have failed to make the data accessible so applications aren’t being developed.”

To achieve this vision, Tesco had to drastically change its approach to data and how it’s deployed across the organisation. The company originally started down the path of building its own API and offering internal users a platform-as-a-service to enable more agile app development, but Kadlec says the project quickly morphed into something much larger.

“It’s one thing to provide an elastic compute environment and a platform for development and APIs – something we can solve in a fairly straightforward way. It’s another thing entirely to expose the information you need for these services to work effectively in such a scalable system.”

Tesco’s systems handle and structure data the way many traditional enterprises within and outside food retail do – segmenting it by department, by function, and in alignment with the specific questions the data needs to answer. But the company is trying to move closer to a ‘store and stream now, ask questions later’ type of data model, which isn’t particularly straightforward.

“Data used to be purpose-built; it had a clearly defined consumer, like ERP data for example. But now the services we want to develop require us to mash up Tesco data and open data in more compelling ways, which forces us to completely re-think the way we store, categorise and stream data,” he explains. “It’s simply not appropriate to just drag and drop our databases into a cloud platform – which is why we’re dropping some of our data systems vendors and starting from scratch.”

Kadlec says the debate now centres on how the company can effectively democratise data while keeping critical kinds of information – like consumers’ personal information – secure and private: “There should only be two types of data. Data that should be open, and we should make sure we make that accessible, and then there’s the type of data that’s so private people get fired for having made it accessible – and setting up very specific architectural guidelines along with this.”

The company hasn’t yet had the security discussion with its customers yet, which is why Kadlec says the systems Tesco puts in place initially will likely focus on improving internal efficiency and productivity – “so we don’t have to get into the privacy data nightmare”.

The company also wants to improve connectivity to its stores to better service both employees and customers. Over the next 18 months the company will implement a complete overhaul of store connectivity and infrastructure, which will centre on delivering low latency bandwidth for in-store wifi and quadrupling the amount of access points. It also plans to install 4G signal booster cells in its stores to improve GSM-based connectivity. Making sure that infrastructure will be secure so that customer data isn’t leaked is top priority, he says.

Tesco is among a number of retailers to make headlines as of late – though not because of datacentre security or customer data loss, but because the company, having significantly inflated its profits by roughly £250m, is in serious financial trouble. But Kadlec says what many may see as a challenge is in fact an opportunity for the company.

One of the things the company is doing is piloting OmniTrail’s indoor location awareness technology to improve how Tesco employees are deployed in stores and optimise how they respond to changes in demand.

“If anything this is an opportunity for IT. If you look at the costs within the store today, there are great opportunities to automate stuff in-store and make colleagues within our stores more focused on customer services. If for instance we’re looking at using location-based services in the store, why do you expect people to clock in and clock out? We still use paper ledgers for holidays – why can’t we move this to the cloud? The opportunities we have in Tesco to optimise efficiency are immense.”

“This will inevitably come back to profits and margins, and the way we do this is to look at how we run operations and save using automation,” he says.

Tomas is speaking at the Telco Cloud Forum in London April 27-29, 2015. To register click here.

CIO Focus Interview: David Chou

CIO focus interviewThis is the fourth installment of our CIO Focus Interview series. This time, I spoke with David Chou, the CIO of a large academic medical center. A recognized thought leader, David is on the Huffington Post’s 2015 list of the top 100 most social CIOs on Twitter, and I would definitely recommend following him. Enjoy!

 

Ben: Could you give us some background on your IT experience?

David: I was fortunate to be put on the IT fast track. I was your typical college student getting a BA in Computer Science, and somehow I landed an analyst job at a small community hospital in LA. This allowed me to get the opportunity to really understand the health care industry from an operational standpoint. From there, I focused on understanding operations and then finding the right technologies to fit in. I took the opposite approach than most IT professionals do. I dug deep into the operations model and then figured out which technologies worked well and matched them. That approach led to me getting exposure up the food chain that opened some doors for me. One thing I realized when talking to my counterparts who are successful is that you have to grasp opportunities, even if it means disrupting other aspects of your life.

 

Ben: What is your job like now?

David: Currently, I work at a large academic medical center. In bigger medical centers, there are typically CIOs across all three verticals – healthcare, research, and higher education. Oftentimes, this causes tension and barriers in terms of adoption. In my position, I have control over all three, which is a pretty unique model to have. In addition, we are a public center which also makes us unique in how we operate.

 

Ben: What are your main responsibilities?

David: Today, I manage day to day operations and an $82 million budget. Early in my career the CIO operated transactional data entry, maintaining mainframes, etc. Now it’s a lot more strategic. Technology should be at the core of every organization. The CIO has to be involved strategically. This means being a part of the executive team and having a seat at the table.

{Follow David on Twitter @dchou1107}

Ben: What areas of IT do you think are having the biggest impact on the industry?

David: Right now the focus is on the “4 pillars” of cloud, mobile, social and big data. Any executive that doesn’t have that vision is not going to be well off in the future. These are extremely important and strategic to me. I am trying to get the organization to adopt the cloud. Organizational culture plays a big role in this. Cloud can be an uncomfortable topic so that’s a barrier. I’m challenging that traditional mindset.

Mobile is also very big for us. Consumers in healthcare want to have personalized medicine. They want to shop for healthcare the same way they shop on Amazon. That’s where I believe healthcare is moving towards – a retail model. Whoever successfully pulls that off first is going to cause a huge disruption. We’re all trying to figure out how to utilize it. We want to be able to predict outcomes and provide the best customer experience possible.

I really believe in the importance of social media and the value of capturing consumer engagement and behavior. In my vertical, it has not been widely adopted yet. The big focus has been on cloud, mobile and big data.

 

Ben: How are you incorporating those technologies in your organization?

David: We’re in the process of incorporating a hybrid cloud model in our environment. From a budgetary and contractual perspective we’re all ready to go, we’re just getting the organization’s terms and conditions aligned with the cloud  providers. It’s a challenge for us to get public cloud providers to agree to our terms and conditions.

Our Electronic Medical Record system went live a year ago. Four years ago we had disparate systems that took a lot of manual upkeep. The first step to remedying this was moving from manual to digital. Now that we have that new format, we can take a controlled approach. We’ll look into some consumer friendly products that allow users to have access to data and have self-serving and provisioning capabilities. After this is implemented for a year, my goal is to take another look. We’ll have what we need to solve 80% of problems, so the question will be whether or not that extra 20% is worth a full blown BI platform for analytics?

 

Ben: What advice do you have for other CIOs starting out in the healthcare industry?

David: Take the time to build that relationship with the business. Learn the terms and lingo. Talking tech won’t work with most business executives so you need to adapt. Ultimately, you need to focus on understanding the needs of the customer and solving those needs.

 

Are you interested in winning a GoPro? Subscribe to our blog by 2/12/2015 for your chance to win!

 

By Ben Stephenson, Emerging Media Specialist

Top 25 Findings from Giagom’s 4th Annual “Future of Cloud Computing” Survey

By Ben Stephenson, Journey to the Cloud

 

Giagom Research and North Bridge Partners recently released their 4th annual “Future of Cloud Computing” study. There was some great data gathered from the 1,358 respondents surveyed. In case you don’t have time to click through the entire 124 slideshare deck, I’ve pulled out what I think are the 25 most interesting statistics from the study. Here’s the complete deck if you would like to review in more detail.

 

  • 49% using the cloud for revenue generating or product development activities (Slide 9)
  • 80% of IT budget is used to maintain current systems (Slide 20) <–> GreenPages actually held a webinar recently explaining how organizations can avoid spending the majority of their IT budgets on “keeping the lights on
  • For IT across all functions tested in the survey, 60-85% of respondents will move some or significant processing to the cloud in the next 12-24 months (Slide 21)
  • Shifting CapEx to OpEx is more important for companies with over 5,000 employees (Slide 27)
  • For respondents moving workloads to the cloud today, 27% said they are motivated to do so because they believe using a cloud platform service will help them lower their capital expenditures (Slide 28)
  • Top Inhibitor: Security, remains the biggest concern, despite declining slightly last year, it rose again as an issue in 2014 and was cited by 49% of respondents (Slide 55)
  • Privacy is of growing importance. As an inhibitor, Privacy grew from 25% in 2011 to 31% (Slide 57)
  • Over 1/3 see regulatory/compliance as an inhibitor to moving to the cloud (Slide 60)
  • Interoperability concerns dropped by 45%, relatively, over the past two years…but 29% are still concerned about lock in (Slide 62)
  • Nearly ¼ people still think network bandwidth is an inhibitor (Slide 64)
  • Reliability concerns dropped by half since 2011 (Slide 66)
  • Amazon S3 holds trillions of objects and regularly peaks at 1.5 million requests per second (Slide 71)
  • 90% of world’s data was created in past two years…80% of it is unstructured (Slide 73) <–> Here’s a video blog where Journey to the Cloud blogger Randy Weis talks about big data in more detail
  • Approximately 66% of data is in the cloud today (Slide 74)
  • The number above is expected to grow 73% in two years (Slide 75)
  • 50% of enterprise customers will purchase as much storage in 2014 as they have accumulated in their ENTIRE history (slide 77)
  • IaaS use has jumped from 11% in 2011 to 56% in 2014 & SaaS has increased from 13% in 2011 to 72% in 2014 (Slide 81)
  • Applications Development growing 50% (Slide 84) <–> with the growth of app dev, we’re also seeing the growth of shadow IT. Check out this on-demand webinar “The Rise of Unauthorized AWS Use. How to Address Risks Created by Shadow IT.”
  • PaaS approaching the tipping point! PaaS has increased from 7% in 20111 to 41% in 2014. (Slide 85) <–> See what one of our bloggers, John Dixon, predicted in regards to the rise of PaaS at the beginning of the year.
  • Database as a Service expected to nearly double, from 23% to 44% among users (Slide 86)
  • By 2017, nearly 2/3rds of all workloads will be processed in cloud data centers. Growth of workloads in cloud data centers is expected to be five times the growth in traditional workloads between 2012 and 2017. (Slide 87)
  • SDN usage will grow among business users almost threefold…from 11% to 30%  (Slide 89) <–> Check out this video blog where Nick Phelps talks about the business drivers behind SDN.
  • 42% use hybrid cloud now (Slide 93)
  • That 42% will grow to 55% in 2 years (Slide 94) <–> This whitepaper gives a nice breakdown of the future of hybrid cloud management.
  • “This second cloud front will be an order of magnitude bigger than the first cloud front.” (Slide 117). <–> hmmm, where have I heard this one before? Oh, that’s right, GreenPages’ CEO Ron Dupler has been saying it for about two years now.

Definitely some pretty interesting takeaways from this study. What are your thoughts? Did certain findings surprise you?

 

 

 

A Guide to Successful Big Data Adoption

By Randy Weis, Practice Manager, Data Management & Virtualization

In this video, storage expert Randy Weis talks about the impact big data is having on organizations and provides an outline for the correct approach companies should be taking in regards to big data analytics.

http://www.youtube.com/watch?v=jZ3V2ynOD44

What is your organization doing in regards to big data? Email us at socialmedia@greenpages.com if you would like to talk to Randy in more depth about big data, data management, storage, and more.

Grading the Internet’s 2014 Tech Predictions

 

The time is here for bloggers across the internet to make their tech predictions for 2014 and beyond (we have made some ourselves around storage and cloud). In this post, a couple of our authors have weighed in to grade predictions made by others across the web.

Prioritizing Management Tool Consolidation vs. New Acquisitions

Enterprise customers will want to invest in new tools only when necessary. They should look for solutions that can address several of their needs so that they do not have to acquire multiple tools and integrate them. The ability to cover multiple areas of management (performance, configuration and availability) to support multiple technologies (e.g., application tiers) and to operate across multiple platforms (Unix, Windows, virtual) will be important criteria for enterprises to assess what management tools will work for them.  (eweek)

Agree – I have been saying this for a while.  If you want a new tool, get rid of 5 and consolidate and use what you have now or get one that really works. (Randy Becker)

 

Bigger big data spending

IDC predicts spending of more than $14 billion on big data technologies and services or 30% growth year-over-year, “as demand for big data analytics skills continues to outstrip supply.” The cloud will play a bigger role with IDC predicting a race to develop cloud-based platforms capable of streaming data in real time. There will be increased use by enterprises of externally-sourced data and applications and “data brokers will proliferate.” IDC predicts explosive growth in big data analytics services, with the number of providers to triple in three years. 2014 spending on these services will exceed $4.5 billion, growing by 21%. (Forbes)

Absolutely agree with this.  Companies of all sizes are constantly looking to garner more intelligence from the data they have.  Even here at GreenPages we have our own big data issues and will continue to invest in these solutions to solve our own internal business needs. (Chris Ward)

 

Enterprises Will Shift From Silo to Collaborative Management

 In 2014, IT organizations will continue to feel increased pressure from their lines of business. Collaborative management will be a key theme, and organizations will be looking to provide a greater degree of performance visibility across their individual silo tiers to the help desk, so it is easier and faster to troubleshoot problems and identify the tier that is responsible for a problem. (eweek)

Agree – cross domain technology experts are key!  (Randy Becker)

 

New IT Will Create New Opportunities

Mobility, bring-your-own device (BYOD) and virtual desktops will all continue to gain a foothold in the enterprise. The success of these new technologies will be closely tied to the performance that users can experience when using these technologies. Performance management will grow in importance in these areas, providing scope for innovation and new solutions in the areas of mobility management, VDI management and so on. (eweek)

Disagree – This is backwards. The business is driving change and accountability.  It is not IT that creates new opportunities – it is the business demanding apps that work and perform for the people using them. (Randy Becker)

 

Here comes the Internet of Things

By 2020, the Internet of Things will generate 30 billion autonomously connected end points and $8.9 trillion in revenues. IDC predicts that in 2014 we will see new partnerships among IT vendors, service providers, and semiconductor vendors that will address this market. Again, China will be a key player:  The average Chinese home in 2030 will have 40–50 intelligent devices/sensors, generating 200TB of data annually. (Forbes)

Totally agree with this one.  Everything and everybody is eventually going to be connected.  I wish I were building a new home right now because there are so many cool things you can do by having numerous household items connected.  I also love it because I know that in 10 years when my daughter turns 16 that I’ll no doubt know in real-time where she is and what she is doing.  However, I doubt she’ll appreciate the ‘coolness’ of that.  Although very cool, this concept does introduce some very real challenges around management of all of these devices.  Think about 30 billion devices connected to the net….  We might actually have to start learning about IPv6 soon… (Chris Ward)

 

Cloud service providers will increasingly drive the IT market

As cloud-dedicated datacenters grow in number and importance, the market for server, storage, and networking components “will increasingly be driven by cloud service providers, who have traditionally favored highly componentized and commoditized designs.” The incumbent IT hardware vendors will be forced to adopt a “cloud-first” strategy, IDC predicts. 25–30% of server shipments will go to datacenters managed by service providers, growing to 43% by 2017. (Forbes)

Not sure I agree with this one for 2014 but I do agree with it in the longer term.  As more and more applications/systems get migrated to public cloud providers, that means less and less hardware/software purchased directly from end user customers and thus more consolidation at the cloud providers.  This could be a catch 22 for a lot of the traditional IT vendors like HP and Dell.  When’s the last time you walked into an Amazon or Google datacenter and saw racks and racks of HP or Dell gear?  Probably not too recently as these providers tend to ‘roll their own’ from a hardware perspective.  One thing is for sure…this will get very interesting over the next 24 to 36 months… (Chris Ward)

 

End-User Experience Will Determine Success

Businesses will expect IT to find problems before their users do, pinpoint the root cause of the problem and solve the problem as early as possible. IT organizations will seek solutions that will allow them to provide great user experience and productivity. (eweek)

Agree – 100% on this one. Need a good POC and Pilot that is well managed with clear goals and objectives. (Randy Becker)

 

Amazon (and possibly Google) to take on traditional IT suppliers

Amazon Web Services’ “avalanche of platform-as-a-service offerings for developers and higher value services for businesses” will force traditional IT suppliers to “urgently reconfigure themselves.” Google, IDC predicts, will join in the fight, as it realizes “it is at risk of being boxed out of a market where it should be vying for leadership.” (Forbes)

I agree with this one to an extent.  Amazon has certainly captured a good share of the market in two categories, developers and large scale-out applications and I see them continuing to have dominance in these 2 spaces.  However, anyone who thinks that customers are forklift moving traditional production business applications from the datacenter to the public cloud/Amazon should really get out in the field and talk to CIOs and IT admins as this simply isn’t happening.  I’ve had numerous conversations with our own customers around this topic, and when you do the math it just doesn’t make sense in most cases – assuming the customer has an existing investment in hardware/software and some form of datacenter to house it.  That said, where I have seen an uptake of Amazon and other public cloud providers is from startups or companies that are being spun out of a larger parent. Bottom line, Amazon and others will absolutely compete with traditional IT suppliers, just not in a ubiquitous manner. (Chris Ward)

 

The digitization of all industries

By 2018, 1/3 of share leaders in virtually all industries will be “Amazoned” by new and incumbent players. “A key to competing in these disrupted and reinvented industries,” IDC says, “will be to create industry-focused innovation platforms (like GE’s Predix) that attract and enable large communities of innovators – dozens to hundreds will emerge in the next several years.” Concomitant with this digitization of everything trend, “the IT buyer profile continues to shift to business executives. In 2014, and through 2017, IT spending by groups outside of IT departments will grow at more than 6% per year.” (Forbes)

I would have to agree with this one as well.  The underlying message here is that IT spending decisions continue to shift away from IT and into the hands of the business.  I have seen this happening more and more over the past couple of years and can’t help but believe it will continue in that direction at a rapid pace. (Chris Ward)

What do you think about these predictions? What about Chris and Randy’s take on them?

Download this free eBook about the evolution of the corporate IT department.

 

 

IBM Acquires Aspera for Fast Big Data Transfer

IBM today announced it has entered into a definitive agreement to acquire Aspera, a privately held company based in Emeryville, California. This provides IBM with new and complementary capabilities to better enable companies to move Big Data, on premise or in the cloud, at global distances with the speed required by today’s business.

Aspera’s patented extreme file transfer technology accelerates the secure transfer of large files and large collections of files by up to 99.9 percent – reducing a 26 hour transmission of a 24 gigabyte file, sent halfway around the world, down to just 30 seconds. This speed is powered by Aspera’s patented” fasp protocol”, which breaks the bottlenecks inherent in broadband networks to achieve high performance, efficiency and security in the most difficult WAN environments. Recently awarded an Emmy for engineering, Aspera is used at virtually every major Hollywood studio, cable provider and pharmaceutical company with leading brands such as Netflix, PBS and Universal Studios.

Take a Photo Tour of Facebook’s Amazing Cold Storage Datacenter

There’s a fascinating photo tour of Facebook’s Oregon data center on readwrite today.

Facebook (arguably) owns more data than God.

But how to store a cache of user data collected at the scale of omniscience? If you’re Facebook, just build another custom-crafted server storage locker roughly the size of the USS Abraham Lincoln on top of a breezy plateau in the Oregon high desert. The company’s new Prineville, Ore., data center employs an ultra-green ”cold storage” plan designed from the ground up to meet its unique—and uniquely huge—needs.

The piece also includes useful links on the tech behind the data center, shingled drive tech, and the Open Compute project that led to the innovations on display here.