Book Your 10×10 @CloudEXPO Booth for $2,000 By January 31 | #Cloud #CIO #IoT #DevOps #APM #Monitoring #Blockchain #ArtificialIntelligence

At CloudEXPO Silicon Valley, June 24-26, 2019, Digital Transformation (DX) is a major focus with expanded DevOpsSUMMIT and FinTechEXPO programs within the DXWorldEXPO agenda. Successful transformation requires a laser focus on being data-driven and on using all the tools available that enable transformation if they plan to survive over the long term. A total of 88% of Fortune 500 companies from a generation ago are now out of business. Only 12% still survive. Similar percentages are found throughout enterprises of all sizes.

read more

The Cloudera-Hortonworks $5.2bn merger analysed: Challenges, competition and opportunities

When Cloudera and Hortonworks, two of the biggest big data behemoths, first announced they were coming together in a $5.2 billion blockbuster merger in October, the questions were almost infinite. Yet the most important two seemed to be: why now? And what does the future hold?

Now, after the transaction officially closed earlier this month, the answers can be a little more candid. Perhaps surprisingly, the new company admits the move came about as a result of feeling the heat from its traditional competitor base as well as the public cloud giants.

Here’s what we know about the deal:

  • Conversations between Cloudera and Hortonworks around M&A activity had started as far back as 2015 with various factors meaning last summer was ‘the right time for both businesses’ to merge
  • The new company will be called Cloudera going forward – the common platform which results will be called Cloudera Data Platform aimed as a nod to the Hortonworks Data Platform suite
  • The first technical kick-off of the joined entity took place earlier this month in Scottsdale, Arizona. Cloudera says ‘80%-85%’ of the most burning issues around overlapping product sets were able to be resolved there

It’s usually good if you know you’re under pressure to face up and admit it rather than endlessly be in denial. And it’s even better if, like Cloudera, you have an action plan to resolve the situation.

Chief marketing officer Mick Hollison (left) notes that, despite the arrows coming in from two different fronts, the challenges are broadly similar. The company’s software, as well as its strategy, is based around what it is calling the ‘enterprise data cloud’. The strategy is based around three strands; supporting every possible cloud implementation, from hybrid to public to multi-cloud; supporting a wide range of analytic capabilities; and going the extra mile on an open philosophy, from open storage, to compute, to integration.

Getting this mix right, Cloudera hopes, will satisfy even the largest and most demanding enterprise customers – and put them one step ahead of the competition in the process.

“If I look at the public cloud providers, they’re inherently never going to be multi-cloud,” Hollison tells CloudTech. “You’ll continue to see [them] dipping toes more into hybrid – [it’s] future state but something they plan to get into. Multi is something they’re not likely to get into.

“The second part public cloud vendors will struggle with a bit is that security and governance and common metadata layer,” Hollison adds. “That doesn’t exist for those vendors today. If you buy EMR from Amazon and you also buy RedShift from Amazon, you get a different security, governance and metadata stack with each of those offerings. We offer commonality at that layer.”

Looking at the more traditional big data companies – the word traditional being used loosely – Hollison again pulls no punches. “If I look across the way, with the more purpose-built data warehouse cloud [vendors], those companies and those offerings are very compelling for their one function that they offer. They’re not at a point where they’re building out their technology into a platform they can offer a set of shared services across.”

It’s worth noting here that Cloudera freely admits it hasn’t got all the pieces in place yet. Yet one element which all sides can agree on is that customer expectations have skyrocketed in recent years. “The expectations are seemingly infinite,” says Hollison. “The raw scale and quantity of data consumption by our largest customers is just orders of magnitude beyond what any of us could have ever imagined not terribly long ago.

“The other dimension is that customers have high demands around cloud,” Hollison adds. “Many of our large enterprises customers have a bit of a concern around being locked in to any one cloud vendor. Regardless of partnership, they don’t want to take the public cloud on as a new version of an IBM or Oracle lock-in.”

Part of this heightened sense of expectation is around the promise of artificial intelligence (AI) and machine learning (ML); a necessary strategic point for the vendors. A report from venture capital firm Work-Bench in August predicted that ‘all modern [business intelligence] vendors to either release an automated machine learning product or buy a startup’ by the end of 2019.

Cloudera was ahead of this curve buying data science platform Sense.io in 2016, with the technology acquired forming the backbone of the company’s Data Science Workbench product. “It’s a very logical step from my point of view,” Hollison says of fusing AI and ML with big data. “The term we’ve been using is to ‘industrialise AI’, to make it more like a factory.

“Most of the ML and AI that has been done in enterprises to date has been pretty bespoke. It hasn’t necessarily been done against well secured and governed data sets supported by IT,” adds Hollison. “It’s often been scraped onto a laptop by a data scientist, putting that data at risk. When you combine data security and management capabilities that Cloudera offers, with an easy to use workbench that allows them to continue to use the languages and frameworks that they like, it’s a pretty good combination that makes both the data scientists and IT happy that data is being used in an intelligent way.”

Going forward, Hollison promises a lot of hard work on integrating and hardening the sales operations, as well as more go-to-market pieces. Yet in other areas progress has been more seamless than one would expect. The companies noted that approximately two thirds of each other’s code base had commonality, while the engineering teams were easier to satiate.

“You might have thought we’d have more challenges on that front, but what I think people forget is a lot of these engineers have been working together for upwards of 10 years in the open source community,” he says. “Even though the big corporate push might have been very competitive, at a code-writing, engineering level, there’s a lot of mutual respect between the teams.”

While there’s still a lot to do, the building blocks are in place. “We knew we would be a much stronger, more formidable competitor together than we would be continuing to take shots at one another,” adds Hollison.

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.

Privacy activist slams Apple, Amazon, YouTube and Spotify for GDPR violations


Clare Hopping

22 Jan, 2019

European citizens have complained en masse that companies such as Amazon, Apple, YouTube, and Spotify are misusing their data.

The claims were brought to light by Austrian privacy group noyb, led by activist Max Schrems, which say that the companies highlighted are not adhering to the General Data Protection Regulations (GDPR).

Schrems said in the complaint that he requested eight streaming companies including Amazon, Apple, Spotify, Netflix, Soundcloud, YouTube and UK-based sports streaming service DAZN, provide the information they hold about customers.

However, Soundcloud and DAZN didn’t even respond to the request, while the others failed to provide adequate data about how their customers’ details were used.

“Spotify takes data privacy and our obligations to users extremely seriously,” the company said in a statement. “We are committed to complying with all relevant national and international laws and regulations, including GDPR, with which we believe we are fully compliant.”

Schrems said that most companies his group sent requests to simply set up automated replies but this isn’t compliant with GDPR. Companies must provide details about the data they collect, how the data is used and stored and who it’s shared with in order to comply with the law.

“In most cases, users only got the raw data, but, for example, no information about who this data was shared with,” Schrems added.

The companies could find themselves with fines of up to €20 million or 4% of their global turnover if they’re found to be violating GDPR regulations.

Other companies such as Facebook and Google have come under fire for not adhering to the law in Europe following the introduction of the GDPR in May last year.

Dropbox Business Advanced review: First-rate filesharing


Dave Mitchell

22 Jan, 2019

A great all-round business service with unlimited storage, smart collaboration tools and tight security

Price 
£15 exc VAT

It’s almost inevitable that most people will have encountered Dropbox at some stage, as it’s one of the most popular file sharing cloud platforms. For good reason too; Dropbox offers a great set of features with its Business plans adding essential collaboration tools and enhanced security.

We’re reviewing the Business Advanced plan which augments the Standard version with unlimited cloud storage, tiered admin roles and file event tracking. You can sign up for a 5-user, 30-day trial and Dropbox only wants your card details if you decide to keep it.

Adding new members to your team is a breeze as invitations are emailed directly from the admin console. After clicking on the link, users enter their name, choose a password and download the Dropbox app.

The whole process takes 5-6 minutes and each user is provided with a personal Dropbox folder and direct access to all shared folders they have permission to see. If permitted, they can also decide whether to save on hard disk space and have files only available online.

Their local Dropbox folder shows all files stored in the cloud which can then be downloaded for editing. The local option still stores the files in the cloud but also downloads them to their Dropbox folder for instant access.

File sharing features are excellent with the contents of any team folder initially available to all members. You can fine-tune this by deciding who is allowed to access and edit top level team folders and set sharing permissions right down to individual folders and files.

Link settings protect shared folders and files so you can decide who can see them, what editing privileges they have and whether they can post comments. They can be password protected and an expiry date set so the link then becomes unavailable.

Admin tasks can be delegated by adding privileges to selected members. User admins can add and remove team members and groups while support admins handle account password management and deleted file recovery requests.

File requests are a handy feature as you can ask anyone, including those without a Dropbox account, to send you a file. Just create an email request and the recipient chooses a local file and sends it to the Dropbox folder you specified in the request.

Dropbox Paper delivers great online document editing tools. It allows team members to create documents that can be viewed, shared and edited directly from the Dropbox web portal and exported in Word, PDF and Markdown formats.

Dropbox Badge supports Microsoft Office files, allows you to see who else is viewing and editing shared files on their desktop and lets you update your version with any changes they’ve made. It requires the Windows or Mac Dropbox agent loaded and adds an icon at the side of the document window which shows who else is accessing the document.

General access security is tight – admins can enable global password controls and choose the ‘moderate’ or ‘very strong’ setting. Dropbox compares their passwords with a pattern database and will stop weak or easily guessed passwords being used.

You can enable this feature at any time, while regular password changes can be enforced from the admin console with one click. There’s more: the plan also includes two-factor authentication (2FA) which uses unique six-digit security codes, while single sign-on (SSO) allows you to redirect user logins to identity providers that support SAML 2.0.

Businesses that want plenty of cloud collaboration tools and no limits on storage will love Dropbox Business Advanced. File sharing features are beyond reproach, it’s easy to manage and account security doesn’t get much tougher either.

Paessler PRTG Network Monitor 18.4 review: A comprehensive monitoring suite


Dave Mitchell

22 Jan, 2019

Features galore and great value makes this one of the best network monitoring tools on the market

Price 
£3,939 exc VAT

Paessler’s PRTG Network Monitor is perfect for SMEs that want to know about everything on their network, as it offers over 250 sensor types. Even better, these are all included as standard so you don’t have to worry about any future costs for optional features.

PRTG’s licensing is based on the number of monitored elements, which can be anything from a CPU core or network switch port to a web URL or Exchange mail queue. The sensor count can get eaten up very quickly on large networks but you can delete those you don’t want so they’re returned to the licensing pool for other devices.

We’ve been running PRTG on a Windows 7 desktop in the lab for over six years, and Paessler’s continuous rollout model has faithfully kept it updated to the latest version. Our host system is fine for monitoring our lab network of up to 50 devices and 500 sensors but for larger networks, Paessler would prefer you to use Windows Server 2012 R2 or later.

New users are guided along with a network discovery wizard which does all the hard work for them. Once it has identified each device, it assigns the most appropriate set of sensors to them along with pre-set threshold triggers so it can start issuing alerts immediately.

PRTG’s web console is a slick affair and its home page provides quick readout graphs showing the status of all sensors and alarms. The latest v18.4 has been updated so that all the status icons next to the donut charts are now active and clicking on one takes you directly to a filtered view.

The console presents a wealth of valuable information with systems neatly organised into hierarchical groupings. You can move systems to other groups as required, where they inherit settings such as discovery schedules and login credentials from the parent group – or they can have their own settings.

It’s easy to pinpoint trouble areas as the colour-coded sensors show clearly whether they are up, down, paused or in a warning state. Hover the mouse over any sensor and it’ll pop up a window with graphs of live data and any relevant warning messages.

PRTG offers ten notification methods including email, SMS, Syslog and Slack. Events can be tied in with actions and the pre-set values provided by the wizard can be changed at any level of the group hierarchy.

Extra sensors can be added to the web console and PRTG scores highly for its cloud service sensors. The price includes ones to monitor Amazon CloudWatch services, Google Drive, Microsoft OneDrive and Dropbox while the Common SaaS sensor keeps an eye on Office 365, SalesForce and Google Apps.

To help interpret sensor data, PRTG provides pre-set views such as the top ten sensors for uptime, downtime or CPU usage but it’s easy enough to create your own custom dashboards. Paessler also delivers unbeatable mobile support and we used the iOS app on our iPad to remotely access the main PRTG server, view the same content as the web console and receive alerts via push notifications.

The new eHealth sensor even puts hospitals on Paessler’s radar, as it monitors the HL7 (Health Level 7) and DICOM (Digital Imaging and Communications in Medicine) protocols. It has no access to patient data and is designed to monitor these systems and provide early warnings of problems.

Paessler’s PRTG Network Monitor 18.4 presents a well-designed console that tells you everything you need to know about your network. The huge range of sensors makes it a compelling choice for businesses of all sizes and the all-inclusive price means there are no hidden costs to worry about.

Predicting the future of next-gen access and Zero Trust Security in 2019: Challenges ahead

Bottom line:  The most valuable catalyst all digital businesses need to continue growing in 2019 is a Zero Trust Security (ZTS) strategy based on Next-Gen Access (NGA) that scales to protect every access point to corporate data, recognising that identities are the new security perimeter.

The faster any digital business is growing, the more identities, devices and network endpoints proliferate. The most successful businesses of 2019 and beyond are actively creating entirely new digital business models today. They’re actively recruiting, and onboarding needed experts independent of their geographic locations and exploring new sourcing and patent ideas with R&D partners globally. Businesses are digitally transforming themselves at a faster rate than ever before. 

Statista projects businesses will spend $190B on digital transformation in 2019, soaring to $490B by 2025, attaining a 14.4% Compound Annual Growth Rate (CAGR) in six years.

Security perimeters make or break a growing business

80% of IT security breaches involve privileged credential access according to a recent Forrester study. The Verizon Mobile Security Index 2018 Report found that 89% of organisations are relying on just a single security strategy to keep their mobile networks safe. A typical data breach cost the average company $3.86M in 2018, up 6.4% from $3.62M in 2017 according to IBM Security’s latest  2018 Cost of a Data Breach Study.

The hard reality for any digital business is realising that their greatest growth asset is how well they protect the constantly expanding perimeter of their business. Legacy approaches to securing infrastructure that relies on trusted and untrusted domains can’t scale to protect every identity and device that comprises a company’s rapidly changing new security perimeter. All these factors and more are why Zero Trust Security (ZTS) enabled by Next-Gen Access (NGA) is as essential to digital businesses’ growth as their product roadmaps, pricing strategies, and services with Idaptive being an early leader in the market. To learn more about Identity-as-a-Service please see the Forrester report, The Forrester Wave: Identity-As-A-Service, Q4 2017 (client access required)

Predicting the future of next-gen access and Zero Trust Security

The following are predictions of how Next-Gen Access (NGA) powered by Zero Trust Security (ZTS) will evolve in 2019:

  • Behaviour-based scoring algorithms will improve markedly in 2019, improving the user experience by calculating risk scores with greater precision than before. Thwarting attacks start with a series of behaviour-based algorithms that calculate a risk score based on a wide variety of variables including past access attempts, device security posture, operating system, location, time of day, and many other measurable factors. Expect to see these algorithms and the risk scores they generate using machine learning techniques improve from accuracy and contextual intelligence standpoint in 2019. Leading companies in the field including Idaptive are actively investing in machine learning technologies to accomplish this today.
     
  • Multi-factor authentication (MFA) adoption soars as digital businesses seek to protect new R&D projects, patents in progress, roadmaps, and product plans. State-sponsored hacking organisations and organised crime see the intellectual property in fast-growing digital businesses as among the most valuable assets they can exfiltrate and sell on the Dark Web. MFA, one of the most effective single defenses against compromised passwords, will be adopted by the most successful businesses in AI, aerospace & defense, chip design for cellular and IoT devices, e-commerce, enterprise software and more.
     
  • Smart, connected products without adequate security designed in will proliferate in 2019, further challenging the security perimeters of the digital businesses. The era of smart, connected products is here, with Capgemini estimating the size of the connected products market will be $519B to $685B by 2020. Manufacturers expect close to 50% of their products to be smart, connected products by 2020, according to Capgemini’s Digital Engineering: The new growth engine for discrete manufacturers. The study is downloadable here (PDF, 40 pp., no opt-in). With every smart, connected device creating a new threat surface for a company, expect to see at least one device manufacturer design Zero Trust Security (ZTS) support to the board level to increase their sales into enterprises by reducing the threat of a breach starting from their device.
     
  • Looking for greater track and traceability, healthcare and medical products supply chains will adopt Zero Trust Security (ZTS): What’s going to make this an urgent issue in healthcare and medical products are the combined effects of greater regulatory reporting and compliance, combined with the pressure to improve time-to-market for new products and delivery accuracy for current customers. The pillars of ZTS are a perfect fit for healthcare and medical supply chains’ need for track and traceability. These pillars are real-time user verification, device validation, and intelligently limiting access, while also learning and adapting to verified user behaviours.
     
  • Real-time security analytics services are going to thrive in 2019 as digital businesses seek insights into how they can fine-tune their ZTS strategies across every threat surface and machine learning algorithms improve. Many enterprises are in for an epiphany in 2019 when they see just how many potential breaches they’ve stopped using a combination of security strategies including single sign-on (SSO) and multi-factor Authentication (MFA). Machine learning algorithms will continue to improve using behaviour-based scoring, further improving the user experience. Leaders in the field include Idaptive who is setting a rapid pace of innovation in Real-Time Security Analytics Services.   

Conclusion

Security is at an inflection point today. Long-standing methods of protecting IT systems and a businesses’ assets can’t scale to protect every new identity, device or threat surface. When every identity is a new security perimeter, a new approach is needed to securing any digital business. The pillars of ZTS including real-time user verification, device validation, and intelligently limiting access, while also learning and adapting to verified user behaviours are proving to be effective at thwarting breaches and securing company’ digital assets of all kinds. It’s time for more digital businesses to see security as the growth catalyst it is and take action now to ensure their operations continue to flourish.

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.

Top 10 DevOps Influencers | @CloudEXPO #DevOps #Serverless #CloudNative #Docker #Kubernetes #Microservices

The graph represents a network of 1,329 Twitter users whose recent tweets contained “#DevOps”, or who were replied to or mentioned in those tweets, taken from a data set limited to a maximum of 18,000 tweets. The network was obtained from Twitter on Thursday, 10 January 2019 at 23:50 UTC.

The tweets in the network were tweeted over the 7-hour, 6-minute period from Thursday, 10 January 2019 at 16:29 UTC to Thursday, 10 January 2019 at 23:36 UTC.

Additional tweets that were mentioned in this data set were also collected from prior time periods. These tweets may expand the complete time period of the data.

read more

IBM secures $325 million deal to help Juniper Networks develop cloud-native landscape

IBM has gotten Juniper Networks on board in a $325 million (£252m) deal which will see the former assist the latter in enhancing their cloud journey.

The seven-year agreement – which has overtones of Microsoft’s recent deal inked with Walgreens in terms of price and length – will see IBM use its autonomously managing IT platform IBM Services Platform with Watson to help manage Juniper’s infrastructure, from help desks and support systems to data centres.

By utilising IBM Services, Juniper will also aim to create an agile IT environment. Again with automation – IBM describes its Services Platform with Watson as a product which ‘partners humans with cognitive technology’ – the goal is for efficiency, cost saving and helping Juniper create a cloud-native landscape. IBM calls this the ‘factory development’ concept.

The move can be seen as yet another step of a major enterprise – Juniper ranks just outside the Fortune 500 – heading towards a multi-cloud strategy. “Our work with thousands of enterprises globally has led us to the firm belief that a ‘one-cloud-fits-all’ approach doesn’t work and companies are choosing multiple cloud environments to best meet their needs,” said Martin Jetter, SVP of IBM global technology in a statement.

“Working with Juniper, we are integrating cloud solutions with their existing IT investments via the IBM Service Platform with Watson,” Jetter added. “This gives them the opportunity to generate more value from existing infrastructure, along with helping them manage strategic services that are critical to their business.”

It has been a busy start to their year for IBM on the alliance front. Last week the company announced a strategic commercial agreement with operator Vodafone. The venture focused broadly on digital transformation initiatives and the next wave of cloud services in the shape of artificial intelligence (AI), 5G, edge computing and software defined networking (SDN). On a more practical level, it would ensure Vodafone Business customers would have immediate access to IBM’s entire cloud portfolio.

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.

Our 5-minute guide to virtual private cloud


Cloud Pro
Esther Kezia Thorpe

21 Jan, 2019

Most IT professionals are familiar with the three primary cloud options; public cloud, private cloud and hybrid cloud. Businesses all around the world are using the cloud in different ways to streamline processes, improve collaboration and communication, and save money on expensive infrastructure, while still getting the benefits of cutting-edge technology.

But as the cloud and the services around it evolve, so do the ways it can be tailored for individual business’ needs. Awareness is now growing of the potential of virtual private clouds, so here, we explain what a virtual private cloud is, why organisations are using them, and what the advantages and disadvantages are.

What is a virtual private cloud?

A virtual private cloud is a private network running within shared public cloud infrastructure, using virtualisation to isolate the resources being used specifically for each user. It ensures the same levels of security that private cloud offers, but virtualisation enables users to run applications, services and a variety of workloads on shared resources, allowing users to benefit from the flexibility of a public cloud. It also offers organisations and their IT staff the ability to isolate select applications and services within the cloud even if they are sharing the same underlying physical hardware as other apps and workloads.


Learn more about Virtual Private Cloud and how it could help your business in this special report from IT Pro.

Download now


Implementations of virtual private cloud vary, but often operate like a private cloud running on dedicated hardware but that runs from inside a multi-tenant environment.

In most ways, virtual private clouds operate very much like private clouds, where businesses pay depending on what level of performance they require. But virtualisation allows delivery of virtual, software-defined infrastructure within a shared hardware infrastructure, before making that available to customers through the internet.

With a virtual private cloud, teams can spin up their own virtual machines, test apps, and innovate. Businesses can pool their resources to get a better understanding of benefits and costs. As well as this, private clouds improve availability by making workloads portable, and adding in more options to scale outwards or recover in a crisis.

Why use a virtual private cloud?

Not all companies will need a virtual private cloud, and for some businesses, public clouds offer everything they need at a reasonable cost. But for others, either a private cloud or a hybrid cloud system is the best one for their organisation.

Common reasons for using a virtual private cloud include:

  • Needing a consistently high level of performance on business-critical applications. Whereas the public cloud can handle heavy workloads, private cloud offers more control over compute and storage performance for applications where speed is essential.

  • Minimising downtime on the business. With more control and allocated resources, private cloud can help ensure availability, and many private cloud services have service level agreements and uptime guarantees as well as more visibility into any issues that occur.

  • More control over security. A private cloud isn’t necessarily more secure than a public one as much depends on where the physical services are located as well as the steps taken to secure them, but businesses have much more control when it comes to access and building defences around data. Some private cloud services also allow more granular control over network security and patching.

  • Being bound by regulatory frameworks around where and how data is processed and stored. GDPR has had an impact in this area, not necessarily because public clouds aren’t compliant, but because private clouds can make compliance that bit easier.

Pros and cons of virtual private cloud

When compared to public cloud, virtual private cloud gives users more control, and often more consistent performance. There’s no contention for compute, storage or network resources, and a reduced risk of ‘noisy neighbour’ syndrome, where another user on the same host hardware keeps processor or storage bandwidth.


‘The IT Pro guide to Virtual Private Cloud’ lifts the lid on VPC, explaining what it is, how it works and how it can help businesses. Download it here.

Download now


There are also advantages to operating in a virtual rather than a physical environment, given that it makes it much easier to scale compute and storage demands upwards or downwards depending on business demand. If more or less CPU cores, RAM or storage resources are needed, these can be easily added and removed.

Because of the way virtual private clouds work, the workloads themselves are still hosted outside the organisation’s own data centre. This can potentially cause issues in industries which have strict regulations around where and how data is processed, and which applications can be run in a virtual private cloud.

Virtual private clouds often work out as being more cost-effective than investing in servers and infrastructure, especially when operating and maintenance costs are factored in. When compared to public cloud however, virtual private cloud is often a more expensive option, but is usually more affordable than a fully private cloud.

CFP Deadline at @CloudEXPO January 31 | #Cloud #CIO #DevOps #IoT #Blockchain #MachineLearning #ArtificialIntelligence

CloudEXPO has been the M&A capital for Cloud companies for more than a decade with memorable acquisition news stories which came out of CloudEXPO expo floor. DevOpsSUMMIT New York faculty member Greg Bledsoe shared his views on IBM’s Red Hat acquisition live from NASDAQ floor. Acquisition news was announced during CloudEXPO New York which took place November 12-13, 2019 in New York City.

Our Silicon Valley 2019 schedule will showcase 200 keynotes, sessions, general sessions, power panels, and hands on tutorials presented by 150 rockstar speakers in 10 hottest conference tracks of 2019:

read more