You’ve got 99 problems when it comes to public cloud compliance – but cryptojacking may not be one

If cloud is part of the conversation for organisational digital change, then cloud security will forever be not far behind. According to new research from Unit 42, compliance needs to be stepped up, yet cryptojacking may be on the decline.

The company – the threat intelligence arm of Palo Alto Networks – put together analysis based on existing threats to cloud security over the second half of 2018, focusing on Amazon Web Services (AWS), Microsoft Azure and Google Cloud environments.

The majority of findings predictably belied the spirit of the festive season. Almost one in three (29%) organisations assessed had potential account compromises, while more than two in five (41%) access keys had not been rotated at all over the past three months.

It only takes a brief glance at recent industry headlines and trends to come to the conclusion that any organisation could be at risk. Indeed, while the concept of shared responsibility for cloud security must again be emphasised – as the very first page of the Unit 42 report illustrates – it must be noted vendors are trying to help ease the burden.

For AWS environments, companies inadvertently setting ‘world-read’ permissions to their data repositories is the classic recipe for disaster. In 2017 the vendor spruced up its dashboard design, giving bright orange warning indicators for buckets which were publicly accessible. Feeling even this wasn’t enough, last month saw the launch of Amazon S3 Block Public Access, which aims to demarcate the process by offering configuration at the account level, on individual buckets, or on future buckets created.

Those who do walk through this open door can therefore get up to any nefarious scheme they choose. In the case of Tesla at the start of this year, for example, hackers got into unsecured S3 buckets to mine cryptocurrencies. Yet while more than one in 10 (11%) organisations experienced cryptojacking activity in their environments, it represented a significant decrease from 25% in May. Unit 42 puts this shift down to a combination of better detection tactics and weakening crypto value.

More good news, albeit unsurprising, came in the shape of container adoption. According to the research, one in three organisations analysed use native or managed Kubernetes orchestration, with a quarter utilising managed services in the cloud. AWS, Google and Microsoft all have products in this space – and for the former, as re:Invent showed last month, breadth of portfolio, from containers tightly integrated with AWS, to managed services, to more ad hoc approaches, is the key differentiator on the vendor side.

Yet the report warns that basic security hygiene is not being observed for container services, making Kubernetes pods vulnerable to attack. 46% of those polled had not applied ‘appropriate network policies’ for their managed Kubernetes services. Organisations should not rely on basic authentication for this, the researchers argue, as brute force attacks can result, instead going for IAM roles.

As far as compliance goes, however, the figures were ‘undeniable’, in the report’s words. One in three (32%) organisations publicly exposed at least one cloud storage services, while half (49%) of databases checked were not encrypted. While Unit 42 notes exposed public cloud storage is slowing trend, data encryption will have the potential to become a much more serious issue with GDPR top of mind. “Clearly, organisations have a long way to go before they can claim compliance in their public cloud environments,” the report warns.

You can read the full analysis here (pdf, email required).

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.

Microsoft announces behaviour visualisation tool Clarity


Clare Hopping

14 Dec, 2018

Microsoft has unveiled its Clarity analytics tools, designed to run A/B testing of user experiences on a website. It’s going head to head with market leaders such as Optimizely, Google Optimize and Visual Web Optimizer that have been helping businesses test the effectiveness of user experiences.

The suite will not only identify which experiences are the most profitable for an organisation, but it will also help companies identify errors and usability issues, making for a very effective tool in a marketing arsenal.

It doesn’t just tell you how users navigated through your site, but it can accurately replay exactly what potential customers clicked, hovered over, and tapped with their finger if using a touchscreen device. When Clarity is deployed on a website, it “listens” to browser events, network requests and user interactions. This data is then uploaded and stored on the Clarity server running on Microsoft Azure.

During testing, Clarity engineers were also able to uncover browser exploits, such as malware running in the Bing search engine.

Clarity highlighting evidence of malware in a Bing browser window

“Thanks to Clarity, Bing’s engineers were able see what the user actually saw – which wasn’t at all what they expected,” the company explained in a blog launching Clarity. “To diagnose the cause of the strange content, they used Clarity to determine that the content came from malware installed on the end user’s machine, which were hijacking the page and modifying the content. As a result, people did not engage with Bing content and page load performance was also impacted.”

Clarity can operate on any HTML web pages and is deployed using Javascript on the page, although it’s only available on some domains at the moment, including those with two and three letters TLDs, including .com and .uk.

If you’re planning a major move to the cloud – ask these 13 questions first

As the calendar turns to 2019, the rapid pace at which businesses are migrating their on-premises networks to the cloud continues unabated, with line items for cloud-based applications and cloud infrastructure now ranking among the three highest IT spending priorities, according to figures from Computer Economics.

Overall, the percentage of companies investing heavily in cloud-based software doubled in 2018, the research firm reports. And that momentum figures to continue, as only 20% of companies have converted at least half of their business applications to the cloud. “The appeal of SaaS is pretty clear,” says David Wagner, vice president of research at Computer Economics. “The increased flexibility, reduced infrastructure burden, and ease of upgrades should have made SaaS a slam dunk.”

One reason moving to the cloud hasn’t been a slam dunk to this point is the measured approach many businesses appear to be taking toward digital transformation. And in many respects, it’s wise to be circumspect about transitioning an enterprise network to the cloud, with nothing less than the company brand, the customer experience and the security of data — customers’ and your own — at stake.

Still, the widespread migration to SaaS and cloud-based network solutions such as SD-WAN is expected to continue gaining strength. “We expect that increased use of cloud infrastructure will mean a continued effort to move as many applications and processes out of the data centre as possible,” Computer Economics posits in its IT Spending & Staffing Benchmarks 2018/2019 report. “The wait is over, and IT organizations are finally jumping into the cloud in a more substantial way.”

Before jumping in, though, it’s critical that organizations and their IT decisions-makers assess their readiness for a migration to the cloud. Here’s a list of questions they should be asking themselves as they lay the groundwork for such a move:

What is driving your cloud strategy? In other words, what are your reasons for moving to the cloud and what benefits do you expect to realise from the move?

This question is designed to get at the fundamental strategic motivations for a cloud initiative. As such, the answer needs to come collectively from all the key stakeholders within the business — operations, IT, finance, sales/marketing, HR, and so on.

Describe the overall digital experience you envision, both customer-facing and back-office, once the transition is complete.

What’s the end game? What’s on each stakeholder’s wish list? The answers to these questions should help define the parameters of a cloud initiative.

What’s your vision for how the process of transitioning to the cloud will unfold?

Will it be a wholesale migration or a phased-in approach? What’s the timeframe? How will responsibilities for managing the process be divided?

Do you have more demands for delivery of business innovation than you have staff/time to support?

This question is designed to assess how well your organisation’s human resources align with the organisational drive to innovate — and it requires a real gut check.

Has IT conducted a needs assessment/business case around a potential move to the cloud to determine which customer-facing and back-office workloads to move to the cloud and why?

If a needs assessment has been conducted, what were the key findings? And if not, is there a plan to conduct one in advance of the cloud initiative? This is stating the obvious, but it’s always a good idea to conduct a thorough needs assessment before mobilising on a major initiative.

How does your network align to support your investment in your customer experience initiatives?

Here’s another question aimed at assessing strategic alignment, and one that goes hand-in-hand with the preceding question. While moving to the cloud means a smaller IT footprint in-house, it will drive additional network capacity.

What are the biggest strengths and weaknesses of your network today, back office and customer-facing?

Consider how and where to build on your strengths and shore up your weaknesses. Also consider what happens if the network goes down when your POS resides in the cloud.

What does your organisation’s application roadmap look like?

How might your digital user experience evolve and grow over time?

How do you envision your security needs changing when you move to the cloud?

As strategically vital as cloud-based digital services are to differentiating a brand, moving network infrastructure and apps to the cloud also raises the specter of significant new cyber vulnerabilities, including DDoS attacks, ransomware incursions, uninvited data exfiltration and other potentially devastating forms of breach. What types of security measures might you need to incorporate into your network to protect users, their data and your network assets?

How will you know you have enough bandwidth to support new and/or existing apps?

This assesses whether your network capacity can keep pace with your aspirations.

Could your company benefit from having a single provider manage your network, security and voice services as you focus on your digital transformation?

Working with a single entity can make life a lot simpler for an IT department. The economics of bundling also may benefit the bottom line.

To what extent has your organisation investigated SD-WAN as a network solution?

Businesses are embracing the software-defined wide-area network for its scalability, elasticity and cost-effectiveness compared to other hardware-focused network approaches.

Which internal stakeholders and departments should be engaged in the cloud initiative?

To put the initiative in position to succeed, be sure the right people have a seat at the table from the outset of the transition process.

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.

Oracle files lawsuit over $10bn Pentagon cloud contract


Bobby Hellard

13 Dec, 2018

Oracle has again launched legal proceedings over the US Pentagon’s single-vendor cloud contract, filing a suit against the Department of Defence in the US Court of Federal Claims.

The legacy database business has already had legal action dismissed by the Government Accountability Office, however, a redacted version of the company’s latest complaint, published this week, shows the company is not backing down.

However, the GAO maintains that the single vendor approach does not violate any laws and that for issues of national security the process is in the government’s best interests.

The contract, known as the Joint Enterprise Defense Infrastructure (JEDI) cloud, involves the migration of defence department data to a commercially operated cloud system. However, because the contract is only on offer to a single winning bidder, Oracle says it’s illegal and out of sync with the industry.

The company’s senior VP Ken Glueck told TechCrunch: “The technology industry is innovating around next-generation cloud at an unprecedented pace and JEDI as currently envisioned virtually assures DoD will be locked into legacy cloud for a decade or more. The single-award approach is contrary to well-established procurement requirements and is out of sync with the industry’s multi-cloud strategy, which promotes constant competition, fosters rapid innovation and lowers prices.”

This is a $10bn contract that involves almost 80% of the Department of Defence’s IT systems being migrated to the cloud that could last 10 years. Other cloud providers such as IBM and Google have taken issue with it being made available to just one vendor, with the latter having pulled out of the process entirely.

“We are not bidding on the JEDI contract because first, we couldn’t be assured that it would align with our AI Principles,” a Google spokesman said in a statement. “And second, we determined that there were portions of the contract that were out of scope with our current government certifications.”

IBM issued a similar legal challenge against the single-vendor process, which is said violate procurement regulations, but that case was also blocked.

Amazon Web Services is thought to be the front-runner for the contract, which is an outcome that Oracle will not like given the current mudslinging between the two. During his AWS re:Invent last month, CEO Andy Jassy announced that the company would be off Oracle databases by the end of 2019.

How machine learning can be used to find employees who can scale with your business

  • Eightfold’s analysis of hiring data has found the half-life of technical, marketable skills is 5 to 7 years, making the ability to unlearn and learn new concepts essential for career survival.
  • Applicant Tracking Systems (ATS) don’t capture applicants’ drive and intensity to unlearn and learn or their innate capabilities for growth.
  • Artificial intelligence (AI) and machine learning are proving adept at discovering candidates’ innate capabilities to unlearn, learn and reinvent themselves throughout their careers.

Hiring managers in search of qualified job candidates who can scale with and contribute to their growing businesses are facing a crisis today. They’re not finding the right or in many cases, any candidates at all using resumes alone, Applicant Tracking Systems (ATS) or online job recruitment sites designed for employers’ convenience first and candidates last.

These outmoded approaches to recruiting aren’t designed to find those candidates with the strongest capabilities. Add to this dynamic the fact that machine learning is making resumes obsolete by enabling employers to find candidates with precisely the right balance of capabilities needed and its unbiased data-driven approach selecting candidates works. Resumes, job recruitment sites and ATS platforms force hiring managers to bet on the probability they make a great hire instead of being completely certain they are by basing their decisions on solid data.

Playing the probability hiring game versus making data-driven decisions

Many hiring managers and HR recruiters are playing the probability hiring game. It’s betting that the new hire chosen using imprecise methods will work out. And like any bet, it gets expensive quickly when a wrong choice is made. There’s a 30% chance the new hire will make it through one year, and if they don’t, it will cost at least 1.5 times their salary to replace them.

When the median salary for a cloud computing professional is $146,350, and it takes the best case 46 days to find them, the cost and time loss of losing just one recruited cloud computing professional can derail a project for months. It will cost at least $219,000 or more to replace just that one engineer. The average size of an engineering team is ten people so only three will remain in 12 months. These are the high costs of playing the probability hiring game, fueled by unconscious and conscious biases and systems that game recruiters into believing they are making progress when they’re automating mediocre or worse decisions.

Hiring managers will have better luck betting in Las Vegas or playing the lottery than hiring the best possible candidate if they rely on systems that only deliver a marginal probability of success at best.

Betting on solid data and personalization at scale, on the other hand, delivers real results. Real data slices through the probabilities and is the best equalizer there is at eradicating conscious and unconscious biases from hiring decisions. Hiring managers, HR recruiters, directors and Chief Human Resource Officers (CHROs) vow they are strong believers in diversity. Many are abandoning the probability hiring game for AI- and machine learning-based approaches to talent management that strip away any extraneous data that could lead to bias-driven hiring decisions. Now candidates get evaluated on their capabilities and innate strengths and how strong a match they are to ideal candidates for specific roles.

A data-driven approach to finding employees who can scale

Personalization at scale is more than just a recruiting strategy; it’s a talent management strategy intended to flex across the longevity of every employees’ tenure. Attaining personalization at scale is essential if any growing business is going to succeed in attracting, acquiring and growing talent that can support their growth goals and strategies.

Eightfold’s approach makes it possible to scale personalized responses to specific candidates in a company’s candidate community while defining the ideal candidate for each open position. Personalization at scale has succeeded in helping companies find the right person to the right role at the right time and, for the first time, personalize every phase of recruitment, retention and talent management at scale.

Eightfold is pioneering the use of a self-updating corporate candidate database. Profiles in the system are now continually updated using external data gathering, without applicants reapplying or submitting updated profiles. The taxonomies supported in the corporate candidate database make it possible for hiring managers to define the optimal set of capabilities, innate skills, and strengths they need to fill open positions.

Lessons learned at PARC

Russell Williams, former Vice President of Human Resources at PARC, says the best strategy he has found is to define the ideal attributes of high performers and look to match those profiles with potential candidates. “We’re finding that there are many more attributes that define a successful employee in our most in-demand positions including data scientist that are evident from just reviewing a resume and with AI, I want to do it at scale,” Russell said. 

Ashutosh Garg, Eightfold founder, added: “that’s one of the greatest paradoxes that HR departments face, which is the need to know the contextual intelligence of a given candidate far beyond what a resume and existing recruiting systems can provide.”  One of the most valuable lessons learned from PARC is that it’s possible to find the find candidates who excel at unlearning, learning, defining and diligently pursuing their learning roadmaps that lead to reinventing their skills, strengths, and marketability.

Conclusion

Machine learning algorithms capable of completing millions of pattern matching comparisons per second provides valuable new insights, enabling companies to find those who excel at reinventing themselves. The most valuable employees who can scale any business see themselves as learning entrepreneurs and have an inner drive to master new knowledge and skills. And that select group of candidates is the catalyst most often responsible for making the greatest contributions to a company’s growth.

http://www.cybersecuritycloudexpo.com/global/wp-content/uploads/2018/10/ai-bigdata-world-series.png Interested in hearing industry leaders discuss subjects like this and sharing their use-cases? Attend the co-located AI & Big Data Expo events with upcoming shows in Silicon Valley, London and Amsterdam to learn more. Co-located with the  IoT Tech Expo, Blockchain Expo and Cyber Security & Cloud Expo so you can explore the future of enterprise technology in one place.

Does tape still have a place in my backup strategy?


Nik Rawlinson

20 Dec, 2018

“I know what some of you are thinking: how could this happen if we have multiple copies of your data in multiple data centres?” wrote Ben Treynor, Google’s VP of engineering. It was 2011, and Treynor was using a Google blog to address an embarrassing glitch. In the process of updating its storage software, the company had wiped 0.02% of its users’ email. That doesn’t sound like much – until you consider how many inboxes Gmail hosts.

The solution was more complex than merely switching to another data centre. As Treynor explained, “to protect your information from these unusual bugs, we also back it up to tape. Since the tapes are offline, they’re protected from such software bugs, but restoring data from them takes longer than transferring your requests to another data centre, which is why it’s taken us hours to get the email back instead of milliseconds.” All told, the restore operation took the best part of four days.

Seven years later, tape remains a mainstay of even the biggest cloud providers’ contingency plans. Should it also be part of yours?

Tape vs cloud

“I’m not saying that tape is always the right place to store all data all the time, [but] it’s the right place to store large amounts of data that you don’t need all the time for long periods of time,” explained Overland-Tandberg’s tape product line director, Peri Grover. “Mission-critical transactional stuff that you need back in a matter of seconds – of course you want that on disk or some other solution, and a lot of people are talking about the cloud.”

She readily agrees that there are advantages to cloud storage, but points to the impracticality of moving large amounts of data between sites. “Amazon Web Services may look inexpensive on the surface,” said Grover, “but when people start checking into it and find out that if they’ve got a 10TB file they need a 40Gbit [connection] to get it across in seconds, [they realise] that’s going to cost them hundreds of thousands of dollars.”

The transfer rates you get with tape could mean a faster recovery

Eric Bassier, senior director for product management at Quantum, also cites cloud’s relatively low speed as a reason for sticking with tape. With a service such as Amazon Glacier, he said, “its published SOA is hours, and that doesn’t count the time to transmit the data from the data centre to the local application. Your SOA from tape is minutes, versus days from the cloud. If you have tens or hundreds of terabytes of data to retrieve in a true disaster recovery scenario, it can be weeks.”

This is enough to make one of cloud’s key benefits – global availability, 24/7 – a moot point. “With tape, you have got an impressive transfer rate once you’ve got to the first byte of data,” said Carlos Sandoval Castro, worldwide offering manager for tape storage at IBM. “For many companies, it’s much easier to just ship a bunch of cartridges to whatever location so they can get their data back, rather than wait to retrieve it from the cloud.”

Cost analysis

It’s a common assumption that cloud is cheap, with providers routinely demanding mere cents for each gigabyte. Surely tape can’t compete.

Actually, it can, said Bassier. “We’re seeing companies starting to bring datasets and workloads back from the cloud because they’re realising it’s too expensive. When they look at tape or cloud as alternatives for long-term storage, a key factor is the cost of retrieval, which is free using on-premise tape, but can be expensive from the cloud.”

The same is true of disk. “Clifford Group compared the cost of tape-only and disk-only archiving solutions,” Grover explained. “When it took into account cooling and footprint, the cost of the hardware, maintenance, media… the disk solution was six times more expensive than tape, at $15 million versus just $2.4 million.”

Storing your storage media

While it’s starting to hit its limits where areal capacity is concerned, tape still has some way to go – and a multi-year roadmap to prove it. New standards, which are currently under development, are touching capacities of 185TB per cartridge. They’re not yet commercially available, but they have been in existence, in the lab, since 2014.

Increasing tape’s areal density has other benefits. It’s not only the price per gigabyte that’s falling as more data is written to each cartridge; cost of storing the media is falling in sync.

“An LTO 1 cartridge held 200GB compressed,” said Laura Loredo, senior product manager for Hewlett Packard Enterprise, “but today’s cartridges are 30TB compressed, so people migrate for the benefits of consolidating their data.” She cites freeing up slots in their libraries as one such benefit.

Still worried about the longevity of tape? Each cartridge has a stated working life of three decades

This makes the cartridges stated working life – three decades apiece – somewhat moot but, as Grover explained, that metric is more a means of quantifying their durability than an indication of how they’re used in the real world.

“Tape technology originally developed from consumer cassettes, which had a reputation for not being very robust,” Grover said. “People would leave them on the dash of their car and they’d melt. So, that’s where the 30-year claim came from. The industry had a longevity robustness perception problem to overcome and once LTO came along, all that baggage associated with tape was logically gone but not emotionally gone in some of the old-timer IT guys’ mindsets.”

Security built-in

The removable nature of tape means it’s uniquely protected against security threats that afflict otherwise live storage media. As Grover explains, “it isn’t subject to things like malware, viruses or ransomware the way disk is, simply by nature of the media’s physicality”.

Hewlett Packard’s Loredo agrees:

“Tape provides an air gap because it’s offline. If your primary disks get attacked with ransomware, you can go to your tapes and recover your data from there without paying the ransom. It’s the only media that provides proper protection against malware and ransomware.”

But not paying the ransom isn’t the only saving you’d make in such a situation – or in a natural or man-made disaster scenario.

“Downtime can cost as much as $3 million an hour if you’re a credit card company,” said Grover. “You can’t risk that. All you have to do is talk to an IT guy about being down or not being able to get your data back – that’s a hard thing to put a number around. As an IT guy, you don’t lose your job for not backing up the data; you lose it for not being able to get it back.”

At this point, the discussion closes a loop. Keeping the data close to home makes it immediately accessible, which reduces the duration – and thus cost – of any downtime while minimising ongoing overheads.

However, there are other reasons why you might favour tape over online storage.

“Once you start putting your data into a provider’s service, you’re locked in,” said Loredo. “You’ll have to get it all out again before you can move [to another provider]. And when you send your data to a third party you’ve lost control of it. Do you know how secure it is there? With GDPR, you should be concerned about that.”

Grover has similar sentiments. “[Data is] your company’s most valuable asset, and you want to trust that to somebody else? If you want to do it in parallel, that’s great, but they have no interest in your company, don’t reside on any of your campuses… no matter how much you pay them, they’re the ones who have control over when you get your data back and how it gets stored. Are you really good with that?”

Is the cloud the next thing for long-term data retention? Looking at the key vendors in the space

For any organisation in this era, there is a realisation on how data is critical for business needs and operations.

An enormous amount of data has been produced already after the disruption of cloud computing into various types of organisation, be it education, finance, healthcare or manufacturing. Today, organisations are more concerned about the data which has been developed in the last 15 to 20 years due to the surge of IT infrastructure.

This data and applications are probably not being used actively, but it is important to organisations as this data contains critical information, having compliance requirements around it. Security of old data (unstructured content, applications, virtual machines) is becoming crucial for the organisation. There has to be a cost effective and reliable archiving solution to store and secure data while gaining rapid access when needed.

In the past, IT management used to save the data in tape drives or on premises data centres without any filtering. But the data demands have drastically changed.

Even more data will be produced in the next five to seven years as more digitally connected devices become part of business operations. Data will be fuel for any business as they will abstract analytical information to get ahead of the competition or to be aligned with consumer demands. This digital transformation is not just to acquire new technology enhancement but to save CAPEX and OPEX every time when the data centre moves ahead in innovations.

As data grows, edge computing architecture will enable data centre systems to get closer to digital devices for processing of information (machine learning/analysis) and only a small set of information will be pushed to the cloud or private data centre.

How will organisations deal will past data when real-time data will also need to get archived for reference? How will organisations deal with data in hybrid cloud or a multi-cloud model where private and public cloud will be utilised for different data processing purposes? Will there be automation available for constantly syncing data based on archival methods that will get integrated in an archival strategy? What about the security from external breaches or physical damages to archival systems?

There are various vendors who have developed solutions to address these needs. Organisations have different choices to select a solution which fits their requirements and can be customised as per the budget. In this post, I have taken a look at data archival solutions from leading vendors like Rubrik, Cohesity and Zerto. Let’s evaluate their solutions.

Cohesity: Enterprise-grade long-term retention and archival

Cohesity’s solutions allow you to leverage both cloud and tapes to archive the data based on the organisation's requirements. The solution they call cloud-native is where, apart from tapes, archival is possible on public clouds, private clouds, Amazon S3-compatible devices and QStar managed tape libraries. The solution enables IT management to define workflow policies for automated backup and archival. It consists of two Cohesity products: Cloud Archive & Data Protect.

Cloud Archive allows to leverage public cloud for long term data retention, while Data Protect helps to reduce long term retention and archival cost with its pay as you go cost model.

Rubrik: Data archival

Rubrik’s solution provides support to organisations for data management on hybrid cloud environments. Organisations can choose their storage and architecture containing:

  • Archive to Google Cloud Storage
  • VMware vSphere, Nutanix AHV, and Microsoft Hyper-V
  • Microsoft SQL Server
  • Oracle, Linux, Windows, UNIX, and NAS
  • Remote and branch offices

The client uses real time predictive global search to access the archived data. You will see files directly from the archive as you type in the search box. This drastically reduces access time for your files. Also, it is possible to instantiate VMs in the cloud itself with Rubrik's solution. 

Data deduplication is used while accessing the data which further reduces transfer and storage costs. With this solution, all the data is encrypted before being send from physical devices to target storage infrastructure. A user is presented with a simple HTML5 responsive interface to set up a policy driven automation and target for archival.

Zerto: Zerto virtual replication

Zerto offers a different solution for archival of data compared with Rubrik and Cohesity. Zerto does the archival of data using an ad-hoc feature in its main software Zerto Virtual Replication. With this feature, it is possible to take daily, weekly and monthly backup of data to be archived. It is possible to use target for archival on tapes, network share in a third location, dedicated disk-based backup device or even cheap S3 or Blob Storage in AWS or Azure.

The latest release supports continuous data protection (CDP), replication, automated orchestration and long-term retention with offsite backup. Journal File Level Recovery mechanism is used to restore backup data quickly.

Conclusion

Apart from Rubrik, Cohesity and Zerto, there are more vendors who have offered different types of solutions for different workloads and for diverse requirements. But these three can be useful in most of the new age workloads like data generated by IoT devices, machine learning analysis data and unstructured big data lakes.

As organisations are evaluating new technologies to deal with data, a proper archival or long term retention solution will help them to get most of the past data and allow them to focus on newly generated data. As per this evaluation, it is clear that most vendors are focused towards utilizing public cloud or hybrid cloud environments to archive the long-term data. Use of the hybrid cloud means that private cloud can be used to store data, which is bound by compliance and security norms critical to organisations. But it will be completely up to the organisations to which solution they would like to go with as there are good options available.

The post Is the Cloud Next Thing for Long Term Data Retention or Archival? appeared first on Calsoft Inc. Blog.

Editor's note: Download the eBook NVMe: Optimizing Storage for Low Latency and High Throughput.

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.

Fears mount around Russian influence over Pentagon cloud data contract


Connor Jones

12 Dec, 2018

AWS is leading the bid for the Pentagon's JEDI (Joint Enterprise Defense Infrastructure) contract to store sensitive military data in a commercial cloud and it's is linked to a bidding partner bearing connections to a sanctioned Russian oligarch.

The <em>BBC</em> reported that AWS is being helped by cyber investment firm C5 Group to secure the <a href="https://www.cloudpro.co.uk/it-infrastructure/cloud-management/7711/googl… target="_blank">multi-billion dollar contract</a> which could see data such as nuclear codes being stored in the cloud.

C5 Group is linked with Viktor Vekselberg, a Kremlin associate who has recently been sanctioned by the US for having close ties with the <a href="https://www.cloudpro.co.uk/it-infrastructure/security/7490/kaspersky-to-… target="_blank">Kremlin</a>.

Vekselberg "poses a risk to the US", said Michael Carpenter, former Pentagon official: "Any oligarch in Russia, when called upon by the Kremlin, to do their bidding will do so, and that is the condition that they keep their wealth."

Vekselberg's former right-hand man Vladimir Kuznetsov is a major shareholder in a C5 subsidiary, C5 Razor Bdico but apparently became one through his own volition, using his own money without instruction from Vekselberg.

Veksleberg was sanctioned by the US and soon after he was stopped before boarding a flight in New York on suspicion of his involvement in Russia's interference in the 2016 Presidential election. His electronic devices were seized but denied any wrongdoing.

It emerged earlier this year that Columbus Nova, a company affiliated with Renova Group had paid £500,000 to Michael Cohen, Donald Trump's lawyer at the time. Renova Group is Russian conglomerate which until April 2018, had Vladimir Kuznetsov on its board.

The fears of Veksleberg's connection to a leading bidder stem from the fact that he could have influence over a company which could hold as much as 80% of all US military sensitive data including nuclear launch codes and military personnel locations, some <a href="https://www.theregister.co.uk/2018/12/11/oracle_sues_pentagon_jedi/" target="_blank">reports</a> suggest. It's logical to fear what Russia could do with access to military data, especially considering how successful it was in the election tampering.

Disclosure of bidding companies is prohibited so the Pentagon has declined to comment, while both AWS and C5 Group have said the pair are not involved in the bidding process at all, contrary to the <em>BBC</em> reports.

The JEDI contract was devised to help the US compete with China and Russia, US Major General David Krumm who helped draft the contract said it would help the US win wars.

Speaking at the contract's launch, he said: "The information has to be available to an army platoon that a friendly unit is just around the block and will not open fire.

"It's got to be available to a platoon of Marines who are about to breach a door that an IED has been found."

There are fears that if the Pentagon's IT systems are not updated soon they will lose a future war. As of now, the data which is due to be moved into the cloud is stored on smaller servers in different departments around the Pentagon, having the data in one place would, in theory, make data sharing much more efficient.

Other criticisms of the contract come from a more commercial standpoint, with Oracle earlier in the year voicing its concerns about how Amazon has an unfair advantage in securing the contract and that the deal was <a href="https://www.bloomberg.com/news/articles/2018-04-04/oracle-s-catz-is-said… target="_blank">tailor-made for it</a>.

<a href="https://www.cloudpro.co.uk/saas/7752/microsoft-explains-why-it-still-sel… target="_blank">Microsoft</a> also joined Oracle in the Department of Defence's complaints pile, claiming that limiting the contract to one vendor means the Pentagon would miss out on emerging technology from other cloud companies.

Symantec and Fortinet team up to deliver comprehensive cloud security service


Daniel Todd

12 Dec, 2018

Cyber security firms Symantec and Fortinet have announced an expansive partnership that aims to deliver comprehensive security services across endpoint, network and cloud environments.

As part of the deal, Fortinet’s FortiGate Next-Generation Firewall (NGFW) will be integrated into Symantec’s cloud-delivered Web Security Service (WSS), resulting in “the most comprehensive set of cloud-delivered threat prevention capabilities in a single service on the market”, according to the companies.

Symantec’s endpoint protection will also be combined with the Fortinet Security Fabric platform to provide customers with real-time, actionable threat intelligence and automated response to tackle exploit-driven attacks and advanced malware.

“As the first step in this technology partnership, we plan to deliver best-of-breed security through the combination of enterprise-class advanced firewall controls to Symantec’s industry-leading network security service,” said Art Gilliland, EVP and GM of enterprise products at Symantec.

“Through this partnership, we hope to provide joint customers the power of Symantec’s Integrated Cyber Defense Platform bolstered by Fortinet’s leading NGFW in an integrated solution that’s easy to use and deploy.”

Interoperability between Fortinet’s SD-WAN technology will also be certified to work with Symantec’s WSS through the latter’s Technology Integration Partner Program (TIPP), while the agreement will also see both firms engage in joint go-to-market activities.

“Upon completion of the integration, Symantec cloud web gateway customers will be able to benefit from Fortinet’s enterprise-class advanced firewall controls, and for the first time ever, Fortinet customers will be able to purchase the industry-leading FortiGate Next-Generation Firewall via FWaaS,” said John Maddison, SVP of products and solutions at Fortinet.

“With the addition of Symantec as a Fortinet Fabric-Ready Partner, Symantec’s endpoint security solution will be validated to seamlessly integrate with the Fortinet Security Fabric platform to provide more consistent and effective protection for joint customers.”

Some elements of the Fortinet Security Fabric have already been integrated with Symantec Endpoint Protection and the companies are planning to explore further possibilities in the future. The cloud firewall service and WSS integration is expected to be available during the first half of 2019.

Best tech podcasts for 2019


Steve Clark

27 Dec, 2018

Podcasts are more popular than ever, but where should you turn your ears for the best news, views, insights and chat?

Our list of 2019’s best podcasts pulls together everything from insightful analysis and security breach breakdowns to general tips and chat shows about the latest goings on in the world of technology.

There’s really something for anyone involved in the tech industry, so why not have a listen?

How to listen to podcasts

To get started, download a podcasting app. We rate the cross-platform Castbox (castbox.fm), but iTunes, Spotify and Google Podcasts are also great options. If you’re strictly into BBC podcasts, Auntie recently unveiled BBC Sounds for instant access. Next, start searching.

Most podcasts are available on every platform, but switch to another app if you can’t find your show. And don’t forget to hit the download button when you’re connected to Wi-Fi – streaming podcasts chews through your mobile data.

Tech thoughts & tips

PC Pro

A shameless plug on behalf of our sister title. Hosted by the magazine’s editor-in-chief Tim Danton and featuring Web User columnist Barry Collins, the PC Pro podcast is like an hour-long fireside chat, as tech experts demystify technology’s latest trends and topics. With new episodes debuting fortnight, you’re never short of opinions and views.

Clockwise

Clockwise’s setup is simple: “Four people. Four tech topics. Thirty minutes”. Affable hosts Mikah Sargent and Dan Moren take the lead, joined by a rotating panel to discuss everything from the state of social media to smart toilets. You can catch up with Clockwise every Wednesday.

Chips with Everything

The Guardian’s Chips with Everything is a snack-sized show that’s deliciously filling. Part interview, part documentary, episodes run between 20 to 30 minutes – just enough time for host Jordan Erica Webber to take a sideways glance at today’s technology trends, and its world-altering impact.

Internet of Things

If you’re after some breezy chat, tech journalist Stacey Higginbotham’s Internet of Things podcast is not for you. This weekly show is geared towards a tech-smart audience interested in consumer and business technology, platforms, privacy and politics. Expert guests also offer tips and advice in this hour-long podcast.

Internet culture

IRL

“Online life is real life”, is the strapline of In Real Life – a bi-monthly podcast produced by Mozilla (of Firefox fame). Unlike most tech podcasts, IRL isn’t a talking shop. You’ll hear real-life stories, like the community who built a better internet or the girl paid to write messages on Tinder.

Reply All

Reply All is investigative journalism for the internet, exploring online phenomenon such as the ‘Instagram for Doctors’ app, Tinder weirdos or message-board mysteries. As one testimonial puts it, “it’s a podcast that tells gorgeous, painfully human stories that happen to have bits of technology sprinkled in”.

This is Only a Test

This is Only a Test is the official podcast for Tested.com – the tech, science and geek culture site run by TV’s former Mythbuster Adam Savage. TIOAT is a rambling 90-minute show that, like its parent site, covers “anything that’s awesome”. You can also watch it on YouTube.

Deeper insights

Twenty Thousand Hertz

Twenty Thousand Hertz studies the history of world-famous audio – from jingles to sound effects and startup noises – and why they’re so effective. Produced by an award-winning sound-design studio (so they certainly know their onions), the podcasts range between 20 and 30 minutes, and feature in-depth interviews that cut through the noise.

Bloomberg’s Decrypted

As you’d expect from the business-orientated Bloomberg, Decrypted offers serious, in-depth reports. These are ‘peek-behind-the-curtain’ podcasts that shine a light on varied global technology topics like how Facebook’s ads really work or why bitcoin still matters. Experts are interviewed, data extrapolated and secrets uncovered, all in just 30 minutes.

TechStuff

For 10 years HowStuffWorks.com’s TechStuff, hosted by Jonathan Strickland, has served up fascinating insights into technology old and new (one week it’s AI, the next it’s DARPA). Where topics span beyond the 40-minute run-time, they’re chopped into multiple episodes. The show occasionally reruns TechStuff Classic podcasts, so newcomers don’t miss out.

TED Talks

TED Talks is to podcasts what Stephen Fry is to Twitter – it’s the iconic ‘brand’ that everyone follows. With its vast range of topics and experts, enjoy eye-opening content that changes the way you see the world. If you’re pushed for time, check out the bite-sized Ted Talks Daily.

Topical tech chat

BBC Click

The BBC has loads of science and technology podcasts, but start out with the World Service’s Click. This weekly podcast (effectively a 40-minute radio show), covers global technology news. As with most Beeb content, podcasts are only available for 30 days before they’re wiped.

TheVergeCast

The Verge team gather every Friday night to discuss the stories behind the week’s news. It’s chilled-out weekly news round-up; just a few friends gabbing about technology. Interspersed throughout the week are reviews and event coverage. Episodes range between 30-minutes up to a whopping 90-minutes.

This Week in Google

Google’s products and services dominate the internet. And the TWiG team – Leo, Jeff, and the IoT podcast’s Stacey Higginbotham – are giving the tech firm the hard-eye. Don’t expect fanboyism. This Week in Google is fair-minded, unafraid of calling out Google’s missteps or praising their successes.

The Two Technies

Each Saturday, the Two Techies Aaron and Jamie, make sense of events in the tech world. Given that it’s two mates having a chat, it’s a well-polished production. Each ep lasts “around an hour or less”, and sometimes features well-known ‘techie’ guests – most notably Apple guru Steve ‘Woz’ Wozniak.

Future technology

Wall Street Journal’s Future of Everything

Using technology news as a launchpad (think quantum computing, cryptocurrencies and cyber-attacks), the WSJ’s Future of Everything takes a 20-minute gander at where our digital future lies. For instance, if voting machines can be hacked, could we one day vote on our smartphones?

Future Tense

The Australian Broadcasting Company’s Future Tense is a podcast for those right on the bleeding-edge of technology. Discover how the very latest tech is transforming our culture and our lives. Given its broadcasting background, Future Tense has that serious news vibe, focusing on interviews with leading experts and insiders.

Security breach

Latest Hacking News

Clocking in at six minutes apiece, Latest Hacking News is the quickest ways to catch up on the day’s cyber-security news. It’s aimed at IT professionals (they use terms that’ll send you scurrying for Google), but that gives it absolute authority. Check out the website for additional news coverage.

Hackable

Hackable is brought to you by godfathers of the anti-virus, McAfee. Over the course of 30 minutes, the cybercrime podcast reveals the many ways we’re vulnerable to hackers. Let’s put it this way: after listening to Hackable, you’ll never go near another virtual reality unit.

Smashing Security

Winner of the ‘Best Security Podcast 2018’, Smashing Security claims it’s “not your typical cybersecurity podcast”. It’s a light-hearted round-table chat about hacking, cybercrime and online privacy that brings some much-needed levity to an otherwise serious subject.