Tech News Recap for the Week of 09/22/17

If you had a busy week and need to catch up, here’s a tech news recap of articles you may have missed for the week of 09/22/2017!

Microsoft is adding a potent security feature to Windows 10. HP announces 360 Secure Fabric analytics security solution. The 6 phases of adopting cloud security practices. Locky variant ransomware attack continues to grow and more top news this week you may have missed! Remember, to stay up-to-date on the latest tech news throughout the week, follow @GreenPagesIT on Twitter.

Tech News Recap

Featured

  • VMworld 2017: NSX Cloud, AppDefense + VMware’s New Direction

IT Operations

[Interested in learning more about SD-WAN? DownloadWhat to Look For When Considering an SD-WAN Solution.]

Microsoft

  • Microsoft shows VMware and Oracle how to get real about open source
  • Microsoft is adding a potent security feature to Windows 10
  • Office 365 phishing attacks create a sustained insider nightmare for IT

Cisco

HPE

Cloud

Security

[Interested in learning more about SD-WAN? DownloadWhat to Look For When Considering an SD-WAN Solution.]

By Jake Cryan, Digital Marketing Specialist

MongoDB sets up to go public, reveals $101m yearly revenues in filing

Database provider MongoDB has filed to go public, confirming reports from a month previously and potentially becoming the third major cloud IPO of 2017 after Okta and Cloudera.

The SEC filing, available to view here, shows MongoDB made $101.4 million in total revenues in the year ending January 31 2017, up 55% from the previous year’s $65.3m, which was up 37% from 2014-15’s $40.8m. Despite total gross profit of $71.4m, total operating expenses of $157.4m puts an overall net loss at $86.7m.

The company’s level of funding sits at more than $300m over six funding rounds – not including two undisclosed ventures – with its valuation sitting at $1.2 billion in 2013.

Continued losses are of course nothing new in this space, yet citing IDC figures of $61.3bn for the global database market, MongoDB cites a ‘highly differentiated business model that combines the developer mindshare and adoption benefits of open source with the economic benefits of a proprietary software subscription business model’ as a primary reason to take the market.

The company’s freemium product, Community Server, has been downloaded more than 30 million times in eight years with a third of that coming in the past 12 months, while 90% of MongoDB’s total revenue comes through subscriptions.

MongoDB cites legacy relationship database software providers – IBM, Microsoft, Oracle ‘and other similar companies’ are in the document – as competitors, as well as non-relational database providers and cloud providers, with Amazon Web Services (AWS), Google Cloud Platform (GCP) and Microsoft Azure cited.

No non-relational database providers were named in the SEC; however it appears MongoDB has beaten Couchbase to the punch for going public for now. In March last year, the Mountain View firm secured a $30 million series F funding round, with CEO Bob Wiederhold telling this reporter it would give the company a ‘runway we need to have what we think will be a very successful IPO in the not too distant future.’

Fast forward 15 months, however, and Wiederhold has stepped aside from the CEO chair – although remaining executive chairman – to be replaced by former Veritas president Matt Cain, with no word of a public offering forthcoming.

Okta and Cloudera, who filed within weeks of each other earlier this year, set out solid financial results in their first quarter as public companies. Okta saw total revenues of $61m in the quarter, an increase of 62.9% year on year, while Cloudera hit $89.8m, up 39% from the year before.

What are the challenges of a hybrid cloud strategy?

When cloud first emerged as a potent technology, the reaction from organizations was mixed. Some of them jumped into the bandwagon right away while others waited until their concerns were answered, especially pertaining to security. Within a few years, AWS and Microsoft emerged as the leaders of the cloud market and their rise also shaped the industry and helped it to mature.

Today, organizations that want to embrace cloud prefer to have a hybrid cloud strategy, where they use the services of multiple vendors and also, have their own data centers. The obvious advantage of this strategy is that they can tap into the benefits of different cloud providers and leverage the power of data centers.

In fact, a research by Gartner shows that by 2020, 90 percent of organizations will adopt a hybrid cloud strategy because of the flexibility and cost-saving that comes with it.

That said, implementing a hybrid cloud strategy is anything but easy because of the huge complexities that come with it. For example, let’s say you have application A that runs on Azure, application B that runs on AWS and application C that is in your own data center. How will you control access to all these three applications? Will you use a single sign-on for easy user identity management or will you go in for a complex user management system?

The above scenario should give you a glimpse into the challenges that go with a hybrid cloud strategy. Let’s look at a few more now.

On-premise or cloud

One major challenge in hybrid cloud implementation is to decide which services should be managed on-premises and which on the cloud. This complication arises from the growing sophistication and falling prices of computing, or the CPU and RAM that power systems. Vendors like Dell and HP are providing increased capabilities, thereby giving greater value for money.

Such a scenario makes it difficult to determine if you should deploy workloads on your on-premise infrastructure or on an off-premise system that comes with a continuous consolidation of server environments.

Security concerns

Despite all the advancements made in cloud security, concerns still abound. There is a lot of skepticism about whether critical data will be safe in the cloud. Such fears lead to prohibitive costs, in many cases, especially if you have a large and complex IT infrastructure.

Planning

You could be losing thousands of dollars every month and ineffective IT systems, if you don’t plan well. You need to stay on top of the developments both in the cloud industry as well as hardware advancements in the data center industry to help you maximize the benefits of both.

Overall, a hybrid cloud strategy is being preferred over other forms of cloud implementations and rightly so because it offers many benefits. At the same time, there are also challenges that come with it. So, make sure you address and plan these challenges to truly leverage the power of hybrid systems.

The post What are the challenges of a hybrid cloud strategy? appeared first on Cloud News Daily.

Multi-cloud strategies ‘urgent’ for European organisations, says IDC

A quarter of IT firms in Europe are already operating tiered applications across on-premise and off-premise environments; yet only a few companies have long term strategies in place, according to the latest note from IDC.

The study, which polled more than 800 IT and line of business decision makers in 11 European countries, found that while 31% prefer to hook up their systems to apps on a public cloud with back-end systems on-premise, 8% went for ‘bursting’ capacity into off-premise. 40% of respondents segregated on- and off-premise environments, with 20% not venturing into public cloud at all.

Only one in five line of business respondents said they would find standardising on one or two infrastructure (IaaS), platform (PaaS), or software as a service (SaaS) vendors acceptable, a number which goes up to one in three for the IT side.

IDC cited companies such as ING Bank and Siemens whose model purports to consume cloud content from several locations, as well as the ‘challenges’ this presents to CIOs. The analyst firm cited a multi-cloud strategy “based on hiring staff with negotiation skills, expanding investments in automation software, and revising cross-country connectivity options” as a result.

“Connecting cloud environments with ad hoc bridges in a hybrid fashion won’t be enough in 2018… nor will standardising on one external provider, at least for large or innovative companies,” said Giorgio Nebuloni, IDC’s European infrastructure group research director. “Developers and line of business require ‘best of breed’, and the purchasing department wants to avoid being locked in.”

The growing need for multi-cloud was covered by this publication earlier this week, when Thor Culverhouse, CEO of Skytap, mentioned it in the context of how the wider war around the cloud – not so much a two-horse race between Amazon Web Services (AWS) and Microsoft – had not started yet. “CIOs are starting to wake up to the fact that there’s going to be multiple clouds to address multiple workloads and applications,” he told CloudTech.

Half of companies fail to meet PCI DSS compliance standards: Is your infrastructure up to it?

Only 55.4% of companies meet all PCI DSS compliance standards, according to a new report released by Verizon. While this number is up 7% from 2015, it still translates to nearly half of retailers, IT services companies, payment software providers and hospitality organisations do not adequately protect credit cardholder information.

Companies had the greatest difficulty meeting the following requirements, many of which are related to infrastructure compliance and policies:

  • Requirement 3 – Protect stored cardholder data. Requirement 3 also saw the second highest use of compensating controls globally.
  • Requirement 6 – Develop and maintain secure systems, covering the security of applications, and particularly change management.
  • Requirement 11 – Test security systems and processes, including vulnerability scanning, penetration testing, file integrity monitoring, and intrusion detection.
  • Requirement 12 – Maintain information security policies. Control 12.8 (Manage service providers with whom cardholder data is shared) was the weakest of the Requirement 12 controls.

Additionally, 44.6% of companies fall out of PCI DSS compliance within nine months of validation.

At a time when 51% of compliance officers in financial services firms report a skills shortage in compliance, it is perhaps no wonder that many companies have fallen behind. Rather than hire more staff, sixty-seven percent (67%) of IT leaders would prefer an automated approach to infrastructure compliance, which is usually a cloud-based solution. One of the many reasons that cloud solutions are appealing is that the cloud platform (such as AWS or Azure) takes care of most physical security controls, reducing the overall cost and effort of building a compliant system. More than 50 percent of new 2017 enterprise North American application adoptions will be composed of SaaS, PaaS, or IaaS solutions, according to Gartner.

Infrastructure compliance automation on the public cloud (Amazon Web Services, Azure), which is referred to as Continuous Compliance or DevSecOps, has received increased attention in 2017. In basic terms, infrastructure compliance automation consists of several cloud-based tools, such as configuration management and infrastructure templates, that allow engineers to easily spin up compliant infrastructure, track configuration changes, and reduce manual compliance work.

The complexity of meeting infrastructure compliance requirements is growing, especially for companies that host large amounts of sensitive financial data. As companies explore cloud options, expect to see a shift in compliance management away from manual compliance work and towards cloud automation.

The post Nearly Half of Companies Fail to Meet PCI DSS Compliance Standards appeared first on Logicworks.

How to Identify Potential Malicious Attacks on Firewalls

Firewalls are an integral part of a network’s safety and security, and this is why they are often considered as the pillar of any cyber security program. Firewalls constantly face a barrage of attacks from different sources ranging from automated programs to experienced hackers. Though the firewall is constantly blocking unauthorized attack from unknown sources, it is nevertheless important for network managers to stay on top of their firewall performance. This helps them to identify and mitigate the effects of potential malicious attacks.

Logs Have All the Information

The first place to look for in case of an attack is the logs. Every login and activity performed on the network is recorded in the log files, therefore, look for the source of the problem in these log files. This will give more insight into the nature of attack and from this information, it is possible to know which parts of the network could be possibly compromised. Further, these log files will have information about the source of attack and this can help to identify the perpetrator as well.

Port Scanning

Hackers use port scanners to identify open ports on the firewall through which they can attack the network. In the case of potential malicious attacks, you should scan the log files for requests that came from the same IP to multiple ports. The firewall system is designed to block multiple requests from the same IP and information about these requests will be available in the log files. This will give you more insight on the source of the attack and more importantly, you can block this IP from accessing any part of your network in the future.

Understand Traffic Patterns

It is easy to identify whether the network is compromised when you know the traffic patterns. Knowing the regular bandwidth usage and the number of connections or packets transmitted per second gives a good idea about the normal traffic on the network. When these rates are much higher than normal, then it is time to examine the network for a possible security breach. To get more information about the nature and source of attack, you can go back to the log files again.

Intrusion Detection Systems

Intrusion detection systems constantly monitor the network and alert you when there is a breach. It monitors user and system activity, assesses the integrity of critical information, analyzes traffic patterns, audits the system to identify problems with configurations and its resulting vulnerabilities and provides a statistical analysis of patterns that match previous attacks.  It creates an alert immediately after it detects an attack. However, it does not block traffic even if the request is from an unauthorized source.

The post How to Identify Potential Malicious Attacks on Firewalls appeared first on Cloud News Daily.

N3N to Exhibit at @ThingsExpo | @N3N_IoT #IoT #IIoT #Sensors #SmartCities

SYS-CON Events announced today that N3N will exhibit at SYS-CON’s @ThingsExpo, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
N3N’s solutions increase the effectiveness of operations and control centers, increase the value of IoT investments, and facilitate real-time operational decision making. N3N enables operations teams with a four dimensional digital “big board” that consolidates real-time live video feeds alongside IoT sensor data and analytics insights onto a single, holistic, display, focusing attention on what matters, when it matters.

read more

[session] The Migration of Legacy Solutions to the Cloud | @CloudExpo @Metavine #BI #DX #Cloud

Today companies are looking to achieve cloud-first digital agility to reduce time-to-market, optimize utilization of resources, and rapidly deliver disruptive business solutions. However, leveraging the benefits of cloud deployments can be complicated for companies with extensive legacy computing environments.
In his session at 21st Cloud Expo, Craig Sproule, founder and CEO of Metavine, will outline the challenges enterprises face in migrating legacy solutions to the cloud. He will also present an approach to address those challenges, accelerate the process, and minimize the risk of the migration.

read more

More shared responsibility confusion among cloud hoppers, Barracuda notes

Another day, another research study which reveals the benefits of the public cloud tinged with security concerns.

This time, a report from Barracuda Networks has shown that while respondents – 300 IT decision makers from companies across the US – expect the percentage of their infrastructure in the public cloud to almost double in five years, three quarters (74%) say security concerns restrict their organisations’ migration.

Yet perhaps the most worrying aspect of the research was around confusion with regard to the shared responsibility model of cloud computing. More than three quarters (77%) of those polled say public cloud providers are responsible for securing customer data in the cloud, while 68% believed vendors are responsible for securing customer applications as well.

This is not the first time this publication – or indeed, Barracuda – has noted the disparity. Back in July, a report from the company found similar misgivings. It’s worth repeating again what Amazon Web Services (AWS) and Microsoft, the two leading companies in the space, have to say on it.

AWS describes the relationship between vendor and customer as being responsible for security ‘of’ the cloud – compute, storage, networking, and so forth – and ‘in’ the cloud, such as customer data, applications, and identity and access management, respectively. For Microsoft, it’s a question of differentiating between software, infrastructure, and platform as a service. SaaS has more responsibility for the provider, going down through PaaS, IaaS and eventually on-prem which is of course entirely the customer’s responsibility – but with data classification always the responsibility of the user.

Here, the issues do not appear to have changed, with Barracuda making a series of recommendations to organisations. First, partner with third-party security vendors who support a wide range of ecosystems for a multi-cloud scenario – a situation Skytap, who this publication recently featured, affirmed – as well as look for vendors that provide a common management scheme. Naturally, Barracuda is adept at each of these scenarios.

“This survey confirms what we are hearing from customers and partners – security remains a key concern for organisations evaluating public cloud, and there’s confusion over where their part of the shared responsibility model begins and ends,” said Tim Jefferson, Barracuda vice president public cloud in a statement.

“Many organisations realise that cloud deployments can be inherently more secure than on-premises deployments because cloud providers are collectively investing more into security controls than they could on their own,” added Jefferson. “However, the organisations benefiting most from public cloud are those that understand that their public cloud provider is not responsible for securing data or applications and are augmenting security with support from third-party vendors.”

Organisations losing revenue due to lack of cloud expertise, Rackspace warns

Two in three UK IT decision makers polled in a new study say their organisation is losing out on revenue as they lack specific cloud expertise.

The report, put together by Rackspace and the London School of Economics, polled 950 IT decision makers and 950 IT pros and found 67% of the latter believed they could bring greater innovation to their organisation with the ‘right cloud insight’. 85% said greater expertise within their organisation would help them recoup the return on their cloud investment.

Almost half (46%) of IT decision makers say they find it hard to recruit the best talent to manage their organisation’s clouds, with migration project management, native cloud app development, and cloud security among the skills companies are struggling with to hire. Similarly, competition for talent, an inability to offer competitive salaries, and difficulties in providing sufficient career progression and training were also cited as barriers to recruitment.

With this in mind, almost three quarters (72%) of respondents said they were looking to increase their firm’s cloud usage in the coming five years. The research evidently denotes a friction point; 84% of IT decision makers said it took ‘a number of weeks or more’ to train new hires, with 37% opting for ‘months’.

Will Venters, assistant professor of information systems at LSE, said cloud technology had been a ‘victim of its own success’. “As the technology has become ubiquitous among large organisations – and helped them to wrestle back control of sprawling physical IT estates – it has also opened up a huge number of development and innovation opportunities.

“However, to fully realise these opportunities, organisations need to not only have the right expertise in place now, but also have a cloud skills development strategy to ensure they are constantly evolving their IT workforce and training procedures in parallel with the constantly evolving demands of cloud.

“Failure to do so will severely impede the future aspirations of businesses in an increasingly competitive digital market.”

According to Firebrand Training, an IT training and project management course provider, the top five cloud skills organisations – and employees – need to get on board for this year include database and big data, application security, and containers.

The Rackspace research polled executives from the UK, US, Germany, Benelux, Switzerland, Mexico, Singapore, Australia and Hong Kong.