The nature of technology


Cloud Pro

29 Jul, 2019

Changes to the planet are happening at an alarming rate, and it’s no secret that we’re partly to blame. But one way in which we are making positive changes to the Earth is through developing technology that can help us protect our planet. And how best to do this? By understanding exactly what is happening now, so we can help predict and influence future events.

The size and complexity of the ecosystem is one of the biggest challenges when monitoring the planet. While plenty of data has been collected over the years, until recently, it has been almost impossible to get the insights needed to truly understand what’s going on.

This is where cutting edge technologies come in. From large scale global mapping to tracking individual species, big data analysis and deep learning are now being used to analyse the Earth and its life.

Macro scale – mapping and modelling the Earth

Forests and oceans cover the majority of the planet. Their health is crucial to the stability of the ecosystem, but monitoring them is no easy feat. Enter digital maps and models. By using big data analytics to create high-resolution maps of the planet, many of which are updated in real time, scientists can literally get the bigger picture.

Global Forest Watch is a UN sponsored web application that uses live satellite images to assess the health of all forests globally. It has provided the tools and data to support projects such as the Amazon Conservation Association, which works to protect biodiversity in the Amazon, and Forest Atlas, which helps manage forest resources in Liberia.

Rezatec, an analytics company, has developed similar forest intelligence and the data analysis techniques and machine learning algorithms to produce crucial insights. In 2017, Rezatec partnered with the Forestry Corporation New South Wales to add value to its existing data by using multiple datasets to create more detailed and useful maps. This gave forest management the tools to spot pests and diseases and environmental changes early.

Similar projects are underway to understand our oceans. Ocean Data Viewer is another resource supported by the UN that collects and curates a wealth of data on coastal and ocean biodiversity from a variety of trusted scientific research facilities. Multiple global data sets and maps can then be viewed and downloaded for research and analysis.

71% of our planet is water, but we have better maps of Mars than we have of the ocean floor. Now, thanks to data gathered from ships, buoys, and satellites, oceanographers have plenty of information at their disposal. From analysing sediment samples with AI to employing cutting-edge laser techniques, they can now map out features on the seabed such as rivers and underwater volcanoes and detect changes as and when they happen. Advanced analytics, powered by Intel, are helping to make these developments faster and more efficient.

Micro scale – monitoring individual animal species

Decline in an individual species is often an early indicator that something is wrong, as many animals are essential to the Earth’s balance. By analysing animal behaviour, scientists can give governments, business leaders and the global community the tools they need to encourage species to thrive.

From counting whales from space to tagging elephants with AI trackers, there are plenty of new and exciting technological developments giving scientists more insight into our wildlife than ever before. However it’s the smaller, more elusive creatures that can be the trickiest to understand. Luckily, technologies are now being developed to provide solutions for these species.

Take bees, for example. Through the pollination of crops, they perform an essential role in the global economy. However, their numbers are in worrying decline. In 2016, Intel teamed up with Australia’s Commonwealth Scientific and Industrial Research Organization on a groundbreaking bee-tracking project. Tiny RFID ‘backpacks’ were attached to 10,000 Tasmanian bees to monitor their every move. Meanwhile, their hives were fitted with an Intel Edison board to collect data on internal conditions and honey production. From the data gathered, scientists deduced that the use of pesticides, climate change and the loss of wildflower habitats were all contributing factors to the species’ decline.

Bats are another crucial, but elusive species. They are a great indicator of biodiversity, the loss of which has been likened to “burning the library of life”. As bats only thrive when insect species are in abundance, understanding their behaviour and numbers can help scientists assess the health of the local environment.

In an Intel and UCL project, automatic smart ‘Echo Box’ detectors were fitted around the Queen Elizabeth Olympic Park in London. These boxes picked up ultrasonic bat calls, allowing scientists to acoustically track bats in the area. The Intel edge processor in each box then processed and converted the sound files into visual representations of the calls and deep learning algorithms analysed and logged each pattern. On some nights, over 20,000 calls were detected, indicating a reassuringly high level of bat activity.

All of these projects highlight the amazing capabilities and flexibilities of today’s technology. From big data analysis to deep learning, the same groundbreaking tools used to empower businesses globally are being applied to make a real difference to the health of the planet.

Discover more amazing stories powered by big data and Intel technology

Gartner argues Amazon holds almost half of cloud infrastructure market in latest analysis

As the largest players have published their most recent financial results, Gartner has weighed up the situation and looked into its crystal ball – and has given Amazon a greater advantage than before.

The analyst firm has put together its latest forecast around the global infrastructure as a service (IaaS) market and found growth of 31.3% in 2018. The worldwide IaaS market was rated at $32.4 billion (£26.2bn) last year, up from $24.7bn in 2017.

Amazon Web Services (AWS) was pegged by Gartner at having almost half (47.8%) of total market share, significantly ahead of Microsoft with 15.5%. Alibaba takes the bronze medal on 7.7%, with Google (4.0%) and IBM (1.8%) rounding off the top five. It's worth noting however that Gartner's 2018 IaaS Magic Quadrant put Google in as a leader, alongside AWS and Microsoft, for the first time.

Not surprisingly, Amazon shows the lowest growth across the four largest players between 2017 and 2018, with Alibaba (92.6% growth) taking the honours. Microsoft (60.9%) and Google (60.2%) scored comparatively, with Amazon (26.8%) trailing.

This makes for interesting comparison with another analyst, Synergy Research. In its most recently published figures, the company pins AWS as holding a third of the market, with Microsoft and Google at 16% and 8% share respectively.

Nevertheless, both analyst firms see a similar theme when it comes to the wider market. “Despite strong growth across the board, the cloud market’s consolidation favours the large and dominant providers, with smaller and niche providers losing share,” said Sid Nag, Gartner research vice president. “Only those providers who invest capital expenditure in building out data centres at scale across multiple regions will succeed and continue to capture market share.

“Google’s cloud offering is something to keep an eye on with its new leadership focus on customers and shift toward becoming a more enterprise-geared offering,” added Nag.

Synergy posits that the law of large numbers has taken effect in recent quarters; the growth among the largest vendors, while still strong, simply could not continue at the same pace. Yet as this publication put it when Amazon and Google’s results were released, it is not so much big numbers but big expectations. Amazon dipped on its $8bn quarter for AWS and got relatively pilloried, while Google Cloud got praise for its first $8bn run rate.

Gartner recommends that for those still looking to take a slice of the cloud infrastructure pie, particularly managed service providers, vertical industries and partnerships should be a key focus.

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.

AWS and Google Cloud earnings beget laws of large numbers – and expectations – for cloud revenue

It is not only the law of large numbers which applies in cloud revenues, but also the law of large expectations. Amazon Web Services (AWS) reported revenues of $8.38 billion (£6.7bn) for its most recent quarter although a slight dip in growth saw pessimism from the analysts. Google Cloud, meanwhile, secured an annual run rate of $8 billion in its results to a much friendlier outlook.

Amazon’s downturn with AWS was seeing only 37% growth year on year, compared with 49% the previous year. AWS profit had risen almost 30% year over year, with the cloud infrastructure arm now comprising more than 13% of total revenues at Amazon.

Google, meanwhile, saw its ‘other revenues’ – of which Google Cloud is a part – climb to $6.18bn (£4.97bn) for the most recent quarter, up almost 40% year on year. Google does not disclose specific revenues around its cloud suite – for the second successive quarter, an analyst asked this very question – yet the noises were all good from Alphabet’s senior management.

“Customers are choosing Google Cloud for a variety of reasons,” Google CEO Sundar Pichai told analysts in what was described as ‘another strong quarter’ for the company. Pichai cited reliability and uptime, flexibility, scalable data management, and artificial intelligence and machine learning as key areas of differentiation.

Industry watchers will not be particularly surprised by these features, as it has been Google’s message for a while. In February Google Cloud chief exec Thomas Kurian – in his first major speaking gig since taking over the role – cited five ways Google differentiated from its hyperscale rivals, which essentially amounted to the above four alongside hybrid and multi-cloud capability.

Of the 42 bullet points which represented Amazon’s highlights over the quarter, AWS commanded 11 of them. Particular developments of note included the general release of Amazon’s managed blockchain service, as well as a major new customer in the shape of NASCAR. The quarter was more product-heavy from an AWS perspective than usual, including the launch of managed machine learning services Personalize and Textract, as well as AWS Security Hub, which gives customers a central area to managed security and compliance across AWS environments.

From Google’s side, key highlights included two acquisitions – storage provider Elastifile and business intelligence platform provider Looker – as well as the launch of a new data centre region in Osaka.

Ruth Porat, chief financial officer at Alphabet, added a little more context to the reporting. “[We’re] pleased with the performance of both [Google Cloud Platform] and G Suite,” she said. “Growth in GCP was led by strong customer demand for our compute and data analytics products and G Suite continues to deliver strong growth. Overall, GCP remains one of the fastest growing businesses in Alphabet, and we’re really pleased with how the team is executing on both.”

Amazon SVP and chief financial officer Brian Olsavsky told analysts it was a ‘really strong quarter’ for AWS and batted away concerns of a slightly lower performance. “We’ve been pretty transparent with our AWS revenue and income numbers – we’ve been breaking it out [from] 2015 and we’re very happy with the growth in absolute dollar terms,” said Olsavsky. “We’re seeing a pick-up from customers and their usage, their increased pace of enterprise migration, [and] increased adoption of our services, especially machine learning services.

“Continually again, AWS is being chosen as a partner to many companies because of our leadership position both in technology, our vibrant partner ecosystem, and also the stronger security that we offer,” Olsavsky added.

As far as this translates to the wider industry, the slight fall in yearly growth may have been expected. Indeed, analyst firm Synergy Research has been citing the law of large numbers in recent quarters; more than 100% growth across the hyperscale cloud providers simply could not go on forever. In terms of market share, Synergy notes AWS remains at a third (33%) of the market, with Microsoft at 16% and Google 8% respectively.

You can read the full Amazon earnings release here and the full Alphabet report here.

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.

AWS and Azure take up half of the cloud market


Bobby Hellard

26 Jul, 2019

Spending on cloud services has grown almost 40% in the second quarter of 2019, with AWS and Microsoft Azure claiming half of the market.

Overall, the cloud market is heading for a world-wide revenue run rate of $100 billion per year, according to Synergy Research Group, with AWS taking up 33% of the that and Microsoft someway off in second with 16%.

AWS actually posted a slight slip in net sales with 37% growth compared to the same time last year. Despite going from $6.105 to $8.381 billion when compared to the second quarter of 2018, it’s the first time in over five years that its net sales have dropped below 40%.

However, not only is AWS still a large chunk of Amazon’s overall revenue (13%), it’s still bigger than the next four providers, Microsoft, Google, Alibaba and IBM, combined.

“When quarterly spend on cloud services is mapped out for the last twelve quarters, we are pretty much looking at a steep, straight-line growth profile,” said John Dinsdale, a chief analyst at Synergy.

“Amazon is maintaining its leadership position in the market, though growth at Microsoft is also noteworthy. In early 2016 Microsoft was less than a quarter the size of Amazon in this market, while today it is getting close to being half the size. These two cloud providers alone account for half of all money spent on cloud infrastructure services, which is impressive for such a high-growth, strategically important market.”

Microsoft’s public cloud computing platform, Azure, has firmly established itself as the second-place cloud provider, recently posting revenue growth of 63%.

Google Cloud, which holds 8% of the overall market, is generating $8 billion a year in run-rate according to parent company Alphabet’s latest earnings. The company also plans to invest in its sales force as it looks to close the gap on Microsoft and AWS.

IBM recently reported a drop in revenue, partly attributed to its acquisition of Red Hat but there is a suggestion that the open-sourced specialist is a big part of the IBM’s cloud strategy.

Alibaba, Salesforce, Oracle, Tencent and Rackspace made up the remaining market share with a combined 14%.

A comprehensive guide to selecting SaaS project monitoring tools

Project monitoring is one of the most important aspects of the SaaS development process. Unfortunately, it’s also one of the most neglected aspects. Given how complex and problem-prone this process can be, proceeding without careful scrutiny can create a lot of liabilities.

When monitoring is a priority in a SaaS project, system operation engineers are able to identify and address problems before end users even notice. Engineers can conduct workload and performance testing, estimate infrastructure performance, and gauge system accessibility by using monitoring software. All of this allows them to accommodate larger workloads from new users so system performance isn’t compromised. Additionally, project monitoring tools are necessary for calculating infrastructure costs when using a public cloud.

Evaluating public cloud monitoring tools

Two of the most widely used monitoring tools are Amazon Web Services and Microsoft Azure. Both have strengths and weaknesses, meaning it’s up to each user to choose the software that best fits his or her needs.

When working with an AWS infrastructure, the only tool necessary is Amazon CloudWatch, which works like an archive for metrics created by AWS. It persistently keeps track of the resources and applications running on AWS, and it gathers data points related to things like disc operations and usage of the central processing unit. If specific metrics reach preassigned limits, users receive a notification and have the option to automatically initiate Auto Scaling. Users can also create their own metric sets and track them accordingly as well as access CloudWatch through several convenient interfaces.

There’s a lot to like about this monitoring tool, but it has notable limitations. For instance, it can’t aggregate data from some locations or send alarms for more than five different metrics. Users also can’t manually remove metrics, and in some cases, obtaining them can be a lengthy process.

Despite this, however, CloudWatch is a popular monitoring tool because it excels at collecting data. The tool is designed to track and document what matters most to administrators, and those records can be kept for years to identify long-term patterns. CloudWatch handles monitoring, but more importantly, it provides insights.

Microsoft Azure is similar. It provides infrastructure metrics as well as Azure service logs. Data can be collected from application logs, activity logs, and performance counters. Users can then engage with the data in various ways. It can be sent to a third party for analysis or stored for 90 days in an archive (or routed to long-term storage). It can also be routed to an application like Application Insights or Power BI for in-depth analysis.

Much like CloudWatch, Azure Monitoring can also be set up to send alerts when metrics reach certain thresholds. Additionally, there are multiple ways to access Azure Monitoring tools. Users may not find the solution perfect, but it provides the functionality necessary to make project monitoring productive.

Evaluating private cloud monitoring tools

Applications that rely on a private cloud for infrastructure are harder to monitor, but it’s not impossible. There are several specialised solutions on the market, such as Nagios Core and Zabbix.

Nagios Core is a free, open-source product. It offers a number of tools that empower users to observe services alongside Windows and Linux hosts. Users have a fair amount of control over what metrics they want to follow, and the system can also be customised using common languages like C++ and PHP. Like all monitoring tools, this one also sends alerts.

The only real drawback of Nagios Core is that it doesn’t offer a graphic user interface to adjust system settings to customers’ needs. Instead, they have to be manually edited in the configuration files. Most users are able to look past this flaw, given that Nagios Core offers a wide range of free commercial plug-ins and, overall, is a highly customisable solution.

Zabbix is also a worthwhile monitoring software. It consists of a monitoring server that collects data, runs analyses, and sends alerts. It also includes a database, a web interface, and daemon agents that can run in either passive or active modes.

Unlike Nagios Core, Zabbix excels in terms of available customisations and the ease of implementing them. Thanks to intuitive templates, the initial configuration time is minimal, and additional templates can be applied later to further adapt the system.

Choosing the right monitoring tool

Ultimately, different users need different things from their monitoring tools. Use these tips to make the right choice:

  • Take a trial of any commercial software to explore whether it has any problematic restrictions or limitations
  • Search for tools that track the most important metrics, not the largest number of possible metrics
  • Consider the total cost of tool ownership, including the licence and ongoing system maintenance

After the system is up and running, administrators need to manage it effectively. That includes testing alert functionality and responding promptly. It’s also important to apply the same metrics to different environments, including production, staging, and testing.

As helpful as monitoring software may be, its success depends on the people behind the technology. They must evaluate public and third-party solutions carefully, implement them correctly, and utilise them effectively. The right tool in the right hands makes project monitoring relatively easy, and thus, it’s a lot more valuable in the process.

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.

Sistema Plastics uses Epicor to iron out inventory woes


Keumars Afifi-Sabet

25 Jul, 2019

For manufacturing companies specialising in fast-moving-consumer-goods (FMCG), the need for reliable enterprise resource planning (ERP) software is paramount. Firms look to these systems to handle many aspects of day-to-day operations, from staying on top of inventory to managing the sales process.

Sistema Plastics runs a single manufacturing site in Auckland, New Zealand, but ships worldwide through a series of third-party retailers, including Amazon and Asda. Indeed, if you peer into any kitchen cabinet you’ll likely find something manufactured by the firm, from microwavable containers to lunchboxes to reusable water bottles.

The company has grown rapidly over the last decade – a single manufacturing run now is in the region of 30-40,000 units. As a result of this rapid growth, inventory management had started to spiral out of control to the point it was being stored “almost anywhere” with no real way of tracking it, Sistema’s CTO, Greg Heeley, tells Cloud Pro.

Four years ago, the company brought in Epicor’s flagship ERP platform to handle a major transformation in its manufacturing processes.

The system can only do what you tell it to

The challenge Sistema faced at the time, Heeley says, was finding a product that could support the way inventory was configured, as well as finding and managing stock. This highly pressing issue was borne from Sistema’s rapid growth coupled with severely restricted physical floor space. Problems deepened when Sistema outgrew its first plant and began opening up several smaller sites. Being spread in such a way, across multiple locations, meant workers would regularly move parts that were needed in one plant from another and vice versa.

To compound these issues, employees neglected to feed accurate information into Epicor ERP, such as where the stock was kept and whether it had been moved, making it even more difficult to use the software.

“Putting stock somewhere and telling the system is one thing, but if you then move it and don’t tell the system – that’s something else,” Heeley explains. In light of this, the firm devised procedures around how the stock was recorded and got people trained up to follow the new system.

“There are some areas like that we struggled with; more personnel than system-driven,” he adds. “The system can’t do what you don’t tell it – and we sometimes didn’t tell it what to do.”

It was only in 2016 that a single site large enough to handle the scale of manufacturing operations was found and things finally began to click with the software. This step forward involved putting in automatic inventory systems, among other measures, to ensure all inventory problems were consigned to the past.

The skills shortage bites hard

Looking ahead, Sistema is shifting its focus to grow as a company now that the software underpinning its operations has been tamed. But the problems the firm now faces are unique to a company that both manufactures plastic goods and is based in “the middle of technically nowhere”.

For a company that keeps a close eye on its carbon footprint, being based in New Zealand has proved a massive hindrance. Sistema’s environmental ranking scores must take into account shipping materials in and out, as well as the products manufactured. This is all offset against its external energy consumption, which is proving a battle. Plastics itself, meanwhile, has become stigmatised due to the effects of discarded materials on the environment.

Sistema’s need to hire more high-skilled staff, however, is chief among the firm’s concerns, and this isn’t helped by the company’s location either. New Zealand’s economy is based predominantly on agriculture and tourism, not manufacturing or engineering. Attracting people to work in “the middle of technically nowhere”, therefore, is something the firm will have to look at addressing in the coming years.

There’s scope for installing human resources (HR) modules in ERP software to assist in talent acquisition. More often than not, this involves automating processes like payroll and benefits to give hiring managers more time to focus on finding talented workers. ERP can also make a difference to a firm’s sustainability goals, with greater visibility over stock allowing Sistema to gain full control over the products ordered, consumed and re-used.

Looking ahead, Sistema is seeking to further buy into document management, and automate a host of processes by implementing Epicor’s DocStar enterprise content management platform. While the firm has adopted Electronic Data Interchange (EDI), a digital exchange of business documents in a standardised format, many of its customers haven’t. This means the manufacturer often receives orders that are 50 to 60 times longer than they should be, which must be then manually entered into their systems. Epicor’s software, Heeley claims, can help the company get around this.

Automating accounts payable (AP) document scanning, as well as document record handling, are also on the horizon. Meanwhile, Sistema has ambitions to digitise standard operating procedures for the shop floor and to index visually-compelling videos that teach employees how to perform tasks. Heeley also touts robotics, predictive analytics, and edge computing as areas he’d be keen to explore, but what he does next will largely depend on what makes the most sense for the company.

“It does become a challenge of we haven’t got infinite resources, and there aren’t infinite resources in the country either,” he continues.

“So we definitely have to be selective about the projects that we take on, and [we need to] know which one’s going to produce the best results in the short term. We will get to all of them eventually, but it’s just about prioritising.”

BT bets on Ubuntu OpenStack to deliver 5G pledge


Bobby Hellard

25 Jul, 2019

BT has announced a partnership with Canonical to develop and deploy its next-generation 5G core network.

The deal will see Canonical offer up its open-source virtual infrastructure manager (VIM) platform so that BT can run network applications as code and transition away from a hardware-based network to one that’s virtualised.

This open-sourced cloud-based approach will help BT to quickly deploy new services and allow it to stay ahead of the demand for 5G and Fibre to the Premises (FTTP), the company said.

Canonical operates its own distribution of OpenStack, a bundle of separate open source projects connected through APIs. When applied to BT’s own infrastructure, this will enable the separation of network hardware and software, turning core components into software applications so they can be updated faster and continuously integrated.

The result is that different network applications can share the same hardware across datacentres for a stronger and more resilient network. For BT, this means the speed at which its software can be updated could potentially lead to a new way of developing 5G services where it can build and deploy within a matter of weeks.

“BT has recognised the efficiency, flexibility and innovation afforded by an open architecture, and realises the value of such an approach in enabling its delivery of new 5G services,” said Mark Shuttleworth, CEO of Canonical. “We’re delighted to be working with them to deliver the foundation to this approach, which will underpin BT’s 5G strategy.”

In May, BT-owned EE switched on its 5G services, making London, Birmingham, Cardiff, Manchester, Edinburgh and Belfast the first places in the UK to experience the benefits of the next generation of mobile connectivity.

BT’s cloud-based full 5G core will be introduced from 2022 with its cloud-based architecture enabling ultra-reliable low latency communications and multi-gigabit-per-second speeds. The company said this phase of 5G will enable critical applications like the real-time traffic management of autonomous vehicles, massive sensor networks with millions of devices measuring air quality across the entire country and a “tactile internet”, where a sense of touch can be added to remote real-time interactions.

The rise of Office 365 phishing scams: How one compromised account can cost millions

More businesses than ever are now choosing to use cloud services. Despite the many benefits of doing so, such as reduced maintenance costs and improved scalability, there are a number of cyber security risks than organisations need to be aware of – including sophisticated phishing scams that intentionally target SaaS applications.

With more than 155 million active commercial monthly users, Office 365 is among the most widely used cloud hosted email and productivity suites – and consequently a prime target of cybercriminals. As threats intensify, organisations must ensure that their cloud security is suitably hardened.

Beware of BEC attacks

A business email compromise, or BEC for short, is a type of advanced phishing scam that has become prevalent in recent years. Such attacks involve cybercriminals posing as an employee, usually a C-level executive, in order to trick an associate of that person into wiring payment for goods or services into a substitute bank account.

To conduct a BEC attack, criminals will seek to either compromise a user’s account through a traditional credential harvesting phishing attack, or spoof the person’s email address so that it appears almost identical to the targeted account. Attempts will then be made to leverage the person’s identity to send phishing emails to others.

Italian football club, Lazio, is one the most high-profile victims of a BEC attack, having inadvertently wired a £2 million player transfer fee to a cybercriminal instead of the player’s former club. However, the fact that BEC attacks are reported only a fraction of the time – largely due to the reputational damage they can inflict – means that they are often left to thrive in anonymity.

A smarter breed of phish

To dupe their targets, BEC fraudsters conduct meticulous research. This entails monitoring company news, investigating supply chains and technology usage, plus undertaking reconnaissance – with any information gathered used to design more creative, custom and elaborate campaigns that are difficult for recipients to identify as malevolent.

Gone are the days of brazen scams with email subject lines such as ‘Congratulations you’re a winner’. To compromise Office 365 users, for example, criminals now use a range of fake requests and notifications, such as security alerts, non-delivery reports and meeting appointments. To instigate BEC attacks, hackers will seek to build victims’ trust by sending a sequence of emails.

It’s not just suspicious links that people need to look out for. Techniques to distribute malicious attachments are also becoming more sophisticated. One recently identified technique involves the use of PowerShell to inject malware when users preview files in Outlook – now it’s not always necessary to open a document to trigger an infection.

To evade firewalls and email gateways, hackers are hosting content on SharePoint and other trusted platforms. This makes unknown content harder to blacklist. Another attack technique, known as ‘NoRelationship’, exploits a weakness in Microsoft’s file filtering technology.

How to protect your employees

If your business subscribes to or is thinking of subscribing to Office 365 or another cloud service, then protecting against BEC and other phishing threats should be an important security consideration. Employee education is an important step but shouldn’t be relied upon to fully mitigate risks. People continue to be a weak link in the security chain.

Enforcing multi-factor authentication across all user accounts is highly recommended to provide an extra layer of protection should password credentials be cracked or stolen. Authentication technologies such as SPF and DMARC can also be implemented to help reduce receipt of emails from unknown senders.

Key to securing Office 365 and other cloud environments is also obtaining greater threat visibility. After all, businesses cannot secure what they cannot see. Next-Gen SIEM (Security Information and Event Management) and Endpoint Detection and Response (EDR) tools that support centralised log monitoring, behavioural analysis and incident response can be incredibly effective at helping to significantly reduce the time it takes to identify suspicious events as well as contain and shut down attacks before they spread.

Examples of behaviours that could indicate something anomalous include: account and infrastructure changes, unauthorised network connections, privilege escalation, and automatic forwarding of emails to other accounts.

Embracing the cloud with confidence

While it’s impossible to completely eliminate the possibility of employees failing victim to BEC scams or opening malicious attachments, having a multi-layered approach to cyber security incorporating prevention, detection and response will certainly go a long way to reduce the risk. By implementing just a few extra controls and procedures, your business will be better placed to avoid suffering operational disruption and avoid serious financial and reputational damage should an incident occur.

(c)iStock.com/abalcazar

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.

CircleCI secures $56 million series D funding to further expand CI/CD focus

CircleCI has secured $56 million (£44.8m) in series D funding to further its presence and product – with one investor arguing the company is the ‘DevOps standard’.

The San Francisco-based provider of continuous integration and delivery software has taken its total funding to $115.5 million, having most recently raised $31m for its series C in January last year.

CloudTech explored CircleCI’s proposition on its series C round around automating built, test and deploy processes across organisations’ development cycles, particularly around its combination of ‘speed and solidity.’

Since then, the company has put together a lot more, including a more than quadrupled increase in monthly job count, alongside 75 new employees added in 2019 and the opening of a first international office in Tokyo. Hubs have been added in Boston and Denver, while more than 900 ‘orbs’ – putting commands and jobs into single, reusable lines of code – have been initiated.

Various research in recent months has explored both the technical and cultural aspect of DevOps implementations. According to Trend Micro, in a study released at the start of this month, more than three quarters (77%) of those polled said developers, security and operations teams needed to be in closer contact.

As this publication has covered variously, as architectures and cloud technologies have improved, then customer expectations have improved with it. “Every company today relies on software to continuously improve its products and business in order to keep up with consumers’ evolving needs and expectations,” said Rob Zuber, CircleCI chief technology officer. “We believe humans should never have to wait on machines. This new round of funding will better allow us to provide developers flexibility and control to power their workflows seamlessly.”

The funding round was led by Owl Rock Capital Partners and NextEquity Partners, alongside existing investors Scale Venture Partners, Top Tier Capital, Threshold Ventures, Baseline Ventures, Industry Ventures, Heavybit, and Harrison Metal Capital.

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.

LinkedIn begins ‘multi-year’ migration to Microsoft Azure

LinkedIn has announced the launch of a ‘multi-year migration’ project to Microsoft Azure.

The move will see LinkedIn move its infrastructure across, having already utilised Azure for a variety of services including video delivery and machine translation across the social network’s news feed.

“Today’s technology landscape makes the need for constant reinvention paramount, especially as we look to scale our infrastructure to drive the next stage of LinkedIn’s growth,” wrote Mohak Shroff, LinkedIn SVP engineering, in a blog post announcing the move. “With the incredible member and business growth we’re seeing, we’ve decided to begin a multi-year migration of all LinkedIn workloads to the public cloud.”

One would have expected, given Microsoft bought LinkedIn for $26.2 billion three years ago, that migrating to Azure would have been part of the plan. Yet Shroff told VentureBeat that this was not quite the case. Shroff noted that Microsoft allowed LinkedIn ‘independence in decision making’, adding that Azure wasn’t particularly pitched to them.

This is inferred in the blog post. The success of Azure, ‘coupled with the opportunity to leverage the relationship we’ve built with Microsoft, made Azure the obvious choice.’ “Moving to Azure will give us access to a wide array of hardware and software innovations, and unprecedented global scale,” added Shroff. “This will position us to focus on areas where we can deliver unique value to our members and customers.

“The cloud holds the future for us and we are confident that Azure is the right platform to build on for years to come.”

Microsoft released its fourth quarter results earlier this month to almost universal praise; the company saw total revenues go up 12% to $33.7 billion (£26.9bn). While revenue for Azure is not specifically disclosed, Microsoft said it climbed 64% compared with the previous year.

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.