Todas las entradas hechas por Connor Jones

Cloud database management set to soar in coming years


Connor Jones

1 Jul, 2019

The trend involving databases being used for analytics under the ever-popular software as a service (SaaS) model will see 75% of all databases being deployed or migrated to a cloud platform, according to Gartner’s latest predictions.

The IT analyst house also said that just 5% of these will ever be considered by owners to be taken back into on-premise infrastructure as businesses continue to realise the benefits of widespread cloud adoption.

«According to inquiries with Gartner clients, organisations are developing and deploying new applications in the cloud and moving existing assets at an increasing rate, and we believe this will continue to increase,» said Donald Feinberg, distinguished research vice president at Gartner.

«We also believe this begins with systems for data management solutions for analytics (DMSA) use cases — such as data warehousing, data lakes and other use cases where data is used for analytics, artificial intelligence (AI) and machine learning (ML).

«Increasingly, operational systems are also moving to the cloud, especially with conversion to the SaaS application model.»

Research from Gartner shows that worldwide revenue from database management systems was up a significant 18.4% to $46 million and cloud database management systems accounted for 68% of that.

The company also notes that Microsoft and AWS account for more than 75% of the total market growth, indicating a trend towards cloud service providers becoming the new data management platform.

On-premise infrastructure rarely offers built-in capabilities to support cloud integration which is why its growth isn’t as vibrant as its cloud counterparts. The industry is growing, but at a much slower rate and not because of new on-premise deployments, but because of price increases and forced upgrades.

«Ultimately what this shows is that the prominence of the CSP infrastructure, its native offerings, and the third-party offerings that run on them is assured,» said Feinberg. «A recent Gartner cloud adoption survey showed that of those on the public cloud, 81% were using more than one CSP.

«The cloud ecosystem is expanding beyond the scope of a single CSP — to multiple CSPs — for most cloud consumers,» he added.

The UK is adopting the cloud more than others in the EU, according to figures from Eurostat published late last year.

A sixth-place ranking among EU countries for cloud adoption is primarily due to the high rate of British enterprises using some form of cloud service.

British businesses beat the average EU country in this regard by a significant margin, with 41.9% using at least one cloud service compared to the average of 26.2% – a figure beaten only by a handful of Scandinavian nations, Denmark, Sweden and Finland among them. 

Microsoft warns of remote execution exploit in Excel


Connor Jones

27 Jun, 2019

A new vulnerability in a Microsoft Excel business intelligence tool has been found to give attackers an opportunity to remotely launch malware and take over a user’s system.

Researchers at Mimecast discovered a vulnerability in Power Query (PQ), a powerful and scalable business intelligence tool in Microsoft Excel that allows users to integrate spreadsheets with other areas of their, business such as external databases, text documents and web pages.

The vulnerability is based on a method of data communication between applications which is used across the Microsoft Office suite called Dynamic Data Exchange (DDE). DDE attacks are nothing new, many successful malware campaigns have used the method to compromise documents, however, this particular attack grants perpetrators significant admin privileges.

«In an email attack scenario, an attacker could leverage the DDE protocol by sending a specially crafted file to the user and then convincing the user to open the file, typically by way of an enticement in an email,» said Microsoft. «The attacker would have to convince the user to disable Protected Mode and click through one or more additional prompts.»

Using the exploit, attackers can fingerprint individual systems belonging to victims, allowing them to deliver harmful code that appears harmless to both sandboxes and other security software the victim may be running.

Mimecast researcher Ofir Shlomo also said that the Power Query exploit could be used to launch sophisticated attacks, difficult-to-detect attacks the combine several attack surfaces.

«Using Power Query, attackers could embed malicious content in a separate data source, and then load the content into the spreadsheet when it is opened,» said Shlomo in a research blog shared with IT Pro. «The malicious code could be used to drop and execute malware that can compromise the user’s machine.»

DDE attacks are infamous for targeting enterprises due to their widespread reliance on Microsoft Office software in workplaces around the world.

APT28 and APT37, Russian and North Korean-linked hacking groups respectively, have both used the technique to good effect in recent years, with other groups utilising malformed Word documents for use in spear phishing campaigns.

«Such attacks are usually hard to detect and gives threat actors more chances to compromise the victim’s host,» said Shlomo. «Using the potential weakness in Power Query, attackers could potentially embed any malicious payload that as designed won’t be saved inside the document itself but downloaded from the web when the document is opened.»

Mimecast approached and disclosed the issue with Microsoft when they discovered it as part of Microsoft’s Coordinated Vulnerability Disclosure process. While Microsoft has yet to offer a fix for the issue, they did share a workaround.

Microsoft published an advisory document (advisory 4053440) that offers tips and guidance on how to secure applications when they process DDE fields. This includes instructions on how to create custom registry entries for Office and other methods too, each with benefits and drawbacks listed.

«Attackers are looking to subvert the detections that victims have,» said Shlomo. «While there is a chance that this kind of attack may be detected over time as threat intelligence is shared between various security experts and information sharing platforms, Mimecast strongly recommends all Microsoft Excel customers implement the workarounds suggested by Microsoft as the potential threat to these Microsoft users is real and the exploit could be damaging.»

NASCAR revs up its video business with AWS


Connor Jones

5 Jun, 2019

The National Association for Stock Car Auto Racing (NASCAR) has partnered with AWS to utilise the cloud giant’s artificial intelligence and machine learning tools to automate the database categorisation of 70 years worth of video.

In the run-up to the airing of its online series ‘This Moment in NASCAR History’, the sport that packs deafening stadiums has 18-petabytes of video to migrate to an AWS archive where the processing will take place.

«Speed and efficiency are key in racing and business which is why we chose AWS – the cloud with unmatched performance, the most comprehensive set of services, and the fastest pace of innovation – to accelerate our migration to the cloud,» said Craig Neeb, executive vice president of innovation and development, NASCAR.

«Leveraging AWS to power our new video series gives our highly engaged fans a historical look at our sport while providing a sneak peek at the initial results of this exciting collaboration,» he added.

Using Amazon Rekognition, the platform’s AI-driven image and video analysis tool, NASCAR hopes to automate the tagging of video metadata for its huge catalogue of multimedia to save time searching for specific clips.

Metadata is attributed to stored multimedia files which makes it easier for someone to search for it in a database. For example, a type of metadata attributed to a given video would include the race date, competition, the drivers involved, location and other information that would differentiate it from other clips.

Making a series that joins clips of races throughout the years would take a long time to manually search through petabytes of video.

«By using AWS’s services, NASCAR expects to save thousands of hours of manual search time each year, and will be able to easily surface flashbacks like Dale Earnhardt Sr.’s 1987 ‘Pass in the Grass’ or Denny Hamlin’s 2016 Daytona 500 photo finish, and quickly deliver these to fans via video clips on NASCAR.com and social media channels,» read an AWS statement.

NASCAR also plans to use Amazon SageMaker to train deep learning models against its footage spanning decades to enhance the metadata tagging and video analytics capabilities.

The sport will also be using Amazon Transcribe, automatic speech recognition service, to caption and timestamp every word of speech in the archived videos which will facilitate easy searchability further.

«AWS’s unmatched portfolio of cloud services gives NASCAR the most flexible and powerful tools to bring new elements of the sport to live broadcasts of races,» said Mike Clayville, vice resident, worldwide commercial sales at AWS.

BT partners with Juniper on unified cloud network platform


Connor Jones

3 Jun, 2019

BT has partnered with Juniper Networks to support with the core infrastructure that will underpin the rollout of its upcoming unified cloud networking platform.

The platform will unify BT’s networks including 5G, Wi-Fi and fixed-line into one virtualised service which will enable more efficient infrastructure management and deployment.

The new unified platform will supposedly allow BT to «create new and exciting converged services bringing mobile, Wi-Fi, and fixed network services together».

The platform’s infrastructure will be build to a common framework, allowing it to be shared across BT’s offices nationally and globally.

The platform will be used by a range of BT’s arms such as voice, mobile core, radio/access, ISP, TV and IT services and deploying the platform company-wide will cut costs and streamline operations.

«This move to a single cloud-driven network infrastructure will enable BT to offer a wider range of services, faster and more efficiently to customers in the UK and around the world,» said Neil McRae, chief architect, BT. «We chose Juniper to be our trusted partner to underpin this Network Cloud infrastructure based on the ability to deliver a proven solution immediately, so we can hit the ground running.»

«Being able to integrate seamlessly with other partners and solutions and aligning with our roadmap to an automated and programmable network is also important,» he added.

We’re told that the project will facilitate the advent of new applications and workloads for the telecoms giant and evolve its existing ones including converged fixed and mobile services and faster time-to-market for internet access delivery.

«By leveraging the ‘beach-front property’ it has in central offices around the globe, BT can optimise the business value that 5G’s bandwidth and connectivity brings,» said Bikash Koley, chief technology officer, Juniper Networks.

«The move to an integrated telco cloud platform brings always-on reliability, along with enhanced automation capabilities, to help improve business continuity and increase time-to-market while doing so in a cost-effective manner,» he added.

BT has undergone a change in leadership this year and faces challenges in almost all areas of its business, according to its annual financial overview.

EE’s business has been carrying the telco, it’s the only arm of the company that is posting profits in an «unfavourable telecoms market». Its revenue slip for the year has been attributed to the decline in traditional landline calls with a seemingly unrelenting shift to voice over IP.

In order to capitalise on new business angles such as IoT, cloud and SD-WAN, BT admits greater investment is needed and this will most likely hinder its short-term revenue targets but it could be pay off in the long term.

«Our aim is to deliver the best converged network and be the leader in fixed ultrafast and mobile 5G networks,» said Jensen. «We are increasingly confident in the environment for investment in the UK.»

EE launched its 5G network last week, becoming the first telecoms company in the UK to do so. It’s available in six major cities and speeds of 1Gbps are promised «for some users».

Google Cloud scores FA digital transformation partnership


Connor Jones

31 May, 2019

The English Football Association (The FA) has partnered with Google Cloud to digitally transform its St. George’s Park national training centre used by 28 national teams.

Google Cloud is now the official cloud and data analytics partner to the FA and during the multi-year partnership, Google Cloud aims to put G Suite at the heart of everything. It will see a shift from siloed working to a more collaborative approach between coaches of all the teams.

«The first step in our transformation at St. George’s Park was to unify the way our coaches train and develop our 28 national teams to increase productivity,» says Craig Donald, CIO at the FA. «We needed the ability to collaborate and share across the coaches and team managers. G Suite allowed us to do that and was the first part of our Google Cloud partnership.»

The FA has terabytes of data stored in Google Cloud collected from tracking player activity and the analysis team will use the tools provided by Google Cloud Platform, such as smart analytics tools, machine learning, AI and BigQuery to unearth new insights from the data.

The organisation’s next step will be to build out its Player Profile System (PPS), a proprietary tool built on the platform, to measure performance, fitness, training and form of players at all levels.

The goal is to automate near real-time data analysis which will give the pitchside coaches a better indication as to how the players are performing in training, which could influence decisions such as player selection for matches.

The PPS will be further enhanced by Google Cloud smart analytics, data management systems and machine learning capabilities to analyse even more player data signals.

«Smart analytics and data management play a critical part in our PPS,» said Nick Sewell FA head of application development. «Everything we do at St George’s Park for this workload is built on Google Cloud.»

Over the multi-year partnership The FA aims to tackle three key areas:

  • Success: Preparing both men’s and women’s senior teams for the next World Cups.
  • Diversity: Doubling female participation in the game.
  • Inclusivity: Making football more inclusive and open to all.

«We believe technology is a key area of potential competitive advantage for our 28 teams and everything we do at St George’s Park,» said Dave Reddin, The FA’s head of team strategy and performance.

«We have progressively built a systematic approach to developing winning England teams and through the support of Google Cloud technology we wish to accelerate our ability to translate insight and learning into performance improvements.»

AWS’ launches Textract tool capable of reading millions of files in a few hours


Connor Jones

30 May, 2019

AWS has said that its Textract tool, designed to extract and translate data between files, is now generally available for all customers.

The tool, which is a machine learning-driven feature of its cloud platform, lets customers autonomously extract data from documents and accurately convert it into a usable format, such as exporting contractual data into database forms.

The fully-managed tool requires no machine learning knowledge to use and works in virtually any document. Industries that work with specific file types such as financial services, insurance and healthcare will also be able to plug these into the tool.

Textract aims to expedite the laborious data entry process that is also often inaccurate when using other third-party software. Amazon claims it can accurately analyse millions of documents in «just a few hours«.

«Many companies extract text and data from files such as contracts, expense reports, mortgage guarantees, fund prospectuses, tax documents, hospital claims, and patient forms through manual data entry or simple OCR software,» the company said.

«This is a time-consuming and often inaccurate process that produces an output requiring extensive post-processing before it can be put in a format that is usable by other applications,» it added.

Textract takes data from scanned files stored in Amazon S3 buckets, reads them and returns data in JSON text annotated with the page number, section, form labels, and data types.

PwC is already using the tool for its pharmaceutical clients, an industry that commonly uses processes that involve Food and Drug Administration (FDA) forms that would otherwise require hours to complete, according to Siddhartha Bhattacharya, director lead, healthcare AI at PwC.

«Previously, people would manually review, edit, and process these forms, each one taking hours,» he said. «Amazon Textract has proven to be the most efficient and accurate OCR solution available for these forms, extracting all of the relevant information for review and processing, and reducing time spent from hours to down to minutes.»

The Met Office is another organisation that plans to implement Textract, making use of old weather records.

«We hope to use AmazonTextract to digitise millions of historical weather observations from document archives,» said Philip Brohan, climate scientist at the Met Office. «Making these observations available to science will improve our understanding of climate variability and change.»

Uncrackable passwords introduced to Microsoft Azure


Connor Jones

17 May, 2019

Microsoft Azure has increased the character limit for passwords in Azure Active Directory from 16 to a massive 256 characters, making brute force hack attempts much more difficult.

It seems to be a hot topic for Azure customers who have been reminding Microsoft of its seemingly unsatisfactorily small limit for passwords.

«Many of you have been reminding us that we still have a 16-character password limit for accounts created in Azure AD,» said Microsoft’s Alex Simons. «While our on-premises Windows AD allows longer passwords and passphrases, we previously didn’t have support for this for cloud user accounts in Azure AD.»

«Today, I am pleased to announce that we have changed this limit, allowing you to set a password with up to 256 characters, including spaces,» he added.

Passwords must still meet three out of the four essential criteria as set out in Microsoft’s policy documentation.

  • Lowercase characters
  • Uppercase characters
  • Numbers (0-9)
  • Symbols (@ # $ % ^ & * – _ ! + = [ ] { } | \ : ‘ , . ? / ` ~ » ( ) 😉

While account and password security are of paramount importance to IT users, Microsoft still won’t force you to create an iron-clad password, keeping the minimum allowance at just a mere eight characters.

The difference between an eight-character password and a 256 character one is huge, according to howsecureismypassword.net, a website used to check how long it would take to brute force a password.

We took three different passwords of varying lengths to see how long it would take to crack each of them. First up is ‘Jazzily1’, the minimum character requirement that adheres to three of Azure’s four essential criteria. This would take just one month to crack, according to the website.

A middle ground 137-character password would take 29,511,750,324 octogintillion years (quite a lot) to crack, and the 253-character password we used at the upper limit of Azure’s allowance would take ‘forever’.

Another way to look at hyper-secure passwords is Professor Bill Buchanan’s take on things regarding 128-bit AES keys. He said that in order to break one of these, it would take the energy required to boil every single one of Earth’s oceans 16,384 times just to crack a single key.

In related news, Microsoft recently gained FIDO certification for its Windows 10 authenticator Windows Hello in the upcoming May 2019 upgrade, seemingly in an embryonic first step towards a passwordless Windows.

Windows Hello will use facial recognition, fingerprint scanning and a secure PIN number for more than 800 million Windows 10 devices starting next month – a service with cross-compatibility with other Microsoft services such as Office 365, OneDrive and more.

«Our work with FIDO Alliance, W3C and contributions to FIDO2 standards have been a critical piece of Microsoft’s commitment to a world without passwords,» said principal group program manager with Microsoft Yogesh Mehta.

«No one likes passwords (except hackers),» he added. «People don’t like passwords because we have to remember them. As a result, we often create passwords that are easy to guess – which makes them the first target for hackers trying to access your computer or network at work.»

In the same May update, Microsoft will also stop enforcing its password expiration policies which prompt users to change their passwords every few months.

The company’s logic behind this came from the idea that if users are frequently changing passwords, they will be more inclined to just make small changes or even start writing them down; a big security no-no.

Security flaw found in Google’s «most secure» account authenticator


Connor Jones

16 May, 2019

A misconfigured Bluetooth pairing protocol in Google’s Titan security keys could allow attackers to bypass encryption and hijack user accounts, the firm has revealed.

Google has said it will start offering replacements of what it once called the «strongest, most phishing resistant method of two-step verification (2SV) on the market today», following the discovery of the flaw which exposes account information to those within Bluetooth range.

The company has assured customers that the keys, the technology for which was first launched in 2017, would still do their job and provide multi-factor authentication built to a FIDO-standard that’s stronger than regular 2SV, but that the $50 cost would be waived if they wanted a replacement unit.

«This bug affects Bluetooth pairing only, so non-Bluetooth security keys are not affected,» said Christiaan Brand, product manager, Google Cloud. «Current users of Bluetooth Titan Security Keys should continue to use their existing keys while waiting for a replacement, since security keys provide the strongest protection against phishing.»

When attempting an account sign-in, a Titan user is required to press a button on the Bluetooth key to authenticate the log-in attempt. It was discovered that immediately after this button press, attackers have a narrow window to connect their own device to the security key, which could result in the attacker logging into the user’s account from their device, provided they already had said user’s email and password.

Titan keys work by acting as another authentication step and are linked with a user’s device, such as a phone or laptop, via a Bluetooth connection. A flaw in this connection means that an attacker could trick the phone or laptop into thinking the attacker’s own device is the security key. If this is achieved, the attacker could bypass the authentication process and start to make changes to the user’s device by mimicking an external keyboard and mouse.

It could be argued that a situation where an attacker that has your account credentials, knows you use a Titan key and is within 30m of your location would be unlikely to occur, but it’s still serious enough to prompt Google into taking action by replacing all affected keys. Others are less sceptical, though.

«The fact you must be within 30 feet of the security key isn’t an issue, especially when you consider how fast compiled and scripted software can run,» said Mark Miller, director of enterprise security support at Venafi. «In addition, lots of people conduct business in public places like coffee shops and airports, so connecting a dongle to a device isn’t that farfetched.»

«From a technology perspective, these keys are amazing; they make security a lot easier to consume», he added. «However, there is no such thing as perfect technology, so I’m glad Google is taking the initiative and recalling these keys.»

Most recently, Google announced that a new form of its Titan Security keys would be made available to all Android phones running Android 7.0 or later, with its line of Pixel phones getting a slightly more secure version too.

The phone as a security key (PaaSK) standard was announced at Google Cloud next 2019 and instead of having an external Titan Security key to hand, all that would be required is to unlock your Google account-linked Android device and press a button to approve the log-in in real time.

The Titan key was originally introduced to combat phishing attempts that exploited vulnerable 2SV methods such as confirmation codes delivered by texts – a method of communication that can be hijacked with relative ease.

In other Google news, a privacy flaw was found in Google Pay’s settings on Wednesday. Optional settings regarding a user’s ability to share their creditworthiness, personal information or Google Pay account information were hidden behind a special URL and not directly through the Google Pay account settings page.

Google has since attributed this error to fault left over from an update and has now fixed it so that the three privacy settings now appear as normal.

NHS Digital cuts costs by using VMware cloud on AWS


Connor Jones

10 May, 2019

NHS digital is migrating VMware vSphere workloads to its AWS cloud platform to reduce costs and improve the operational efficiency of digitally-enabled healthcare.

The move is part of NHS Digital’s long-term intention to migrate the majority of its services from its current on-premise infrastructure to its AWS and Azure multi-cloud environment, building on its cloud-first approach to delivering healthcare.

NHS digital has worked with VMware cloud on AWS to create a new commercial model which provides better economies of scale benefits and supposedly will streamline the remaining cloud migration processes NHS Digital needs to make in the future.

«The uptake of digital services in the NHS is accelerating so the NHS and social care’s IT backbone must be up to the job,» said Rob Shaw, deputy chief executive at NHS Digital. «With VMware Cloud on AWS, we’re providing a resilient platform to support digitally-enabled care today and in the future.»

«We now have a commercial framework in place to enable NHS and public-sector organizations to confidently use the cloud,» he added. «Together we can benefit from the economies of scale and cost efficiencies of this model.»

NHS Digital is the UK’s healthcare service’s IT arm which is currently attempting to undertake one of the most complex digital transformation projects in recent times.

The organisation is notorious for operating on legacy infrastructure and outdated technology, which is why Health Secretary Matt Hancock announced a £487 million fund in his inaugural speech last July dedicated for digital projects.

The organisation coordinates and maintains mission-critical IT infrastructure that underpins the NHS and social care, facilitating the needs of 1.4 million staff and 1.5 million social care staff.

«We choose the right cloud for each workload, and VMware Cloud on AWS is the absolute best option for running our vSphere-based environments in the cloud,» said Michael Flintoft, associate director of platforms and infrastructure at NHS Digital. «It’s easy to move solutions across the different environments and it’s easy to run and manage.»

«We built a virtual data centre in the AWS cloud in less than three hours,» he added. «That speed and agility is just what we need to harness innovation and make the best digital services available for the NHS and social care sector.»

Its cloud-first initiative forms just part of the organisation’s overall digital transformation. AI and robotics are also are expected to be harnessed in the near future and the current thinking is that within 20 years, 90% of jobs in the NHS will require digital skills.

NHS Digital has been entangled in its fair share of controversy over the past few years. It was slammed heavily for agreeing to share patient health records with Google-owned DeepMind and its data sharing practices again came under fire during the care.data fiasco.

Microsoft announces Azure updates for IoT, databases and more


Connor Jones

7 May, 2019

Microsoft has kicked off its annual developer conference Build in Seattle with CEO Satya Nadella using his pre-event keynote to announce a slew of updates to its Azure services, including IoT, databases, analytics and storage.

Internet of Things (IoT)

The first update comes to Azure IoT Edge, Microsoft’s cloud-based IoT monitoring and deployment platform used by businesses to coordinate their IoT devices. After nearly two years since its release, IoT Edge will now support Kubernetes integration in preview.

Microsoft’s fully managed IoT SaaS, Azure IoT central, will now sport some enhanced features surrounding the rules it follows to process data and send it to analytics services as well as benefiting from some cosmetic upgrades to dashboards and data visualisation.

Databases

Microsoft has released an update to Azure SQL Database serverless which will appeal to those that care about being billed for unused compute. The update introduces a new compute tier for databases with intermittent usage with the goal of improving the cost-efficiency of the service. It involves scaling determined by workload and pauses compute when the database is inactive. It’s now available in preview for single databases.

Developers will also now be able to scale compute, storage, and memory resources in a new Hyperscale service tier in Azure Database for PostgreSQL. It’s now in public preview as Hyperscale (Citus) for PostgreSQL and in general availability as Azure SQL Database Hyperscale.

Storage

Azure Managed Disks will now support direct uploads at sizes from 32GB to 32TiB – an update now in public preview in all regions. Azure Managed Disks are virtual hard disks that act like a physical disk in an on-premise server.

Analytics

Azure Data Factory’s two new components announced earlier this year will enter preview this week. Mapping Data Flows allows cloud users to transform data at scale by creating data transformation jobs without requiring any knowledge of coding. Also on the codeless theme, Wrangling Data Flows allows users to ‘explore and wrangle data at scale’ – visualising data en masse and making it easier to understand. The components are in public preview and private preview respectively.

Azure Data Warehouse also gets support for semi-structured data, which is also available in preview. ADW can now analyse both structured and semi-structured data such as JSON directly inside the service which yields faster insights.


More than two thirds of cloud resellers offer Azure, making it the most widely offered IaaS solution. Discover more cloud buying and selling trends in this whitepaper.

Download now


In other news

Microsoft has treated us to many of the announcements prior to the event’s official opening. AI and blockchain preset services were announced last week as appetisers for this week’s event.

Azure Cognitive Services got smarter with Decision and Personaliser, two new components that are designed to provide users with smarter specific recommendations for better decision-making.

AI has been added to Azure Search, which will interact with Cognitive Services to unearth better insights from structured and unstructured content. Machine learning services have also been tweaked to enable codeless model creation and a new deployment service with drag and drop capabilities.