Category Archives: IoT

Jaap van Vliet, Ambit Software: Navigating the intricacies of digital transformation

In an interview at this year’s Digital Transformation Week Europe, Ambit Software Managing Director Jaap van Vliet delved into the intricacies of digital transformation and how the company utilises innovative strategies to assist clients globally. “We try to help to accelerate our customers to improve their business,” says van Vliet. “We do that from multi-angles:… Read more »

The post Jaap van Vliet, Ambit Software: Navigating the intricacies of digital transformation appeared first on Cloud Computing News.

Full ‘stream’ ahead for Scottish Water smart monitoring roll-out

An intelligent monitoring system has been successfully introduced by Scottish Water across rural locations in the Highlands and Islands, using Internet of Things (IoT) technology to gather essential data that helps keep the water network in Scotland safe. Using a device developed by CENSIS for M2M Cloud – Scotland’s innovation centre for sensing, imaging, and… Read more »

The post Full ‘stream’ ahead for Scottish Water smart monitoring roll-out appeared first on Cloud Computing News.

Eight Crucial Strategies for Strengthening Network Security

strengthening network security Strengthening Network Security

Strengthening network security is vital to your organization. Check out the tips below to ensure you are well protected.

Leave no host forgotten, know your hosts (all of them)

Any and every device capable of wired or wireless access with an IP address should be known in your environment. This goes beyond desktops, laptops, servers, printers, IP phones, and mobile devices. The “Internet of Things” presents a larger potential footprint of hosts including environmental monitoring and control devices, security cameras, and even things like vending machines. IoT devices all run operating systems that have the potential to be compromised by hackers and used as a platform for performing reconnaissance of your network for more valuable assets. Ensure inventory lists are valid by performing routing network scans to identify unknown devices.

Understand your users’ behavior

Knowing the culture and habits of users, like when and where they work, is important for establishing baseline behavior patterns. Also, the types of work they do online such as researching, downloading software, and uploading files will vary greatly by industry. For example, users at a law firm are not going to have the same internet usage behavior as users at a software development company. Even within an organization, there will be differences between administrative and technical engineering user behavior. Knowing the behavior of your users will make it easier to identify what is normal versus abnormal network traffic.

Understand what talks to what and why

The network traffic patterns in your organization should represent the usage of critical business applications that users need to do their job. Understanding these traffic flows is critical to building effective security policies for ACLs, stateful firewall policies, and deep packet inspection rules on network security devices. This applies to traffic within your internal private networks, what is allowed in from the outside, and especially the type of traffic allowed to leave your organization.

Control what is running on your hosts

The more applications and services running on a host, the more potential for exposure to software vulnerabilities.  Software updates are important for bug fixes and new features but security related fixes to applications are critical. Limit the types of applications users may install to reputable software vendors that take security updates seriously. Staying current with operating system security updates is even more important. Situations when legacy applications require older EOL operating systems to run on your network should be monitored very closely and if possible should be segmented to dedicated VLANs.

Know your data & control your data

Understand the data that is critical to your business and classify that data into different levels of sensitivity. You must ensure that encryption is used when transmitting highly sensitive data across the network as well as limit access to sensitive data to only those who require it. It is important to implement effective logging on all devices that store and transmit sensitive data and perform routine checks of your backup solutions to ensure the integrity of critical data backups.

Monitor and control your perimeter (egress too!!)

The network perimeter of your organization includes Internet and WAN connections but also wireless access points. All three of these perimeter pathways need to be protected with the highest levels of access restrictions.  Next-generation security appliances should be deployed on all perimeter segments to provide deep packet inspection, content filtering, and malicious URL inspection. Centralized logging of network and security devices using a security information event management (SIEM) solution is vital for analysis and correlation of logging data.

Train your users: they are your weakest link and your best defense

Deliver routine end-user security awareness training to keep users up to date on ways to recognize suspicious email content and websites. Perform routine experimental phishing campaigns to determine how well users are able to identify suspicious emails. Review policies with users on how to manage sensitive data. Make sure users are aware of non-technical methods used by hackers such as social engineering tactics to extract information about your organization.

Implement strong authentication controls

Use multifactor authentication for wireless and VPN remote access whenever possible. Restrict the usage of local user accounts and require complex passwords that must be changed regularly. Implement 802.1x security on wireless LANs as well as wired network connections that are accessible to common areas in your facility.

Utilizing the tips above can go a long way in strengthening network security, reach out to your account manager or contact us to find out more about strategies to strengthen your network.

By Kevin Dresser, Solutions Architect

Intel digs deep into wallet to buy its way into AI game

AI-Artificial-Intelligence-Machine-Learning-Cognitive-ComputingVirtual reality may well have been capturing the imagination of the industry in recent months, but Intel’s $400 million of AI start-up Nervana highlights it’s not all fun and games, reports Telecoms.com.

Having set its position as a leader in the data centre market and then largely missed out on the smartphone revolution, it would appear Intel is determined not to miss out on the burgeoning IoT segment, with the Nervana purchase added more firepower to the company’s efforts. The acquisition also highlights the importance of artificial intelligence to the development of the technology industry.

“Intel is a company that powers the cloud and billions of smart, connected computing devices,” said Diane Bryant, GM of the Data Center Group at Intel. “Thanks to the pervasive reach of cloud computing, the ever decreasing cost of compute enabled by Moore’s Law, and the increasing availability of connectivity, these connected devices are generating millions of terabytes of data every single day. The ability to analyse and derive value from that data is one of the most exciting opportunities for us all. Central to that opportunity is artificial intelligence.”

The IoT revolution is coming whether we like it or not, and with it will come such vast amounts of data. Due to the volume, it will beyond comprehension for humans to develop insight from the information. Current data analytics tools and processes could be described (at best) as adequate, though this is before the surge in connected devices. Statista estimates the number of connected devices will grow from 18.2 billion in 2015, through to 50.2 billion in 2020. The devices themselves will also improve, increasing the amount of information which can be collected individually, which will lead to a tidal wave of data to be analysed.

If it is assumed to be immensely difficult or more likely impossible to analyse this data and turn it into actionable insight, what is the point in collecting it in the first place. This is the justification of artificial intelligence. Using such technologies to undertake more rudimentary decision making capabilities brought about through data analysis, or presenting insight to the more complex decisions to business leaders, is where the value of artificial intelligence will be felt. If cloud computing enables the IoT revolution, artificial intelligence will make sure it’s not a waste of time or money.

For a notable proportion of the population, AI is likened to Terminator or other such doomsday stories. But as Bryant notes below, the applications of AI will stretch throughout the life of a consumer, but perhaps more importantly, the business, manufacturing and services world.

“While artificial intelligence is often equated with great science fiction, it isn’t relegated to novels and movies,” said Bryant. “AI is all around us, from the commonplace (talk-to-text, photo tagging, and fraud detection) to the cutting edge (precision medicine, injury prediction, autonomous cars). Encompassing compute methods like advanced data analytics, computer vision, natural language processing and machine learning, artificial intelligence is transforming the way businesses operate and how people engage with the world.”

The acquisition does answer a question raised by Telecoms.com a couple of weeks ago. During early July, Intel announced a new initiative with BMW and Mobileye to drive forward the development of autonomous vehicles. The initiative showed potential, though should BMW are to supply the cars, Intel the chips and Mobileye the detection capabilities, have the body, the muscles and the eyes, but not the brain/AI to bring it all together. This Nervana acquisition in theory completes the circle and provides the intelligence aspect of the car.

Artificial intelligence has the potential to shape the technology industry moving forward, and it would appear this is a view which is shared by the major players. Google has acquired nine AI firms, including Deepmind for $625 million, Twitter has four major acquisitions, most recently Magic Pony for $150 million, Salesforce has acquired two AI start-ups already this year and Apple reported bought Turi for $200 million. The money being spent to gain the upper hand in this sub-sector is beginning to rival the early days of cloud computing.

Smartphones help Huawei to 40% revenue growth over H1

Huawei MWC 2016

Huawei has released financials for the first half of 2016 demonstrating a 40% revenue boost to $37 billion, partly owing to a healthy performance in the consumer business unit.

Although operating margin for the period has declined from 18% to 12%, the company posted stronger revenue growth for the period, slightly offsetting the decline. During the first six months of 2015 revenues grew 30%.

“We achieved steady growth across all three of our business groups, thanks to a well-balanced global presence and an unwavering focus on our pipe strategy,” said Sabrina Meng, Huawei’s CFO. “We are confident that Huawei will maintain its current momentum, and round out the full year in a positive financial position backed by sound ongoing operations.”

The decrease in the operating margin reflects the progress of the larger smartphone industry, as well as the competition which is increasing worldwide. Huawei currently sits in third place in global market share of the smartphone market, though it has been investing heavily to penetrate western markets in recent months. Samsung and Apple are currently defending their position as the top two, though Huawei’s efforts to chance the mid-range market are seemingly paying off.

Set against a backdrop of declining smartphone shipments, Huawei has held onto its strong position in the Chinese market, increasing its shipments from 11.2 million to 16.6 million in Q1 2016, compared to the same period in 2015. The move increased its market share from 10.2% to 15.8% taking it to the top of the Chinese leader board, while Apple lost ground dropping from 12.3% to 11%.

While this may be seen as unsurprising in some quarters of the industry, success in the international markets is becoming more apparent. According to research from Gartner, sales of smartphones to end users totalled 349 million units in the first quarter of 2016, a 3.9 percent increase over the same period in 2015. Samsung accounted for roughly 23% of the market, whereas Apple was just under 15%. Huawei increased its share 5.4% to 8.3%, taking it to third in the global market share tables. The company is expected to continue to ramp up its R&D focus over foreseeable future.

Although the company did not detail the enterprise business units figures though that is likely to be outlined in the coming weeks. The enterprise business, which includes cloud computing, storage, and SDN products, Safe City and Electric Power IoT solutions, did announce healthy growth of 44% to $4.5 billion during its annual Global Analyst Summit in April.

In the carrier business, the role of 5G and IoT was reaffirmed, and the team will be focusing on four areas within the telco industry, business, operations, architecture, and networks. While the carrier business has been demonstrating strong growth throughout the world, it has struggled in the US after its technology was effectively banned over concerns it would be used by Chinese authorities to spy on the US. While Huawei has continually denied the allegations, it has struggled to rebound and reassert itself in the market.

Elsewhere in the industry, competitor Ericsson has been experiencing slightly different fortunes after CEO Hans Vestberg resigned following another difficult quarter for the company. Last week, the company reported an 11% annual decline in net sales with pressure continuing to build against Vestberg.

Smart Watch market enters into decline for first time

Research firm IDC says shipments of the Apple Watch have dropped by 55% resulting in the first year-on-year quarterly shipment decline for the smart watch sector, reports Telecoms.com.

Preliminary data from IDC’s Worldwide Quarterly Wearable Device Tracker estimates vendors shipped 3.5 million units, down from 5.1 million in the same period 2015. Apple, which dominates the smart watch market share, saw its shipments decrease from 3.6 million in Q2 2015 to 1.6 million this year. While it is a substantial drop, it does also demonstrate Apple’s strangle hold on the market. All other vendors in the top five increased shipments, however Apple still controls 47% market share.

“Consumers have held off on smart watch purchases since early 2016 in anticipation of a hardware refresh, and improvements in WatchOS are not expected until later this year, effectively stalling existing Apple Watch sales,” said Jitesh Ubrani, Senior Research Analyst for IDC Mobile Device Trackers. “Apple still maintains a significant lead in the market and unfortunately a decline for Apple leads to a decline in the entire market. Every vendor faces similar challenges related to fashion and functionality, and though we expect improvements next year, growth in the remainder of 2016 will likely be muted.”

A recent report from Ericsson indicating the wearables market is not performing in-line with consumer expectations, as general consensus is the technology is not advanced enough to date. A common cause of dissatisfaction is customers feel tethered to their smartphone, as the wearable device does not have standalone features. Respondents of the survey also highlighted the price was a barrier to entry, though this may be down to the fact smart watches cannot currently be used as a standalone device. Currently, it is an add-on.

“What will bear close observation is how the smart watch market evolves from here,” said Llamas. “Continued platform development, cellular connectivity, and an increasing number of applications all point to a smartwatch market that will be constantly changing. These will appeal to a broader market, ultimately leading to a growing market.”

This is not the first warning sign for the smart watch subsector, as Strategy Analytics recently released a forecast which estimated shipments would decline by 12% over the course of 2016. There has been a growing consensus shipments of the Apple Watch may have peaked following a blockbuster launch in Q2 last year, though the research from IDC could imply the decline is moving faster than previously anticipated. IDC also stated it does not expect the market return to growth in 2017.

Although smart watches have not penetrated the mainstream market currently, what could give the devices a lift is the entry of traditional watch brands. Casio, Fossil, and Tag Heuer have launched their own models, though the brand credibility associated with these brands could give the segment a much needed boost.

Top Five Smartwatch Vendors, Shipments, Market Share and Year-Over-Year Growth, 2Q 2016 (Units in Millions)
Vendor 2Q16 Unit 

Shipments

2Q16 Market

Share

2Q15 Unit

Shipments

2Q15 Market

Share

Year-Over-

Year Growth

  1. Apple
1.6 47% 3.6 72% -55%
  1. Samsung
0.6 16% 0.4 7% 51%
  1. Lenovo
0.3 9% 0.2 3% 75%
  1. LG Electronics
0.3 8% 0.2 4% 26%
  1. Garmin
0.1 4% 0.1 2% 25%
Others 0.6 16% 0.6 11% -1%
Total 3.5 100% 5.1 100% -32%
Source: IDC Worldwide Quarterly Wearable Device Tracker, July 21, 2016

Intel grows despite the PC continuing its slow decline

IntelIntel has reported 3% growth, including a 5% boost in its data centre business, though the client computing unit continues its slow decline, reports Telecoms.com.

The company’s efforts to redefine itself are seemingly beginning to pay dividends as a 3% year-on-year decline to $7.3 billion in the client computing business unit was offset by healthy performances elsewhere in the organization. The data centre unit brought in $4 billion in revenues, up 5%, whereas IoT accounted for $572 million, an increase in 2%, and the security portfolio grew 10% to $554 million for the quarter. The Programmable Solutions group also saw a 30% boost to $465 million. Overall quarterly earnings grew 3% to $13.5 billion.

“Our top line results for the quarter came in right in line with outlook, and profitability this quarter exceeded our expectations,” said Brian Krzanich, Intel CEO. “Year-over-year growth this quarter was 3% overall, as we transform Intel into a company that powers the cloud and billions of smart connected devices. We continue to focus on growth in line with this transformation, as evidenced by results in the data centre, IoT, and Programmable Solutions business this quarter.”

Looking forward, the team is forecasting Q3 will bring in revenues of roughly $14.9 billion, which would represent 3% year-on-year growth. Client computing is expected to continue its decline in the high single digits, while double-digit growth is anticipated in the data centre business, funded by cloud players in the second half of the year. CFO Stacy Smith believes growth in the IoT, data centre and memory businesses will counteract any negative impact of client computing.

While the data centre business continues to demonstrate growth for Intel, overnight trading saw share price decline by 3% following the earnings announcement. Investors were anticipating higher growth levels for the data centre group, as Intel forecasted double digit growth previously.

Intel’s efforts to redefine the focus and perception of the business has been ongoing for some time, as the personal computing market segment, Intel’s traditional cash cow, has continued to erode. Back in April, Krzanich outlined the company’s future focus on the company blog, which is split into five sections; cloud technology, IoT, memory and programmable solutions, 5G and developing new technologies under the concept of Moore’s law.

“Our strategy itself is about transforming Intel from a PC company to a company that powers the cloud and billions of smart, connected computing devices,” said Krzanich in the blog entry. “But what does that future look like? I want to outline how I see the future unfolding and how Intel will continue to lead and win as we power the next generation of technologies.

“There is a clear virtuous cycle here – the cloud and data centre, the Internet of Things, memory and FPGA’s are all bound together by connectivity and enhanced by the economics of Moore’s Law. This virtuous cycle fuels our business, and we are aligning every segment of our business to it.”

While the IoT business only grew 2% year-on-year, it would be worth noting this is off the back of a healthy Q1 which saw the unit grow 22%. Krzanich linked the Q2 performance, which was below the teams expectations, to an inventory burn following a strong performance in the first quarter. The team now anticipate double-digit growth through the remainder of 2016.

This was also the second consecutive quarter in which the security portfolio was listed as a separate business unit, previously being incorporated into the software and services unit. The group itself has demonstrated healthy growth over the course of 2016, but has been the topic of speculation surrounding a sale.

Only last month the team were rumoured to be considering a sale of its security business, which was created following the $7.6 billion acquisition of antivirus specialists McAfee in 2010. Although security is one of the larger sections of the Intel business, it was not specifically mentioned as a focus point for the future business strategy during Krzanich’s blog entry in April. While the prospective sale has not been confirmed by the Intel team, separating the unit in the financials could indicate it is attempting to provide a greater level of transparency for potential buyers.

Are cyber attacks covering up server inadequacies at Pokémon Go?

Pokemon GO 2Pokémon Go users have continued to struggle as the app’s developer Niantic Labs recovers from hacker attacks and unprecedented demand for the game, reports Telecoms.com.

Claimed attacks from various hacker groups would have appeared to cover up server inadequacies at Niantec Labs, as the team seemingly struggles to meet capacity demands following the games launch in 27 countries worldwide.

Over the course of the weekend, various hacker groups including PoodleCorp and OurMine have claimed responsibility for a distributed denial of service (DDoS) attack, causing a slow and clunky experience for many players around the world. Although the Niantec Labs team has played down the incidents, disruptions have continued into Monday morning with the Telecoms.com editorial team unable to access the game effectively. Whether this can be attributed to the claimed attacks or a lack of server capacity is unclear for the moment.

The hacker saga would have appeared to have started over the weekend, with OurMine stating on its website, “Today We will attack “Pokemon Go” Login Servers! so no one will be able to play this game till Pokemon Go contact us on our website to teach them how to protect it! We will attack it after 3-4 hours! Be ready! We will update you!” This was followed by another statement declaring the servers were down. PoodleCorp claimed the day before (June 16), it had caused an outage, though also said to expect a larger attack in the near future.

While both of these attacks have attracted headlines, it would also appear to have covered up shortcomings on the company’s infrastructure and its ability to deal with high demand. The launch of Pokémon Go has been well documented over the last few weeks as it has been lauded by numerous sources as the biggest mobile game in US history. Even before its official release in the UK, EE announced it saw 350,000 unique users of Pokémon GO on its network.

“This is the fastest take up of an app or game we’ve ever seen – and that’s before it’s officially launched! People across the country are going to be relying on a mobile data network that’s everywhere they go,” said Matt Stagg, EE head of video and content strategy.

Despite claims the server problems have been addressed, complaints have continued to be voiced. Server status tracking website Downdetector stated 39,013 complaints were registered at 22.00 (EST) on July 17. The Niantic Labs team are seemingly underestimating demand for Pokémon Go with each launch, which would be a nice problem to have.

While Telecoms.com was unable to identify Niantic Labs specific cloud set-up, other reports have identified Google as the chosen platform. Although there are no specific announcements linking the two organizations, Niantec was spun out of Google in October last year, and currently has John Hanke at the helm, who was previous VP of Product Management for Google’s Geo division, which includes Google Earth, Google Maps and StreetView. A job vacancy is also on the company’s website which asks for experience in dealing with Google Cloud or AWS.

Although AWS has been listed on the job vacancy, it would be fair to assume it is not involved currently as CTO Werner Vogels couldn’t resist making a joke at the affair stating “Dear cool folks at @NianticLabs please let us know if there is anything we can do to help!” on his twitter account. This could imply some insider knowledge from Vogels as it would be most likely the company would take a swipe at its closest rivals in the public cloud market segment, namely Google or Microsoft Azure.

The claims of the DDoS attacks would appear to have come at an adequate time, as it has taken the heat off the cloud infrastructure inadequacies. According to Business Insider, Hanke said the international roll-out of the game would be “paused until we’re comfortable”, with relation to the server capacity issues. It would seem the company is prepared to ride the wave of demand, as well as complaints, and fix the server problem later, as launches and server issues continued following that interview.

What did we learn from EMC’s data protection report?

a safe place to workEMC has recently released its Global Data Protection Index 2016 where it claims only 2% of the world would be considered ‘leaders’ in protecting their own assets, reports Telecoms.com.

Data has dominated the headlines in recent months as breaches have made customers question how well enterprise organizations can manage and protect data. Combined with transatlantic disagreements in the form of Safe Habour and law agencies access to personal data, the ability to remain secure and credible is now more of a priority for decision makers.

“Our customers are facing a rapidly evolving data protection landscape on a number of fronts, whether it’s to protect modern cloud computing environments or to shield against devastating cyber-attacks,” said David Goulden, CEO of EMC Information Infrastructure. “Our research shows that many businesses are unaware of the potential impact and are failing to plan for them, which is a threat in itself.”

EMC’s report outlined a number of challenges and statistics which claimed the majority of the industry are not in a place they should be with regard to data protection. While only 2% of the industry would be considered leaders in the data protection category, 52% are still evaluating the options available to them. Overall, 13% more businesses suffered data loss in the last twelve months, compared to the same period prior to that.

But what are the over-arching lessons we learned from the report?

Vendors: Less is more

A fair assumption for most people would be the more protection you take on, the more protected you are. This just seems logical. However, the study shows the more vendors you count in your stable, the more data you will leak.

The average data loss instance costs a company 2.36TB of data, which would be considered substantial, however it could be worse. The study showed organizations who used one vendor lost on average 0.83TB per incident, two vendors 2.04TB and three vendors 2.58TB. For those who used four or more vendors, an average of 5.47TB of data was lost per incident.

Common sense would dictate the more layers of security you have, the more secure you will be, however this is only the case if the systems are compatible with each other. It should be highlighted those who lost the larger data sets are likely to be the larger companies, with more data to lose, though the study does seem to suggest there needs to be a more co-ordinated approach to data protection.

And they are expensive…

Using same concept as before, the average cost of lost data was $900,000. For those who have one vendor, the cost was $636,361, for those with two, 789,193 and for those with three vendors the cost was just above the average at 911,030. When companies bring in four or more vendors, the average cost of data loss rises to 1.767 million.

China and Mexico are the best

While it may be surprising, considering many of the latest breakthroughs in the data world have come from Silicon Valley or Israel, China and Mexico are the two countries which would be considered furthest ahead of the trend for data protection.

EMC graded each country on how effective they are were implementing the right technologies and culture to prevent data loss within the organizations themselves. 17 countries featured ahead of the curve including usual suspects of the UK (13.5% ahead of the curve), US (8%), Japan (1%) and South Korea (9%), however China and Mexico led the charge being 20% and 17% respectively ahead.

While it may not be considered that unusual for China to have a strong handle on data within its own boarders, Mexico is a little more surprising (at least to us at Telecoms.com). The country itself has gone through somewhat of a technology revolution in recent years, growing in the last 20 years from a country where only 10% of people had mobile through to 68% this year, 70% of which are smartphones. Mexico is now the 11th largest economy in terms of purchasing power, with the millennials being the largest demographic. With the population becoming more affluent, and no longer constrained by the faults of pre-internet world, the trend should continue. Keep up the good work Mexico.

Human error is still a talking point

When looking at the causes of data loss, the results were widespread, though the causes which cannot be controlled were at the top of the list. Hardware failure, power loss and software failure accounted for 45%, 35% and 34% respectively.

That said the industry does now appear to be taking responsibility for the data itself. The study showed only 10% of the incidents of data loss was blamed on the vendor. A couple of weeks ago we spoke to Intel CTO Raj Samani who highlighted to us the attitude towards security (not just data protection) needs to shift, as there are no means to outsource risk. Minimizing risk is achievable, but irrelevant of what agreements are undertaken with vendors, the risk still remains with you. As fewer people are blaming the vendors, it would appear this responsibility is being realized.

Human error is another area which still remains high on the agenda, as the study showed it accounts for 20% of all instances of data loss. While some of these instances can be blamed on leaving a laptop in the pub or losing a phone on the train, there are examples where simple mistakes in the workplace are to blame. These will not be removed, as numerous day-to-day decisions are based off the back of intuition and gut-feel, and a necessity for certain aspects of the business.

An area which could be seen as a potential danger would be that of artificial intelligence. As AI advances as a concept, the more like humans they will become, and thus more capable of making decisions based in intuition. If this is to be taken as the ambition, surely an intuitive decision making machine would offer a security defect in the same way a human would. Admittedly the risk would be substantially smaller, but on the contrary, the machine would be making X times many more decision than the human.

All-in-all the report raises more questions than provides answers. While security has been pushed to the top of the agenda for numerous organizations, receiving additional investment and attention, it does not appear the same organizations are getting any better at protecting themselves. The fact 13% more organizations have been attacked in the last 12 month suggests it could be getting worse.

To finish, the study asked whether an individual felt their organization was well enough protected. Only 18% believe they are.

Telsta adds IoT and big data offering to Network and Services biz unit

Location Australia. Green pin on the map.Australian telco Telstra has continued efforts to bolster its Network Applications and Services (NAS) business unit through acquiring Readify, reports Telecoms.com.

The company has been vocal about its aims for the NAS business unit as it has sought to expand through numerous acquisitions in recent years. Aside from the Readify deal, the company has also incorporated O2 Networks, Bridge Point Communications, Kloud and North Shore Connections, as well as numerous partnerships including with cloud security start-up vArmour.

“This arm of the business (NAS) has been a strong growth area for Telstra, achieving double-digit growth in revenue driven by business momentum in Asia, as well as advances in technology in the cloud computing space,” said a statement on the company website. “We are well equipped to continue to capitalise on this growth and ensure our focus on NAS continues to drive revenue.”

Readify, which currently offers enterprise cloud application solutions as well as Big Data and IoT, will provide an additional platform for Telstra to drive digital transformation for its enterprise customers in domestic and global markets. The offering builds on the January acquisition of Kloud which offers cloud migration services, as well as unified communications solutions and contact centre provider North Shore Connections in 2013, network integration services provider O2 Networks in 2014 and security, networking, and data management provider Bridgepoint, also in 2014.

“Readify will provide application development and data analytics services, nicely complementing Kloud’s existing services,” said Telstra Executive Director Global Enterprise and Services, Michelle Bendschneider. “It will enable Telstra to add incremental value to customers in enterprise cloud applications, API-based customisation and extensions as well as business technology advisory services.”

Back in April, the company announced a business multi-cloud connecting solution, which supports numerous offerings hybrid cloud offerings including Azure, AWS, VMware, and IBM. The one-to-many “gateway” model will enable Australian customers to connect to Microsoft Azure, Office365, AWS, IBM SoftLayer, and VMware vCloud Air, while international customers can only connect to AWS and IBM SoftLayer for the moment.

The cloud and enterprise services market has been a long-ambition of the company, though it did get off to a slow start. Back in 2014, its national rival Optus Business stole a march on Telstra through acquiring Ensyst, winner of Australian Country Partner of the Year at the Microsoft Worldwide Partner Awards during the same year, as it looked to grow its own cloud proposition. It would appear Telstra is making up for lost time through an accelerated program of product releases and acquisitions.