FinTechs use the cloud to operate at the speed and scale of digital financial activity, but are often hindered by the complexity of managing security and compliance in the cloud. In his session at 20th Cloud Expo, Sesh Murthy, co-founder and CTO of Cloud Raxak, showed how proactive and automated cloud security enables FinTechs to leverage the cloud to achieve their business goals. Through business-driven cloud security, FinTechs can speed time-to-market, diminish risk and costs, maintain continuous compliance, and set themselves up for success.
Monthly Archives: August 2017
[session] Getting Public Cloud Benefits | @CloudExpo @Cloudistics #API #Cloud #Serverless
You know you need the cloud, but you’re hesitant to simply dump everything at Amazon since you know that not all workloads are suitable for cloud. You know that you want the kind of ease of use and scalability that you get with public cloud, but your applications are architected in a way that makes the public cloud a non-starter. You’re looking at private cloud solutions based on hyperconverged infrastructure, but you’re concerned with the limits inherent in those technologies.
How companies can boost their website in China’s clouded market
Company name translated into Chinese? Tick. Chinese social media accounts up and running? Tick. A Chinese-language website? Tick.
The IT team managing your Chinese website clicks ‘deploy’, and you’re now on the way to conquering the lucrative Chinese market.
The euphoria of global expansion endures for a few days, but then the teething problems start stacking up. Website availability is patchy. No one is coming to your site — except for spammers and existing customers who complain about slow load speed. Moreover, your website is lost eight pages deep on the Chinese equivalent of Google.
Overcoming Internet speed challenges
Website load speed is crucial anywhere in the world and especially in a mobile-centric market such as China. Hosting your website outside of China causes slower response times due to limitations on international bandwidth into China and high latency. This problem is especially acute for companies hosting their website in distant locations such as Europe and North America.
The most effective way to overcome these issues is to host your website in China. Hosting a website in China reduces site load time, minimizes latency, and is likely to improve search engine visibility in China over the long-term.
The other option is to deploy your website on a China-based Content Delivery Network (CDN). A CDN will cache your website on a distributed network of nodes. When a user in China requests access to your site, the CDN will serve a copy of your website from the closest node to the end-user. This dramatically reduces latency and is an ideal approach for companies that do not wish to migrate their origin server to a new location.
To deploy your website on a hosting server or CDN in China, you will first need to obtain an ICP (Internet Content Provider) license from the Ministry of Industry and Information Technology (MIIT). If your business is not eligible for an ICP license, an alternative is to deploy your website on a server or CDN node located in Hong Kong.
Why cloud hosting is perfect for China
The scalability and built-in elasticity of the cloud are purpose-built for large online markets such as China, and businesses are realizing this advantage. Bain & Company predicts cloud computing sales to swell to 20% of China’s total IT market by 2020, up from a mere 3% in 2013.
Given the massive number of online users in China dispersed across distant geographic locations, cloud hosting offers businesses the ability to maximize coverage and respond in real-time to sudden changes in traffic. This includes adding new deployment regions and availability zones and the option to release resources when traffic subsides after a promotional event or unexpected spike in activity.
Alibaba, for instance, has broken records during promotional periods by leveraging the cloud to process up to 175,000 orders in just one second.
Safeguarding your website
China’s tech-savvy population is leading the way in adopting mobile payments, online-to-offline (O2O) services, mobile gaming, and designing their lives around their smartphone. While Android is the leading mobile operating system in China with approximately 74.4% of the market as of February 2017, its operating system can be susceptible to external attacks. To address mobile vulnerabilities and data security, companies need to carefully assess website and mobile security.
Cloud hosting provides access to a range of security products to protect your website from malicious attacks, including free services such as anti-DDoS protection and real-time monitoring. Advanced security products are also a must for commercial websites that integrate online payments. Cloud-based security products including server guard, mobile security, and web application firewall (WAF) can be integrated into your cloud architecture to protect against high volume DDoS attacks and other cyber intrusions.
To mitigate the threat of attacks, it is also vital to regularly update and backup your website. If your website is deployed on WordPress, this also means upgrading your WordPress theme and plugins to eliminate potential loopholes that hackers can exploit.
China is a highly competitive market, and consumers expect a smooth and secure online experience. The flexibility, scalability, and security offered by the cloud provides an optimal solution to boost your website in China’s competitive online space.
Getting the balance right in microservices development
Choices, choices, choices. User requirements and non-functional requirements are just the beginning of the balancing act of services development. New development paradigms usually take a few years before their practitioners get a handle of the factors that they need to balance.
In the case of microservices, this balancing act comes down to three things: granularity, data consistency, and performance. The most usable and best-performing services built on the microservices architecture will find a balance of these three factors that work for the developers, users, and the business. Let’s start by defining these three factors in the context of microservices architecture.
Granularity
To be more specific: how many microservices and how granular the functionality of the services. The ultimate goal is to have the most granularity possible. Microservices architecture calls for the creation of a set of distinct functions that run on demand and then shut down.
The purpose of the granularity is twofold. Firstly, granularity enables rapid deployment of fixes and updates without lengthy and expensive testing cycles. Secondly, by execution on demand in a ‘serverless’ construct, the right level of granularity can help reduce cloud based infrastructure costs.
Data consistency
Since the microservices architecture calls for a discrete set of small functions defining an application or service the question arises as to how data will be passed between and acted upon by multiple functions while remaining consistent. Also, the question needs to be asked as to how multiple services built on microservices will access common data.
Think about a logistics system where multiple services, booking, confirmation, insurance, tracking, billing, and so on, each consist of multiple microservices functions and also need to pass information between them. Although the intent of this article is not to explore persistence technologies, some examples are: DBMS, journals and enterprise beans. The function of all of these and many others is to have data outlive the process that created it.
Performance
The starting and stopping of a process, even a microservices process, takes time and processing power. That may seem like an obvious statement but it requires some thought. If a process starts once and remains idle waiting for an input to run, there is no start-up delay except for the original startup or starting additional processes to handle the load.
In a microservices scenario, each time an input is received a process is started to handle the request. Then once the process ends its execution the process is shut down. This is an area where pre-provisioning of microservices servers (as in Microsoft Azure’s container based microservices solution) could be of benefit.
Balancing act
Granularity, as defined above, is the ultimate point of balance in a microservices architecture. While trying to make their services as granular as possible, if a team makes their microservices too granular, problems arise in the data consistency/persistence and performance realms. If the services aren’t granular enough you might get performance gains but you lose resiliency, flexibility, and scalability.
If a service is built in too granular a form, data consistency can become difficult simply because of the rise in the number of data connections and elements that need to be kept consistent. Performance suffers, in an overly granular scenario, due to the processing requirements of startup and shutdown, as described above.
The balance between the various functional requirements, non-functional requirements and the will to utilise new paradigm capabilities can lead to many different balancing scenarios. Realistically, creating a solid microservices based environment comes down to basic application architecture principals.
Understanding the needs and architecting the solution based on all the functional and non-functional requirements. When developing in a microservices architecture we need to resist the temptation to go too granular in order while also introducing the scalability and flexibility that the microservices architecture is known for.
A Look Into Cognizant’s Results
Cognizant Technology Solutions, a major IT provider headquartered in the city Teaneck in New Jersey, has posted better than expected results during the second quarter of this year.
It’s net income increased by 86.5 percent to $470 million, though analysts attribute much of this rise to the lower income tax levels it paid last year. Last year, the Indian subsidiary of Cognizant repurchased shares valued at $2.8 billion from shareholders and this led the company to take a $190 million expense last year.
Even without this, the company earned 93 cents per share and this is more than the analysts’ prediction of 90 cents per share. The overall revenue also rose to $3.67 billion, slightly more than the analysts’ expectations of $3.66 billion. This revenue accounts for a nine percent increase when compared to last year.
In addition, the company said that the revenue for the next quarter will be anywhere between $3.73 to $3.78 billion, and this is fairly close to the analysts’ prediction of $3.76 billion.
Despite such positive results, the company’s shares rose only slightly in the stock market. This is mainly because Cognizant lowered its 2017 revenue and it estimates the revenue between $14.7 billion and $14.84 billion. Though analysts were expecting $14.76 billion, the fact that it lowered the revenue is a cause of concern for investors.
This brings up the next question – why Cognizant lowered the revenue when it has posted such impressive results?
The culprit is the U.S healthcare industry. Uncertainty surrounding Trump’s policies and the scrapping of Obamacare has put a lot of pressure on this industry and it has cut back spending in a big way. As a result, It service providers that offer software and maintenance support to these healthcare companies are affected.
Since Cognizant gets a major chunk of its revenue from healthcare and financial services companies, it is forced to lower its forecast as it expects healthcare companies to put in tighter controls for spending.
For this quarter though, revenue from the healthcare sector grew by 9.5 percent to around $1.05 billion, but this growth rate may not be sustainable over the coming year due to the confusions about healthcare policy.
Otherwise, Cognizant seems to be doing well and is poised to take a big share in newer technologies like IoT.
The post A Look Into Cognizant’s Results appeared first on Cloud News Daily.
Interoute deploys Cloudian for new storage service
Interoute has announced it has rolled out a cloud-based storage service based on Cloudian’s HyperStore object storage technology.
The new service, which is part of Interoute’s Virtual Data Centre (VDC) platform, aims to provide customers with ‘fast, reliable and highly durable cloud-based storage for unstructured data, backups and archives at very low cost’, in the company’s own words. It will be available across the entire Interoute platform of 17 virtual data centre zones around the world.
The company cited GDPR concerns from customers explaining the rollout with organisations ‘revisiting the legacy world of physical backup and archiving and demanding a simple, controlled, auditable cloud service’, according to Mark Lewis, Interoute EVP products and development.
“With Cloudian, Interoute is offering its customers choice in limitlessly scalable and cost-effective storage, on a foundation that is proven in some of the world’s largest unstructured data stores,” said Jon Toor, chief marketing officer at Cloudian in a statement.
Cloudian’s mission is to provide what it calls ‘a clear vision to revolutionise object storage’ – storage which allows retaining unstructured data, such as photos, music, and collaboration services – by enabling 100% native AWS S3 object storage in users’ own data centres.
The company secured $41 million in financing in October last year, as this publication reported, adding it aimed to use the capital to help expand its sales and marketing, as well as grow international operations.
McKinsey argues how the current wave of AI is ‘poised to finally break through’
Editor’s note: Read more around artificial intelligence, deep learning and machine learning at AI News.
- Tech giants including Baidu and Google spent between $20B to $30B on AI in 2016, with 90% of this spent on R&D and deployment, and 10% on AI acquisitions.
- Artificial Intelligence (AI) investment has turned into a race for patents and intellectual property (IP) among the world’s leading tech companies.
- U.S.-based companies absorbed 66% of all AI investments in 2016. China was second with 17% and growing fast.
- By providing better search results, Netflix estimates that it is avoiding canceled subscriptions that would reduce its revenue by $1B annually.
These and other findings are from the McKinsey Global Institute Study, and discussion paper, Artificial Intelligence, The Next Digital Frontier (80 pp., PDF, free, no opt-in) published last month. McKinsey Global Institute published an article summarizing the findings titled How Artificial Intelligence Can Deliver Real Value To Companies. McKinsey interviewed more than 3,000 senior executives on the use of AI technologies, their companies’ prospects for further deployment, and AI’s impact on markets, governments, and individuals. McKinsey Analytics was also utilized in the development of this study and discussion paper.
Key takeaways from the study include the following:
Tech giants including Baidu and Google spent between $20B to $30B on AI in 2016, with 90% of this spent on R&D and deployment, and 10% on AI acquisitions
The current rate of AI investment is 3X the external investment growth since 2013. McKinsey found that 20% of AI-aware firms are early adopters, concentrated in the high-tech/telecom, automotive/assembly and financial services industries. The graphic below illustrates the trends the study team found during their analysis.
AI is turning into a race for patents and intellectual property (IP) among the world’s leading tech companies
McKinsey found that only a small percentage (up to 9%) of Venture Capital (VC), Private Equity (PE), and other external funding. Of all categories that have publically available data, M&A grew the fastest between 2013 And 2016 (85%). The report cites many examples of internal development including Amazon’s investments in robotics and speech recognition, and Salesforce on virtual agents and machine learning. BMW, Tesla, and Toyota lead auto manufacturers in their investments in robotics and machine learning for use in driverless cars. Toyota is planning to invest $1B in establishing a new research institute devoted to AI for robotics and driverless vehicles.
McKinsey estimates that total annual external investment in AI was between $8B to $12B in 2016, with machine learning attracting nearly 60% of that investment
Robotics and speech recognition are two of the most popular investment areas. Investors are most favoring machine learning startups due to quickness code-based start-ups have at scaling up to include new features fast. Software-based machine learning startups are preferred over their more cost-intensive machine-based robotics counterparts that often don’t have their software counterparts do. As a result of these factors and more, Corporate M&A is soaring in this area with the Compound Annual Growth Rate (CAGR) reaching approximately 80% from 20-13 to 2016. The following graphic illustrates the distribution of external investments by category from the study.
High tech, telecom, and financial services are the leading early adopters of machine learning and AI
These industries are known for their willingness to invest in new technologies to gain competitive and internal process efficiencies. Many start-ups have also had their start by concentrating on the digital challenges of this industries as well. The\ MGI Digitization Index is a GDP-weighted average of Europe and the United States. See Appendix B of the study for a full list of metrics and explanation of methodology. McKinsey also created an overall AI index shown in the first column below that compares key performance indicators (KPIs) across assets, usage, and labor where AI could contribute. The following is a heat map showing the relative level of AI adoption by industry and key area of asset, usage, and labor category.
McKinsey predicts High Tech, Communications, and Financial Services will be the leading industries to adopt AI in the next three years
The competition for patents and intellectual property (IP) in these three industries is accelerating. Devices, products and services available now and on the roadmaps of leading tech companies will over time reveal the level of innovative activity going on in their R&D labs today. In financial services, for example, there are clear benefits from improved accuracy and speed in AI-optimized fraud-detection systems, forecast to be a $3B market in 2020. The following graphic provides an overview of sectors or industries leading in AI addition today and who intend to grow their investments the most in the next three years.
Healthcare, financial services, and professional services are seeing the greatest increase in their profit margins as a result of AI adoption
McKinsey found that companies who benefit from senior management support for AI initiatives have invested in infrastructure to support its scale and have clear business goals achieve 3 to 15% percentage point higher profit margin. Of the over 3,000 business leaders who were interviewed as part of the survey, the majority expect margins to increase by up to 5% points in the next year.
Amazon has achieved impressive results from its $775 million acquisition of Kiva, a robotics company that automates picking and packing according to the McKinsey study
“Click to ship” cycle time, which ranged from 60 to 75 minutes with humans, fell to 15 minutes with Kiva, while inventory capacity increased by 50%. Operating costs fell an estimated 20%, giving a return of close to 40% on the original investment.
Netflix has also achieved impressive results from the algorithm it uses to personalize recommendations to its 100 million subscribers worldwide
Netflix found that customers, on average, give up 90 seconds after searching for a movie. By improving search results, Netflix projects that they have avoided canceled subscriptions that would reduce its revenue by $1B annually.
Gartner changes EFSS Magic Quadrant to content collaboration with Box and Microsoft leading
Box and Microsoft lead the way on vision and execution in Gartner’s recently released Magic Quadrant for content collaboration platforms – but it doesn’t quite tell the full story.
If you are unfamiliar with the name of the report, there is a good reason. The Quadrant was previously known as EFSS (enterprise file sync and share), with the analyst firm changing the definitions this year to reflect a shift in the market.
As Box puts it, content collaboration platforms ‘go beyond EFSS to also facilitate team collaboration and content workflows’; a point backed up by Tom Grave, SVP marketing at CTERA Networks, who found a place as a niche player in the report. Grave said that as ‘content’ lined up with ‘file’, and ‘collaboration’ lined up with ‘sharing’, it still gave an accurate view of the market but with room to expand.
“It certainly makes sense for us,” he told CloudTech. “I think that file sharing is an accurate term, and content collaboration can be a little more specific and also more specifically aligned with what you’re trying to do. [Employees are] not just sharing files for the sake of sharing them, they’re collaborating.
“Increasingly, organisations are distributed – we’ve got mobile workers all over the globe, systems for collaborating with their peers,” he added. “We definitely support the name and it makes sense – it’s a term that resonates.”
Of the 13 vendors who made the cut, seven made the top right leaders zone; Axway – essentially Syncplicity, which was bought by Axway in February – Box, Citrix, Dropbox, Egnyte, Google, and Microsoft.
Saying that it was a leader in all content markets, Box said it agreed with Gartner’s analysis ‘that the realities of business and technology today are forcing a change in the way organisations think about content.’ “With higher than ever customer expectations and increasing pressure on IT to deliver, content, collaboration and security need to be central to overall IT and business strategy,” Joely Urton, VP outbound marketing at Box, wrote in a company blog post.
Dropbox pointed to its most recent product releases, including the introduction of Dropbox Paper, a teamwork and collaboration tool, and Smart Sync, as an indication of both its success and the changing shape of the market. “By connecting the creation, feedback, organisation, and distribution steps that happen across different tools today, Dropbox is reducing the frustration and miscommunication that can slow teams down,” wrote Rob Baesman, senior director of product management.
As is frequently the case with these reports however – and as Gartner always insists – the top right axis is not the be all and end all. For certain workloads and organisations, each member of the Quadrant has its own strengths.
Similar to Egnyte, CTERA, who last made the report in 2015, narrows its focus on the enterprise market. The company offers two primary products; CTERA Drive, the app, and, crucially, CTERA Gateway, a physical appliance which enables share drives, but is also connected back to the cloud. Gartner says the company is ‘a good fit for organisations with highly distributed users and offices, and priorities on data privacy or data sovereignty.’
As a result, Grave prefers to use the term ‘focus’ instead of ‘niche’. “That’s the key,” he said. “Any two-person shop can go to Dropbox, or Box, or Google with a credit card and start using their service automatically, and that’s just not our position in the market. We’re not trying to be universal for any individual or very small customer.”
CTERA – who also launched its 6.0 iteration last week – says it hangs its hat on security as well as ‘cloud choice’, or ‘infrastructure choice’ – and this differentiation gives customers who have a particular focus on security options compared with software as a service (SaaS) vendors. “Depending on the profile of the customer, the more important [security] is, and especially when it’s not just broad security, but some of the specifics within security and privacy that different customers care differently to us, they align themselves to us,” said Grave.
“Wherever data sovereignty is involved, where IT has to very specifically know the location of all the data and being able for IT to access it themselves, and giving IT not only the ability to encrypt end to end, but always control encryption keys so there’s no third party they’re delegating or deferring to for managing encryption – those two factors are critical for a certain class of customer, and that would eliminate a lot of software as a service vendors who run in the top right,” Grave added.
You can read a copy of the report from CTERA’s page here (registration required).
The Rules of #DigitalTransformation for Start-Ups | @ThingsExpo #BigData #AI #DX #IoT #SmartCities
I have had the opportunity to work for and around a good many start-ups during the course of my career. Often the start-up founders would simply define a problem, develop a solution and launch a company. The marketing department would then do their very best to identify the individuals in each target company that experienced the problem and had a budget to fix it. This was always a challenging task, that has become even harder today.
[slides] How Do You Eat a Whale? | @DevOpsSummit @Skytap #CloudNative #DevOps #Serverless
In his session at @DevOpsSummit at 20th Cloud Expo, Kelly Looney, director of DevOps consulting for Skytap, showed how an incremental approach to introducing containers into complex, distributed applications results in modernization with less risk and more reward. He also shared the story of how Skytap used Docker to get out of the business of managing infrastructure, and into the business of delivering innovation and business value. Attendees learned how up-front planning allows for a clean separation between infrastructure, platform, and service concerns.