Alibaba and Softbank launch SB Cloud for Japanese market

AlibabaAlibaba and Softbank have announced the establishment of SB Cloud Corporation, a new joint venture to offer cloud computing services in Japan.

The demand for public cloud in Japan and surrounding countries has been growing in recent years, with Japan leading the way as the most advanced nation. A report from Gartner last year estimated the total public cloud services spending in the mature APJ region will rise to $11.5 billion by 2018. Alibaba has targeted the region to grow its already healthy cloud business unit.

“I’ve really enjoyed working with the Alibaba Cloud team on the joint venture over the past few months,” said Eric Gan, the new CEO of SB Cloud and EVP of SoftBank. “During the business planning discussions, I quickly felt that we were all working very much as one team with one goal. I believe the JV team can develop the most advanced cloud platform for Japanese customers, as well as for multinational customers who want to use the resources we have available in Japan.”

SB Cloud will enable Alibaba to increase its presence in the market, where it already offers services to SoftBank’s business customer base in Japan, which primarily comprises of global organizations. SB Cloud will open a new data centre in the country, where it will now serve customers outside of established SoftBank customer base, offering data storage and processing services, enterprise-level middleware as well as cloud security services.

A recent report from the US Department of Commerce highlighted the Japanese market is one of the most competitive worldwide, though five of the six major vendors are American, Amazon Web Services, Google, IBM, Microsoft and Salesforce. Domestic companies, such as Fujitsu, have announced aggressive expansion plans. Fujitsu claims to be to investing $2 billion between 2014 and 2017 to capture an increased market share in cloud computing, primarily focused on the growing IoT sub-sector.

While Alibaba’s traditional business has been in the Chinese market, the company has been making efforts over the last 12-18 months to diversify its outreach. Last year, the company launched a new data centre in Singapore, as well as in Silicon Valley. It also launched what it claims is China’s first cloud AI platform last August, DT PAI. The purpose-built algorithms and machine learning technologies are designed to help users generate predictive intelligence insights, claiming the service features “drag and drop” capabilities that let users easily connect different services and set parameters, seemingly following IBM’s lead in designing a more accessible offering for the industry.

Obtener JAVA_HOME mediante jrunscript

Para poder automatizar ciertas tareas nos puede interesar poder obtener el JAVA_HOME de la versión que tengamos instalada de java. Vamos a ver como podemos hacerlo mediante jrunscript:

jrunscript permite ejecutar javascript, por lo que simplemente deberemos obtener el valor de java.home y imprimirlo:

# jrunscript -e 'java.lang.System.out.println(java.lang.System.getProperty("java.home"));'
/usr/lib/jvm/java-7-openjdk-amd64/jre

Podemos ver un ejemplo de uso en el modulo de puppet eyp-tomcat para realizar la instalació de la tomcat native library:

    exec { "configure native library ${srcdir}":
      command => 'bash -c "./configure --with-apr=/usr/bin/apr-1-config --with-java-home=$(dirname $(jrunscript -e \'java.lang.System.out.println(java.lang.System.getProperty("java.home"));\'))"',
      require => [ Package[$tomcat::params::develpkg], Exec["tar xzf native library ${srcdir}"] ],
      cwd     => "${srcdir}/tomcat-native-library/jni/native",
      creates => "${srcdir}/tomcat-native-library/jni/native/Makefile",
    }

Tags:

IoT and Fog Computing Architecture | @ThingsExpo #IoT #DigitalTransformation

Cloud computing changed data analytics for good. It enabled companies to drastically decrease resources and architecture previously assigned with business intelligence departments. It also enabled laymen to run advanced business analytics. Cloud was also the architecture of choice for storing and processing big data.
Data piling is a continuous process, which is going to explode with emerging Internet of Things concept. Answer to this issue developers found in new concept called fog computing. As opposed to clouds, fog computing architecture is capable of conducting all required computations and analytics directly at the data source. This way, single network administrator is able to control the work of thousands (or even millions) of different data generating devices, by real time and predictive analytics, without overloading the network with huge piles of data going back and forth

read more

Digital Oil and Wireless #IoT | @ThingsExpo #IIoT #DigitalTransformation

Production demands of the 21st century change at an extraordinary pace. Industrial markets, such as energy and oil & gas face challenges going forward, including the reliable monitoring of assets in the field, dealing with 24×7 production demands, and managing high costs in terms of both time and resources to manage assets in remote locations. These market forces have naturally led to the emergence of the industrial internet of things (IIoT) and wireless communications technology.

read more

Agile, DevOps and CD | @DevOpsSummit #DevOps #ContinuousDelivery

Our CTO, Anders Wallgren, recently sat down to take part in the “B2B Nation: IT” podcast — the series dedicated to serving the IT professional community with expert opinions and advice on the world of information technology.

Listen to the great conversation, where Anders shares his thoughts on DevOps lessons from large enterprises, the growth of microservices and containers, and more.

read more

Microservices and Containers | @DevOpsSummit #DevOps #Microservices

Last week I had the pleasure of speaking on a panel at Sapphire Ventures Next-Gen Tech Stack Forum in San Francisco.
Obviously, I was excited to join the discussion, but as a participant the event crystallized not only where the larger software development market is relative to microservices, container technologies (like Docker), continuous integration and deployment; but also provided insight into where DevOps is heading in the coming years.

read more

The top three cloud security myths: BUSTED

a safe place to workThe rise in global cyber-attacks and the subsequent high-profile press coverage, understandably makes businesses question the security of cloud. After all, the dangers of hosting anything in an environment where data loss or system failure events are attributed to an outside source are magnified. As a result, many CIOs are also still struggling to identify and implement the cloud services most suitable for their business. In fact, research finds over three quarters (79%) of CIOs find it a challenge to balance the productivity needs of employees against potential security threats. Moreover, 84% of CIOs worry cloud causes them to lose control over IT.

But is cloud really more vulnerable than any other infrastructure? And how can organisations mitigate any risk they encounter? The reality is that all systems have vulnerabilities that can be exploited, whether on-premise, in the cloud or a hybrid of the two. It’s safe to say that people fear what they don’t understand – and with cloud becoming increasingly complex, it’s not surprising that there are so many myths attached to it. It’s time to clear up some of these myths.

Myth 1: Cloud technology is still in its infancy and therefore inherently insecure

Cloud has been around for much longer than we often think and can be traced as far back as the 1970’s. The rapid pace of cloud development, coupled with an awakening realisation of what cloud can do for businesses, has thrust it into the limelight in recent years.

The biggest issue CIOs have with cloud is their increasing distance from the physical technology involved. Indeed, many CIO’s feel that if they cannot walk into a data centre and see comforting lights flashing on the hardware, then it is beyond their reach. As a result, many organisations overlook instrumentation in the cloud, so don’t look at the data or systems they put there in the same way they would if it were on a physical machine. Organisations then forget to apply their own security standards, as they would in their own environment, and it is this complacency that gives rise to risk and exposure.

Lady Justice On The Old Bailey, LondonMyth 2: Physical security keeps data safe

It is a common misconception that having data stored on premise and on your own servers is the best form of protection. However, the location of data is not the only factor to consider. The greatest form of defence you can deploy with cloud is a combination of strict access rights, diligent data stewardship and strong governance.

Common security mistakes include not performing full due diligence on the cloud provider and assuming that the provider will be taking care of all security issues. In addition, it is still common for organisations to not take into account the physical location of a cloud environment and the legal ramifications of storing data in a different country. Indeed, a recent European Court of Justice ruling found the Safe Harbour accord was invalid as it failed to adequately protect EU data from US government surveillance. Cloud providers rushed to assure customers they were dealing with the situation, but the main takeaway from this is to not believe that a cloud provider will write security policy for you – organisations need to take ownership.

Myth 3: Cloud security is the provider’s responsibility

All of the major public clouds have multiple certifications (ISO27001, ISO27018, ENISA IAF, FIPS140-2, HIPAA, PCI-DSS) attained by proving they have controls to ensure data integrity.

Security CCTV camera in office buildingThe real risk comes when organisations blindly park data, thinking that security is just implicit. Unless the data is protected with encryption, firewalls, access lists etc., organisations remain vulnerable. The majority of cloud exposures can in fact be traced back to a failure in policy or controls not being applied correctly – look at the TalkTalk hack for example, and consider the alternate outcome had the database been encrypted.

Education and ownership is the future

The speed at which cloud is evolving can understandably cause a few teething problems. But it is the responsibility of providers and clients alike to take ownership of their own elements and apply security policies which are right for their business, their risk profile and the data which they hold. As with any technological change, many interested parties quickly jumped on the cloud bandwagon. But the allure of a technology can inhibit a lack of critical thinking, and the broader view of choosing the right application at the right cost, with appropriate security to mitigate risk, is lost. Remember, the cloud is not inherently secure and given the fact it stands to underpin enterprise operations for years to come, it’s worth approaching it not as a bandwagon but as an important part of enterprise infrastructure.

Written by Mark Ebden, Strategic Consultant, Trustmarque

ZeroStack launches first global partner program that benefits customers and partners

(Image Credit: iStockPhoto/shylendrahoode)

Earlier in 2016, ZeroStack announced general availability of its ZeroStack Cloud Platform for the simple installation and operation of a scale-out private cloud. Now, in order to bolster adoption, the company has rolled out a global Partner Program to aid partner sales.

The ZeroStack Cloud Platform is based on OpenStack and a unique feature of the offering is that management and infrastructure functionality are on premises while operations and user workflows are in the cloud. The cloud model is made available through hyper-converged ZeroStack Cloud Building Blocks that combine compute, clustered storage, software-defined networking and management software in a control plane. The on-premises infrastructure is managed by ZeroStack Cloud Portal, which provides analytics-based insights and optimizations for application deployment.

Sean Cardenas, VP of Sales and Operations at ZeroStack, said: “Whether or not we really emphasize the OpenStack depends on who we are talking to. OpenStack is really attractive to the technology crowd looking for open solutions, but there are plenty of people who just want to see a solution that works and solves their problems, and there we don’t emphasize the OpenStack aspect of it.”

Cardenas adds: “More often than not, masking complexity for the end user has a heavy burden for the partners. ZeroStack is not this at all. We refer to it as Zero Touch. It’s extremely easy to deploy within 30 minutes, as we advertise. Partners don’t necessarily have OpenStack expertise either. They know its value proposition, but most don’t have OpenStack SMEs. With this, they can literally point in the IP addresses and generate a private cloud. That’s a phenomenal win for them.”

What are your thoughts about ZeroStack’s cloud platform? Let us know in the comments.

Can your analytics tools meet the demands of the big data era?

New productSpeaking at Telco Cloud, Actian’s CTO Michael Hoskins outlined the impact big data is having on the business world, and the challenges which are being faced by those who are not keeping up with the explosion of data now available to decision makers.

The growth of IoT and the subsequent increase is data has been widely reported. Last year, Gartner predicted the number of connected ‘things’ would exceed 6.4 billion by the end of 2016 (an increase of 22% from 2015), and continue to grow to beyond 20.8 billion by 2020. While IoT is a lucrative industry, businesses are now facing the task of not only managing the data, but gaining insight from such a vast pool of unstructured information.

“Getting a greater understanding of your business is the promise of big data,” said Hoskins. “You can see things which you never were able to before, and it’s taking business opportunities to the next generation. The cloud is really changing the way in which we think about business models – it enables not only for you to understand what you are doing within your business, but the industry on the whole. You gain insight into areas which you never perceived before.”

Actian is one of a number of companies who are seemingly capitalizing on not only the growth of IoT and big data, but also the fact it has been rationalized by decision makers within enterprise as a means to develop new opportunities. The company has been building its presence in the big data arena for five years, and has invested more than $300m in growing organically, as well as acquiring new technology capabilities and expertise externally. As Hoskins highlighted to the audience, big data is big business for Actian.

Actian - Mike Hoskins

Actian’s CTO Michael Hoskins

But what are the challenges which the industry is now facing? According to Hoskins, the majority of us don’t have the right tools to fully realize the potential of big data as a business influencer.

“The data explosion which is hitting us is so violent, it’s disrupting the industry. It’s like two continents splitting apart,” said Hoskins. “On one continent we have the traditional tools, and on the other we have the new breed of advanced analytics software. The new tools are drifting away from the traditional, and the companies who are using the traditional are being left behind.”

Data analytics as a business practise is by no means a new concept, but the sheer volume, variety and speed at which data is being collected means traditional technologies to analyse this data are being made redundant. Hoskins highlighted they’re too slow (they can’t keep up with the velocity of collection), they’re too rigid (they can’t comprehend the variety of data sets), and they’re too cumbersome (they can’t manage the sheer volume of data). In short, these tools are straining under the swell.

The next challenge is scaling current technologies to meet the demands, which leaves most cases is a very difficult proposition. It’s often too short-term, too expensive and the skills aren’t abundant enough. Hoskins believes the time-cost-value proposition simply does not make sense.

“The journey of modernization goes from traditional, linear tools, through to business intelligence and discovery, this is where we are now, through to decision science,” said Hoskins. “Traditional tools enable us to look back at what we’ve done and make reactive decisions, but businesses now want to have a forward looking analytics model, drawing out new insights to inform decision making. But this cannot be done with traditional tools.

“This is the promise of advanced analytics. The final stage is where we can use data analytics to inform business decisions; this is where data becomes intelligence.”

24% of businesses expect a cyberattack within the next 90 days

Hacker performing cyber attack on laptopResearch from VMWare has highlighted 24% of office workers and IT decision makers believe their organization will be the victim of a cyberattack with the next 90 days, mainly due to the belief that the threats are advancing at a faster pace than a company’s defences.

Although the statistics imply the event of a cyberattack is becoming normalized within the industry, the findings do also suggest investments from enterprise organizations are not meeting the demanding trends of security, as 39% of the respondents believe one of the greatest vulnerabilities to their organisation to a cyberattack is threats moving faster than their defences.

“The issue around accountability is symptomatic of the underlying challenge faced as organisations seek to push boundaries, transform and differentiate, as well as secure the business against ever-changing threats”, commented Joe Baguley, CTO of VMware in EMEA. “Today’s most successful organisations can move and respond at speed as well as safeguard their brand and customer trust. With applications and user data on more devices in more locations than ever before, these companies have moved beyond the traditional IT security approach which may not protect the digital businesses of today.”

While security could be seen as something of a sound-bite for board-level execs in recent months, the importance of spreading cybersecurity awareness and responsibility throughout the organization have been made clear by the IT department. Of the IT decision makers who were surveyed as part of the research, 22% said the board should be most aware of the necessary actions to take following a significant data breach, and 40% said the CEO should be this person.

Industry insiders have commented to BCN in recent weeks that the use of security comments by execs highlighted the importance of cybersecurity has been an effort to appease customers and stakeholders, and there is little follow through in terms of investment in new technologies. Research from the Economist Intelligence Unit also backs up these comments as its own survey said only 5% of UK corporate leaders consider cyber security a priority for their business, contradicting comments made by execs in the press.

Shadow IT was another area which featured in the report, as unauthorized devices and software are seemingly still plaguing IT decision makers throughout the industry. 55% of the IT decision makers surveyed believe their own employees are the greatest security threat a company faces, which is also backed up by the statistics that 26% would use their personal device to access corporate data and almost a fifth, 16%, would risk being in breach of the organisation’s security to carry out their job effectively.

“Security is not just about technology. As the research shows, the decisions and behaviours of people will impact the integrity of a business,” said Baguley. “However, this can’t be about lock-down or creating a culture of fear. Smart organisations are enabling, not restricting, their employees – allowing them to thrive, adapt processes and transform operations to succeed.”