Dr. Max Opening Keynote at @CloudExpo NY | @IBMcloud @IBMWatson ‏#AI #Bluemix #CloudFoundry

In his opening keynote at 20th Cloud Expo, Michael Maximilien, Research Scientist, Architect, and Engineer at IBM, will motivate why realizing the full potential of the cloud and social data requires artificial intelligence. By mixing Cloud Foundry and the rich set of Watson services, IBM’s Bluemix is the best cloud operating system for enterprises today, providing rapid development and deployment of applications that can take advantage of the rich catalog of Watson services to help drive insights from the vast trove of private and public data available to enterprises.

read more

WineSOFT to Exhibit at @CloudExpo New York | #SDN #SDS #AI #DataCenter

SYS-CON Events announced today that WineSOFT will exhibit at SYS-CON’s 20th International Cloud Expo®, which will take place on June 6-8, 2017, at the Javits Center in New York City, NY. Based in Seoul and Irvine, WineSOFT is an innovative software house focusing on internet infrastructure solutions. The venture started as a bootstrap start-up in 2010 by focusing on making the internet faster and more powerful. WineSOFT’s knowledge is based on the expertise of TCP/IP, VPN, SSL, peer-to-peer, mobile browser, and live streaming solutions.

read more

Hitachi to Exhibit at @CloudExpo | @HDScorp #IoT #IIoT #AI #DX #SmartCities

SYS-CON Events announced today that Hitachi Data Systems, a wholly owned subsidiary of Hitachi LTD., will exhibit at SYS-CON’s 20th International Cloud Expo®, which will take place on June 6-8, 2017, at the Javits Center in New York City. Hitachi Data Systems (HDS) will be featuring the Hitachi Content Platform (HCP) portfolio. This is the industry’s only offering that allows organizations to bring together object storage, file sync and share, cloud storage gateways, and sophisticated search and analytics to create a tightly integrated, simple and smart cloud-storage solution. HCP provides massive scale, multiple storage tiers, powerful security, cloud capabilities, and multitenancy with configurable attributes for each tenant, all backed by legendary Hitachi reliability. Designed to preserve data for long durations, HCP carries built-in data protection mechanisms and is designed to fluently evolve as storage technologies change.

read more

[session] The Right #Microservices | @CloudExpo @IBMDevOps #AI #DevOps

We all know that end users experience the internet primarily with mobile devices. From an app development perspective, we know that successfully responding to the needs of mobile customers depends on rapid DevOps – failing fast, in short, until the right solution evolves in your customers’ relationship to your business. Whether you’re decomposing an SOA monolith, or developing a new application cloud natively, it’s not a question of using microservices – not doing so will be a path to eventual business failure. The real and more difficult question, in developing microservices-based applications, is this: What’s the best combination of cloud services and tools to use to get the right results in the specific business situation in which you need to deliver what your end users’ want. Considering that new streams of IoT data are already raising the stakes on what end users expect in their mobile experiences, the versatility and power of cloud services is going to become the key to innovation that’s meaningful in the market.

read more

Cloud security best practice: Security as a service or cloud security tooling?

A recent survey on cloud security and cloud adoption found that the single biggest impediment to moving to the public cloud was continued concerns around security.

While there has been tremendous progress in the area of cloud security in recent years, another important finding of the LinkedIn survey was that legacy tools, reconfigured for use in the public cloud just don’t work. This is mostly due to the nature of the cloud computing environment especially the aspects of dynamic networking, and workload agility.

The two major methodologies that have grown up to deal with these concerns are the development of specific security tools targeted to cloud environments and the development of security as a service (SECaaS). In the case of both methodologies a number of players have entered the fray, including a number of legacy security appliance manufacturers and cloud management platform developers.

On the tooling side a number of legacy security tools have been reborn as cloud security virtual appliances, including firewalls, anti-virus and identity management tools. Also new cloud purposed tools are being rolled-out such as web application firewall, network segmentation, and compliance checking. The SECaaS methodology calls for comprehensive, separated grid, security services and again a number of vendors are seeking footholds in this space.

The biggest selling points around “tooling” for cloud security are the ability to control your own environment and roll out tools that, while they work differently than their legacy counterparts, are conceptually familiar. When it comes to the reborn legacy tools, a virtual perimeter firewall looks and feels much like the physical firewall appliances that were rolled out in the data centre. When tooling the security of the environment relies solely on the team configuring the appliances. Virtual security appliance vendors include Barracuda, Fortinet, Blue Coat and Cisco.

When speaking about “cloud born” tools such as network micro-segmentation, threat identification and compliance checking, the emphasis is no longer on securing the environment but focusing on the individual workloads. Not a familiar place for the legacy security professional but in many cases much more effective in securing the environment. Vendors in this space include VMware, Threat Stack and AlertLogic. Many of the major infrastructure vendors have programs to assess and secure the environment based on tooling as part of migration to the cloud, including IBM Cognitive Security and HP Enterprise Secure Cloud.

The major difference of SECaaS is the ability to offload the backend processing to a separate provider and only run a lightweight agent on each VM. This provides agility in securing workloads whether they are moving to different physical hardware, different data centers or changing in numbers. The agent serves as a translator between the backend service and an executor of the appropriate policies. SECaaS can provide all of the functions that appliances can including segmentation, anti-virus, threat identification and compliance checking.

Another benefit found in SECaaS products is metered licensing. Much like the public cloud itself payment for services is based on usage. The questions around SECaaS – at least in my mind – revolve on an individual product’s ability to secure serverless or micro-services based applications, since these paradigms support application execution environments that are constantly in flux.

Examples of SECaaS providers are Bitglass, Alien Vault, Okta, Trend Micro, CloudPassage and Palerra (a division of Oracle). Most SECaaS providers are focusing on slices of the security pie such as IAM, encryption, anti-virus or compliance, recently a few multi-faceted SECaaS solutions have begun to emerge (for instance CloudPassage Halo), which is where this paradigm really becomes interesting. Still, adoption of SECaaS may present similar challenges to cloud adoption itself, because, in general, security professionals operate based on what they trust.

Security still stands as the most critical piece of architecting and implementing any computing environment. There are an increasing number of ways to secure public and hybrid cloud environments hopefully resulting in increased cloud adoptions as enterprises become more comfortable. Whether tooling or SECaaS, the key is planning for the security solution, or set of solutions, that best fit the enterprise, and the services that said enterprise will present.

One year GDPR countdown is a final warning for organisations to sort compliance out

May 25 2018 will see the General Data Protection Regulation (GDPR) legislation come into effect.

Organisations will by now be more than aware of the penalties – 4% of annual turnover or €20 million (£17.3m), whichever is greater, and if not take this as your final warning – but how are companies reacting to it?

Keyrus is a data intelligence and master data management (MDM) provider. The company has been putting its message out there at various events – including at the Information Builders Summit in London this week, with another at IBM at the end of the month – on how organisations need to protect themselves and what they can do about it.

Santiago Castro is head of business analytics at Keyrus’ UK practice. He explains that while each company is different in its requirements, there are other issues at play.

“You would like to have a one size fits all type solution, but one of the main points of GDPR is to understand what the purpose is of holding data and processing data,” he says. “It also depends on what your contractual situation is with your customers; some customers agree or allow organisations to hold data for these purposes while others don’t have that agreement.”

Naturally, with the anniversary looming some organisations have put together a few best practice ideas of their own. Skyhigh Networks, for example, issued a new eBook, titled ‘The GDPR: An Action Guide for IT’ earlier this week. The cloud access security broker (CASB) offers companies the chance to assess their GDPR ‘risk’ rating, as well as advanced encryption for structured and unstructured data.

Keyrus puts together a seven-step methodology for customers, to understand first of all what is needed, assess the gaps needed to be filled for compliance, then look at the risks, plan what needs to happen first, and move on from there. It’s ‘awareness to assessment to prioritising to planning to implementation’, as Castro puts it.

Sheila Fitzpatrick, chief privacy officer at NetApp, puts it this way. “Companies of all sizes need to take an active look at what data they hold, what they use it for, and where it’s stored,” she said. “They can then use this insight to conduct a comprehensive review of data privacy policies, consents, processes and so on to ensure they are meeting the minimum legal requirements.” Castro adds that in some cases, consent from customers will suffice.

The key aspect however is to treat GDPR not as a potential disaster looming towards the horizon, but as an opportunity. “I often try to see it more as an opportunity than as a pain or cost,” says Castro, “because if you actually understand data assets, they can get more valuable, so this is an investment to do something with the data you hold.”

This is backed up by Rogelio Aguilar, senior consultant at Sungard Availability Services. “Businesses should approach the next year as a great opportunity to drive increased value,” he said. “A correct GDPR implementation will help businesses manage data privacy risk, implement good record management practices, streamline business processes, increase resilience as well as benefit from cost savings and ultimately a more competitive market position.

“To take advantage of these opportunities and mitigate risk, senior management must champion GDPR as a strategic initiative.”

Read more: Why you need to understand GDPR now – and what you need to do from here

How cloud operators can help mitigate the onerous tasks of GDPR responsibilities

With the onset of GDPR (General Data Protection Regulation) in May 2018, data protection requirements will become more stringent. The responsibilities placed on an organisation relating to the data it holds will be two-fold:

  • As a data controller (where the organisation enters and maintains personal data), the organisation must comply with rules concerning consent, access and transferability
  • As a data processor [where the organisation holds data on its own servers] it must follow regulation by ensuring high level cyber security, physical hardware security, strict backup regimes, firewalls and auditing. For example, a data processor is responsible for monitoring the access to the physical equipment on which the data sits, and the route the data takes to be processed. A good way of doing this is to produce an access control policy, which clearly sets out roles and rights of staff members, only allowing staff with sufficient rights the ability to access system

What’s an organisation to do? The answer is to either remain a full data processor – with the responsibilities that come with that – or to outsource all its IT.  An example of the latter is outsourcing to a hosted desktop provider that is accredited under ISO 27001, as it will already have policies and procedures in place which will cover the requirements of a data processor under GDPR.

Security tools previously only affordable by large organisations can be deployed for use by SMEs – affordable now because the costs are shared among users of the outsourcing company’s secure data centre. Services include robust firewalls, enterprise quality antivirus and web filtering, optional encryption of sent emails and management of all access devices [smartphones/tablets/laptops/desktops or thin clients] used by staff.

Outsourcing the storage, backups, security and processing of data to a company that complies with strict data protection regulations will ease the processing responsibility; “ease” because the organisation will still need to make sure that paper copies aren’t left lying around and that staff are given adequate authorisation to manage access to the data. However, the bulk of an organisation’s responsibility under GDPR’s data processor requirements can be safely left in the hands of the professionals at the outsourcing company.

Hybrid solutions, whereby an external IT company manages in-house equipment, can also work, but in such instances one needs to be particularly careful to use a very reputable IT company. For a hybrid IT solution, using the wrong kind of support company may hinder rather than help.

Let’s consider the following two scenarios: (i) the data storage is remote but the processing local (i.e. on the organisation’s own servers).  In this case, the organisation will still be considered a processor (ii) the organisation brings in an IT provider to manage the servers, but the servers are owned by the organisation. In this case, the organisation will still have the responsibilities of a processor. IT providers cannot typically take responsibility that the personal data customers hold is GDPR compliant and therefore the organisation must ensure that the data held complies with the rules.

However, when it comes to processing responsibilities, the burden of compliance will fall somewhere between the organisation and its IT provider. What an organisation must ensure is that it is working in perfect synergy with its IT provider in setting out the GDPR processing responsibilities. They need joint access policies, joint security policies and so on.

In summary, outsourcing all of the IT can greatly simplify the GDPR management process, while a hybrid solution can be GDPR compliant, but the organisation must be extremely diligent as to which IT vendor it chooses as a partner to ensure that nothing is falling between the proverbial cracks of GDPR’s processing and procedures.

Certifications for Cloud Professionals

Cloud companies are growing at a rapid pace and this means, they’re constantly going to need cloud professionals to manage their infrastructure and client projects. This translates to more opportunities for cloud professionals. In fact, this profession is likely to be the future of IT.

Given these opportunities, it makes sense for IT professionals to move to the cloud. The best way to do that is by getting hands-on knowledge about cloud applications, infrastructure, management, deployment and more.

One way to acquire this knowledge is through self-study, but then you also need to show your potential employers that you’re proficient in this area. So, a good way to learn and showcase your skills to your future employer is through certifications. The obvious advantage is you learn new things and at the same time, you have some proof to show the world your expertise on the subject.

Due to these multiple advantages, many companies are offering these certification courses. Let’s look at some of them. All these certifications are in alphabetical order, so the choice of a certification simple depends on what you want to do and the niche you want to carve for yourself.

Atlassian Certified JIRA Administrator

This is a fairly tough certification that helps you to understand the popular JIRA deployment. This certification is perfect for anyone who want to become the administrator for mission-critical applications based on JIRA. According to Indeed, the average salaries for JIRA administrators is anywhere from $70,000 to $95,000 per year.

AWS Certified DevOps

DevOps is a happening field, as it combines development with operations. In other words, you not only code and develop applications, but you also manage them. With this skill, you’ll act as a bridge between development and administration teams, putting your valuable knowledge to good use to streamline and increase the pace of deployment.

To get you into this interesting line of work, AWS Certified DevOps course is a good fit. The average salary for people with this certification can be upwards of a $100,000 a year.

CompTIA Cloud+

If you’re looking for competence in cloud technologies, then this is the one for you. It’s a non-vendor certification that tests on your general knowledge and understanding of the cloud, without going into any specific niche areas.

This makes this certification a good starting point for a more technology-agnostic and generic cloud training. It’s great if you’re just starting off and want to know as much as you can about cloud technologies in general, before deciding the specific area you want to take up.

Oracle Database Cloud Administrator

Oracle implementation specialist is a pre-requisite for this certification. As the name suggests, this certification helps you to learn more about cloud databases and how you can manage them in an efficient way. Average salary is around $87,000 according to the 2017 IT Skills and Salary Survey.

In short, if you want to become a cloud professional, choose from one or more of these certifications, as they’re sure to give you an edge over that of your competitors.

The post Certifications for Cloud Professionals appeared first on Cloud News Daily.

Online Demo and Webinars for Parallels Mac Management for Microsoft SCCM

Reserve your seat today for a Parallels Mac Management for Microsoft SCCM demo or 1 hour deep dive held by our Sales Engineer Danny Knox. In a demo, we will give you an encompassing overview about all essential details in 30 minutes and will answer all your questions. In a deep dive session, we will show you […]

The post Online Demo and Webinars for Parallels Mac Management for Microsoft SCCM appeared first on Parallels Blog.

Announcing @Loom_Systems to Exhibit at @CloudExpo NY | #AI #DX #DevOps

SYS-CON Events announced today that Loom Systems will exhibit at SYS-CON’s 20th International Cloud Expo®, which will take place on June 6-8, 2017, at the Javits Center in New York City, NY. Founded in 2015, Loom Systems delivers an advanced AI solution to predict and prevent problems in the digital business. Loom stands alone in the industry as an AI analysis platform requiring no prior math knowledge from operators, leveraging the existing staff to succeed in the digital era. With offices in San Francisco and Tel Aviv, Loom Systems works with customers across industries around the world.

read more