How Quantum Computing with DNA Storage Will Affect Your Health | @CloudExpo #Cloud #Storage

Moore’s Law, which states that processing speeds will double every two years as we cram more and more silicon transistors onto chips, has been faltering since the early 2000s when the law started to run up against fundamental limitations presented by the laws of thermodynamics. While the chip industry, with Intel leading the charge, has found ways to sidestep the limitations up until now, many are now saying that despite the industry’s best efforts, the stunning gains in processor speeds will not be seen again by the simple application of Moore’s Law. In fact, there is evidence to show that we are reaching the plateau for the number of transistors that will fit on a single chip. Intel has even suggested silicon transistors can only keep getting smaller during the next five years.As a result, Intel has resorted to other practices to improve processing speeds, such as adding multiple processing cores. However, these new methods are just a temporary solution because computing programs can benefit from multi-processors systems up until a certain point.

read more

Cloud Management Tools Still Elusive | @CloudExpo #API #AWS #Cloud

Google’s Cloud team this week discussed multicloud integration with their own platform and AWS using Google Cloud Endpoints and AWS’ Lambda. The Cloud Endpoints let you develop open APIs, and you can then call those APIs up from Lambda. This may be the way forward for businesses eager to get their multiple public cloud providers integrated. We’re interested in the network dependency that’s now been introduced as the two providers connect. We’ll see what other integration or multicloud management tools come out as this trend grows.

read more

Dell EMC Announce Azure Stack Hybrid Cloud Solution | @CloudExpo @DellEMC #Cloud #Azure

Dell EMC have announced their Microsoft Azure Stack hybrid cloud platform solutions. This announcement builds upon earlier statements of support and intention by Dell EMC to be part of the Microsoft Azure Stack community. For those of you who are not familiar, Azure Stack is an on premise extension of Microsoft Azure public cloud.
What this means is that essentially you can have the Microsoft Azure experience (or a subset of it) in your own data center or data infrastructure, enabling cloud experiences and abilities at your own pace, your own way with control. Learn more about Microsoft Azure Stack including my experiences with and installing Technique Preview 3 (TP3) here.

read more

AT&T moving databases and application workloads to Oracle’s cloud

Oracle and AT&T have announced a strategic agreement whereby the telco will move thousands of its large scale internal databases to Oracle’s infrastructure as a service (IaaS) and platform as a service (PaaS), the companies have announced.

AT&T will gain global access to Oracle’s cloud portfolio, both in the public cloud and on AT&T’s Integrated Cloud, to include Oracle’s IaaS, PaaS, database as a service, and software as a service. This will help ‘increase productivity, reduce IT costs and enable AT&T to gain new flexibility in how it implements SaaS applications across its global enterprise’, Oracle added.

“This is an historic agreement,” said Mark Hurd, CEO of Oracle in a statement. “The Oracle Cloud will enable AT&T to use Oracle technology more efficiently across every layer of the technology stack. This includes AT&T’s massive redeployment of Oracle databases, which will be provisioned entirely from the Oracle Cloud Platform including our highly cost effective Exadata as a Service.”

AT&T partners with other cloud companies outside of Oracle. CEO Randall Stephenson took to the stage at IBM’s InterConnect event in Las Vegas back in March, arguing the importance of the Armonk firm’s ‘enterprise strong’ cloud message. “I don’t believe we’re more than three or four years away from being indistinguishable from the ‘data cloud’ to the ‘network cloud’…we’re riding off what you guys are doing,” he said.

The company also renewed its vows with Amazon Web Services (AWS) last October to ‘help both existing and new customers more efficiently migrate to and utilise the AWS cloud with the AT&T network’.

This announcement can also be analysed in the context of Verizon selling its cloud and managed hosting services arm to IBM earlier this week. While the companies described it as ‘a unique cooperation between two tech leaders’, others saw it differently, with John Dinsdale, chief analyst at Synergy Research, noting last year when Verizon shut down part of its cloud service, that “telcos generally are having to take a back seat on cloud.”

AT&T and Oracle Enter Into an Agreement

It’s raining agreements in the cloud sector, and the latest is the one that Oracle entered into with AT&T. Under the terms of this agreement, AT&T will move its thousands of large-scale internal databases to Oracle’s infrastructure-as-a-service (IaaS) and platform-as-a-service (PaaS).

This deal gives a win-win scenario for both companies. For AT&T, this deal gives access to Oracle’s cloud services portfolio and all the tools that come with it. Specifically, this agreement will allow AT&T to optimize the scheduling and dispatch of its field technicians.

Currently, AT&T employs more than 70,000 field technicians and it wants to make the most of their services. To optimize the skill and availability of these technicians, it wants to combine its own machine learning and big data capabilities with Oracle’s cloud technology. Through this combination, AT&T plans to increase the overall productivity and efficiency of its workers and ensure that technicians arrive on-time for service requests.

Right now, that’s one of the complains that its customers have, as the company gives a two-hour window for its technicians arrival time. That can be too large a time gap and customers can plan their day better if they know the exact arrival time of technicians. This is why AT&T wants to provide accurate time slots and want to ensure that its technicians arrive at the given time.

This time accuracy is dependent on the work duration of each job. For example, if a technician starts his first service at 9 AM and takes half an hour to finish it. Add another 15 minute commute time, so the next service can be only at 9.45 AM. Now, if the system predicts inaccurately that the technician can finish the job in 20 minutes, then he will not be able to keep up the next appointment. This is why both the duration of the job and the overall schedule of the technicians have to be considered, and AT&T is doing just that with advanced technology.

This is a significant agreement for Oracle too, as it’s looking to expand its cloud-based offerings. Such deep collaborations with existing clients provides more opportunities for Oracle to extend its offerings. Interestingly, Oracle entered into agreements with VMware and Equinix as well, on the same day. These three agreements can greatly boost Oracle’s revenue and more importantly, give it a firm hold in the competitive cloud market.

One significant trend that we’ve been noticing is the flexibility that companies like AT&T have when it comes to entering into agreements with cloud providers. For example, AT&T has earlier deepened its commitment with both Amazon Web Services and IBM to handle cloud services for networking, mobility, security, analytics and the Internet of Things. This is a heartening trend as companies can pick and choose the provider they want for multiple divisions or services, without having to rely on a single company to provide it all.

Overall, this agreement is expected to give rich benefits to both Oracle and AT&T, and hopefully will improve service and offerings to AT&T’s end customers.

The post AT&T and Oracle Enter Into an Agreement appeared first on Cloud News Daily.

Supermicro to Exhibit at @CloudExpo NY & CA | #DataCenter #SDN #SDDC #AI

SYS-CON Events announced today that Super Micro Computer, Inc., a global leader in compute, storage and networking technologies, will exhibit at SYS-CON’s 20th International Cloud Expo®, which will take place on June 6-8, 2017, at the Javits Center in New York City, NY.
Supermicro (NASDAQ: SMCI), the leading innovator in high-performance, high-efficiency server technology, is a premier provider of advanced server Building Block Solutions® for Data Center, Cloud Computing, Enterprise IT, Hadoop/Big Data, HPC and Embedded Systems worldwide. Supermicro is committed to protecting the environment through its “We Keep IT Green®” initiative and provides customers with the most energy-efficient, environmentally friendly solutions available on the market.

read more

How hyperconvergence is driving today’s hyper-marketer

Every IT industry news website, whitepaper, analyst research, and e-shot seems to reference ‘hyperconvergence’. As the public cloud continues to grow so does the negative press when any form of a security breach occurs on its network. Big or small, many business leaders are now looking to systems that can have the commercial and technical agility of the public cloud whilst ensuring any data that is core to their business remains on-premise, with anything on the periphery being pushed out to the public cloud.

Hybrid cloud environments have been in operation for many years but the difference with hyperconverged infrastructure, such as the likes of Nutanix, is that it takes a much more agnostic approach to server, storage, networking and virtualisation. More effort is instead focused on software defined architecture for deployment and integration. If deployed effectively hyperconverged networks can also be very competitive on cost compared to their public cloud rivals.

Another significant and disruptive topic gathering momentum across the industry is GDPR. Once General Data Protection Regulation (GDPR) changes come into effect on 25th May 2018 the rules of engagement will change dramatically and as a consequence so will the way in which data needs to be handled – particularly for marketing departments and agencies. Obtaining consent from the person whose data a business holds is going to be a major factor as there will be no more options around ‘opt out’ of a database as businesses will have to systematically ask people if they want to ‘opt-in’. They’ll then have to provide the necessary evidence that a clear ‘yes’ has been communicated by the data owner rather than an ambiguous ‘well, they didn’t say no’ approach.

Companies that specialise in obtaining contact databases that are used for marketing campaigns will probably no longer exist in their current form due to GDPR changes. This change will be immediate from May 2018 in the B2C world but once all criteria is defined it will also play a major part in the B2B market so companies need to act now.

This is just skimming the surface of a very complex area but any business breaching the new rules will be fined heavily for breaking the law – up to 4% of their global revenues or €20 million, whichever is greater. A fine of this magnitude could put many firms out of business. Current legislation means these fines are capped at £500k, so when Talk Talk were in the press for their security breach relating to the theft of over 150,000 customer records last year they were reportedly fined around £400k. If the new rules were in place that figure could of been closer to £70m which is similar to the £80m they lost in the first quarter following the breach due to a 11% fall in share price and the loss of over 100,000 customers.

Born in the cloud marketing automation platforms are also driving disruption as businesses look to automate and orchestrate the way in which they engage with their target audiences. Similar to businesses view of hyperconverged infrastructures, marketers don’t really care what technology a platform sits on. The core piece of information is how it can shorten the buyer journey, build buyer personas, and create actionable reporting that enables marketing departments to focus on the campaigns that are delivering the greatest return. These platforms also give businesses the agility of having multiple agency resource but delivered through an internal team of marketers or sales enablers. What this does is provide more time defining the strategy based on qualitative and actionable intelligence and therefore decreases the cost of customer acquisition.

The omnichannel world of social media, e shots, web, automation, mobile, CRM and the likes are also fusing into a hyperconverged software centric architecture. Marketers need to work with IT, finance and the broader business to aggregate operations into a single interface where actionable and informed information becomes the output for driving business change and growth.  It is important to ensure the whole business is brought on that journey by informing them of what your customer needs are. If everyone in business understands the customer, experience throughout the whole path to purchase will be much more positive and in turn that lead to increased loyalty.

Changes to GDPR, marketing automation, balanced with the rise of storytelling is defining the future direction of inbound marketing.  Companies are reviewing consumer search terms, downloads, page visits, social conversation, heatmaps and satisfaction surveys to really understand what customers want. Essentially the success of inbound marketing is the result of much more targeted and relevant content being shared with those who have a specific need. This is why marketing teams are creating buyer personas to ensure the right information is being shared with the right person at the right time. Next year’s changes to GDPR will really fan the inbound marketing flames as companies are going to need to push the boundaries of inbound techniques and ‘own’ their data rather than rely on the traditional outbound approach that relied so much on generic data lists.

Over the years marketers have always had to adapt to change and embrace technology advancements in order to succeed. With digital engagement, customer experience and value forming the future shape of marketing, change is happening at a rapid rate. As a result a new breed of the collaborative hyper-marketer is emerging that will need to accelerate that pace of change in today’s hyperconverged workplace.

How to Improve Excel Performance in a VM

Microsoft® Office for Windows® is the gold standard for a productivity suite, and the spreadsheet in that suite, Excel, is the gold standard of spreadsheet applications. So, it is not surprising that many Parallels Desktop users turn to Excel® for their spreadsheet needs, and getting the best performance out of Excel is important to them. […]

The post How to Improve Excel Performance in a VM appeared first on Parallels Blog.

DDN Storage “Bronze Sponsor” of @CloudExpo | #SDN #AI #DX #DataCenter

As cloud adoption continues to transform business, today’s global enterprises are challenged with managing a growing amount of information living outside of the data center. The rapid adoption of IoT and increasingly mobile workforce are exacerbating the problem. Ensuring secure data sharing and efficient backup poses capacity and bandwidth considerations as well as policy and regulatory compliance issues.

read more

Supermicro to Exhibit at @CloudExpo NY & CA | #DataCenter #SDN #SDDC #AI

SYS-CON Events announced today that Super Micro Computer, Inc., a global leader in compute, storage and networking technologies, will exhibit at SYS-CON’s 20th International Cloud Expo®, which will take place on June 6-8, 2017, at the Javits Center in New York City, NY.
Supermicro (NASDAQ: SMCI), the leading innovator in high-performance, high-efficiency server technology, is a premier provider of advanced server Building Block Solutions® for Data Center, Cloud Computing, Enterprise IT, Hadoop/Big Data, HPC and Embedded Systems worldwide. Supermicro is committed to protecting the environment through its “We Keep IT Green®” initiative and provides customers with the most energy-efficient, environmentally friendly solutions available on the market.

read more