Beacon Essentials You Must Quickly Learn | @ThingsExpo #IoT #M2M #Microservices

Our resident Cognizant digital/mobile expert, Peter Rogers, asked me to recommend a digital strategies topic to share, and I suggested Beacons for this week. I confess to reading about them daily without knowing much about them, so I want to thank Peter for this article! Enjoy!
Beacons do not push out notifications. They broadcast an advertisement of themselves (traditionally their UUID, major and minor values) and can be detected by Bluetooth Low Energy (BLE) devices.

read more

Adaptive Two-Factor Authentication: Is It All It’s Cracked up to Be? | @CloudExpo #Cloud

It’s a given that employee access to corporate systems should be both as secure and simple as possible. Up until recently however, time-strapped CIOs, under pressure from demanding staff and challenged with authenticating users all over the world on multiple devices, have been torn between relying on the fatally flawed password or hard token two-factor authentication (2FA) to keep their systems secure.

read more

Autodesk launches $100m PaaS developer programme

Autodesk forge platformCAD software maker Autodesk has put $100m on the table and challenged developers create new cloud friendly design automation systems on its Forge development platform.

In an official statement on its web site Autodesk, famous for its AutoCAD computer aided design (CAD) system, explains that it wants the cloud-based Forge system to catalyse a change in the way products are designed made and used. The Forge scheme was announced at Autodesk University the company’s conference in Las Vegas.

The initiative consists of three major components, a platform-as-a-service (PaaS) offering, a developer programme and a $100 million investment fund.

The Forge Platform is a set of cloud services that span early stage design, engineering, visualization, collaboration, production and operations. It offers open application programming interfaces (APIs) and software development kits (SDKs) for developers wanting to build cloud powered apps and services. The Forge Developer Program provides training, resources and support and will host an inaugural Forge Developer Conference in June 2016. In addition to financial support for companies that quality for the developer fund, Autodesk will give business and technical support.

The logic of the scheme is that industrial production methods used to design, make and use products is changing and new technologies disrupt every aspect of the product lifecycle. This can make production risky, since investments are poured into the creation of a new product line only for the market to be destroyed by some other invention before the manufacturer can launch their products.

Cloud computing, by creating a more flexible fluid economies of design and manufacture, could help make CAD systems adapt to the new market conditions, according to Amar Hanspal, senior VP of Products at Autodesk. “Autodesk is launching Forge to help developers build new businesses in the changing manufacturing landscape,” said Hanspal, “we are inviting innovators to take advantage of Autodesk’s cloud platform to build services that turn today’s disconnected technologies into highly connected, personalised experiences.”

Autodesk itself has evolved as the manufacturing changed. It started by creating computer-aided design and manufacturing (CAD/CAM) software, which was used by engineers to create parts on screen before manufacturing them. However, in later years it has evolved into product lifecycle management (PLM) systems and offers services such as simulation and modelling. It has taken on a stronger mobile and cloud focus with offerings such as AutoCAD 360, a mobile companion to AutoCAD that engineers can use to call up blueprints while away from their desks.

The end of the artisan world of IT computing

cloud computing machine learning autonomousWe are all working toward an era of autonomics ‒ a time when machines not only automate key processes and tasks, but truly begin to analyse and make decisions for themselves. We are on the cusp of a golden age for our ability to utilise the capacity of the machines that we create.

There is a lot of research about autonomic cloud computing and therefore there are a lot of definitions as to what it is. The definition from webopedia probably does the best job at describing autonomic computing.

It is, it says: “A type of computing model in which the system is self-healing, self-configured, self-protected and self-managed. Designed to mimic the human body’s nervous system in that the autonomic nervous system acts and reacts to stimuli independent of the individual’s conscious input.

“An autonomic computing environment functions with a high level of artificial intelligence while remaining invisible to the users. Just as the human body acts and responds without the individual controlling functions (e.g., internal temperature rises and falls, breathing rate fluctuates, glands secrete hormones in response to stimulus), the autonomic computing environment operates organically in response to the input it collects.”

Some of the features of autonomic computing are available today for organisations that have completed – or at least partly completed – their journey to the cloud. The more information that machines can interpret, the more opportunity they have to understand the world around them.

It spells the death of the artisan IT worker – a person working exclusively with one company, maintaining the servers and systems that kept a company running. Today, the ‘cloud’ has literally turned computing on its head. Companies can access computing services and storage at the click of a button, providing scalability, agility and control to exactly meet their needs. Companies pay for what they get and can scale up or down instantly. What’s more, they don’t need their army of IT artisans to keep the operation running.

This, of course, assumes that the applications that leverage the cloud have been developed to be native using a model like the one developed by Adam Wiggins, who co-founded Heroku. However, many current applications and the software stacks that support them can also use the cloud successfully.

More and more companies are beginning to realise the benefit that cloud can provide, either private, public or hybrid. For start-ups, the decision is easy. They are ‘cloud first’ businesses with no overheads or legacy IT infrastructure to slow them down. For CIOs of larger organisations, it’s a different picture. They need to move from a complex, heterogeneous IT infrastructure into the highly orchestrated and automated – and ultimately, highly scalable and autonomic – homogeneous new world.

CIOs are looking for companies with deep domain expertise as well as infrastructure at scale. In the switch to cloud services, the provision of managed services remains essential. To ensure a smooth and successful journey to the cloud, enterprises need a company that can bridge the gap between the heterogeneous and homogeneous infrastructure.

Using a trusted service provider to bridge that gap is vital to maintain a consistent service level to the business users that use or consume the application being hosted. But a cloud user has many more choices to make in the provision of their services. Companies can take a ‘do it myself approach’, where they are willing to outsource their web platform but keep control of testing and development. Alternatively, they can take a ‘do it with me’ approach, working closely with a provider in areas such as managed security and managed application services. This spreads the responsibility between the customer and provider, which can be decided at the outset of the contract.

In the final ‘do it for me’ scenario, trust in the service provider is absolute. It allows the enterprise customer to focus fully on the business outcomes. As more services are brought into the automation layer, delivery picks up speed which in turn means quick, predictable and high-quality service.

Hybrid cloud presents a scenario of the ‘best of both worlds’. Companies are secure in the knowledge that their most valuable data assets are still either on premise in the company’s own private servers or within a trusted hosting facility utilising isolated services. At the same time, they can rely on the flexibility of cloud to provide computing services that can be scaled up or down at will, at a much better price point than would otherwise be the case.

Companies who learn to trust their service provider will get the best user experience. In essence, the provider must become an extension of the customer’s business and not operate on the fringes as a vendor.

People, processes and technology all go together to create an IT solution. But they need to integrate between the company and the service provider as part of a cohesive solution to meet the company’s needs. The solution needs to be relevant for today but able to evolve in the future as business priorities change. Only then can we work toward a future where autonomics begins to play a much bigger part in our working lives.

Eventually, autonomic computing can evolve almost naturally, much like human intelligence has over the millennia. The only difference is that with cloud computing the advances will be made in years, not thousands of years. We are not there yet, but watch this space. In your lifetime, we are more than likely to make that breakthrough to lead us into a brave new world of cloud computing.

 

Written by Jamie Tyler, CenturyLink’s Director of Solutions Engineering, EMEA

Is Adobe axing Flash under cover of Creative Cloud?

Adobe Animate screenAs an official Adobe blog hailed a ‘new era’ for Flash Professional, the software company seems to be sidelining its creation.

Apple boss Steve Jobs once famously dismissed Flash as proprietary software from the PC age. Now Adobe appears to be admitting it doesn’t have a role in the age of the cloud. While updating readers on developments in its Creative Cloud, Adobe reveals that Flash Professional CC is to be re-branded as Adobe Animate CC in order to “more accurately reflect the content-formats produced by this tool.”

Flash has long been heavily criticised because its proprietary nature made it unsuitable for the web. Jobs said Apple would never consider Flash for any of its phones tablets because “we know from painful experience that letting a third party layer of software come between the platform and the developer ultimately results in sub-standard apps.”

Latterly, the high power needed by devices running Adobe would make it unsuitable for the cloud, while the lack of openness would, in Jobs’ words, “hinder the progress of the platform.”

Adobe explains, in its blog, that “open web standards and HTML5 have become the dominant standard” and that “Flash Professional CC product team has embraced this movement by rewriting the tool from the ground up”. Adding native support for HTML5 Canvas and WebGL, in addition to supporting output to any format was such a ‘hug hit’ with Adobe customers that, in a short space of time, a third of all content produced in Flash Professional CC is HTML5-based, reaching over 1 billion devices worldwide.

The name change reflects the downgrading of Flash’s role in Creative Cloud. However, in another official blog post the vendor explains that Adobe Animate CC will continue to support Flash (SWF) and AIR formats ‘as first-class citizens’, as well as other formats like broadcast-quality video. “We will continue improving Animate CC’s HTML5 capabilities over time, while optimizing its core animation and authoring feature set,” said Rich Lee, senior product marketing manager for Creative Cloud web products.

In the cloud, it was the lack of stability and security that dissuaded Apple from using Flash.

Flash was highlighted by Symantec for having one of the worst security records in 2009. Steve Jobs once said he knew first hand Flash is the top reason for Apple device crashes. “We don’t want to reduce the reliability and security of our iPhones, iPods and iPads by adding Flash,” Jobs once said. Now, it seems, Adobe has accepted that Flash isn’t right for the cloud.

HPE launches Edgeline systems and Aruba Sensors for IoT

HPE datacenterHewlett Packard Enterprise (HPE) has announced a new invention that means that Internet of Things (IoT) systems can decentralise all their processing and devolve decision-making to local areas. It has also unveiled plans to simplify the roll out of IoT services with the Meridian cloud service.

HPE’s new Edgeline IoT Systems 10 and 20 are designed to sit at the network edge, so that customers can aggregate and analyze data in real-time, more quickly and securely, and control devices and things, it says. By keeping data localised, they can also alleviate traffic across the network.

The systems come in ruggedized, mobile and rack-mounted versions and are aimed at a variety of industry sectors, with HPE specifically naming the logistics, transport, health, government and retail sectors. The new systems are certified to work with Microsoft’s Azure IoT Suite.

The Edgeline range is the fruit of a partnership with Intel to create open systems for the IoT market. Models include the HPE IoT System EL10, a rugged entry level priced edge gateway designed with long lifecycle components and the pricier but more powerful HPE IoT System EL20. The Edgeline range is built on HPE’s Moonshot system architecture, which is designed to use less energy and space than typical servers.

Meanwhile one of HPE’s companies, Aruba, has unveiled a cloud-based beacon management system for multivendor Wi-Fi networks.

The new, enterprise-grade IoT Aruba Sensor is the next wave of Aruba’s Mobile Engagement strategy, it says. The sensors combine a small Wi-Fi client and BLE radio, so that organizations can remotely monitor and manage Aruba Beacons across existing multivendor Wi-Fi networks from a central location using the Meridian cloud service.

The upshot is that any company can introduce location-based services more easily, using Aruba Beacons and Sensors at the edge and the Meridian cloud service to interface with business and analytics applications.

HPE’s goal is to simplify IoT for customers and drive more business, according to Antonio Neri, HPE’s general manager. “The new solutions today are important elements of our strategy to  deliver more connectivity and computing power at the edge,  help customers maximize the value and minimize the risks from IoT at the speed of business,” said Neri.

Water, Data and Storage Analogy By @StorageIO | @CloudExpo #Cloud

Recently I did a piece over at InfoStor titled “Water, Data and Storage Analogy”. Besides being taken for granted and all of us being dependent on them, several other similarities exist between water, data, and storage. In addition to being a link that piece, this is a companion with some different images to help show the similarities between water, data and storage if for no other reason to have a few moments of fun. Read the entire piece here.

read more

Customer Story: A PC Guy Switches to Mac with Parallels Desktop

The following post is a customer story submitted to our Advocacy program by Andy Cohen. We are incredibly thankful to Andy for sharing his story with us and allowing us to share it with you. Read on for Andy’s experience choosing and using Parallels Desktop. Name: Andy Cohen Geography: Vienna, Virginia USA Industry: Software manufacturer Role: Business Development Consultant, Anytrax   Meet […]

The post Customer Story: A PC Guy Switches to Mac with Parallels Desktop appeared first on Parallels Blog.

Puppies in Your Datacenter? | @CloudExpo #DevOps #BigData #Microservices

You may have heard about the pets vs. cattle discussion – a reference to the way application servers are deployed in the cloud native world. If an application server goes down it can simply be dropped from the mix and a new server added in its place. The practice so far has mostly been applied to application deployments.
Management software on the other hand is treated in a very special manner. Dedicated resources are set aside to run the management software components and several alerting systems are deployed to watch the health of those components. Administrators spend hours each day managing the management infrastructure.

read more

Presidio Acquires Sequoia

Recently, Presidio, a managed service provider, has bought Sequoia Worldwide, a consulting, integration, and devices firm, for an undisclosed amount.

Sequoia specializes in private and hybrid cloud implementations, so it will operate as the Cloud Business Unit under Sequoia Cloud Solutions, a Presidio company. Presidio will use Sequoia’s expertise to expand its cloud computing sector.

Bob Cagnazzi, CEO of Presidio, commented in a statement: “The acquisition of Sequoia continues our leadership in advising, architecting and implementing solutions that enable our customers to derive maximum value from their cloud investments. Sequoia’s cloud expertise enhances Presidio’s capability to advise customers on the optimal way to leverage both private and public cloud resources to develop the optimal hybrid model that will deliver scalable, secure, high performing service delivery models for our clients.”

Presidio managed cloud Icon UPDATED

While Presidio said it will utilize Sequoia’s experience with cloud technologies from companies such as Cisco, Microsoft, and OpenStack, it all stated it will utilize Sequoia’s knowledge of cloud management and brokerage technology to help customers manage public cloud services.

Presidio is not the only one benefiting from this deal; Sequoia will have access to resource through Presidio. The symbiotic relationship between the two companies allows them to establish an innovative cloud solutions practice and catalyze cloud adoption. Steve Hanney, co-founder and managing partner of Sequoia, has said, “We are very excited about the opportunity to be part of Presidio. The addition of Sequoia will help Presidio drive innovation and improve business outcomes of its customers and business partners through innovative cloud services that accelerate time to market and create a competitive advantage.”

About Presidio

Presidio aims to provide an individualized cloud for different business models. It utilizes hybrid cloud services to accelerate the business benefits available through the cloud. Presidio’s partners include VMWare, Cisco, and EMC.

About Sequoia

Established in 1972, Sequoia has established an extensive cloud portfolio, making it extremely valuable to a company such as Presidio. According to the company website, it has partnered with the founders of companies that now have an aggregate, public market value of over $1.4 trillion since 1972.

The post Presidio Acquires Sequoia appeared first on Cloud News Daily.