Technology is advancing all around us. From our TV’s that we use in our homes to the smart phones we ever so mindlessly use on a day to day basis, and they’re becoming more and more connected to each other too. Technological advancement and connectivity are driving forces in businesses large and small; minimizing costs and effort, and maximizing productivity. Here are just a few of the newest ways in which technology is advancing business.
Monthly Archives: June 2015
AgilData Sneak Peek By @CodeFutures at @CloudExpo | #IoT #API #DevOps #Microservices
The web app is Agile. The REST API is Agile. The testing and planning are Agile. But alas, Data infrastructures certainly are not. Once an application matures, changing the shape or indexing scheme of data often forces at best a top down planning exercise and at worst includes schema changes which force downtime. The time has come for a new approach that fundamentally advances the agility of distributed data infrastructures.
Come learn about a new solution to the problems faced by software organizations as their products become successful and the data grows beyond the initial scope of the database,whether it starts out as relational SQL, NoSQL, or somewhere in between.
Oracle Q4 cloud revenues grow 29%, down 5% overall
Oracle Corporation has announced its 2015 fiscal Q4 quarterly earnings, unveiling impressive growth for its PaaS and SaaS business, which is up 29% on last year. The company posted overall revenue of $10.7 billion however, down 5% year on year.
After a bullish announcement of its Q3 results in March, where Oracle boss Larry Ellison publicly called out rival Salesforce, the software giant posted Software and Cloud business revenues at $8.4bn, down 6% year on year, while its SaaS and PaaS revenues came in at $416m.
Announcing the decline in revenues, Oracle was hasty to point the finger at the fluctuating strength of the US dollar against international exchange rates; it claimed total revenues would have been up 3%, software and cloud revenues up 2% and SaaS and PaaS growth 35% instead of 29% year on year, blaming the strengthening of the U.S. dollar.
Oracle CEO Safra Catz is expecting the growth of its SaaS and PaaS revenues to kick up a notch in fiscal year 2016.
“We sold an astonishing $426 million of new SaaS and PaaS annually recurring cloud subscription revenue in Q4,” he said. “We expect our rapidly increasing cloud sales to quickly translate into significantly more revenue and profits for Oracle Corporation.” For example, SaaS and PaaS revenues grew at a 34% constant currency rate in our just completed Q4, but we expect that revenue growth rate to jump to around 60% in constant currency this new fiscal year.”
In highlighting his firm’s ambition for the coming fiscal year, Ellison again took the chance to name-check one of Oracle’s main competitors.
“We expect to book between $1.5 and $2.0 billion of new SaaS and PaaS business this fiscal year,” he said. “That means Oracle would sell more new SaaS and PaaS business than salesforce.com plans to sell in their current fiscal year – the only remaining question is how much more. Oracle’s planned SaaS and PaaS revenue growth is around 60% in constant currency; salesforce.com has a planned growth rate of around 20%. When you contrast those growth rates it becomes clear that Oracle is on its way to becoming the world’s largest enterprise cloud company.”
IT Transforming into the Cloud Enabler by @CiscoCloud | @CloudExpo #IoT #API #DevOps
Business as usual for IT is evolving into a “Make or Buy” decision on a service-by-service conversation with input from the LOBs. How does your organization move forward with cloud?
In his general session at 16th Cloud Expo, Paul Maravei, Regional Sales Manager, Hybrid Cloud and Managed Services at Cisco, discusses how Cisco and its partners offer a market-leading portfolio and ecosystem of cloud infrastructure and application services that allow you to uniquely and securely combine cloud business applications and services across multiple cloud delivery models.
Allaying Your Fears with Cloud Hosting | @CloudExpo #Cloud
As more and more businesses choose to depend on the cloud, it is not unreasonable to be cynical and doubt the evolution. Are businesses risking it all by moving their data to the cloud? Or is this a totally thought-out move?
The days of hosting websites and applications on your own in-house server are numbered. The recent RightScale State of the Cloud report found that close to 88% of businesses today make use of the public cloud while 63% are on a private cloud. As more and more businesses choose to depend on the cloud, it is not unreasonable to be cynical and doubt the evolution. Are businesses risking it all by moving their data to the cloud? Or is this a totally thought-out move? If you are a business that is yet to make the move and have your questions unanswered still, this article will attempt to allay those fears.
Google Discloses Cloud Information
Recently, google has been talking more about companies that utilize its cloud business as well as revealing information about its computing resources, which may be the largest on the planet, beating Amazon Web Services. This information includes Google’s ultra-fast fiber network, its big data resources, and the computers and software it has built for itself.
The aim of these disclosures is to present Google as more fit to handle the biggest computational exercises as opposed to a company such as Amazon Web Services. This follows earlier moves by Google Cloud Platform to show off its data analysis capabilities.
Details like the ability to pass information between Europe and the United States in less than 100 milliseconds, and a practice of fully backing up user data in nine different locations, make google seem innovative and cutting-edge.
At an event on Tuesday, Google Cloud Platform will announce HTC as a customer. The company has utilized Google’s services to construct computing architecture that enables smartphone apps to update data fast and reliably to many devices at once, and appear efficient even when the phones are in areas of poor reception.
On Wednesday, it is expected that a Google executive will present a look at the overall network design. This includes key tools that enable large-scale management of computing devices around the globe. As the senior director of engineering at HTC, John Song, claims, “We are managing two million to three million smartphones in this network. Google is the only player in cloud that owns lots of fiber-optic cable worldwide, and it replicates its users’ data in nine different places at all times.”
While Song did consider other companies lime Amazon and Microsoft, Google’s technical dedication made them stand out. Google crunches large amounts of data and has already begun to branch out specialties in areas such as genomics. Google Cloud Platform currently has 90 points of presence, where a company’s computers get direct access to the Internet and a local telecommunications service provider, throughout the world compared to Amazons 53.
The post Google Discloses Cloud Information appeared first on Cloud News Daily.
Parallels RAS Growing Into a Virtualization Powerhouse
Cost-effectiveness, easy deployment and auto-configuration of powerful virtualization features are the three important aspects that have made Parallels RAS a leading virtualization solution in recent times. The recent acquisition of 2X RAS by Parallels has significantly empowered the product in terms of brand value, market reach, and performance, transforming it into a virtualization powerhouse. Growing […]
The post Parallels RAS Growing Into a Virtualization Powerhouse appeared first on Parallels Blog.
How EU legislation impacts data processing in the cloud
(c)iStock.com/Ramberg
Last week, the European Union agreed on proposed Data Protection Regulations that potentially impact all organisations that either use or process the personal data of EU citizens. There will now be further consultations before these become statute, but for the first time these will be regulations, rather than directives; which mean that individual EU member states will have little room for interpretation in how they are applied.
This has implications for IT service providers, SaaS providers or cloud providers, and for their customers. Under the current directives, third-party organisations who store data on behalf of others, have limited responsibilities as “processors” rather than “controllers” of data. But under the new proposals, individuals will be able to seek legal redress against any organisation they believe has misused their data and against any third-party that processed that data. In addition the EU may be able to fine those who breach the regulations, with a maximum potential fine of two percent of global turnover.
In practice it will mean that the safeguarding of personal data will become even more important; and that organisations will have extend their diligence into investigation of the controls and processes deployed by any third party they trust to process data on their behalf. Businesses must now implement “privacy by design”; How this will work in practice is still being debated, but with increasing amounts of sensitive data being available online, companies will be expected to be more aware of and better able to implement privacy into their IT platforms and into any outsource relationships.
Larger processors of data will need to appoint a Data Protection Officer and they will need to evidence transparent processes that deal with:
- Controls to mitigate risks
- Data security breach and communication around such incidents
- Impact and risk assessment around the use of personal data
- Staff training and awareness
- The deletion of personal data or “Right to be Forgotten”
In turn this means that businesses engaging with service providers should ascertain that these partners have:
- Appropriate tools to ensure the physical and logical security of their data; ranging from secure data centres with appropriate access controls, through to logical controls like firewalls, web application firewalls, intrusion detection or file integrity monitoring
- Processes that control access to and management of data; for example secured logical access to networks or devices, or best practices around server image hardening and patching
- Processes and tools that facilitate audit and investigation; for example the review and storage of device logging data; transparent monitoring and reporting; or the willingness to allow a 3rd party audit of systems and processes
- Processes and tools for the identification and erasure of records, including secure destruction of storage and backup media
- A demonstrable commitment to staff training and culture of data security.
Microsoft and HP top data centre infrastructure market, claims research
(c)iStock.com/gogo_b
Over the last four quarters, spend on data centre infrastructure hit $114 billion – and Microsoft and Hewlett Packard are the vendors at the top of the tree.
That’s the verdict from industry analysts Synergy Research. Microsoft is particularly dominant in the software market, with 72% of share compared to VMware on 17%. Yet that comprises less than a quarter (23%) of the overall data centre infrastructure market.
On the hardware side, HP (19%) is the largest vendor, followed by Cisco (12%), Dell (12%) and IBM (11%). Synergy describes HP’s lead in data centre hardware as “strong”, and notes other leading players in the market are EMC, Lenovo, NetApp, Oracle, Fujitsu and Hitachi.
Picture credit: Synergy Research
It’s worth noting here that this examines the overall market, both traditional on-prem and cloud, the former which Synergy describes as “huge”. Synergy links servers, server OS, storage, networking, network security and virtualisation apps in its overall examination of the market.
Despite the behemoth-like size of the traditional data centre market, cloud is catching up, according to Synergy founder and chief analyst Jeremy Duke.
“Clearly the single biggest driver of spend on data centre infrastructure is the boom in cloud computing,” he said. “The shift in computing workloads to public and private clouds is driving huge investments in both service provider and enterprise data centres.” John Dinsdale, a chief analyst and research director at Synergy, argued the industry focus on cloud took away from the size of the traditional data centre market. “That part of the data centre market remains enormous and will remain a prime source of revenue for vendors for many years to come,” he said.
Synergy’s other research has predominantly focused on the infrastructure and collaboration markets; the most recent report on the cloud infrastructure services market saw Amazon Web Services hit a five year high, while analysis of the cloud infrastructure equipment market saw Cisco and HP leading the way.
Earlier this week, a report from Zenium Technology Partners saw half of organisations polled do not operate a data centre that could continue to function after a natural disaster.
Cisco Live 2015 Recap: IoT, Digital Age, Wireless Updates & More!
The GreenPages/LogicsOne Team landed at Cisco Live last week and spent the days soaking up new tech, new trends, and developing a sense of where the market is headed with everything Cisco.
Digital Age Keynote
John Chambers gave an incredible keynote (and also took a picture with my colleague Nick Phelps! See below). He’s a very commanding speaker with a great vision. He highlighted that 90% of companies believe that they should become digital and that only 7% have a plan in their head on how to do so. That is our market in a bottle. In 10 years, it’s estimated that 40% of enterprise companies won’t exist anymore. In 1950 the average company had a run time of 45 years. In 2010 it was only 10 years. The reason? People feel that they need to keep doing what they have been doing, for doing’s sake. It’s time to step up and make change, disrupt, or run the risk of being disrupted.
IoE/IoT
The Internet of Everything and Internet of Things was once again a big hit overall with people at Cisco Live. They estimate that of the 7 billion people on earth, 4 billion have cell phones, 3.5 billion have toothbrushes. That’s how badly people want apps, app based lifestyles, and apps with sensors. And, on average, there are 50,000 new apps launching every week. The Internet of Things emphasized the different ways to apply the concept of everyone being connected to spark a generation of ideas and how to solve modern problems. Everything from providing a demo, to configuring a train to detect and change a signal to prevent a hypothetical crash, to a walking stick recently developed to enable the blind to see and feel their surroundings by detecting an announced crosswalk, traffic light status, and the number of stairs ahead to the user.
Meraki
Meraki is getting some serious development and is growing like crazy! They are continuing to provide the 2 week and up to 6 week Proof of Concept demo, risk and cost free for any size deal, from a single access point to an entire site design of 50 devices including Aps, switches, and firewalls. Of these Try and Buy situations, 75% of customers keep and possibly buy more gear.
- The MX/Firewall appliance has had limitations with VPN support in the past, but has been updated to support 3rd party VPN connections, a visual dashboard with VPN traffic usage visibility, and a topology mode. GreenPages can enable the customer to manage and rapidly deploy this multisite VPN firewall solution out to hundreds of locations.
- Cisco is applying its iWAN portfolio to the Meraki MX Firewalls! Cisco Intelligent WAN (iWAN) is a collection of Cisco technologies that provide redundancy similar to an MPLS network without much of the cost. Meraki will soon be supporting dual-active path support for VPN, and with PfR (Performance Based Routing) and PbR (Policy Based Routing) a customer with 2 circuits can utilize VPN over both circuits at once without a load balancer, while allowing for intelligent link selection based on things like policy, latency, or loss.
- SourceFire’s AMP is coming to the MX firewall as well! This incredible anti malware protection centralized at the network firewall gives great visibility into what files, both malicious and non-malicious, are passing through.
- Cisco ISE (Integrated Services Engine) is now compatible with all Meraki devices in addition to the traditional Cisco product line like switches, routers, access points. ISE allows a customer to centrally apply a profile-detecting policy that rivals Microsoft Radius for port level wired, wireless, and VPN security access. Hundreds to thousands of access points, site core switches, and remote site firewalls in an enterprise environment can be updated from a single dashboard for agility and dynamic security.
Wireless
- Cisco is soon introducing full Wave 2 AC Wireless. The upcoming 1902i and 2902i access points introduce a max speed of 2.3Gbps, and more incredibly, the introduction of MU-MIMO wireless technology.
- 2.3Gbps is a big deal. Think about it, 90% of customer client machines connect using existing 1Gb cabling, or the latest wireless of 1.3Gb. This new wireless is twice as fast, it can make more sense to go wireless instead of cabling for clients at all.
- MU-MIMO means Multiple User wireless. Wireless clients currently have to “share the air”, transmitting one at a time across channels. This can lead to bottlenecks, complex configurations, and having to choose between coverage or capacity. MU-MIMO allows multiple wireless clients to communicate over wireless channels at once, allowing the entire wireless spectrum to be consumed constantly, leading to much more highway for all those packets. Combine that with increased wireless transmission speed, and I feel confident saying that wireless could possibly disrupt physical cabling and introduce a wave of the “All Wireless Office”.
Nbase T
- With wireless AP’s capable of up to 2.3Gbps comes the need for faster cabling, but no one is going to want to spend the time or money recabling. Let’s face it; ethernet is the last cabling we’re going to pull. Introducing Nbase-T, 2 additional speeds of ethernet that run on the existing copper ethernet cabling customers have now and can perform 2.5Gbps or 5Gbps speeds. This has the potential to be huge to allow high density wireless with very limited cabling, complementing the new wireless AP’s high density capabilities.
- Also, think big picture here. Think how the market is going to respond to this. Manufacturers are going to want to build network cards for client workstations capable of using the same ethernet cabling at 2.5x or 5x the speed. We could see a huge shift to the end of a static 1Gbps wired speed to the client, with a move to an auto-detecting 100Mb to 10Gb spectrum. (.1Gbps) – 1Gbps – 2.5Gbps – 5Gbps – 10Gbs infrastructure all over existing cabling! This will let us keep up with the high bandwidth demands of our applications both internal and external. There are some cabling distance limitations, a chart showing that info is below.
Overall, it was a great event. If you’d like to talk in more detail about news that came out of the event or how you can take advantage of any of them in your environment, reach out!
By Dan Allen, Architect