Explores three ways to help development teams bear the burden of security: use pen test results to harden the application, leverage service virtualization for security scenarios, & adopt policy-driven development to help engineers understand and satisfy management’s security expectations.
The move to the cloud brings a number of new security challenges, but the application remains your last line of defense. Engineers are extremely well poised to perform tasks critical for securing the application-provided that certain key obstacles are overcome.
Archivo mensual: diciembre 2013
SAP Releases New SAP Business ByDesign Solution
SAP AG on Wednesday announced new capabilities for the SAP Business ByDesign® solution, reaffirming its commitment to provide independent companies and subsidiaries worldwide a flexible, cost-effective mid-market cloud enterprise resource planning (ERP) suite. The updates leverage SAP HANA®, mobile and cloud technology, and enable businesses to rapidly adapt processes to capitalize on ever-changing market dynamics.
“We continue to honor our commitment to SAP Business ByDesign customers and leverage the strength of the SAP Cloud portfolio and SAP HANA to deliver innovations that help them run efficiently and effectively their entire business in the cloud,” said David Sweetman, senior director, Global Product Marketing, SAP Business ByDesign. “The new capabilities in the latest release help companies manage more data and transactions in more places with greater speed and simplicity.”
Three Ways to Sell Desktop-as-a-Service
Cloud service providers are up against several key challenges when it comes to selling desktop virtualization technology. An understanding of the benefits of desktop-as-a-service (DaaS) rarely exists, and licensing concerns – and their rising costs – are not going away any time soon. Knowing the do’s and don’t of selling desktop-as-a-service before diving in can solve problems on the front end while ensuring adoption of DaaS occurs in companies of all shapes and sizes.
1. Turn IT administrators into allies, not opponents.
Among the myriad reasons enterprises cite for turning to desktop-as-a-service (DaaS) technology, a few prove instrumental in driving its adoption. When evaluating the decision drivers behind DaaS, it is important to consider what the actual motivators behind a switch away from traditional desktop computing may be, and who it is that must be properly motivated. While some of the most progressive executives of small and mid-sized businesses may themselves be the driving force behind such a switch, it is worth noting that these are not typically the individuals interacting with DaaS providers. On the other hand, those who head the IT department of a company have many potential motivators for spearheading a switch from traditional compute to virtualized hosted Windows desktops. The time, energy and frustration DaaS can spare those who actually administer end-user compute for organizations with dozens – or hundreds – of desktops can be significant. If these individuals are sufficiently aware of the benefits they stand to accrue through a switch to more efficient DaaS technology, it follows intuitively that such heads of IT organizations would be among the strongest advocates for such a shift.
VaultLogix Invests in Customers by Partnering with EMC
VaultLogix on Wednesday announced that it has purchased and is implementing EMC’s Isilon NAS scale-out storage products to ensure that its services for customers are the fastest, most reliable, and secure in the industry.
EMC’s Isilon offers massive scalability with the world’s fastest performing NAS. In addition it provides resilient data protection for a highly available environment and robust security and data encryption options.
VaultLogix will be using Isilon as part of its services that help clients receive:
Fully automated backup solutions that move data to an offsite data center for rapid file restoration.
New BMC Marketplace Accelerates Enterprise App Revolution
BMC Software on Tuesday introduced BMC Marketplace, a cloud-based app store for companies who want to deploy their own “Amazon-like” branded marketplace to market and sell private-label mobile, cloud, custom and desktop applications – twice as fast as any custom alternative.
App stores are very well known in the consumer world, and now their popularity is rapidly increasing in enterprise environments. Gartner recently predicted that 25 percent of large companies will deploy an enterprise app store by 2017, partially in response to the bring-your-own-device (BYOD) movement that is sweeping the workplace.
How to Put Your Storage Cloud to Work
Cloud storage has become the preferred choice for those who want the best. The storage area is limitless and the security that it provides is the best in the market. However, that is not the reason why businesses use cloud storage. More and more people are turning to cloud storage as it offers flexibility and meets the constantly changing demands of businesses all over. Of course, the fact that it is extremely economical and easy to use is also a decisive factor. Perhaps that is why even individuals are turning to cloud storage and are finding innovative uses for cloud storage.
As a cloud can store endless amounts of data, people are using it for their personal uses and are backing up personal memories like photos, videos, voice recordings and even family histories. Millions are finding cloud storage a boon for storing all their email attachments and while free plans also offer a vast amount of features, people are willing to pay for paid storages just to keep their data safe in addition to getting customized storages. For professionals who need to organize and store their data for research, surveys and calculations, this is the perfect solution.
The 2013 Tech Industry – A Year in Review
By Chris Ward, CTO, LogicsOne
As 2013 comes to a close and we begin to look forward to what 2014 will bring, I wanted to take a few minutes to reflect back on the past year. We’ve been talking a lot about that evil word ‘cloud’ for the past 3 to 4 years, but this year put a couple of other terms up in lights including Software Defined X (Datacenter, Networking, Storage, etc.) and Big Data. Like ‘cloud,’ these two newer terms can easily mean different things to different people, but put in simple terms, in my opinion, there are some generic definitions which apply in almost all cases. Software Defined X is essentially the concept of taking any ties to specific vendor hardware out of the equation and providing a central point for configuration, again vendor agnostic, except of course for the vendor providing the Software Defined solution . I define Big Data simply as the ability to find a very specific and small needle of data in an incredibly large haystack within a reasonably short amount of time. I see both of these technologies becoming more widely adopted in short order with Big Data technologies already well on the way.
As for our friend ‘the cloud,’ 2013 did see a good amount of growth in consumption of cloud services, specifically in the areas of Software as a Service (SaaS) and Infrastructure as a Service (IaaS). IT has adopted a ‘virtualization first’ strategy over the past 3 to 4 years when it comes to bringing any new workloads into the datacenter. I anticipate we’ll begin to see a ‘SaaS first’ approach being adopted in short order if it is not out there already. However, I can’t necessarily say the same on the IaaS side so far as ‘IaaS first’ goes. While IaaS is a great solution for elastic computing, I still see most usage confined to the application development or super large scale out application (Netflix) type use cases. The mass adoption of IaaS for simply forklifting existing workloads out of the private datacenter and into the public cloud simply hasn’t happened. Why?? My opinion is for traditional applications neither the cost nor operational model make sense, yet.
In relation to ‘cloud,’ I did see a lot of adoption of advanced automation, orchestration, and management tools and thus an uptick in ‘private clouds.’ There are some fantastic tools now available both commercially and open source, and I absolutely expect to see this adoption trend to continue, especially in the Enterprise space. Datacenters, which have a vast amount of change occurring whether in production or test/dev, can greatly benefit from these solutions. However, this comes with a word of caution – just because you can doesn’t mean you should. I say this because I have seen several instances where customers have wanted to automate literally everything in their environments. While that may sound good on the surface, I don’t believe it’s always the right thing to do. There are times still where a human touch remains the best way to go.
As always, there were some big time announcements from major players in the industry. Here are some posts we did with news and updates summaries from VMworld, VMware Partner Exchange, EMC World, Cisco Live and Citrix Synergy. Here’s an additional video from September where Lou Rossi, our VP, Technical Services, explains some new Cisco product announcements. We also hosted a webinar (which you can download here) about VMware’s Horizon Suite as well as a webinar on our own Cloud Management as a Service Offering
The past few years have seen various predictions relating to the unsustainability of Moore’s Law which states that processors will double in computing power every 18-24 months and 2013 was no exception. The latest prediction is that by 2020 we’ll reach the 7nm mark and Moore’s Law will no longer be a logarithmic function. The interesting part is that this prediction is not based on technical limitations but rather economic ones in that getting below that 7nm mark will be extremely expensive from a manufacturing perspective and, hey, 64k of RAM is all anyone will ever need right?
Probably the biggest news of 2013 was the revelation that the National Security Agency (NSA) had undertaken a massive program and seemed to be capturing every packet of data coming in or out of the US across the Internet. I won’t get into any political discussion here, but suffice it to say this is probably the largest example of ‘big data’ that exists currently. This also has large potential ramifications for public cloud adoption as security and data integrity have been 2 of the major roadblocks to adoption so it certainly doesn’t help that customers may now be concerned about the NSA eavesdropping on everything going on within the public datacenters. It is estimated that public cloud providers may lose as much as $22-35B over the next 3 years as a result of customers slowing adoption due to this. The only good news in this, at least for now, is it’s very doubtful that the NSA or anyone else on the planet has the means to actual mine anywhere close to 100% of the data they are capturing. However, like anything else, it’s probably only a matter of time.
What do you think the biggest news/advancements of 2013 were? I would be interested in your thoughts as well.
Register for our upcoming webinar on December 19th to learn how you can free up your IT team to be working on more strategic projects (while cutting costs!).
Oracle announces sponsorship of OpenStack Foundation
Tech giant Oracle has announced corporate sponsorship of the OpenStack Foundation, clearing the path for integration between the open source software and its various clouds.
The Redwood firm plans to integrate a huge amount of its products into the OpenStack technology, including Oracle’s virtual machine, its infrastructure as a service, as well as its Exalogic elastic cloud, its storage and its compute cloud.
Edward Screven, chief corporate architect at Oracle, said in a statement: “Oracle is pleased to join the OpenStack Foundation and plans to integrate OpenStack capabilities into a broad set of Oracle products and cloud services.
“Our goal is to give customers greater choice and flexibility in how they use Oracle products and services in public and private clouds,” he added.
Mark Collier, OpenStack Foundation chief operating officer, said: “We welcome Oracle to the OpenStack community, and look forward to innovative contributions from their many domain experts …
Red Hat Pathway to IT Modernization
Over time, every IT portfolio gets bogged down in the chaos of servers, platforms, and software that make up the current IT landscape. Legacy systems, disparate architectures, and aging technologies slowly eat away at returns, reduce your ability to respond to shifting demands, and limit how quickly you can scale to meet new market opportunities. IT modernization lets you reboot your portfolio for greater agility, reduced cost, and operational excellence. This frees up your time and money to focus on innovation, not implementation.
The Changing Face of Disaster Recovery in the Age of Cloud Computing
Disaster Recovery (DR) has typically only been used by organizations for applications deemed to be mission critical. This was because organizations didn’t want to incur the expense associated with DR for less important applications. Today, because of cloud computing, many organizations are considering the use of DR for as many applications as is essential for the business.
DR in the cloud is relatively a new concept still and, like many technology trends we’ve seen so far, there’s a lot of hype and misinterpretation out there. Multiple schools of thought exist on whether or not to implement DR in the cloud.