Secure-24 Gets Investment from Pamlico Capital

Secure-24, Inc. (“Secure-24”) and Pamlico Capital announced today that a Pamlico Capital-led investment group has made a growth equity investment in Secure-24, Inc. Secure-24, founded in 2001, is an emerging leader in business critical application hosting and cloud services with a focus on ERP applications from SAP® and Oracle®. The co-founders, Volker Straub and Matthias Horch will retain significant ownership of Secure-24 and will continue to manage the growth of the company in partnership with Pamlico Capital. The investment group included HarbourVest Partners. Detailed terms of the transaction have not been disclosed.

Secure-24 also announces that it has recruited industry veteran Michael Jennings to join the Secure-24 management team. Jennings was co-founder and CTO of Appshop, Inc. a leader in Oracle ERP hosting.

“We are extremely happy to partner with Pamlico because of its track record supporting companies and entrepreneurs as a growth capital provider,” said Matthias Horch, Secure-24 CEO. “Secure-24 has seen tremendous demand for IT infrastructure and application outsourcing services and is poised to continue to grow as companies focus on building strategic partnerships to reduce costs and improve the operation of advanced and complex IT systems. The expertise and resources Pamlico brings to our team will be critical assets to accelerate innovation, growth and market leadership.”

Pamlico Partner Scott Stevens said, “We focus on making equity investments in growing and profitable industry-leading technology companies. Secure-24 has demonstrated all the characteristics we look for; an experienced and highly capable management team, a strong competitive position, an industry-leading technology platform, and a superior service delivery model. Secure-24’s focus on helping companies increase business value through their IT investments provides them the opportunity for impressive growth in the years ahead.”


F5 Extends Dynamic Networking to Windows Server-Based Virtual Network Environments

F5 Networks, Inc. today announced the F5 Network Virtualization Solution for Microsoft Windows Server 2012 Hyper-V. The solution gives F5 customers the flexibility to use the BIG-IP platform to deploy network services in cloud-driven data centers that are built on Windows Server 2012 Hyper-V. This announcement underscores F5’s commitment to deliver a dynamic, efficient data center that will ensure scalability, security, and manageability across an organization’s IT environments and systems.

With this solution, the same network-based services that the BIG-IP platform provides—such as local and global load balancing, advanced traffic steering, access control, and application security and acceleration—can now also be used to deliver applications in the Microsoft cloud and virtualized network environments. The solution is enabled by F5 BIG-IP Local Traffic Manager (LTM®) Virtual Edition (VE) running on Windows Server 2012 Hyper-V.

Organizations that use Hyper-V network virtualization to realize cost savings and operational efficiencies stand to gain many additional benefits from the F5 solution, including:

  • Improved Flexibility – Working in conjunction with Hyper-V
    network virtualization, the F5 solution supports seamless, low-cost
    migration to the cloud by allowing organizations to use the same
    policies and IP addresses in the cloud that they currently use in the
    physical network.
  • Cost Savings – The F5 solution accelerates data center
    consolidation by connecting hybrid cloud environments, enabling
    organizations to cut costs while extending their applications and
    services.
  • Efficient Network Management – The F5 solution can
    intelligently manage network traffic at layers 4-7, mitigating the
    need for organizations to build and manage large layer 2 networks.
  • Streamlined ADN Services – The F5 solution runs on Windows
    Server 2012 Hyper-V, and all services are applied in BIG-IP LTM VE, so
    no software upgrades or special code is required on the physical
    network.


DeviceAnywheres Automated Mobile Testing Platform Being Shown at StarWest Conference 2012

Keynote® Systems today announced that Bryan Segale, Director of Solutions for Keynotes’ DeviceAnywhere platform, will conduct a technical presentation at Software Testing Analysis & Review West (StarWest) titled “Learn About Testing on Real Devices With All the Ease of Browser-based Testing,” on Thursday, October 4 at 1:30 p.m. at the Disneyland Hotel in Anaheim, Calif.

Segale will highlight the advantages of using automated testing tools for mobile devices, drawing from first-hand experience working with all types of enterprises in the mobile space, helping them to design and implement manual and automated testing strategies to facilitate higher-quality applications to market faster. His technical presentation will focus on the benefits of using browser-based testing with actual, live devices. “Learn About Testing on Real Devices With All the Ease of Browser-based Testing” will examine what it means to be able to remotely test devices using industry-leading tools, including how to:

  • Easily create scripts that operate across multiple mobile device
    models and OSs
  • Efficiently automate tests, leveraging the latest in HTML5
  • Improve ROI for your mobile initiatives

WHAT: Technical Presentation: “Learn About Testing on Real Devices With All the Ease of Browser-based Testing”

WHEN: Thursday, October 4, 2012 from 1:30 – 2:30 p.m. PDT

WHERE: Disneyland Hotel – Booth 3

For more information about the StarWest event, visit: www.sqe.com/StarWest. If you are unable to attend the event and wish to view what we are showcasing, you can join us via StarWest Virtual Conference on October 3-4. To attend the event right from your desktop, register here: https://vts.inxpo.com/scripts/Server.nxp.


Cloudcuity™: Thought Leadership Translated to Operational Excellence

As my long-time readers have certainly noticed, the frequency of my posts have lengthened over the past few months. First, I would like to offer my apologies for being remiss in that department but I have had some extenuating circumstances.

For over three years I have been closely observing and actively participating in the government’s transition to cloud computing. Many of my observations, experiences and insights have been revealed through “Cloud Musings”, “Cloud Musings on Forbes” and in my first book “GovCloud: Cloud Computing for the Business of Government”. In parallel, I have also been designing strategies, building processes and creating transition tools all designed to reduce the risk and increase the value of cloud transformation projects. The culmination of my personal efforts and those of the entire team here at NJVC have now all been folded into our new brand for cloud computing operational excellence – Cloudcuity™

read more

Cloud Computing: AWS Direct Connect Available to Telx Clients

Telx on Wednesday announced the availability of Amazon Web Services Direct Connect offerings for Telx customers in Northern California, as well as providing extended availability to Telx Cloud Connection Centers in adjacent markets.
“Telx continues to expand the portfolio of options available to its clients to build next-generation distributed infrastructures. Rather than just a single solution, clients can leverage a range of public and private network transport and interconnection options to deploy optimal hybrid architectures, across a range of multilayer network services, utilizing a mix of colocation, private, hosted private, and public cloud services,” said Joe Weinman, senior vice president of cloud services and strategy for Telx. “We are excited to offer our clients direct access to Amazon Web Services, a leading provider and innovator in the cloud.”

read more

Cloud Expo Silicon Valley: Performance Management in Big Data Applications

NoSQL and Hadoop environments are a new challenge for performance engineers. NoSQL solutions require more logic to the application and rely on accurate access patterns to perform. Ensuring MapReduce performance and scalability is a Big Data problem in itself.
In his session at the 11th International Cloud Expo, Michael Kopp, a technology strategist in the Compuware APM, will show how to tackle both challenges with the help of modern Application Performance Management.
Michael Kopp is a technology strategist in the Compuware APM center of excellence and has more than 10 years of experience as an architect and developer in the Java/JEE space. Additionally, Kopp specializes in architecture and performance of large-scale production deployments with a special focus on virtualization and cloud.

read more

The Next Big Thing: WeeData

‘Big Data’ has a problem, and that problem is its name.
Dig deep into the big data ecosystem, or spend any time at all talking with its practitioners, and you should quickly start hitting the Vs. Initially Volume, Velocity and Variety, the Vs rapidly bred like rabbits. Now we have a plethora of new V-words, including Value, Veracity, and more. Every new presentation on big data, it seems, feels obligated to add a V to the pile.
But by latching onto the ‘big’ part of the name, and reinforcing that with the ‘volume’ V, we become distracted and run the risk of rapidly missing the point. The implication from a whole industry is that size matters. Bigger is better. If you don’t collect everything, you’re woefully out of touch. And if you’re not counting in petas, exas, zettas or yottas, how on earth do you live with the shame?

read more

Load Balancing 101: Scale versus Fail

One of the phrases you hear associated with cloud computing is “architecting for failure.” Rather than build in a lot of hardware-level redundancy – power, disk, network, etc… – the idea is that you expect it to fail and can simply replace the application (which is what you care about anyway, right?) with a clone running on the same cheap hardware somewhere else in the data center.
Awesome idea, right?
But when it comes down to it, cloud computing environments are architected for scale, not fail.

read more

Platform for “machine data” Splunk aims to climb the value chain

By Tony Baer, Principal Analyst, Enterprise Solutions, Ovum

Splunk, which specializes in delivering a data platform for “machine data,” is approaching a turning point. The explosion of sensory data – part of the Big Data phenomenon – is pulling the company in different directions. With a base as the data platform for IT systems management and security programs, Splunk could expand to other forms of machine data such as smart public infrastructure.

Or, as implied by the recruitment of key product executives from SAP and Oracle, it could venture higher up the value chain, developing more business-focused solutions around this competency. Either way, Splunk must choose its targets carefully. As a $150–$200m company, it can’t be all things. Splunk is already promoting itself as an operational intelligence platform that provides quick visibility of trends from low-level data. However, Ovum believes that the company could get more mileage in the market …

Big Data Swamping My Timeline

Tibco Software is holding its annual user conference this week in Las Vegas, and Big Data is among the big topics of discussion. The company claims it was doing big data before there was Big Data, which reminds me of all the companies who’ve said they’ve been doing cloud computing for, you know, ever. (Disclosure: I worked at Tibco from 2006-09, a time during which it acquired Spotfire, which did bring a layer of analytics and big data-ish capability to the company.)

In any case, Big Data has been dominating my twitter timelines the last few weeks, with a sharp spike the likes of which I’ve not seen before.

The Big Data landscape looks like utter chaos to me right now. I don’t know how to define Big Data, and not even NIST has defined it, the way it has helpfully if loosely defined cloud computing.

NIST did hold a workshop on the topic back in June. The first thing to jump out at me from that event was the old-line Big Data people – aka scientists – gamely outlining the classic Big Data applications – nuclear-event simulation, epidemiology, metereology, and other massively computational uses, while noting that NIST had its first seminar on this topic in 1997.

But today, Big Data has been hijacked by real-time data-collection activities, and all the nefarious uses to which it can be put by marketers. There’s still big science involved in developing Big Data frameworks and apps, but the application focus is moving from controlling measles and predicting hurricanes to capturing eyeballs and selling coupons.

In addressing issues such as sensor density, sampling rates, and real-time graphic simulation, I’m reminded of the late Argentinian author Jorge Borge’s perfect map of the world – one which would be the same size as the world and feature every detail precisely.

In other words, by creating perfect pictures of the world (or perfect pictures of a customer’s buying activity), are we missing something essential in the human mind’s ability for abstraction? Would Big Data helped design the original Ford Mustang and identify its buyers? Did it drive the vision behind the iPod, iPhone, and iPad?

So yes, I’m uncomfortable with the all-out onslaught on the term Big Data. I fear its best uses may be obscured and perhaps lost within a miasma of marketing-driven bloviation. The same thing almost happened to cloud computing, and with today’s conflation of Big Data and cloud computing, it could still happen.

I’ll be at Cloud Expo/Big Data Expo in November, wearing my hairshirt and admonishing anyone who’s abusing the terms. I may not have such a problem there, though, as I expect discussions to be highly technical and useful. Maybe I just need to stop taking my twitter timeline so seriously.

read more

The cloud news categorized.