NoSQL and Hadoop environments are a new challenge for performance engineers. NoSQL solutions require more logic to the application and rely on accurate access patterns to perform. Ensuring MapReduce performance and scalability is a Big Data problem in itself.
In his session at the 11th International Cloud Expo, Michael Kopp, a technology strategist in the Compuware APM, will show how to tackle both challenges with the help of modern Application Performance Management.
Michael Kopp is a technology strategist in the Compuware APM center of excellence and has more than 10 years of experience as an architect and developer in the Java/JEE space. Additionally, Kopp specializes in architecture and performance of large-scale production deployments with a special focus on virtualization and cloud.
Monthly Archives: September 2012
The Next Big Thing: WeeData
‘Big Data’ has a problem, and that problem is its name.
Dig deep into the big data ecosystem, or spend any time at all talking with its practitioners, and you should quickly start hitting the Vs. Initially Volume, Velocity and Variety, the Vs rapidly bred like rabbits. Now we have a plethora of new V-words, including Value, Veracity, and more. Every new presentation on big data, it seems, feels obligated to add a V to the pile.
But by latching onto the ‘big’ part of the name, and reinforcing that with the ‘volume’ V, we become distracted and run the risk of rapidly missing the point. The implication from a whole industry is that size matters. Bigger is better. If you don’t collect everything, you’re woefully out of touch. And if you’re not counting in petas, exas, zettas or yottas, how on earth do you live with the shame?
Load Balancing 101: Scale versus Fail
One of the phrases you hear associated with cloud computing is “architecting for failure.” Rather than build in a lot of hardware-level redundancy – power, disk, network, etc… – the idea is that you expect it to fail and can simply replace the application (which is what you care about anyway, right?) with a clone running on the same cheap hardware somewhere else in the data center.
Awesome idea, right?
But when it comes down to it, cloud computing environments are architected for scale, not fail.
Platform for “machine data” Splunk aims to climb the value chain
By Tony Baer, Principal Analyst, Enterprise Solutions, Ovum
Splunk, which specializes in delivering a data platform for “machine data,” is approaching a turning point. The explosion of sensory data – part of the Big Data phenomenon – is pulling the company in different directions. With a base as the data platform for IT systems management and security programs, Splunk could expand to other forms of machine data such as smart public infrastructure.
Or, as implied by the recruitment of key product executives from SAP and Oracle, it could venture higher up the value chain, developing more business-focused solutions around this competency. Either way, Splunk must choose its targets carefully. As a $150–$200m company, it can’t be all things. Splunk is already promoting itself as an operational intelligence platform that provides quick visibility of trends from low-level data. However, Ovum believes that the company could get more mileage in the market …
Big Data Swamping My Timeline
Tibco Software is holding its annual user conference this week in Las Vegas, and Big Data is among the big topics of discussion. The company claims it was doing big data before there was Big Data, which reminds me of all the companies who’ve said they’ve been doing cloud computing for, you know, ever. (Disclosure: I worked at Tibco from 2006-09, a time during which it acquired Spotfire, which did bring a layer of analytics and big data-ish capability to the company.)
In any case, Big Data has been dominating my twitter timelines the last few weeks, with a sharp spike the likes of which I’ve not seen before.
The Big Data landscape looks like utter chaos to me right now. I don’t know how to define Big Data, and not even NIST has defined it, the way it has helpfully if loosely defined cloud computing.
NIST did hold a workshop on the topic back in June. The first thing to jump out at me from that event was the old-line Big Data people – aka scientists – gamely outlining the classic Big Data applications – nuclear-event simulation, epidemiology, metereology, and other massively computational uses, while noting that NIST had its first seminar on this topic in 1997.
But today, Big Data has been hijacked by real-time data-collection activities, and all the nefarious uses to which it can be put by marketers. There’s still big science involved in developing Big Data frameworks and apps, but the application focus is moving from controlling measles and predicting hurricanes to capturing eyeballs and selling coupons.
In addressing issues such as sensor density, sampling rates, and real-time graphic simulation, I’m reminded of the late Argentinian author Jorge Borge’s perfect map of the world – one which would be the same size as the world and feature every detail precisely.
In other words, by creating perfect pictures of the world (or perfect pictures of a customer’s buying activity), are we missing something essential in the human mind’s ability for abstraction? Would Big Data helped design the original Ford Mustang and identify its buyers? Did it drive the vision behind the iPod, iPhone, and iPad?
So yes, I’m uncomfortable with the all-out onslaught on the term Big Data. I fear its best uses may be obscured and perhaps lost within a miasma of marketing-driven bloviation. The same thing almost happened to cloud computing, and with today’s conflation of Big Data and cloud computing, it could still happen.
I’ll be at Cloud Expo/Big Data Expo in November, wearing my hairshirt and admonishing anyone who’s abusing the terms. I may not have such a problem there, though, as I expect discussions to be highly technical and useful. Maybe I just need to stop taking my twitter timeline so seriously.
Cloud Computing: Terracotta Offers Free Real-Time Big Data Access
Terracotta would love to stick its knee in the groin of the relational database and figures its new BigMemory Go, a free version of its BigMemory in-memory caching widgetry for Java apps, will put it in easy striking distance by expanding its installed base.
It’s counting on users not being content with the single-node 32GB production instance Go offers even if they can put it on as many servers as they want.
The company, now an independent subsidiary of Software AG, calculates that Go users will upgrade within a year to the full, more scalable BigMemory that goes for $500 a gigabyte. And the more mainstream BigMemory goes to speed application performance the less competitive disk-backed databases will be.
AppNeta Launches Unified Application, Network Performance Management
AppNeta today announced the launch of its unified solutions for End User Experience Monitoring including full-stack application performance tracing and unprecedented network performance insight. With the recent acquisition of Tracelytics, AppNeta now provides industry-first performance visibility that IT teams have never had before, across the complete network and application infrastructure.
As business organizations become completely dependent on Internet and web-based technologies such as unified communications, CRM systems, ecommerce, hosted email, and other custom applications, it is necessary to know how they are running and where problems are occurring. With AppNeta’s integrated performance management solutions, IT teams and developers can easily see into the datacenter applications, down to the specific layer and code, and out to the remote sites where end users are experiencing performance degradation or service failure. With this comprehensive level of visibility and analysis, IT teams can quickly identify if the issues are happening in the network or the applications, and save hours chasing down problems and finger-pointing.
“End users have little tolerance for application performance problems,” said Jim Melvin, CEO of AppNeta. “As today’s applications have become more sophisticated and performance-sensitive, the risk of service disruption is at an all time high. AppNeta solutions provide a level of breadth and depth of performance insight that the industry has never seen before. Our customers solve technology problem before they become business problems.”
“The Application Performance Management market is moving to a SaaS delivery model that allows developers and IT teams to assure performance of distributed applications across datacenters and private/public clouds,” said Bernd Harzog, CEO, APM Experts. “This is leading to innovative, disruptive performance management solutions that are truly transforming the way IT teams assure application performance and end user experience.”
AppNeta’s new TraceView application performance monitoring service delivers immediate time-to-value by providing full-stack application tracing capabilities through application tiers and stacks to get real-time insight into actual web application performance. TraceView offers deep, detailed analysis of performance problems and provides actionable data for quick resolution and optimization.
AppNeta’s award-winning PathView Cloud service provides integrated insight from every element of the network performance stack including active path performance analysis, application-aware traffic flow analysis, automated remote site packet capture, and on-demand device status (SNMP). This complete suite delivered from the cloud brings thousands of global customers the fastest time to resolution in the industry and superior End User Experience monitoring.
Free trials of PathView Cloud network performance management service and the TraceView application performance monitoring service are both available today at www.appneta.com.
AppFirst Launches New Partner Program
AppFirst, the SaaS application management system, today unveiled a comprehensive new Partner Program designed for both Cloud and Solution Providers. The program offers a new, expanded free subscription available only via AppFirst partners and provides ongoing training and product support, enabling partners to deliver innovative solutions to their own customers. The rollout of this Partner Program follows the company’s launch earlier this month of its new DevOps Dashboard, an application performance monitoring solution that delivers a clear, unified status view of infrastructure, applications and business metrics to all stakeholders in an organization.
“Our focus today is on growing our global ecosystem to increase the number of AppFirst experts as well as the overall availability of our product,” said David Roth, AppFirst CEO. “By offering access to our solutions through the primary source customers use to deploy their applications in the cloud, our Partner Program complements our partners’ solutions with AppFirst technology, delivering added value to our respective customers. This expanded universe is designed to provide our partners with the solutions and support they need to succeed and grow within their own markets.”
Under the Cloud Provider Partner Program, qualified cloud service providers such as Infrastructure as a Service (IaaS) and Platform as a Service (PaaS) can offer the enhanced free version of AppFirst to its customers via their own marketplace or dashboards, as an extension or add-on to their own service. AppFirst partners deliver value to their end customers by offering a solution that provides the visibility needed across the infrastructure, application and business as an application is running in production, with the added benefit of AppFirst’s extended free program.
“In today’s online world, keeping applications and websites up and running is business critical,” said Mal Knox, Director of Business Development at Engine Yard. “With Engine Yard Cloud, we provide the tools and resources to ensure the optimum performance of our customers’ applications. Partnering with companies like AppFirst enables us to deliver enhanced performance monitoring services, giving customers the visibility they need.”
Solution Provider Partners, which include systems integrators, application development shops and cloud consultancies, will have access to AppFirst products and training, while also receiving implementation tools and assistance, technical resources and implementation support from the company. As these partners become competent with AppFirst’s solutions, they can integrate them in their customer deliverables and even customize and optimize the solution. Solution Providers are also able to offer the free version to their clients, allowing direct visibility into the applications their customers want to actively monitor themselves. Building a global ecosystem of trained Solution Provider Partners allows AppFirst to scale to deliver implementation and industry-specific configuration services at the local level through their partners.
Because AppFirst’s DevOps Dashboard is also easily customizable, any partner company can offer broad custom solutions for internal use, and for their own customers, whether executives, IT ops or DevOps.
“Our customers, with our assistance, are making strategic decisions about if, when and how they migrate to cloud,” said Eileen Boerger, president of CorSource Technology Group. “Partnering with AppFirst provides us another tool in our quiver to assist with those decisions. With the DevOps Dashboard being so customizable, that allows us to provide individualized solutions for our customers, something they demand.”
AppFirst collects millions of infrastructure, application and business metrics that are aggregated and correlated in a single big data repository that eliminates the need for users to look for data in multiple places. Data is collected continuously to provide customers with unprecedented visibility into their entire infrastructure and every application running in production, bringing overall system management to a whole new level.
The DevOps Dashboard’s ability to auto-detect application stacks and configure data collection from the relevant sources delivers a customized dashboard specific to the user’s environment — all automatically, with no extra effort required. With the smart threshold feature, AppFirst time learns over time what is “normal” for a user’s business metrics and delivers alerts when metrics shift one standard deviation from normal levels, saving time and money.
For more information on AppFirst’s Partner Programs, please visit the AppFirst partner section or call 1.800.782.2181.
Cloud Comptuing: Nasdaq Partners with AWS
Nasdaq OMX has gone to Amazon for a new service called FinQloud, where US financial services clients can store the data needed to meet increasingly granular SEC regs or analyze trade data.
FinQloud will host a patent-pending regulatory data retention product called Regulatory Records Retention, or R3, and a fast, on-demand analysis tool is called Self-Service Reporting, or SSR. R3 should be out in the coming months.
Brokers will be able to store order and transaction data and maintain records using FinQloud. That means multi-source data, not just Nasdaq-related trading info.
Rand Secure Archive Implements Hosted Archiving Solution for Ampersand Capital Partners
Rand Worldwide announced that their Rand Secure Archive division has implemented their proprietary hosted archiving and retrieval solution for Ampersand Capital Partners.
“As a private equity firm, Ampersand has to deal with a constantly evolving regulatory environment, including reporting requirements by the SEC, the impact of Sarbanes-Oxley and the Dodd-Frank Act,” says Rick Charpie, Ampersand’s founder. “To address those requirements we followed the advice that we always give to the CEOs we mentor – to select the best partners possible. That’s why we chose Rand Secure Archive, a partner who shares our commitment to success by delivering world-class technology and service.”
“In the financial services industry, it’s critical that electronic information and communication is both archived and easily retrievable – regardless of the file type or medium,” says Chris Grossman, vice president, Enterprise Applications at Rand Worldwide. “Our robust, hosted solution not only meets the legal requirements, but also offers robust functionality with simplicity and security.”
Key functional requirements for financial services organizations safeguarding their data include:
- Preserve every electronic business record and store them securely.
- Find and produce electronic communications quickly, and be prepared
for SEC and other regulatory examinations or requests with on-demand
data production for investment advisor compliance. - Enables the creation and enforcement of retention policies while
maintaining a full audit trail to ensure compliance with these
policies. - Offload the cost and operational burden of managing complex and
disparate email and messaging platforms.
Compared to competitors, Rand Secure Archive provides faster, more advanced search capability and integrated eDiscovery tools, as well as more convenient user access to archived data.