Archivo de la categoría: Big Data

New AWS Pipeline Tool Aims to Make Effective Use of Your Business Data

AWS Pipeline Diagram

Amazon’s new AWS Data Pipeline product “will help you move, sort, filter, reformat, analyze, and report on  data in order to make use of it in a scalable fashion. ” You can now automate the movement and processing of any amount of data using data-driven workflows and built-in dependency checking.

A Pipeline is composed of a set of data sources, preconditions, destinations, processing steps, and an operational schedule, all definied in a Pipeline Definition.

The definition specifies where the data comes from, what to do with it, and where to store it. You can create a Pipeline Definition in the AWS Management Console or externally, in text form.

Read more.


Garantia Brings Redis Cloud to Heroku, AppFog, AppHarbor

Garantia Data, a provider of in-memory NoSQL cloud services, today announced the availability of its Redis Cloud database hosting service on HerokuAppFog and AppHarbor platforms over AWS. Garantia Data’s new Redis Cloud add-ons will provide the hundreds of thousands of developers who run their applications on these platforms with an infinitely scalable, highly available, high-performance and zero-management Redis solution in just one click.

Used by both enterprise developers and cutting-edge start-ups, Redis is an open source, RAM-based, key-value memory store that provides significant value in a wide range of important use cases. Garantia Data’s Redis Cloud is a fully-automated service for running Redis on the cloud – completely freeing developers from dealing with nodes,clusters, scaling, data-persistence configuration and failure recovery.

“Redis Cloud has been running in a private beta on Amazon EC2 since January and in a free, public beta since June, and we survived several node failures and three AWS outages without losing any customer data,” said Ofer Bengal, CEO of Garantia Data. “After successfully navigating these events, we are now 100 percent confident that our service is fully reliable and ready for PaaS environments. Heroku, AppFog and AppHarbor developers will now be able to enjoy the powerful benefits that our solution can bring to their critical databases, while gaining more time to focus on building the best possible applications.”

Redis Cloud is the only solution that scales seamlessly and infinitely, so a Redis dataset can grow to any size while supporting all Redis commands. It provides true high-availability, including instant failover with no human intervention. In addition, it runs a dataset on multiple CPUs and uses advanced techniques to maximize performance for any dataset size. Redis Cloud add-ons let developers create multiple databases in a single plan, each running in a dedicated process and in a non-blocking manner.

“We’re very excited to welcome Garantia Data to the AppFog ecosystem,” said Lucas Carlson, CEO and founder of AppFog. “Redis Cloud is exactly the sort of production, workload-ready service that our customers have been demanding. As huge fans of Redis, we feel that Redis Cloud’s robust performance and complete feature set makes it one of the best NoSQL DB-as-a-Service options out there. We can’t wait to see what developers create with Redis Cloud and AppFog!”

“We’re excited to welcome Garantia Data’s Redis Cloud into the AppHarbor add-on catalog,” said Michael Friis, co-founder of AppHarbor. “Redis is becoming a critical component for many .NET developers and is used by prominent .NET-powered web-properties like StackOverflow.

“We’ve seen Redis become an integral part of modern web applications, in part because of its amazing performance and flexibility,” said Glenn Gillen, Engineering Manager for Heroku Add-ons. “We’re excited to include Redis Cloud in the Heroku Add-ons marketplace so our customers can take advantage of its highly available and scalable solution in the quickest and simplest way possible.”

Garantia Data is currently offering the Redis Cloud free of charge to early adopters of its Heroku, AppFog and AppHarbor add-ons. The company will demonstrate the Redis Cloud and its new PaaS add-ons at Booth #332 duringAWS re: Invent, November 27-29 in Las Vegas.


FairComs Newest c-treeACE Bridges SQL, NoSQL Worlds

FairCom today announced the tenth major edition of its cross-platform database technology, c-treeACE® V10, that introduces the industry’s first Relational Multi-Record Type support for seamless integration between relational and non-relational database worlds.

c-treeACE V10 also delivers features such as new Java interfaces, performance and scalability enhancements, additional platform support, and new replication models. With this latest version come significant performance gains including 30 percent faster transaction throughput, 60 percent faster SQL performance, 200 percent better replication throughput, and 26 percent faster read performance.

“The database market is growing substantially, yet there are many problems plaguing developers today: large data volumes; requirements to reduce data access time; data access requirements from a myriad of new locations, like mobile devices and the cloud; trickier integration; and decreasing budgets,” said Randal Hoff, FairCom’s VP of Engineering. “Engineers tell us they really need technology that enables them to work seamlessly within both the relational and non-relational worlds. In the past, they’ve felt forced to choose one or the other, when, in fact, they realize concrete benefits from both. Our newest c-treeACE gives them the flexibility to enjoy the best of both worlds: high performance data throughput levels that a NoSQL database can provide; and concurrent relational access for ease of data sharing with other parts of the enterprise, including cloud and mobile devices, all at a reasonable price point.”

For more than 30 years, FairCom has provided a unique model to enterprise database developers and ISVs not available from off-the-shelf databases. Its c-treeACE offers the highest levels of tailored configuration and control while simultaneously supporting a variety of non-relational API’s (e.g., ISAM, .NET, and JTDB) along with industry-standard relational API’s (e.g., SQL, JDBC, ODBC, PHP, Python, etc.) within the same application, over the same data. Enterprises such as Federal Express, Microsoft, NASA and Visa have used FairCom technology in mission-critical solutions.

Photos/Multimedia Gallery Available: http://www.businesswire.com/cgi-bin/mmg.cgi?eid=50473561&lang=en


Actuate, VMware to Deliver Faster Insights from Big Data in the Cloud

Actuate Corporation is partnering with VMware to provide a cost-effective way for companies to gain insight from any source of Big Data in hybrid cloud environments, by deploying seamless, dynamic and accessible business information solutions.

ActuateOne is designed to support private, public and SaaS (software as a service) cloud deployments by eliminating the need for hard-coded configuration, thereby enabling fast and easy deployment with VMware solutions. VMware solutions provide multiple benefits to IT administrators and users. VMware virtualization creates a layer of abstraction between the resources required by an application and operating system, and the underlying hardware that provides those resources.

Customers can extend the reliability and agility offered by the new VMware vFabric™ Data Director™ to their virtualized ActuateOne cluster, by provisioning and managing the data sources visualized in an Actuate dashboard as well as the vFabric Postgres database that manages the Encyclopedia Volumes of that ActuateOne deployment. This multi-layered coupling of ActuateOne upon VMware simplifies IT management tasks, while ensuring the highest levels of mission-critical availability, security and scalability for any Actuate / VMware deployment.

“We are pleased to be working with Actuate to provide our customers with a solution for analyzing and visualizing large data sets that reside in hybrid cloud environments,” said Fausto Ibarra, Senior Director, Product Management, VMware. “With VMware vFabric™ Data Director, VMware vFabric Postgres and ActuateOne, organizations can securely and efficiently extract actionable insights from their data to make informative business decisions.”

“Actuate’s technology scales linearly across any number of virtual instances, interfacing well with VMware solutions, and further delivering on the Actuate promise of better insights for better decision making by more people in the organization,” said Wenfeng Li, Vice President of Product Development at Actuate. “ActuateOne sources multiple instances of Big Data, merging it with traditional relational data via our data integration features, and then presents this blended content as interactive, customizable visualizations to large numbers of simultaneous users.”

ActuateOne – Actuate’s BI platform and suite of interactive applications built on open source BIRT – has already earned VMware Ready™ status. The VMware Ready designation signifies that ActuateOne has gone through VMware’s advanced testing and evaluation process to certify product compatibility. As part of the partnership, Actuate and VMware demonstrated linear scalability of ActuateOne in conjunction with VMware vSphere®, vFabric™ Data Director™, and VMware vFabric Postgres.

Together, the solution delivers three unique capabilities:

  • An efficient and secure approach to using shared infrastructure for
    servicing highly frequent requests
  • A standardized, portable, and extensible approach to enable workloads
    to be deployed across multiple clouds without manual configuration
  • Agile access to shared infrastructure for provisioning workloads in
    demand

 


iQor Acquires HardMetrics

iQor, a provider of intelligent customer interaction and outsourcing solutions, today announced that it has acquired HardMetrics, a provider of cloud-based visual business intelligence, business analytics and reporting solutions. HardMetrics enables organizations to integrate disparate data from any part of the enterprise, then structures and translates that data into actionable insight, including user-friendly info-graphics, scorecards, charts, and dashboards.

With its customer and transactional databases, iQor’s industry-leading Big Data analytics engine QuantuMatch® helps clients uncover insights into their customer base. This agreement extends iQor’s analytics and self-service reporting capabilities by enabling on-demand, interactive drill-down reporting for more targeted customer interaction campaigns and faster recognition of, and response to, emerging trends.

“We are excited about the acquisition for our customers,” said Brian Turley, outgoing CEO of HardMetrics, Inc. “iQor brings resources, technology prowess, and market muscle that we simply could not match on our own.”

HardMetrics will continue to serve its existing customer base and operate as a standalone company. In addition, iQor will incorporate and market the HardMetrics Performance Manager product in a new offering called QeyMetricsSM for its broad base of customers.

“With iQor, HardMetrics clients now have the weight and ambition of an acknowledged technology and analytics innovator driving the product and the business,” said Bryce Engelbrecht, iQor Vice President and General Manager of HardMetrics. “We look forward to expanding HardMetrics’ proven, innovative platform for delivering operational metrics to business end users.”

“iQor has led the way in developing the digital network, tools and processes to capture, analyze, and act upon the Big Data generated with every customer interaction,” said Norm Merritt, President and CEO, iQor. “iQor’s analytics engine, coupled with HardMetrics’ leading data visualization tools, provides an unparalleled analytics toolset for both existing and new clients to help them generate value adding insight.”


Quest Software Announces Hadoop-Centric Software Analytics

 

Image representing Hadoop as depicted in Crunc...Quest Software, Inc. (now part of Dell) announced three significant product releases today aimed at helping customers more quickly adopt Hadoop and exploit their Big Data:

  • Kitenga Analytics ? Based on the recent acquisition of Kitenga,
    Quest Software now enables customers to analyze structured,
    semi-structured and unstructured data stored in Hadoop. Available
    immediately, Kitenga Analytics delivers sophisticated capabilities,
    including text search, machine learning, and advanced visualizations,
    all from an easy-to-use interface that does not require understanding
    of complex programming or the Hadoop stack itself. With Kitenga
    Analytics and the Quest Toad®
    Business Intelligence Suite, an organization has a complete
    self-service analysis environment that empowers business and systems
    analysts across a variety of backgrounds and job roles.
  • Toad for Hadoop ? Quest Software expands support for Hadoop in
    the upcoming release of Toad® for Hadoop. With more than two million
    users, and ranked No. 1 in Database Development and Optimization for
    three consecutive years by IDC [1], Toad has been enhanced to help
    database developers and DBAs bridge the gap between what they already
    know about relational database management systems and the new world of
    Hadoop. Toad will provide query and data management functionality for
    Hadoop, as well as an interface to perform data transfers using the
    Quest Hadoop Connector. Like Toad for any other platform, Toad for
    Hadoop makes the lives of developers, DBAs, and analysts easier and
    more productive.
  • SharePlex with Hadoop Capabilities ? Quest Software adds Hadoop
    capabilities to the next release of SharePlex® for Oracle,
    its robust, high-performance Oracle-to-Oracle database replication
    technology. For enterprise mission-critical systems that must always
    be available, the new release will seamlessly create multiple copies
    of Oracle data for movement simultaneously to both another Oracle
    environment and Hadoop, with no downtime. Customers can choose how
    they optimize Oracle and Hadoop environments based on data
    requirements, such as high availability; analytics and reporting;
    image and text processing; and general archiving. The architecture
    allows for scalable data distribution on-premise, in the cloud, and
    across multiple data centers without a single point of failure.


Benchmarking Redis on AWS: Is Amazon PIOPS Really Better than Standard EBS?

The Redis experts at Garantia Data did some benchmarking in the wake of Amazon’s announcement of

Their conclusion:

After 32 intensive tests with Redis on AWS (each run in 3 iterations for a total of 96 test iterations), we found that neither the non-optimized EBS instances nor the optimized-EBS instances worked better with Amazon’s PIOPS EBS for Redis. According to our results, using the right standard EBS configuration can provide equal if not better performance than PIOPS EBS, and should actually save you money.

Read the full post for details and graphs.


Redis/Memcached: Even Modest Datasets Can Enjoy the Speediest Performance

A pretty technical blog post over at Garantia Data’s blog relates the results of a recent benchmark test of the effects of cloud intrastructure on Memcached and Redis datasets:

Redis and Memcached were designed from the ground-up to achieve the highest throughput and the lowest latency for applications, and they are in fact the fastest data store systems available today. They serve data from RAM,  and execute all the simple operations (such as SET and GET) with O(1) complexity.

However, when run over cloud infrastructure such as AWS, Redis or Memcached may experience significant performance variations across different instances and platforms, which can dramatically affect the performance of your application.

Read the full post.


Study: Big Data, Cloud will Transform City Government

Around the world, city leaders face the challenge of delivering economic growth while meeting sustainability targets and rising expectations about the quality of municipal services, often in the face of drastic budget reductions. This is forcing many city leaders to improve efficiency and drive further innovation in the creation and delivery of services. According to a recent report from Pike Research, a part of Navigant’s Energy Practice, new platforms for communication, data sharing, and application development – particularly cloud computing and data analytics – will play a key role in this transformation.

Cumulative investment in smart government technology between 2011 and 2017 will be almost $4.8 billion, the report finds. Annual investment in smart government technologies in North America alone will surpass $1 billion in 2017, and annual investment in cloud services for smart cities will reach nearly $1.4 billion worldwide by 2017.

“Cloud-based computing, in particular, offers new options for cities that reduces capital expenditure, provides access to new skills, and reduces time-to-deployment of new solutions,” says research director Eric Woods. “Cloud-based systems also enable cities to take advantage of the huge amounts of operational data they collect to improve efficiency and develop new services.”

City leaders are also looking at investment in technology as a means of spurring economic growth. This includes a range of strategies: making the city a center of cleantech development and innovation (e.g., Denver, Copenhagen, and Amsterdam); creating new types of digital commerce and development (e.g., New York and Manchester); being at the leading edge of technology adoption (e.g., Barcelona and Friedrichshafen); becoming an exporter of technology (e.g., Seoul); or retaining or establishing a position as a regional trading hub (e.g., Singapore and Songdo). Each of these approaches, the study concludes, requires a vision of where the city is heading, an investment in infrastructure, and a commitment to innovation.

Pike Research’s report, “Smart Government Technologies”, analyzes the global market opportunity for smart government technologies. It assesses the business drivers, market forces, and technology trends that are transforming the use of information and communication technology and related technologies in smart cities and communities. The study forecasts the size and growth of the market for smart government technologies through 2017, and it also forecasts the growth in smart government data analytics and cloud-based services between 2011 and 2017. The report includes profiles of major smart government initiatives around the world and also examines the strategies of key players in the smart government market including government agencies, IT companies, telcos, and infrastructure providers. An Executive Summary of the report is available for free download on the firm’s website.