Category Archives: Database

KT, EnterpriseDB to offer OpenStack-based DBaaS

EnterpriseDB and KT are co-developing a postgres-based DBaaS

EnterpriseDB and KT are co-developing a postgres-based DBaaS

Postgres database specialist EnterpriseDB announced a partnership with the technology services subsidiary of incumbent Korean telco KT Corporation, KT DS, that will see the two jointly develop and offer a database-as-a-service offering deployed on OpenStack.

The offering will feature EDB’s database tech which is based on postgres, an open source object-relational database management system, and will be delivered via KT’s uCloud service.

KT, a longtime user of EDB’s postgres tech – the same database tech used by eBay and Facebook – said its engineers already contribute heavily to the open source project.

“Our long and successful partnership with EnterpriseDB with on-premises deployments in our own infrastructure made EDB’s Postgres Plus the obvious choice when we selected a cloud partner for our uCloud,” said Seunghye Sohn, senior vice president of KT DS.

Ed Boyajian, chief executive officer of EnterpriseDB said: “KT was a proving ground in Korea for Postgres Plus to power mission-critical, high-volume workloads. They are now leading enterprise and government users to the future with their uCloud and together we’re building on our years of partnership to play a role as cloud computing expands across Korea.”

KT is looking to bolster its cloud computing business among local incumbents looking to get a piece of the growing market, particularly the public sector segment. Earlier this year the Korean government passed the Act on Promotion of Cloud Computing and User Protection (colloquially known as the “Cloud Act”) designed to encourage public sector uptake of cloud services.

Real-time cloud monitoring too challenging for most providers, TFL tech lead says

Reed says TFL wants to encourage greater greater use of its data

Reed says TFL wants to encourage greater greater use of its data

Getting solid data on what’s happening in your application in real-time seems to be a fairly big challenge for most cloud services providers out there explains Simon Reed, head of bus systems & technology at Transport for London (TFL).

TFL, the executive agency responsible for transport planning and delivery for the city of London, manages a slew of technologies designed to support over 10 million passenger journeys each day. These include back office ERP, routing and planning systems, mammoth databases tapped in to line-of-business applications as well as customer-facing app (i.e. real-time travel planning apps, and the journey planner website), line-of-business apps, as well as all the vehicle telematics, monitoring and tracking technologies.

A few years ago TFL moved its customer facing platforms – the journey planner, the TFL website, and the travel journey databases – over to a scalable cloud-based platform in a bid to ensure it could deal with massive spikes in demand. The key was to get much of that work completed before the Olympics, including a massive data syndication project so that app developers could more easily tap into all of TFL’s journey data.

“Around the Olympics you have this massive spike in traffic hitting our databases and our website, which required highly scalable front and back-ends,” Reed said. “Typically when we have industrial action or a snowstorm we end up with 10 to 20 times the normal use, often triggered in less than half an hour.”

Simon Reed is speaking at the Cloud World Forum in London June 24-25. Register for the event here.

The organisation processes bus arrival predications for all 19,000 bus stops in London which is constantly dumped into the cloud in a leaky-tap model, and there’s a simple cloud application that allows subscribers to download the data in a number of formats, and APIs to build access to that data directly into applications. “As long as developers aren’t asking for predictions nanoseconds apart, the service doesn’t really break down – so it’s about designing that out and setting strict parameters on how the data can be accessed and at what frequency.”

But Reed said gaining visibility into the performance of a cloud service out of the box seems to be a surprisingly difficult thing to do.

“I’m always stunned about how little information there is out of the box though when it comes to monitoring in the cloud. You can always add something in, but really, should I have to? Surely everyone else is in the same position where monitoring actual usage in real-time is fairly important. The way you often have to do this is to specify what you want and then script it, which is a difficult approach to scale,” he said. “You can’t help but think surely this was a ‘must-have’ when people had UNIX systems.”

Monitoring (and analytics) will be important for Reed’s team as they expand their use of the cloud, particularly within the context of the journey data TFL publishes. Reed said its likely those systems, while in a strong position currently, will likely see much more action as TFL pursues a strategy of encouraging use of the data outside the traditional transport or journey planning app context.

“What else can we do to that data? How can we turn it around in other ways? How can other partners do the same? For us it’s a question of exploiting the data capability we have and moving it into new areas,” he said.

“I’m still not convinced of the need to come out of whatever app you’re in – if you’re looking at cinema times you should be able to get the transportation route that gets you to the cinema on time, and not have to come out of the cinema listings app. I shouldn’t have to match the result I get in both apps in order to plan that event – it should all happen in one place. It’s that kind of thinking we’re currently trying to promote, to think more broadly than single purpose apps, which is where the market is currently.”

Data-as-a-service provider Delphix buys data-masking specialist Axis

Delphix has acquired data masking and de-identification specialist Axis Technology Software

Delphix has acquired data masking and de-identification specialist Axis Technology Software

Data management provider Delphix has acquired Axis Technology Software, a data masking software specialist, for an undisclosed sum.

Delphix offers software that helps users virtualise and deploy large application databases (i.e. ERP) on private and public cloud infrastructure, while Axis offers data masking and de-identification software, particularly for large financial service firms, healthcare providers and insurers.

Delphix said the move will give it a boost in verticals where Axis is already embedded, and help strengthen its core offering. By adding data masking and de-identification capabilities to its data services suite, the company hopes to improve the appeal of its offerings from a security and privacy perspective.

“We believe that data masking—the ability to scramble private information such as national insurance numbers and credit card information—has become a critical requirement for managing data across development, testing, training and reporting environments,” said Jedidiah Yueh, chief executive of Delphix. “With Axis, Delphix not only accelerates application projects, but also ​increases​ data security for our customers.”

Following the acquisition Michael Logan, founder and chief executive of Axis Technology Software will join Delphix as vice president of data masking, where he will be responsible for driving development and adoption of the feature set Axis brings to Delphix.

“We’ve built a sophisticated platform to secure customer data at Axis, proven at many of the world’s biggest banks and enterprises,” Logan said.

“The integrated power of our platforms will provide our customers the ability to protect their data where and when they need it.”

Data-as-a-service provider Delphix buys data-masking specialist Axis

Delphix has acquired data masking and de-identification specialist Axis Technology Software

Delphix has acquired data masking and de-identification specialist Axis Technology Software

Data management provider Delphix has acquired Axis Technology Software, a data masking software specialist, for an undisclosed sum.

Delphix offers software that helps users virtualise and deploy large application databases (i.e. ERP) on private and public cloud infrastructure, while Axis offers data masking and de-identification software, particularly for large financial service firms, healthcare providers and insurers.

Delphix said the move will give it a boost in verticals where Axis is already embedded, and help strengthen its core offering. By adding data masking and de-identification capabilities to its data services suite, the company hopes to improve the appeal of its offerings from a security and privacy perspective.

“We believe that data masking—the ability to scramble private information such as national insurance numbers and credit card information—has become a critical requirement for managing data across development, testing, training and reporting environments,” said Jedidiah Yueh, chief executive of Delphix. “With Axis, Delphix not only accelerates application projects, but also ​increases​ data security for our customers.”

Following the acquisition Michael Logan, founder and chief executive of Axis Technology Software will join Delphix as vice president of data masking, where he will be responsible for driving development and adoption of the feature set Axis brings to Delphix.

“We’ve built a sophisticated platform to secure customer data at Axis, proven at many of the world’s biggest banks and enterprises,” Logan said.

“The integrated power of our platforms will provide our customers the ability to protect their data where and when they need it.”

NTT Data, DiData partner on SAP cloud migration

NTT Data and Dimension Data are helping enterprises move their SAP software into the cloud

NTT Data and Dimension Data are helping enterprises move their SAP software into the cloud

NTT Data and Dimension Data announced a partnership this week that will see the two firms jointly offer cloud migration services for SAP application users.

As part of the deal Dimension Data will host SAP Cloud and SAP HANA instances in 16 of its cloud datacentres globally, with NTT Data offering up its implementation, migration and application management services.

The companies said the partnership will enable them to offer clients complete SAP lifecycle management for workloads, which they said are increasingly being shifted into the cloud (even bulkier ERP workloads).

“Many of our clients tell us that they’re ready for the next chapter in cloud, which is moving production applications to a consumption based infrastructure. Until recently, clients found that self-service provisioning is complete, while self-service management of applications in the cloud remains challenging,” explained Steve Nola, Dimension Data’s group executive – ITaaS.

“That’s why Dimension Data is providing software and hands-on attention that will provide our enterprises with the confidence to move their SAP licenses to the Dimension Data cloud,” Nola said.

Kaz Nishihata, executive vice president, global business, NTT DATA said: “NTT DATA will manage the implementation of SAP with fully customized configurations, leverage well-established and automated cloud migration processes, and manage the ongoing services using our proprietary tools to improve service quality and manage costs. Combined with Dimension Data’s robust cloud platform, we can offer clients the mission-critical service levels required for today’s SAP environments.”

To its credit SAP has worked to make more recent versions of its software both more modular and more cloud-friendly in terms of architecture, but moving SAP applications into the cloud isn’t a particularly easy job given the bulkiness of their suites – and enterprises seem ready to take cues from service providers on how to get the job done.

Last year Colt and Virtustream struck a very similar partnership, the two companies now working together to migrate enterprises to SAP’s cloud platform and SAP HANA, with the former providing the service management and cloud infrastructure and the latter providing SAP-optimised microvirtualisation and cloud management tools.

Google reveals Bigtable, a NoSQL service based on what it uses internally

Google has punted another big data service, a variant of what it uses internally, into the wild

Google has punted another big data service, a variant of what it uses internally, into the wild

Search giant Google announced Bigtable, a fully managed NoSQL database service the company said combines its own internal database technology with open source Apache HBase APIs.

The company that helped give birth to MapReduce and its sister Hadoop is now making available the same non-relational database tech driving a number of its services including Google Search, Gmail, and Google Analytics.

Google said Bigtable is powered by BigQuery underneath, and is extensible through the HBase API (which provides real-time read / write access capabilities).

“Google Cloud Bigtable excels at large ingestion, analytics, and data-heavy serving workloads. It’s ideal for enterprises and data-driven organizations that need to handle huge volumes of data, including businesses in the financial services, AdTech, energy, biomedical, and telecommunications industries,” explained Cory O’Connor, product manager at Google.

O’Connor said the service, which is now in beta, can deliver over two times the performance of its direct competition (which will likely depend on the use case), and has a TCO of less than half that of its direct competitors.

“As businesses become increasingly data-centric, and with the coming age of the Internet of Things, enterprises and data-driven organizations must become adept at efficiently deriving insights from their data. In this environment, any time spent building and managing infrastructure rather than working on applications is a lost opportunity.”

Bigtable is Google’s latest move to bolster its data services, a central pillar of its strategy to attract new customers to its growing platform. Last month the company announced the beta launch of Google Cloud Dataflow, a Java-based service that lets users build, deploy and run data processing pipelines for other applications like ETL, analytics, real-time computation, and process orchestration, while abstracting away all the other infrastructure bits like cluster management.

Percona buys Tokutek to mashup MySQL, NoSQL tech

Percona acquired Tokutek to strengthen its expertise in NoSQL

Percona acquired Tokutek to strengthen its expertise in NoSQL

Relational database services firm Percona announced it has acquired Tokutek, which provides a high-performance MongoDB distribution and NoSQL services. Percona said the move will allow it to improve support for non-relational database technologies.

Tokutek offers a distribution of MongoDB, called TokuMX, which the company pitches as a drop-in replacement for MongoDB – but with up to 20 times performance improvements and 90 per cent reduction in database size.

One of the things that makes it so performant is its deployment of fractal tree indexing, a data structure that optimises I/O while allowing for simultaneous search and sequential access but with much faster insertions and deletions (it can also be applied in MariaDB).

Percona said the move will position the company to offer the full range consulting and technology services to support MySQL and MongoDB deployments; Percona Server already supports TokuMX as an option but the move will see the later further integrated and ship standard with the former.

“This acquisition delivers game-changing advantages to our customers,” said Peter Zaitsev, co-founder and chief executive of Percona. “By adding a market-leading, ACID-compliant NoSQL data management option to our product line, customers finally have the opportunity to simplify their database decisions and on-going support relationships by relying on just one proven, expert provider for all their database design, service, management, and support needs.”

John Partridge, president and chief executive of Tokutek said: “Percona has a well-earned reputation for expert database consulting services and support. With the Tokutek acquisition, Percona is uniquely positioned to offer NoSQL and NewSQL software solutions backed by unparalleled services and support. We are excited to know Tokutek customers can look forward to leveraging Percona services and support in their TokuMX and TokuDB deployments.”

NoSQL adoption is growing at a fairly fast rate as applications shift to handle more and more unstructured data (espeically cloud apps), so it’s likely we’ll see more MySQL incumbents pick up non-relational startups in the coming months.

 

YouTube brings Vitess MySQL scaling magic to Kubernetes

YouTube is working to integrate a beefed up version of MySQL with Kubernetes

YouTube is working to integrate a beefed up version of MySQL with Kubernetes

YouTube is working to integrate Vitess, which improves the ability of MySQL databases to scale in containerised environments, with Kubernetes, an open source container deployment and management tool.

Vitess, which is available as an open source project and pitched as a high-concurrency alternative to NoSQL and vanilla MySQL databases, uses a BSON-based protocol which creates very lightweight connections (around 32KB), and its pooling feature uses Go’s concurrency support to map these lightweight connections to a small pool of MySQL connections; Vitess can handle thousands of connections.

It also handles horizontal and vertical sharding, and can dynamically re-write queries that could impede the database performance.

Anthony Yeh, a software engineer at YouTube said the company is currently using the service to handle metadata for the company’s video service, which handles billions of daily video views and 300 hours of new video uploads per minute.

“Your new website is growing exponentially. After a few rounds of high fives, you start scaling to meet this unexpected demand. While you can always add more front-end servers, eventually your database becomes a bottleneck.”

“Vitess is available as an open source project and runs best in a containerized environment. With Kubernetes and Google Container Engine as your container cluster manager, it’s now a lot easier to get started. We’ve created a single deployment configuration for Vitess that works on any platform that Kubernetes supports,” he explained in a blog post on the Google Cloud Platform website. “In this environment, Vitess provides a MySQL storage layer with improved durability, scalability, and manageability.”

Yeh said the company is just getting started with the Kubernetes integration, but once users will be able to deploy Vitess in containers with Kubernetes on any cloud platform supported by it.

Cloud-based data management provider Reltio scores $10m

Reltio scored $10m, which will be used to expand its sales and marketing efforts

Reltio scored $10m, which will be used to expand its sales and marketing efforts

Reltio, a startup founded by Informatica veterans, has secured $10m in its first round of funding and announced the launch of its cloud-based data management platform.

Much like the integration element Informatica specialises in, Reltio is pitching its services at those that don’t necessarily want to acquire and set up all of the front-end and back-end big data tools in piecemeal, siloed fashion, but instead want an integrated platform that can query, analyses and display multiple data types.

The company said its data management platform is designed for those accustomed to using services like Facebook or Linkedin, but within traditionally data-intense industries like healthcare and life sciences, oil and gas, retail and distribution.

“Data is the new natural resource, but it’s truly valuable only when it’s effectively mined, related and transformed into insight with business actions that can be taken within the context of day-to-day operations,” said Manish Sood, founder and chief executive officer of Reltio.

“With Reltio, data is collated and analysed for actionable intelligence with the speed needed to support innovation and spark new revenue streams. IT gets a modern data management platform while business users get easy to use data-driven applications to address their everyday needs,” Sood said.

The company was founded largely by Informatica data management specialists: Sood led product strategy for master data management at Informatica; Anastasia Zamyshlyaeva, chief architect for Reltio, helped design the core components of Informatica’s MDM offering; Curt Pearlman, vice president of solutions, previously held positions in sales consulting with Informatica, as did Bob More, Reltio’s senior vice president of sales.

Reltio is throwing its hat into an increasingly competitive but lucrative ring. Analyst firm IDC estimates spending on big data and analytics will reach $125bn in 2015, with Database-as-a-Service growing in importance as cloud and commercial vendors open up their data sets.

FairComs Newest c-treeACE Bridges SQL, NoSQL Worlds

FairCom today announced the tenth major edition of its cross-platform database technology, c-treeACE® V10, that introduces the industry’s first Relational Multi-Record Type support for seamless integration between relational and non-relational database worlds.

c-treeACE V10 also delivers features such as new Java interfaces, performance and scalability enhancements, additional platform support, and new replication models. With this latest version come significant performance gains including 30 percent faster transaction throughput, 60 percent faster SQL performance, 200 percent better replication throughput, and 26 percent faster read performance.

“The database market is growing substantially, yet there are many problems plaguing developers today: large data volumes; requirements to reduce data access time; data access requirements from a myriad of new locations, like mobile devices and the cloud; trickier integration; and decreasing budgets,” said Randal Hoff, FairCom’s VP of Engineering. “Engineers tell us they really need technology that enables them to work seamlessly within both the relational and non-relational worlds. In the past, they’ve felt forced to choose one or the other, when, in fact, they realize concrete benefits from both. Our newest c-treeACE gives them the flexibility to enjoy the best of both worlds: high performance data throughput levels that a NoSQL database can provide; and concurrent relational access for ease of data sharing with other parts of the enterprise, including cloud and mobile devices, all at a reasonable price point.”

For more than 30 years, FairCom has provided a unique model to enterprise database developers and ISVs not available from off-the-shelf databases. Its c-treeACE offers the highest levels of tailored configuration and control while simultaneously supporting a variety of non-relational API’s (e.g., ISAM, .NET, and JTDB) along with industry-standard relational API’s (e.g., SQL, JDBC, ODBC, PHP, Python, etc.) within the same application, over the same data. Enterprises such as Federal Express, Microsoft, NASA and Visa have used FairCom technology in mission-critical solutions.

Photos/Multimedia Gallery Available: http://www.businesswire.com/cgi-bin/mmg.cgi?eid=50473561&lang=en