Progress Introduces DataDirect Connect for ADO.NET R4.0

Progress Software Corporation today introduced the new Progress DataDirect Connect® for ADO.NET 4.0, the latest version of its suite of ADO.NET (ActiveX Data Objects for .NET) data providers that eliminates the need for database client libraries with a managed, wire protocol architecture. This new release includes Microsoft Entity Framework support, Visual Studio LightSwitch support and expanded enterprise capabilities that enable developers to rapidly create, deploy and manage high-performance, mission-critical desktop and Cloud applications with ease.

Tatsuro Hayashi, Corporate Development Office Manager for MapQuest Corporation in Japan, noted: “Using the DataDirect Connect for ADO.NET 4.0 data provider, we can develop high performing, reliable ADO.NET applications that offer bulk load functionality, application failover, and top of the line security for our customers. Because managed code runs in the Common Language Runtime (CLR) environment, it reduces risks such as memory leaks and closes security holes that unmanaged code leaves exposed. This means our programmers and developers have secure, reliable and versatile deployment options available in either application or client-server environments. Additionally, our software developers and programmers can avoid the versioning headaches associated with other .NET providers as well as multiple versions of client libraries and databases used across our company.”

As more .NET applications move to the Cloud, high performance and efficiency become even more important. The Progress DataDirect Connect for ADO.NET 4.0 data providers are a critical component to delivering on that goal, while significantly lowering operating costs for Cloud applications by using less memory and CPU cycles. The DataDirect Connect for ADO.NET 4.0 data providers deliver top performance, as well as significantly higher levels of security and authentication, reliability and failover, connection pool management and multi-database consistency not found in other drivers. This enables developers to create higher performing applications with greater uptime and better security.


Assay Depot, AstraZeneca Launch Virtual Research Laboratory for Drug Researchers

Assay Depot Inc. today announced the launch of a virtual research laboratory. Developed in partnership with AstraZeneca, the virtual laboratory is a vendor relationship management (VRM) system that gives researchers easy access to a distributed network of thousands of research service providers located inside and outside the company.

“We brought together features of today’s favorite consumer websites to create a virtual laboratory that empowers scientists,” stated Kevin Lustig, Assay Depot’s CEO. “A small group of talented scientists can now run an entire drug discovery program, from concept to clinic, from a laptop computer.”

The private virtual laboratory (aka Research Exchange) enables researchers to search for research services and vendors, communicate with experts, purchase services, and rate and review services. In one simple and intuitive interface, scientists identify experts, initiate research collaborations, and track entire projects to completion.

Researchers can access the virtual laboratory from anywhere they access the Internet, including tablets and mobile devices. They can view their colleagues’ ratings and reviews, view past transactions and determine at a glance which vendors have current legal agreements.

“The virtual drug discovery era has arrived,” said Chris Petersen, Assay Depot’s CIO. “Enabling research scientists to access any service and any expert in just a few mouse clicks can dramatically improve productivity, reduce costs and promote innovation.”

A private virtual laboratory enables complete transparency across global research operations while retaining local control over research sourcing decisions. It simplifies legal and compliance verification, standardizes sourcing governance and serves as a versatile all-in-one platform that benefits the entire chain of pharmaceutical stakeholders, including discovery research, supply chain and consumer health.


SYS-CON.tv Interview: Data Center Fabric for Cloud Computing

“We’ve announced the Data Center Fabric product line that allows you to spin up a brand new data center type within your data center,” stated Bruce Fingles, CIO and VP of Product Quality at Xsigo, in this SYS-CON.tv interview with Cloud Expo Conference Chair Jeremy Geelan at the 10th International Cloud Expo, held June 11–14, 2012, at the Javits Center in New York City.
Cloud Expo 2012 Silicon Valley, November 5–8, at the Santa Clara Convention Center in Santa Clara, CA, will feature technical sessions from a rock star conference faculty and the leading Cloud industry players in the world.

read more

Oracle to Acquire Xsigo

Oracle announced on Monday that it has entered into an agreement to acquire Xsigo Systems, a provider of network virtualization technology.
Xsigo’s software-defined networking technology simplifies cloud infrastructure and operations by allowing customers to dynamically and flexibly connect any server to any network and storage, resulting in increased asset utilization and application performance while reducing cost.
The company’s products have been deployed at hundreds of enterprise customers including British Telecom, eBay, Softbank and Verizon.

read more

Enterprise IT: Don’t Bury Your Head in the Sand on Cloud Data Protection

In a lot of ways, changes in enterprise technology have mirrored consumer technology trends over the last few years. Just look at mobility, social apps, and cloud computing as examples. And whether you view the crossover as advantageous, overly risky, or merely inevitable, it’s certain that enterprise IT security must adapt quickly to the new challenges presented, including overcoming cloud data security, cloud data residency, and cloud compliance issues.
A series of recent articles and industry analyst reports I have read shows the clear trend that business-critical apps like CRM, storage, and collaboration — enterprise IT’s mainstays – are now moving towards cloud and mobile, despite inherent risks. In the case of mobile, workers are simply more productive and comfortable using their smartphones, laptops, and tablets to access company information of all types from a variety of locations. Similarly with the cloud, the benefits are simply too compelling for the business organizations inside of the enterprise. And rather than burying their heads in the sand, enterprise IT managers need to adapt security practices accordingly, or ban the use of personal devices and cloud altogether – an unlikely choice given recent trends.

read more

Arkessa Chooses Telecloud the Data Centre Cloud

Arkessa, a major technology provider within the machine-to-machine (M2M) sector, has turned to leading global data centre provider, Telehouse along with its telecom and ICT parent company KDDI to design and build a virtualised private Cloud platform within their London data centres.

KDDI’s Cloud architects designed an infrastructure-as-a-service (IAAS) solution that is capable of supporting the growth of the business, allowing Arkessa to focus on developing and building its portfolio of innovative solutions and customer relations.

Headquartered in Cambridge, and with services throughout Europe, the Americas, the Far East and Australasia, Arkessa operates in the growing and highly competitive M2M industry, where it enables remote devices to be operated, monitored, managed and controlled as though they are connected directly to the user’s desktop, tablet or smartphone. The mission-critical application is used by the security, construction, public transport and other key industries which require super-fast data transfer from remote locations to central control hubs.

read more

How a SaaS product shook the foundations of its on-premise rival

What’s an example of on-premise software being threatened by the emergence of a SaaS upstart operating in the cloud?  Answer: Mint Vs. Quicken


Quicken, for years, had cultivated a devoted base of users who waited for each new version of the software to appear from Intuit. This nifty little personal finance tool is easier to use than QuickBooks albeit more limited in scope.  

Then came along this amazingly simple and intuitive online offering called Mint, which promised to integrate all your personal finance into one single web interface, with no requirement to download software. And it was free!

In 2009 TechCrunch reported that Mint claimed it gained 3000 users per day jumping from 600,000 to 850,000 in just a few short months. The numbers were disputed by Intuit, who suddenly understood how the cloud could reshape a landscape it had for years dominated.

This led Intuit to …

BISNOW Data Center Event Highlights Cloud

A big thank you to BISNOW and my fellow panel members for an outstanding discussion and very informative event, last week’s Data Center Investment Conference and Expo. The federal marketplace is certainly being changed by cloud and the data center industry is certainly willing and able to support this important transition.
Our federal roundtable, moderated by purple tie-wearing King & Spalding partner JC Boggs, explained that agencies are all moving at different speeds to the cloud. The most mission critical programs are moving the slowest. FAA CIO Steve Cooper says his agency is starting to stand up federal private clouds. And there are plans to share that environment with other government agencies, as well as offer them cloud-based services. But Steve says he won’t be the first CIO to go into a public cloud and risk the public embarrassment of a security breach.

read more

How to Survive a Cloud Outage

cloud servicesCloud computing solutions offer organizations tremendous opportunities for efficiency, flexibility, and scalability. They also create some scenarios in which a business can find themselves in trouble. A single cloud outage can shut an organization down, for example.

While it may seem unlikely that a provider will experience a severe interruption of service, it’s important to realize that it does happen. Amazon Web Services, for example, experienced a four-day outage for some of its clients in 2010. That kind of outage can grind business to a halt.

Fortunately, there are some things you can do to make sure your mission-critical apps survive a cloud outage. Here are some procedures you need to put in place as soon as possible:

  1. Take advantage of cloud provider options. Some providers will offer you multiple availability zones or multiple availability regions. What this boils down to is that your particular solution is being housed in multiple locations, each independent of a disaster at the other. This usually results in a premium, but in many critical applications it will be worth it.
  2. Consider multiple providers. Sometimes, especially with mission-critical applications, you can develop a multi-provider architecture. To be sure, you need to do some investigation here; some cloud providers are actually likely to share data center resources with each other. It doesn’t do you any good to have multiple providers if each is relying on the same physical location to provide you services.
  3. Include availability in your service level agreement. Whenever possible, you need to make sure that your SLA outlines specific consequences when there is a disruption in service. Define an acceptable level of availability for the given cloud application. For example, if you’re relying on a cloud provider to give you disaster recovery services, you might require 99.999% availability.
  4. Think twice before putting some applications into the cloud. If you’re not willing to take the risk and do what needs to be done to insure availability or survive an outage, you may not be ready for public cloud solutions. It’s not that cloud solutions can’t meet your needs; you just need to go in with eyes wide open and be ready to accept the associated risks.

read more

Big Data and Cloud – Managing the Explosion of Data

There is a data explosion occurring in the world. As more and more data is being collected, the challenge is how to use it effectively and securely. Data that is being created is not just transaction oriented and is not structured. Such data cannot be managed effectively with some data management systems since traditional systems cannot process such data. . Big data requires adequate storage and distributed processing. Big Data also injects heterogeneous data types that need proper integration into the existing infrastructure. Social networks have been generating tremendous amounts of such data. Big Data is truly a nightmare and it is important to effectively manage such data. Big Data tools can manage and analyze structured and unstructured data and can obtain information from large data sets.

read more

The cloud news categorized.