Archivo de la categoría: COBOL

GFT and CloudFrame help industries say ‘cheerio’ to COBOL

GFT, a digital transformation pioneer, and CloudFrame, a provider of pathways to digital transformation for large organisations who are running mission-critical applications on COBOL, have partnered to help COBOL users make the transition to more efficient platforms that enable users to reduce their overall mainframe costs. CloudFrame’s proprietary technology converts COBOL code into more efficient… Read more »

The post GFT and CloudFrame help industries say ‘cheerio’ to COBOL appeared first on Cloud Computing News.

Ciber machine will convert Cobol into cloud ready code

Legacy code is keeping enterprises from migrating to the cloud

Legacy code is keeping enterprises from migrating to the cloud

Service provider Ciber claims it has solved one of the most expensive problems in business: upgrading legacy systems to make them secure and cloud friendly.

Its new system, Ciber Momentum, converts the code from languages such as Cobol, Ada and Pascal into a more cloud-ready format. By automating the conversion of machine code into a modern format, the Momentum system creates massive time and money savings on projects that can take up to three years if conducted using human resources, according to Ciber.

Gartner research estimates that companies spend 70 per cent of their IT budget on maintaining existing systems, Ciber claims, leaving only 30 per cent available for new projects. This is because few of the programmers familiar with the languages used to create legacy applications are available for work today.

This means that Cobol writers, for example, are three times as expensive to hire as modern developers and, with few hiring options, companies find it difficult to dictate terms.

Since the conversion of a trading system written in Cobol can take years, this is creating a crippling expense and leaving companies vulnerable to competition from cloud based start ups that can move much faster, Ciber claims. Legacy apps are not only inflexible, they are more likely to be a security liability, said Michael Boustridge, president and chief executive of Ciber.

“Most of the time companies get hacked, the criminals are exploiting vulnerabilities of an old system,” said Boustridge, “legacy computers are not secure.”

Boustridge said Ciber intends to reverse the formula for the industry, so that CIOs will be able to spend 70 per cent of their budgets on new projects and only 30 per cent on maintenance.

The fast-track to the cloud can only be 80 to 85 per cent software generated as some human checking and balancing will be necessary. However, Boustridge claimed that conversion project times will be halved.

The automated system will also uncover any anomalies in legacy coding. These logical inconsistencies were often created by programmers who were notorious for over complicating systems in order to inflate their value to their employers, according to Boustridge. “Anything in the old code that doesn’t add up will be exposed,” said Boustridge.

The system, now on global release, will be available for partners to white label and offer as part of their own client service.

BluePhoenix Moves Mainframe COBOL, Batch Processing to the Cloud

BluePhoenix has released their Cloud Transaction Engine and Batch In The Cloud Service. The Cloud Transaction Engine (CTE) is a module of the company’s soon-to-be-released ATLAS Platform.    CTE is a proprietary codebase that enables mainframe processes to be run from off- mainframe infrastructure. BluePhoenix’s Batch In The Cloud service is the first formal offering leveraging CTE capabilities.

“Batch In The Cloud uses off-mainframe, cloud-based processing power to reduce mainframe MIPS and total cost of ownership,” explains Rick Oppedisano, BluePhoenix’s Vice President of Marketing. “The huge array of virtual machines in the cloud brings greater performance and scalability than the mainframe. Jobs can be processed quicker at a lower cost. It’s a great way for customers to save money immediately and explore options for an eventual mainframe transition.”

The Batch In The Cloud service is supported on private or public clouds, including Microsoft’s Azure and Amazon’s EC2. This service is designed to enable COBOL, CA GEN and Natural/ADABAS mainframe environments.

“In a typical scenario, workloads continue to grow while the mainframe’s processing power and batch window stays the same,” says BluePhoenix’s VP of Engineering, Florin Sunel. “Our technology acts as a bridge between the mainframe and cloud. With Batch In The Cloud, all business logic is preserved. Customers can reduce usage cost by running jobs like reporting from the cloud platform rather than the mainframe. In that scenario, they can also add business value by using modern business intelligence tools that aren’t compatible with the mainframe to gain insight from their data.”
Adds Oppedisano, “Beyond the immediate cost savings, this technology creates a competitive advantage. Exposing data in an off-mainframe location empowers the customer to become more agile. Not only can they process reports faster, but they can slice and dice their data to get a broader perspective than competitors who keep data on the mainframe.”

“By moving batch workloads to Windows Azure or a Microsoft Private Cloud, companies are able to take advantage of cloud economics,” said Bob Ellsworth, Microsoft Worldwide Director of Platform Modernization. “Combined with the advanced analytics included in SQL Server, the customer not only realizes great savings, scale and flexibility but increased business value through self-service BI.”

BluePhoenix is offering a free Proof of Concept for the Batch In The Cloud service. “To manage the scale and demand, we’re going to start with a complimentary assessment of the customer environment to identify the most appropriate applications for this service,” says Oppedisano. “Once those applications are identified, we will build the roadmap and execute the Proof of Concept on the cloud platform of the customer’s choice.”

Additional details on the Batch In The Cloud service and Proof of Concept can be found here.

COBOL in the Cloud

Heirloom Computing Inc. today announced a new partnership with Java Platform as a Service (PaaS) provider CloudBees to speed the transition of mainframe workloads to the CloudBees PaaS. With the partnership, Heirloom will help IT managers lower costs and modernize their COBOL-based mainframe workloads by deploying them to the cloud, utilizing Heirloom Elastic COBOL and the CloudBees Platform.

For more information about Heirloom Computing and how Heirloom can help you transition mainframe workloads to the cloud, please visitwww.heirloomcomputing.com. For more information about CloudBees, please visit www.cloudbees.com.