Category Archives: API

Container Solutions brings production environment to the developers laptop

Global Container TradeLondon-based Container Solutions has released the latest version of its minimesos project, an open source testing and experiment tool for Apache Mesos, which it claims brings production orchestration testing to the development environment.

The new offering targets the challenge of moving microservice applications from a developer’s laptop to the production environment, which can prove to be complicated as the target platform is different than the local one. The offering allows developers to bring up a containerised Apache Mesos cluster on their laptop, creating a production-like environment on their desktops for building, experimenting and testing.

“When we started building a number of Mesos frameworks, we found it hard to run and test them locally,” said Jamie Dobson, CEO of Container Solutions. “So, we ended up writing a few scripts to solve the problem. Those scripts became minimesos, which lets you do everything on your laptop. We later integrated Scope so that developers could visualise their applications. This made minimesos even more useful for exploratory testing.”

The company claims developers can now start a Mesos cluster through the command line or via the Java API, which is logically isolated as each of the processes run in separate Docker containers. Minimesos is aslo integrated: it exposes framework, state and task information to its Cluster State API.

Atlassian launches Bitbucket Pipelines

Door to new opportunityAtlassian has announced a number of new developments within its team collaboration software portfolio, including the launch of Bitbucket Pipelines platform.

The new platform extends its cloud-based Bitbucket code repository to provide teams with entire continuous delivery workflows from source to deployment in the cloud. The team claim the new proposition helps developers who are struggling to apply on-premises continuous integration and delivery tools as software development and production applications are shifting into the cloud.

“Atlassian is helping teams across all industries do amazing things. We’re helping developers at Cochlear build aural implants to help people hear, The Telegraph to inform millions of readers each day, and Lufthansa Systems to provide IT services for everything from aviation safety to entertainment,” said Sri Viswanath, CTO, Atlassian. “The common thread between these teams and your own is the need to work smarter and faster. We’re seeing more and more of these teams choosing to collaborate in the cloud. In fact, over half of our customers choose to collaborate in the cloud and an even higher number of new customers select our cloud offerings.”

Elsewhere, the team also launched a native mobile platform to increase connectivity between departments who are using the Confluence and JIRA tools, building on the enterprise mobility trends, as well opening up its JIRA Service Desk product, to developers to build add-ons that create and update requests or extend JIRA Service Desk’s automation capabilities to react to changes in requests.

The company has also joined the Open API Initiative and replace the company’s existing API documentation, using a custom site generator, RADAR and it has released this software as open source to be used by any Open API provider.

“Collectively, we have a lot to gain from an open, widely accepted definition language for REST APIs,” said Viswanath. “We’re committed to actively contributing to the standard and are now a member of the Open API Initiative and the Linux Foundation, alongside industry leaders like Google, Microsoft, PayPal and others.”

Linux Foundation wants to extend Swagger in connected buildings

smart cityMembers of the Linux Foundation have met in San Francisco to push its newly announced Open API initiative. The collective want to harmonise efforts in the development of connected building technology.

Founding members of the Open API Initiative, including Google, IBM, Intuit, Microsoft and PayPal, want to extend the range of Swagger, the popular framework for building application programming interfaces (APIs). Their collective ambition is to create a vendor neutral, portable and open specification for providing metadata for APIs based on the representational state architecture (REST).

A new open specification could let humans and computers discover and understand the potential of their proposed connected building services with minimal implementation logic. The Initiative will also promote the use of an open API standard.

Swagger, created in 2010 and offered as an open source license a year later, is a description format used by developers in a broad range of industries. It’s used to design and deliver APIs that support a variety of connected applications and services. Downloads of Swagger and Swagger tooling have tripled in the last year as it became the most popular open source framework for defining and creating RESTful APIs.

SmartBear recently acquired the Swagger API open source project from Reverb Technologies and today is working with its industry peers to ensure the specification and format can continue to evolve. The open governance model for the Open API Initiative includes a Technical Developer Committee (TDC) that will manage the specification and keep users abreast of developments.

“Swagger is considered one of the most popular frameworks for building APIs. When an open source project reaches this level of maturity, it just can’t be managed by one entity,” said Jim Zemlin, executive director at The Linux Foundation. “The Open API Initiative will extend this technology to advance connected application development through open standards.”

IBM targets API Harmony in the cloud

APIIBM is to use the cloud to deliver IT developers into a state of API Harmony.

The vendor turned service provider has launched an intelligent cloud based matchmaking technology, which instantly helps coders to find the right application programming interface (API) for the right occasion. The service, API Harmony, could save the world’s developers so much time and money that the global API economy has been predicted to be worth $2.2 trillion by 2018, according to IBM’s internal research.

The system uses cognitive technologies to anticipate the needs of a developer as they build new apps. It then pre-empts some of the delays that developers face by anticipating their interface challenges and researching the answers. It then makes useful time saving recommendations on which APIs to use, API relationships and anything that might be missing.

The API economy is characterised by IBM research as a commercial exchange of business functions and competencies in APIs. IBM says it is the driving force behind most digital transformation across industries today. By 2018, according to research company Ovum, the number of enterprises having an API program will have grown by 150%.

There are three pillars of harmoniousness in the API economy, according to IBM. Accordingly its API Harmony servicee main components: Strategy, Technologies and Ecosystems. The strategy element consists of IBM’s API Economy Journey Map, in which consultants will help clients identify key opportunities and gauge their readiness for their journey. The Technologies aspect of the service is built on the previously described intelligent me has thrapping systems and cloud delivery. The Ecosystem for the services is the fruit of an IBM collaboration with the Linux Foundation to create an open platform for building, managing and integrating open APIs.

IBM’s Watson APIs are managed by IBM API Management on Bluemix, bringing approximately 30 cognitive-based APIs to industries.

“To succeed in the API Economy enterprises need an open ecosystem and IBM is helping guide clients every step of the way,” said Marie Wieck, General Manager for IBM Middleware.

New Mendix system replaces programme writing with system modeling in the cloud

AppsDevelopment system maker Mendix claims its new system can speed application development by replacing the programme writing with application modeling.

It claims the newly announced Mendix 6 system makes it possible to build digital applications quickly by importing and exporting a range of previously made models. It has also introduced a mechanism that supports offline functions in mobile applications, so that mobile workers can still use their cloud applications when cut off from a network.

Mendix claims developers can build mobile applications that make use of static resource storage, and use data and data entry caching in order to maintain consistency of user experience and performance when offline.

The Mendix 6 Model API (application programming interface) and open source platform software development kit will help companies avoid vendor lock-in, help them migrate from or modernize legacy systems, automate tasks and – through fault finding analytical systems – create a new level of quality assurance, claims Mendix.

The processes of legacy migration and modernization are supported by a ‘model importing system’ which, in effect, allows would be developers to use development models that have worked successfully in similar situations elsewhere. This, claims Mendix, allows organisations to ‘accelerate application modernisation at massive scale’.

The model exchange function also aims to save time for clients by making it easier to examples for documentation, to move applications to other platforms and to increase transparency.

Customers running Mendix apps on the open source platform service, Cloud Foundry. will work with simpler configurations and enjoy more resilience said Mendix CTO Johann den Haan.

“Application development doesn’t run fast enough for many companies,” said den Haan, “now you don’t have to programme apps. You model them in the cloud and click run.”

Mendix is available in the Amazon Web Services Marketplace.

Twitter nixes firehose partnership with DataSift

Twitter is consolidating its grip on data analytics and resellers using its data in real-time

Twitter is consolidating its grip on data analytics and resellers using its data in real-time

Twitter has suspended negotiations over the future use of the social media giant’s data with big data analytics provider DataSift, sparking concerns the firm plans to shut out others in the ecosystem of data analytics providers it enables.

In a recent blog post penned by DataSift’s chief exec and founder, Nick Halstead, the company aimed to reaffirm to customers that’s its business model “never relied on access to Twitter data” and that it is extending its reach into “business-owned data.”

But, the company still attacked the social media giant for damaging the ecosystem it enables.

“Our goal has always been to provide a one-stop shop for our customers to access all the types of data from a variety of networks and be able to consume it in the most efficient way. Less noise, more actionable results. This is what truly matters to companies that deal with social data,” Halstead explained.

“The bottom line: Twitter has seriously damaged the ecosystem this week. 80% of our customers use technology that can’t be replaced by Twitter. At the end of the day, Twitter is providing data licensing, not processing data to enable analysis.”

“Twitter also demonstrated that it doesn’t understand the basic rules of this market: social networks make money from engagement and advertising. Revenue from data should be a secondary concern to distribution and it should occur only in a privacy-safe way. Better understanding of their audiences means more engagement and more ad spend from brands. More noise = less ad spend.”

DataSift was one three data resellers that enjoy privileged access to Twitter’s data in real-time – Gnip, which is now owned by Twitter, and NTT Data being the other two.

The move to strengthening its grip over the analysis ecosystem seems aimed at bolstering Gnip’s business. A similarly-timed post on Gnip’s blog by Zach Hofer-Shall, head of Twitter more or less explained that the Gnip acquisition was a “first step” towards developing a more direct relationship with data customers, which would suggest other firehose-related negotiations may likely sour in the coming months if they haven’t already (BCN reached out to NTT Data for comment).

Some have, reasonably, hit out at Twitter for effectively eating its own ecosystem and shutting down third party innovation.  For instance Steven Willmott, chief executive of 3Scale, an API services vendor, said shutting down firehose access will result in niche verticals being underserved.

“While it makes sense at some level to want to be closer to the consumers of data (that’s valuable and laudable from a product perspective), removing other channels is an innovation bust. Twitter will no doubt do a great job on a range of use-cases but it’s severely damaging not to have a means to enable full firehose access for others. Twitter should really be expanding firehose access, not restricting it”

Julien Genestoux, founder of data feed service provider Superfeedr, said the recent move to cut off firehose access is not very different from what Twitter did a couple years ago when they started limiting the 3rd party client’s API accesses, and that Facebook often does much the same with partners it claims to give full data access to.

“The problem isn’t the company. The problem is the pattern. When using an API, developers are completely surrendering any kind of bargain power they have. There’s a reason we talk about slave and master in computer science. API’s are whips for web companies. This is the very tool they use to enforce a strong coupling and dependence to their platform,” he said.

While Twitter seems to be severely restricting the data reseller ecosystem it’s also redoubling its efforts to capture the hearts and minds of the enterprise developer, with coveted access to its data being placed front and centre. Twitter is working with IBM to make its data stream available to Big Blue’s clients, and in March this year IBM said it has over 100 pilots in place that see the company working with enterprises in a range of verticals to create cloud-based services integrating Twitter data and Watson analytics.

API and cloud app specialist Apigee to go public

Apigee helps enterprises re-architect their apps to make them suitable for cloud, big data and IoT

Apigee helps enterprises re-architect their apps to make them suitable for cloud, big data and IoT

Apigee, an API software platform provider that helps enterprises build and scale apps, is the latest cloud provider to propose an initial public offering of shares of its common stock.

The firm, backed by notable technology investment firms including BlackRock, SAP Ventures and Norwest Ventures, has enlisted Morgan Stanley and Credit Suisse Securities to help manage the process of going public.

The company hopes to raise a modest $87m through the IPO according to a filing with the US Securities and Exchange Commission.

While Apigee claims some of the most reputable firms in the world as customers (eBay, the BBC, Orange, Equinix) and secured close to $200m in seven funding rounds since it was founded in 2004, the company’s financials raise some questions about the company’s viability in the long term.

Like Box, the pure-play cloud storage and collaboration provider that also recently went public (raising over three times when Apigee is seeking), the company has accrued a notable amount of debt compared with what it intends to raise through the IPO.

The company’s gross billings were $36.7m, $43.1m and $63.8m in fiscal 2012, 2013 and 2014, respectively. But it incurred net losses of $8.3m, $25.9m and $60.8m in 2012, 2013 and 2014, respectively. It racked up losses of $26.8m in the six months ended January 31, 2015.

Nevertheless, it’s clear enterprise app development is becoming API-centric, as an increasing number of IT services are being joined up.

“We believe that application programming interfaces, or APIs, are a critical enabling technology for the shifts in mobile, cloud computing, big data and the IoT and that APIs are a foundational technology on which digital business operates. We believe that a new and expansive market opportunity exists to help enterprises adopt digital strategies and navigate the digitally driven economy,” the company said in its SEC S-1 filing.

“Today, it is difficult for many businesses to fully participate and innovate in the digital world because traditional enterprise software is not designed to interact with and connect to the rapidly evolving digital economy. The IT architectures deployed at most businesses are based on thousands of application servers communicating with databases, other applications and numerous middleware layers, each using thousands of custom integrations and connectors. These legacy architectures generally cannot publish APIs in a way that can be used by application developers.”