Category Archives: Saas

EverString raises $65 million for predictive marketing push

Startup seed fundingPredictive marketing SaaS company EverString has raised $65 million in venture funding for market expansion. The Series B investment round was led by Lightspeed Venture Partners, which gathered extra financial support from Sequoia Capital and IDG Ventures and brought in new investors including Lakestar.

EverString uses data science, artificial intelligence and predictive analytics in a cloud based system that helps B2B companies identify their best customer prospects. The system intelligently scores new and existing sales prospects and widens the net to find completely new targets. In ‘demand generation’ circles this is called going outside the funnel. EverString’s customers are primarily enterprise and mid-market companies and include Comcast Business, IBM, Hortonworks, Apttus and Zenefits.

It has also unveiled its latest service, EverString Predictive Ad Targeting, which it claims gives ‘demand generators’ more control over the top of their funnel.

While the marketing technology sector has grown significantly in recent years, few companies offer more than just niche, point solutions, said venture capitalist and market watcher Peter Nieh, partner at Lightspeed Venture Partners. “EverString is unique as it offers a comprehensive platform that helps B2B companies find and drive the best customer prospects through the sales funnel,” said Nieh.

Most marketing officers struggle to prioritise their efforts efficiently, according to the Chief Marketing Officer Council which said 67 per cent of CMOs believe this ‘essential’ function is wasted because they can’t manage multiple systems. EverString impressed stakeholders because it bypasses the heavy integration cycles usually associated with combining several new systems and builds on the limitations of old-school demand generation methods.

Predictive marketing works by identifying the ideal audience then applying models of customers to map out the expected prospect-to-customer journey. EverString Audience Selection then allows the sellers to analyse and validate their entire addressable market of prospects, and then proactively target the ideal audience. These predictions are based on the wisdom compiled in a database of 20,000 external signals combined with data from internal CRM and marketing automation systems.

Software and platforms as a service driving our growth says Oracle

OracleOracle’s latest quarterly results show the increasing strategic of importance of revenue from cloud software and platforms as a service, according to the vendor. Chairman Larry Ellison also claimed the sales figures show Oracle will soon overtake Salesforce as the top selling cloud operator.

The official figures for Oracle’s fiscal 2016 Q1 period show that total revenues were $8.4 billion, which represent a two per cent fall in US dollars but a seven per cent rise in constant currency. Oracle attributed the fall to the current strength of the US dollar.

However, a clearer pattern emerged in the nature of software sales, when benchmarking all sales in US dollars. While revenues for on premise software were down two per cent (in US dollars) at $6.5 billion, the total cloud revenues were up by 29 per cent at $611 million. The revenue from Cloud software as a service (SaaS) and platform as a service (PaaS) was $451 million, which represents a 34 per cent increase in sales. Cloud infrastructure as a service (IaaS) revenues, at $160 million, rose 16 per cent in the same period.

Meanwhile, Oracle’s total hardware revenue figure for the period, $1.1 billion, also indicated a decline, of three per cent. Using the same US dollar benchmark, Oracle’s services revenues for the period more or less stagnated, at $862 million, a rise of one per cent.

Growth is being driven by SaaS and PaaS, according to Oracle CEO Safra Catz. “Cloud subscription contracts almost tripled in the quarter,” said Catz, “as our cloud business scales-up, we plan to double our SaaS and PaaS cloud margins over the next two years. Rapidly growing cloud revenue combined with a doubling of cloud margins will have a huge impact on growth going forward.”

Oracle’s cloud revenue growth rate is being driven by a year-over-year bookings rise of over 150 per cent in Q1, reported Oracle’s other joint CEO Mark Hurd. “Our increasing revenue growth rate is in sharp contrast to our primary cloud competitor’s revenue growth rates, which are on their way down.”

Oracle is still on target to book up to $2.0 billion of new SaaS and PaaS business this fiscal year, claimed executive chairman Larry Ellison. “That means Oracle would sell between 50 per cent more and double the amount of new cloud business that Salesforce plans to sell in their current fiscal year. Oracle is the world’s second largest SaaS and PaaS company, but we are rapidly closing in on number one.”

IBM: New Cloud Service For Designing Sophisticated Electronic Systems For Mobile And Wearable Technology

IBM announced their new cloud service called High Performance Services for Electronic Design Automation (EDA), which is the industry’s first Software-as-a-Service that provides users the tools to build out electronic systems used for mobile and wearable devices. These tools have been made available through a partnership with SiCAD, a Silicon Design Platform provider with expertise in EDA, design flows, networking, security, platform development, and cloud technologies.

 

The service opens up the door for anyone interested in designing the next big tech innovation. IBM says that anyone with the SoftLayer subscription can begin designing electronic technology in the cloud using patented technology that IBM’s own internal product designers use. This service is delivered on their SoftLayer infrastructure and is on a pay as you go model.

 

IBM-Softlayer-Cloud_Loft2

 

The first phase of the launch will deliver three key tools: 1) IBM Library Characterization, to create abstract electrical and timing models required by chip design tools and methodologies; 2) IBM Logic Verification, to simulate electronic systems described using the VHDL and Verilog design languages; and 3) IBM Spice, an electronic circuit simulator used to check design integrity and predict circuit behavior, all on an IBM Platform LSF cluster built on the IBM SoftLayer cloud.

 

The cluster will use physical and network isolations for enhanced security and the cloud services ingle-tenant servers, which means that clients don’t share servers and firewalls to enhance security as well. These tested tools are expected to set new benchmarks in price-performance. This means that verification work will get done quicker and utilize fewer resources or produce better quality through more verification in the same amount of time.

 

Cloud computing provides the scalability requirements for EDA, along with making it affordable. Clients can scale up or down based on demand, manage peak demands, increase design productivity, reduce capital expenditure and increase operational efficiency using IBM High Performance Services. Another positive of using this cloud service is that clients will not need to purchase any new hardware or technology.

The post IBM: New Cloud Service For Designing Sophisticated Electronic Systems For Mobile And Wearable Technology appeared first on Cloud News Daily.

Accenture, Oracle form business unit to accelerate cloud uptake

Accenture and Oracle are forming a business unit to accelerate cloud  uptake

Accenture and Oracle are forming a business unit to accelerate cloud uptake

Oracle and Accenture are teaming up to create a joint business unit that will help mutual customers move more quickly onto (mostly Oracle) cloud platforms.

According to the companies the Accenture Oracle Business Group will bring together technologies and consulting power in order to help customers implement cloud-based services, which includes helping those clients tailor their business processes to those technologies.

Thomas Kurian, president, product development at Oracle said: “By providing a single process to implement end-to-end mission- critical services, the Accenture Oracle Business Group is ideally positioned to help our customers realize the true benefits of cloud computing.”

The group will offer vertically-integrated solutions built using Oracle’s software-as-a-service and platform-as-a-service offerings, supported by fleets of Accenture consultants skilled in Oracle and Java tech – who will also help implement cloud readiness and data migration strategies for clients.

“Building on our 23-year alliance relationship, the Accenture Oracle Business Group combines Accenture’s deep industry and technology experience with Oracle’s expansive set of cloud solutions to deliver client value not found elsewhere in the market today,” said Stephen Rohleder, group chief executive for North America, Accenture.

“This is part of our strategy to take advantage of Oracle’s leading technologies and build our business together for the future. It is a game-changer for our clients, Oracle, and Accenture,” Rohleder said.

Cloud security vendor Adallom secures $30m in series C led by HP

Adallom secured $30m in new funding this week from HP Ventures among others

Adallom secured $30m in new funding this week from HP Ventures among others

Cloud security service provider Adallom announced this week it has secured $30m in a series C funding round led by Hewlett Packard Ventures, which the company said it would put towards research and development.

Adallom, which was founded by cybersecurity veterans Assaf Rappaport, Ami Luttwak and Roy Reznik in 2012, offers a security service that integrates with the authentication chain of a range of SaaS applications and lets IT administrators monitor usage for every user on each device.

The software works with a conjunction of end-point and network security solutions and has a built-in, self-learning engine that analyses user activity on SaaS applications and assesses the riskiness of each transaction in real-time, alerting administrators when activity becomes too risky for an organisation given its security policies.

The company said the latest funding round, which brings the total amount secured by the firm since its founding three years ago to just under $50m, speaks to the rapid growth of the SaaS market, and the need for more flexible security solutions.

“The market’s embrace of our approach to cloud security and our investors’ continued confidence in our products, team and results to date is a strong endorsement of Adallom. It also serves as encouragement to continue to execute on our mission to deliver the best platform for protecting data in the cloud,” said Rappaport, Adallom’s chief executive. “We’re determined to exceed the expectations of our customers and investors, and continue our innovation in this market.”

The company said the investment will be used to double down on development and improve support for more services; it claims the security service already supports over 13,000 cloud apps.

Adallom’s funding round caps off a successful month for a number of cloud security vendors, with Palerra, ProtectWise and Elastica all securing millions in investment.

The PaaS Market as We Know it Will Not Die Off

I’ve been hearing a lot about Platform as a Service (PaaS) lately as part of the broader discussion of cloud computing from both customers and in articles across the web. In this post, I’ll describe PaaS, discuss a recent article that came out on the subject, and take a shot at sorting out IaaS, PaaS, and SaaS.

What is PaaS?

First a quick trip down memory lane for me. As an intern in college, one of my tours of duty was through the manufacturing systems department at an automaker. I came to work the first day to find a modest desktop computer loaded with all of the applications I needed to look busy, and a nicely printed sheet with logins to various development systems. My supervisor called the play: “I tell you what I want, you code it up, I’ll take a look at it, and move it to test if it smells ok.” I and ten other ambitious interns were more than happy to spend the summer with what the HR guy called “javaweb.” The next three months went something like this:

Part I: Setup the environment…

  1. SSH to abcweb01dev.company.com, head over to /opt/httpd/conf/httpd.conf, configure AJP to point to the abcapp01 and 02dev.company.com
  2. SSH to abcapp01.dev.company.com, reinstall the Java SDK to the right version, install the proper database JARs, open /opt/tomcat/conf/context.xml with the JDBC connection pool
  3. SSH to abcdb01dev.company.com, create a user and rights for the app server to talk to the web server
  4. Write something simple to test everything out
  5. Debug the environment to make sure everything works

Part II: THEN start coding…

  1. SSH to abcweb01dev.company.com, head over to /var/www/html and work on my HTML login page for starters, other things down the road
  2. SSH to devapp01dev.company.com, head over to /opt/tomcat/webapps/jpdwebapp/servlet, and code up my Java servlet to process my logins
  3. Open another window, login to abcweb01dev and tail –f /var/www/access_log to see new connections being made to the web server
  4. Open another window, login to abcapp01dev and tail –f /opt/tomcat/logs/catalina.out to see debug output from my servlet
  5. Open another window, login to abcdevapp01 and just keep /opt/tomcat/conf/context.xml open
  6. Open another window, login to abcdevapp01 and /opt/tomcat/bin/shutdown.sh; sleep 5; /opt/tomcat/bin/startup.sh (every time I make a change to the servlet)

(Host names and directory names have been changed to protect the innocent)

Setting up the environment was a little frustrating. And I knew that there was more to the story; some basic work, call it Part 0, to get some equipment in the datacenter, the OS installed, and IP addresses assigned. Part I, setting up the environment, is the work you would do to setup a PaaS platform. As a developer, the work in Part I was to enable me and my department to do the job in Part II – and we had a job to do – to get information to the guys in the plants who were actually manufacturing product!

 

So, here’s a rundown:

Part 0: servers, operating systems, patches, IPs… IaaS

Part I: middleware, configuration, basic testing… PaaS

Part II: application development

So, to me, PaaS is all about using the bits and pieces provided by IaaS, configuring them in a usable platform, delivering that platform to a developer so that they can deliver software to the business. And, hopefully the business is better off because of our software. In this case, our software helped the assembly plant identify and reduce “in-system damage” to vehicles – damage to vehicles that happens as a result of the manufacturing process.

Is the PaaS market as we know it dead?

I’ve read articles predicting the demise of PaaS altogether and others just asking the question about its future. There was a recent Networkworld article entitled “Is the PaaS market as we know it dying?” that discussed the subject. The article makes three main points, referring to 451 Research, Gartner, and other sources.

  1. PaaS features are being swallowed up by IaaS providers
  2. The PaaS market has settled down while the IaaS and SaaS markets have exploded
  3. Pure-play PaaS providers may be squeezed from the market by IaaS and SaaS

 

I agree with point #1. The evidence is in Amazon Web Services features like autoscaling, RDS, SQS, etc. These are fantastic features but interfacing to them locks developers in to using AWS as their single IaaS provider. The IaaS market is still very active, and I think there is a lot to come even though AWS is ahead of other providers at this point. IaaS is commodity, and embedding specialized (read: PaaS) features in an otherwise IaaS system is a tool to get customers to stick around.

I disagree with point #2. The PaaS market has not settled down – it hasn’t even started yet! The spotlight has been on IaaS and SaaS because these things are relatively simple to understand, considering the recent boom in server virtualization. SaaS also used to be known as something that was provided by ASPs (Application Service Providers), so many people are already familiar with this. I think PaaS and the concepts are still finding their place.

Also disagree with point #3, the time and opportunity for pure-play PaaS providers is now. IaaS is becoming sorted out, and it is clearly a commodity item. As we highlighted earlier, solutions from PaaS providers can ride on top of IaaS. I think that PaaS will be the key to application portability amongst different IaaS providers – kind of like Java: write once, run on any JVM (kind of). As you might know, portability is one of NIST’s key characteristics of cloud computing.

Portability is key. I think PaaS will remain its own concept apart from IaaS and SaaS and that we’ll see some emergence of PaaS in 2014. Why? PaaS is the key to portable applications — once written to a PaaS platform, it can be deployed on different IaaS platforms. It’s also important to note that AWS is almost always associated with IaaS, but they have started to look a lot like a PaaS provider (I touched on this in a blog earlier this month). An application written to use AWS features like AutoScaling is great, but not very portable. Lastly, the PaaS market is ripe for innovation. Barriers to entry are low as is required startup capital (there is no need to build a datacenter to build a useful PaaS platform).

This is just my opinion on PaaS — I think the next few years will see a growing interest in PaaS, possibly even over IaaS. I’m interested in hearing what you think about PaaS, feel free to leave me a comment here, find me on twitter at @dixonjp90, or reach out to us at socialmedia@greenpages.com

To hear more from John, download his whitepaper on hybrid cloud computing or his ebook on the evolution of the corporate IT department!

 

 

The 2013 Tech Industry – A Year in Review

By Chris Ward, CTO, LogicsOne

As 2013 comes to a close and we begin to look forward to what 2014 will bring, I wanted to take a few minutes to reflect back on the past year.  We’ve been talking a lot about that evil word ‘cloud’ for the past 3 to 4 years, but this year put a couple of other terms up in lights including Software Defined X (Datacenter, Networking, Storage, etc.) and Big Data.  Like ‘cloud,’ these two newer terms can easily mean different things to different people, but put in simple terms, in my opinion, there are some generic definitions which apply in almost all cases.  Software Defined X is essentially the concept of taking any ties to specific vendor hardware out of the equation and providing a central point for configuration, again vendor agnostic, except of course for the vendor providing the Software Defined solution :) .  I define Big Data simply as the ability to find a very specific and small needle of data in an incredibly large haystack within a reasonably short amount of time. I see both of these technologies becoming more widely adopted in short order with Big Data technologies already well on the way. 

As for our friend ‘the cloud,’ 2013 did see a good amount of growth in consumption of cloud services, specifically in the areas of Software as a Service (SaaS) and Infrastructure as a Service (IaaS).  IT has adopted a ‘virtualization first’ strategy over the past 3 to 4 years when it comes to bringing any new workloads into the datacenter.  I anticipate we’ll begin to see a ‘SaaS first’ approach being adopted in short order if it is not out there already.  However, I can’t necessarily say the same on the IaaS side so far as ‘IaaS first’ goes.  While IaaS is a great solution for elastic computing, I still see most usage confined to the application development or super large scale out application (Netflix) type use cases.  The mass adoption of IaaS for simply forklifting existing workloads out of the private datacenter and into the public cloud simply hasn’t happened.  Why?? My opinion is for traditional applications neither the cost nor operational model make sense, yet. 

In relation to ‘cloud,’ I did see a lot of adoption of advanced automation, orchestration, and management tools and thus an uptick in ‘private clouds.’  There are some fantastic tools now available both commercially and open source, and I absolutely expect to see this adoption trend to continue, especially in the Enterprise space.  Datacenters, which have a vast amount of change occurring whether in production or test/dev, can greatly benefit from these solutions. However, this comes with a word of caution – just because you can doesn’t mean you should.  I say this because I have seen several instances where customers have wanted to automate literally everything in their environments. While that may sound good on the surface, I don’t believe it’s always the right thing to do.  There are times still where a human touch remains the best way to go. 

As always, there were some big time announcements from major players in the industry. Here are some posts we did with news and updates summaries from VMworld, VMware Partner Exchange, EMC World, Cisco Live and Citrix Synergy. Here’s an additional video from September where Lou Rossi, our VP, Technical Services, explains some new Cisco product announcements. We also hosted a webinar (which you can download here) about VMware’s Horizon Suite as well as a webinar on our own Cloud Management as a Service Offering

The past few years have seen various predictions relating to the unsustainability of Moore’s Law which states that processors will double in computing power every 18-24 months and 2013 was no exception.  The latest prediction is that by 2020 we’ll reach the 7nm mark and Moore’s Law will no longer be a logarithmic function.  The interesting part is that this prediction is not based on technical limitations but rather economic ones in that getting below that 7nm mark will be extremely expensive from a manufacturing perspective and, hey, 64k of RAM is all anyone will ever need right?  :)

Probably the biggest news of 2013 was the revelation that the National Security Agency (NSA) had undertaken a massive program and seemed to be capturing every packet of data coming in or out of the US across the Internet.   I won’t get into any political discussion here, but suffice it to say this is probably the largest example of ‘big data’ that exists currently.  This also has large potential ramifications for public cloud adoption as security and data integrity have been 2 of the major roadblocks to adoption so it certainly doesn’t help that customers may now be concerned about the NSA eavesdropping on everything going on within the public datacenters.  It is estimated that public cloud providers may lose as much as $22-35B over the next 3 years as a result of customers slowing adoption due to this.  The only good news in this, at least for now, is it’s very doubtful that the NSA or anyone else on the planet has the means to actual mine anywhere close to 100% of the data they are capturing.  However, like anything else, it’s probably only a matter of time.

What do you think the biggest news/advancements of 2013 were?  I would be interested in your thoughts as well.

Register for our upcoming webinar on December 19th to learn how you can free up your IT team to be working on more strategic projects (while cutting costs!).

 

 

Cloud Spending Will Increase 1 Billion% by 2014

By Ben Stephenson, Journey to the Cloud

It seems like every week a new study comes out analyzing cloud computing growth. Whether it’s that Public Cloud Services Spending will reach $47.4B in 2013, Global SaaS spending projected to grow from $13.5B in 2011 to $32.8B in 2016, the public cloud services market is forecast to grow 18.5 percent in 2013, or cloud spending at Dunder Mifflin will increase 200% by 2020, the indication is that cloud adoption and spending are on the rise. But how is that relevant to you?

Does it matter to the everyday CIO that cloud spending at midsized companies west of the Mississippi is going to increase by 15% over the next 3 years? The relevant question isn’t how much will cloud adoption and spending increase, but why will it do so? It’s the “why” that matters to the business. If you understand the why, it becomes easier to put context around the statistics coming out of these studies. It comes down to a shift in the industry – a shift in the economics of how a modern day business operates. This shift revolves around the way IT services are being delivered.

To figure out where the industry is going, and why spending and adoption are increasing, you need to look at where the industry has come from. The shift from on-premise IT to public cloud began with SaaS based technologies. Companies like Salesforce.com realized that organizations were wasting a lot of time and money buying and deploying hardware for their CRM solutions. Why not use the internet to be able to allow organizations to pay a subscription fee instead of owning their entire infrastructure? This, however, was not true cloud computing. Next came IaaS with Amazon’s EC3 initiative. Essentially, Amazon realized it had excess compute capacity and decided to rent it out to people who needed the extra space. IaaS put an enormous amount of pressure on corporate IT because App Dev. teams no longer had to wait weeks or months to test and deploy environments. Instead, they could start up right away and become much more efficient. Finally, PaaS came about with initiatives such as Microsoft Azure.

{Free ebook: The Evolution of Your Corporate IT Department}

The old IT paradigm, or a private cloud environment, consists of organizations buying hardware and software and keeping it in their datacenter behind their own firewalls. While a private cloud environment doesn’t need to be fully virtualized, it does need to be automated and very few organizations are actually operating in a true private cloud environment. Ideally, a true private cloud environment is supposed to let internal IT compete with public cloud providers by providing a similar amount of speed and agility that a public cloud allows. While the industry is starting to shift towards public cloud, the private cloud is not going away. Public cloud will not be the only way to operate IT, or even the majority of the way, for a long time. This brings us to the hybrid cloud computing model; the direct result of this shift. Hybrid cloud is the combination of private and public cloud architectures. It’s about the ability to be able to seamlessly transition workloads between private and public, or, in other words, moving on-premise workloads to rented platforms where you don’t own anything in order to leverage services.

So why are companies shifting towards a hybrid cloud model? It all comes down to velocity, agility, efficiency, and elasticity. IT delivery methodology is no longer a technology discussion, but, rather, it’s become a business discussion. CIOs and CFOs are starting to scratch their heads wondering why so much money is being put towards purchasing hardware and software when all they are reading about is cloud this and cloud that.

{Free Whitepaper: Revolutionizing the Way Organizations Manage Hybrid Cloud Environments}

The spending and adoption rates of cloud computing are increasing because the shift in the industry is no longer just talk – it’s real and it’s here now. The bottom line? We’re past hypothetical discussions. There is a major shift in the industry that business decision makers need to be taking seriously. If you’re not modernizing your IT operations by moving towards a hybrid cloud model, you’re going to be missing out on the agility and cost savings that can give your organization a substantial competitive advantage.  This is why cloud adoption and spending are on the rise. This is why you’re seeing a new study every month on the topic.

Weidlinger Launches PZFlexCloud 3D Virtual-Prototyping SaaS Using CliQr Technologies CloudCenter, HP Cloud

Weidlinger Associates, Inc., the developer of PZFlex, a 3D virtual-prototyping and wave-propagation analysis software, and CliQr Technologies announced the launch of PZFlexCloud on CliQr’s CloudCenter platform. PZFlexCloud extends the market reach and performance of PZFlex’s engineering software by exploiting the power, elasticity, and ubiquity of the cloud. Running on HP Cloud Services, HP’s public cloud, PZFlexCloud is offered as a professional service as well as an additional feature of the full PZFlex solution suite.

“Cloud computing’s almost infinite on-demand resources, with its utility billing model, combined with our PZFlex finite element analysis [FEA] software as a service, is a game changer for the scientific and engineering communities,” said Dr. Robert Banks, PZFlex director and senior associate at the Mountain View, California, office of global engineering firm Weidlinger Associates. “PZFlexCloud represents a step change in the way high-fidelity FEA solutions can be accessed by a broad set of users, from large enterprises to innovative departments and individuals.”

By taking advantage of the power and elasticity of cloud computing, PZFlexCloud will permit experienced users to realize unprecedented performance and flexibility of use. An accurate multi-run 3D simulation for piezoelectric and wave propagation analysis that traditionally took 32 days was recently completed with the CliQr platform and PZFlexCloud in just 14 hours, allowing for more test iterations and shorter analysis times. PZFlexCloud also makes advanced FEA available to a broader market. With CliQr and PZFlexCloud, new users who have had to compromise on functionality can now use the PZFlex suite on a pay-as-you-go basis without the costs and complexities of building and maintaining capital-intensive physical computing resources.

Dr. Banks added, “PZFlexCloud eliminates the longstanding trade-offs between advanced speed, functionality, and approachable economics. Customers can get simplified access and high-performance use of the PZFlex solution without having to design, build, or maintain their own information-technology infrastructure.”

Contributing to PZFlexCloud’s success, the CliQr Technologies CloudCenter platform simplifies the migration and runtime management of the PZFlex software suite without requiring any modification of the leading FEA software. With CliQr’s CloudCenter, PZFlex was able to benchmark the price and performance of their application across all possible cloud environments and determine where their offering could deliver the best value for their customers. Running on HP Cloud Services, PZFlexCloud makes it easy, powerful, and secure to perform complex FEA on the cloud.

“CliQr shares Weidlinger’s value and vision to make the most sophisticated cloud solutions approachable and manageable by the broadest user base,” said Gaurav Manglik, CEO and co-founder of CliQr Technologies. “CliQr understands that software vendors want to take advantage of the cloud while protecting the time and investments they have already made in their core offerings. CliQr provides a complete platform for businesses like Weidlinger and their PZFlex offering, looking for an integrated approach to commercially use the cloud and maintain the ability to flexibly adapt to future changes in the cloud-computing landscape.”

“Scientific and engineering communities are looking for ways to access 3D virtual-prototyping solutions without having to build and maintain their own physical infrastructure,” said Dan Baigent, senior director, Business Development, Cloud Services, HP. “Running on HP Cloud Services, PZFlexCloud provides users with the ability to access PZFlex in the cloud in one click, which leads to much shorter analysis time at much lower cost.”

SaaSID Releases CAM 2.0, Adding Audit Dashboard for Security, Compliance

Web application security provider, SaaSID, has launched Cloud Application Manager 2.0 (CAM), the latest version of its browser-based authentication, management and auditing solution. CAM 2.0’s comprehensive audit report is now displayed in CAM Analytics, an intuitive dashboard that provides clear visibility of Web application use throughout an organization. The new software simplifies administration of authentication, feature controls and password management to help CIOs comply with data security regulations, standards and internal policies, by making it easier to govern, monitor and audit every user interaction with Web applications.

CAM 2.0’s comprehensive suite of dashboards in CAM Analytics provide at-a-glance graphics, showing managers exactly how employees are interacting with Web applications and associated corporate data, regardless of whether employees are working on company workstations or personally-owned computing devices. Detailed analytics provide managers with a complete overview of Web application use and the ability to drill down into reports for additional information. Activities such as exporting customer lists, or attaching sensitive files to Webmail, are tracked and clearly displayed for compliance. A range of graphic elements show social media activity and interactions with corporate applications, providing managers with complete visibility of departmental and individual use of Web applications.

CAM 2.0 users can now be authenticated and logged into Web applications from the SaaSID server. This server-side authentication improves security by ensuring that log-in credentials are protected from malware that might be present on an unsecured device. Users do not know their login details, so they cannot write them down, share them, or access managed applications from unprotected devices. Once CAM 2.0 has authenticated a user, the session is handed to the device and the user works with the application as normal.

Additional new features within CAM 2.0 include:

  • The new Restriction Learning feature which allows in-house IT staff to apply their own restrictions to application features. The simple GUI allows administrators to test the effect of restrictions prior to implementation.
  • Support for more two factor authentication solutions, including offerings from RSA, Vasco and ActivIdentity.
  • The new Password Wizard which learns the workflow for Web application authentication processes, enabling automated password resets. Organisations can use this new feature to change passwords at chosen intervals and to enforce strong password security for all Web applications managed by CAM 2.0: saving administration time and support costs, without impeding productivity.

CAM is a browser extension that goes beyond single sign-on (SSO) by enabling IT staff to manage Web application features according to employee roles. CAM assists organisations in maintaining security and compliance when they adopt Web applications and implement bring your own device (BYOD) programmes, by creating a comprehensive audit trail of all employee interactions with these Web applications.

To request a free trial or a demo of SaaSID’s CAM 2.0, see www.saasid.com.