SYS-CON Events announced today that Net Optics, Inc., the leading provider of Intelligent Network Access and Monitoring Architecture, will exhibit at SYS-CON’s 13th International Cloud Expo, which will take place on November 4–7, 2013, at the Santa Clara Convention Center in Santa Clara, CA.
Net Optics is the leading provider of Intelligent Access and Monitoring Architecture solutions that deliver real-time IT visibility, monitoring and control since 1995. As a result, businesses achieve peak performance in network analytics and security. More than 7,500 enterprises, service providers and government organizations – including 85 percent of the Fortune 100 – trust Net Optics’ comprehensive smart access hardware and software solutions to plan, scale and future-proof their networks through an easy-to-use interface. Net Optics maintains a global presence through leading OEM partner and reseller networks.
Monthly Archives: November 2012
Security Is Not the Only Barrier to Cloud Adoption
Migrating from VLANs to the public cloud is not trivial.
So public cloud adoption should be a no-brainer, right? Oh wait, but Andy omitted security in the public cloud – how can I trust that my customers’ sensitive data is secure in the public cloud?
I agree, the message wears thin that enterprise businesses are apprehensive to store sensitive customer data in the public cloud, and thus hesitant to adopt the cloud at all. (By the way, it was also very refreshing in Jassy’s keynote to see NASA’s JPL use of the public cloud, as the Netflix story also gets old.)
Maybe it’s the network ?
Enterprises might not be able to migrate their applications to the public cloud, because of how their datacenter servers are connected and secured. A customer came over to our booth at AWS re:invent and while very excited about the AWS public and our announcement, wanted to know “What do I do about my VLANS? My physical and virtual servers are isolated at layer 2? And my servers have two networks: one network for data and applications, the second network only for administrators – how do I architect this in AWS?”
AWS Might Do $1.5 Billion This Year
RW Baird analyst Colin Sebastian takes Amazon Web Services for a “potentially under-appreciated asset” although he thinks it’ll do $1.5 billion in revenue this year.
It takes a lot of capital outlay to support, roughly $500 million or about 50% of total 2010 capex, Sebastian thinks, and says its profitability is unproven.
Amazon doesn’t break out AWS numbers but has put its IaaS market share at about 60% and AWS boss Andy Jassy said this week that the cloud unit has the potential to be Amazon’s biggest business, outgrowing its online retail parent.
Using Cloud for Disaster Recovery
Use of cloud for DR solutions is becoming more common, even the organizations which are not using cloud for mission critical production applications are moving towards using cloud for application DR.
Faster Recovery Time Objective (RTO): Typically DR requires lengthy manual processes to fully restore the business applications at the DR site. Having backup data and servers at the DR site is easy, however, restoring the entire application or service takes time. E.g. full application restoration requires starting services in specified order, performing dns and other configuration updates etc. In Cloud, the IaaS APIs provide ability to use automation solutions like Kaavo IMOD to fully restore the business applications automatically without manual intervention. As a result organizations get predictable recovery and reduced RTO. Automating the service or application recovery can reduce RTO to minutes from hours or days.
The Cure for the Common Cloud-Based Big Data Initiative
There is no doubt that Big Data holds infinite promise for a range of industries. Better visibility into data across various sources enables everything from insight into saving electricity to agricultural yield to placement of ads on Google. But when it comes to deriving value from data, no industry has been doing it as long or with as much rigor as clinical researchers.
Unlike other markets that are delving into Big Data for the first time and don’t know where to begin, drug and device developers have spent years refining complex processes for asking very specific questions with clear purposes and goals. Whether using data for designing an effective and safe treatment for cholesterol, or collecting and mining data to understand proper dosage of cancer drugs, life sciences has had to dot every “i” and cross every “t” in order to keep people safe and for new therapies to pass muster with the FDA. Other industries are now marveling at a new ability to uncover information about efficiencies and cost savings, but – with less than rigorous processes in place – they are often shooting in the dark or only scratching the surface of what Big Data offers.
New AWS Pipeline Tool Aims to Make Effective Use of Your Business Data
Amazon’s new AWS Data Pipeline product “will help you move, sort, filter, reformat, analyze, and report on data in order to make use of it in a scalable fashion. ” You can now automate the movement and processing of any amount of data using data-driven workflows and built-in dependency checking.
A Pipeline is composed of a set of data sources, preconditions, destinations, processing steps, and an operational schedule, all definied in a Pipeline Definition.
The definition specifies where the data comes from, what to do with it, and where to store it. You can create a Pipeline Definition in the AWS Management Console or externally, in text form.
Licensed to Print Money (In the Cloud)
One of the major issues facing cloud service providers is the expense of building out infrastructure without knowing how or when revenues will follow. As a result, cloud providers are reevaluating their approach to hardware and software investments and engaging with technology and networking vendors to develop creative pricing models that are aligned with cloud business principles and engineered to reduce risks.
In a perfect world, cloud service providers would pay for infrastructure only after a customer has made a purchase – in order to maintain a tight correlation between revenues and expenses. In the real world, however, implementing this type of model is easier said than done. This was especially true during the ‘iron’ age, when hardware and software were highly coupled and there were very few alternatives to big vendors that focused on selling high-dollar, high-performance networking gear.
SiSense Out to Democratize Big Data Analytics
Tel Aviv-based start-up SiSense Ltd raised $8 million and bought a ticket to California, where it’s set up shop across the street from Oracle, a definite competitor.
See, SiSense, which obviously has a sense of humor, has a business intelligence tool dubbed Prism that it calls the “world’s smallest Big Data analytics solution.” It can crunch a terabyte of data on a sub-$750 laptop with 8GB of RAM.
The company’s Elasticube technology with its in-memory columnar data store, strong data compression, parallel processing and advanced query optimization is supposed to offer analytical processing power previously available only with high-end solutions.
It claims non-technical users can analyze 100 times more data at least 10 times the speed of current in-memory analytics solutions.
ClickFuel Gets $4 Million for Fuel Station Expansion
ClickFuel recently closed a $4 million Series B Round with Baird Venture Partners among the participants. The investment will be used to develop new products and services to expand the use of Fuel Station in data and business intelligence applications.
Fuel Station is a SaaS-based marketing analytics and performance management solution tailored for small to medium-sized businesses, providing more than 200,000 activated dashboards for SMBs to access, track and monitor marketing initiatives and results. Through more than two dozen partners, Fuel Station addresses the challenge SMBs face in maximizing online marketing spending, tracking campaign effectiveness and reducing the time necessary to analyze campaign data.
With Fuel Station, ClickFuel partners such as The E.W. Scripps Company, DudaMobile and Propel Marketing, a GateHouse Media company, provide their SMB customers with comprehensive marketing dashboards that address the complexity of marketing and advertising campaign performance. This dashboard delivers data on a single platform to empower decision-making and improve campaign productivity and presence. ClickFuel partners also leverage Fuel Station’s back-end business intelligence platform to improve account managers’ efficiency, differentiate their businesses in the crowded media industry, increase customer lifetime value and reduce customer churn.
“The Fuel Station dashboard has quickly become an invaluable tool for our account executives from coast to coast,” said Adam Symson, chief digital officer of Scripps, a leading media enterprise. “Our television stations and newspapers are focused on providing the market’s most-effective solutions for our advertisers, and Fuel Station makes it possible to measure and track our success so customers have actionable evidence that we’re helping them build their businesses.”
“Providing our customers with transparency into their marketing investments across all mediums helps us build trust and lasting relationships with them. As a convergent resource that fully integrates all components of a marketing campaign from pay-per-click to behavioral analytics to call tracking, Fuel Station makes it easy for our clients to visualize their campaigns and returns on investment, and helps our account managers better serve our customers,” said Dave Myer, director of advertising operations at DudaMobile, anonline platform that converts websites into mobile friendly websites.
Research and Markets: Potential of Cloud Computing
Research and Markets has announced the addition of the “Potential of Cloud Computing” report to their offering.
First there was the advent of the Internet that changed the manner in which we do business forever. Now, with the advent of cloud computing, the world is ready to undergo another major shift in terms of technology.
Cloud computing is an internet-based process that makes it possible to share information, software and even resources from computers to other devices all through the internet. The concept of cloud computing brings forth a new delivery model for IT services that are conducting businesses over the Internet. The process generally involves provision of scalable and virtualized resources over the internet. Not only does the process provide ease-of-access, but the speed and overall reliability of the entire concept of cloud computing is changing the IT industry rapidly.
Taiyou Research presents an analysis of the Potential of Cloud Computing.
Key Topics Covered:
1. Executive Summary
2. Overview of Cloud Computing
3. Market Profile
4. Benefits of Deploying the Cloud
5. Cost Benefits to Organizations from Cloud Systems
6. Cloud Computing Delivery Modes
7. Cloud Computing Deployment Models
8. Understanding the Concept behind Cloud Computing
9. Application Programming Interfaces
10. Cloud Computing Taxonomy
11. Deployment Process of the Cloud System
12. Technical Features of Cloud Systems
13. Understanding Cloud Clients
14. Regulatory Landscape & Investment
15. Commercializing of Cloud Computing
16. Concepts Related to Cloud Computing
17. Cloud Computing versus Other Computing Paradigms
18. Cloud Exchanges and Markets Worldwide
19. Research Projects on Cloud Computing
20. Cloud Computing Case Studies
21. Future of Cloud Computing
22. Market Leaders
23. Appendix
24. Glossary