We’ve just published our first main ‘Solution Guide’, a complete snapshot of how Cloud is applied in one specific area, in this case the Microsoft portfolio. MicrosoftCloud.biz is intended as a channel marketing tool to envelop all aspects of the Microsoft Cloud supply chain, ranging from in-house Private Cloud through Azure hosting and Office 365. […]
Archivo mensual: mayo 2012
Changing World of End User Devices
Let me start out by saying that I am not an Apple fan boy. I am not a Microsoft zealot or a Linux aficionado. I use them all daily; it is all about usability to me.
I wanted to talk about the shift I have seen in technology that we use in business every day. This transformation has been just as large and disruptive as virtualization. 10 years ago, the end user hardware and software was set and had very little diversity or customization. It was Win/Tel (Windows running on Intel processors) all the way with Microsoft Office. There was very little or no working from home, and you had to be in the office or have a VPN to the office to do your work.
Fast forward to today and the end user client and software environment has a lot more options. The other architects and I have …
Survey Reveals Cloud Adoption Should Quadruple
Cloud adoption amongst enterprises is now expected to significantly grow throughout the rest of 2012, however it has also been revealed that some serious obstacles must be overcome. This is according to a customer survey carried out by Cisco Systems.
The 2012 Cisco Global Cloud Networking Survey of 100 IT executives in each of 13 countries has unveiled that while just 5% of the IT executives are currently using cloud computing technology, in order to deliver the majority of the software applications they are using within their business, the figure is expected to rise to around 20% by the end of the year, thus quadrupling the amount of enterprises using cloud hosting as a solution.
Inbar Lasser-Raab, Senior Marketing Director of the Cisco Services Routing Technology Group (SRTG) has been explaining: “The reason so many are moving the majority of their apps to the cloud is because there are more …
A Job for Man or Machine
A Chief Technology Officer for a Midwest banking holding company made a very interesting observation. In commenting about the needed increase in fraud fighting resources, he warned about the perils of overemphasizing technology while ignoring training staff in using manual fraud-detection processes.
Most of what he says is spot on in terms of ensuring the proper prioritization, risk analysis and the blind reliance on technology to identify and neutralize threats and breaches. In fact, as an officer in a technology company, I happen to agree with him on almost everything he said.
He also noted that to prevent fraud, financial institutions need to go beyond adopting the latest technologies and ensure they have trained staff to identify fraud, such as by reviewing reports or spotting unusual activity
Now the key is how to cost effectively apply those resources, train those departments in the latest detection protocols and remediation, implement new layers of detection and correlation. Even for the largest corporation, this has the earmarks of an expensive (but obviously important) initiative. The answer can be found in the cloud as part of a security-as-service deployment.
SAP Releases “Top 50 Big Data Twitter Influencers” List
Yesterday, SAP released a list of “Top 50 #BigData Twitter Influencers” on their blog and included CTOvision editor Bob Gourley. The list was aimed at identifying who to follow for “the latest trends, news and opinions” on Big Data, as well as to identify the figures shaping the discussion and the evolution of field. The list […]
Securing Cloud-Based Communications at Cloud Expo New York
While cloud-based communications include a complex web of ports and protocols, typically 85 percent of the traffic flowing in and out of an organization through the cloud is email and web, including identity services. Taking more of your business and business processes to the cloud begins with knowing your data and ensuring that your existing IT security controls on these major traffic channels extend well into cloud models.
In his session at the 10th International Cloud Expo, Ramon Peypoch VP, Global Business Development at McAfee, will discuss what key security controls need attention when considering the cloud for your initiatives including preventing data leakage, protecting against exposure to service interruption, minimizing the risk surface as identity services are exposed beyond the physical network boundary, and maintaining your resilience to malware and web-borne threats.
C12G Launches Enterprise Edition of OpenNebula 3.4 Cloud Manager
C12G Labs has just announced an update release of OpenNebulaPro, the enterprise edition ofOpenNebula. OpenNebula 3.4, released one month ago, features enhancements in several cloud subsystems, like support for multiple datastores, resource pools, elastic IPs in the Amazon API, improved web GUIs, and better support for hybrid clouds with Amazon EC2. Some of these improvements were contributed by several members of the OpenNebula community, such as Research in Motion, Logica, Terradue 2.0, CloudWeavers, Clemson University, and Vilnius University.
OpenNebulaPro is used by corporations, research centers and governments looking for a hardened, certified, long-term supported cloud platform. OpenNebulaPro combines the rapid innovation of open-source with the stability and long-term production support of commercial software. Compared to OpenNebula, the expert production and integration support of OpenNebulaPro and its higher stability increase IT productivity, speed time to deployment, and reduce business and technical risks. Compared to other commercial alternatives, OpenNebulaPro is an adaptable and interoperable cloud management solution that delivers enterprise-class functionality, stability and scalability at significantly lower costs.
Study: Cloud Computing Cuts $5.5 Billion Annually from Federal Budget
The federal government saved nearly $5.5 billion a year by moving to cloud services. But it might have saved up to $12 billion if cloud strategies were more aggressive, a survey of federal IT managers found.
The study, drawn from interviews with 108 federal CIOs and IT managers, was published by MeriTalk Cloud Computing Exchange, a community of federal government leaders focused on public-private collaboration in Washington, D.C.
The IT managers surveyed also reported spending 11 percent of their current, fiscal year 2013 budgets, or $8.7 billion, on cloud computing.
The chief impediment to implementing cloud services was security, listed by 85 percent of federal IT managers. Also of concern were agency culture, named by 38 percent of managers, and service levels, listed by 32 percent of managers.
Lucid Imagination Combines Search, Analytics and Big Data to Tackle the Problem of Dark Data
Organizations today have little to no idea how much lost opportunity is hidden in the vast amounts of data they’ve collected and stored. They have entered the age of total data overload driven by the sheer amount of unstructured information, also called “dark” data, which is contained in their stored audio files, text messages, e-mail repositories, log files, transaction applications, and various other content stores. And this dark data is continuing to grow, far outpacing the ability of the organization to track, manage and make sense of it.
Lucid Imagination, a developer of search, discovery and analytics software based on Apache Lucene and Apache Solr technology, today unveiled LucidWorks Big Data. LucidWorks Big Data is the industry’s first fully integrated development stack that combines the power of multiple open source projects including Hadoop, Mahout, R and Lucene/Solr to provide search, machine learning, recommendation engines and analytics for structured and unstructured content in one complete solution available in the cloud.
With LucidWorks Big Data, Lucid Imagination equips technologists and business users with the ability to initially pilot Big Data projects utilizing technologies such as Apache Lucene/Solr, Mahout and Hadoop, in a cloud sandbox. Once satisfied, the project can remain in the cloud, be moved on premise or executed within a hybrid configuration. This means they can avoid the staggering overhead costs and long lead times associated with infrastructure and application development lifecycles prior to placing their Big Data solution into production.
The product is now available in beta. To sign up for inclusion in the beta program, visit http://www.lucidimagination.com/products/lucidworks-search-platform/lucidworks-big-data.
How big is the problem of dark data? The total amount of digital data in the world will reach 2.7 zettabytes in 2012, a 48 percent increase from 2011.* 90 percent of this data will be unstructured or “dark” data. Worldwide, 7.5 quintillion bytes of data, enough to fill over 100,000 Libraries of Congress get generated every day. Conversely, that deep volume of data can serve to help predict the weather, uncover consumer buying patterns or even ease traffic problems – if discovered and analyzed proactively.
“We see a strong opportunity for search to play a key role in the future of data management and analytics,” said Matthew Aslett, research manager, data management and analytics, 451 Research. “Lucid’s Big Data offering, and its combination of large-scale data storage in Hadoop with Lucene/Solr-based indexing and machine-learning capabilities, provides a platform for developing new applications to tackle emerging data management challenges.”
Data analytics has traditionally been the domain of business intelligence technologies. Most of these tools, however, have been designed to handle structured data such as SQL, and cannot easily tap into the broad range of data types that can be used in a Big Data application. With the announcement of LucidWorks Big Data, organizations will be able to utilize a single platform for their Big Data search, discovery and analytics needs. LucidWorks Big Data is the only complete platform that:
- Combines the real time, ad hoc data accessibility of LucidWorks (Lucene/Solr) with compute and storage capabilities of Hadoop
- Delivers commonly used analytic capabilities along with Mahout’s proven, scalable machine learning algorithms for deeper insight into both content and users
- Tackles data, both big and small with ease, seamlessly scaling while minimizing the impact of provisioning Hadoop, LucidWorks and other components
- Supplies a single, coherent, secure and well documented REST API for both application integration and administration
- Offers fault tolerance with data safety baked in
- Provides choice and flexibility, via on premise, cloud hosted or hybrid deployment solutions
- Is tested, integrated and fully supported by the world’s leading experts in open source search
- Includes powerful tools for configuration, deployment, content acquisition, security, and search experience that is packaged in a convenient, well-organized application
Lucid Imagination’s Open Search Platform uncovers real-time insights from any enterprise data, whether structured in databases, unstructured in formats such as emails or social channels, or semi-structured from sources such as websites. The company’s rich portfolio of enterprise-grade solutions is based on the same proven open source Apache Lucene/Solr technology that powers many of the world’s largest e-commerce sites. Lucid Imagination’s on-premise and cloud platforms are quicker to deploy, cost less than competing products and are more easily tailored to specific needs than business intelligence solutions because they leverage innovation from the open source community.
“We’re allowing a broad set of enterprises to test and implement data discovery and analysis projects that have historically been the province of large multinationals with large data centers. Cloud computing and LucidWorks Big Data finally level the field,” said Paul Doscher, CEO of Lucid Imagination. “Large companies, meanwhile, can use our Big Data stack to reduce the time and cost associated with evaluating and ultimately implementing big data search, discovery and analysis. It’s their data – now they can actually benefit from it.”
TeleCommunication Systems Receives 35 U.S. Patents Advancing Mapping/Mobile Location, Public Safety, Messaging, Wireless Data
TeleCommunication Systems, Inc. today announced that the U.S. Patent and Trademark Office has issued TCS 35 patents related to mobile location/mapping, public safety, messaging and wireless data.
Patents from this new group include 13 mapping and mobile location patents, covering navigation and route determination, location and points of interest, and techniques for locating a user:
Navigation and Route Determination:
- “Method and System for Dynamic Estimation and Predictive Route Generation” (U.S. Patent No. 8,095,152)
- “Stateful, Double-Buffered Dynamic Voice Prompting” (U.S. Patent No. 8,099,238)
- “System and Method for Providing Routing, Mapping, and Relative Position Information to Users of a Communication Network” (U.S. Patent No. 8,107,608)
- “Method and System for Enabling an Off-Board Navigation Solution” (U.S. Patent No. 8,090,534)
- “Method and System for Saving and Retrieving Spatial Related Information” (U.S. Patent No. 8,169,343)
The location and points of interest patents provide important benefits for computing and mapping devices that receive discrete position updates for the purpose of monitoring, planning and analysis of mobile devices’ positional information used for mapping, routing and direction finding. Key aspects also include requesting and receiving audio navigation voice prompts and new methods for storing and retrieving locations and/or metadata, such as stop points and images.
Location and Points of Interest:
- “Device-Based Trigger for Location Push Event” (U.S. Patent No.8,099,105)
- “Predictive Ephemeral Points of Interest (PEPOI)” (U.S. Patent No. 8,156,068)
- “Position Identification Method and System” (U.S. Patent No. 8,090,796)
- “Wireless Network Location-Based Reference Information” (U.S. Patent No. 8,032,166)
Many navigation solutions utilize a database of points of interest (POI) to assist the user in efficiently and effectively routing to/from locations. POIs can be either fixed, such as restaurants, gas stations, etc., or moveable, such as employees and vehicles. The data associated with most POIs can be outdated, as it is stored when the system is created. Thus, conventional navigation systems, for example, would return a list of all possible gas stations to the user, regardless of operating hours, as well as whether or not the POI moved. These patents provide a means to allow the owners of POIs to securely update and post their information on a real-time basis, while also providing a means for the navigation system to predict the future locations of POIs. The patents also cover the use of location triggers for points of interest based upon previous user patterns, stored preferences, etc.
Techniques for Locating a User:
- “Personal Location Code” (U.S. Patent No. 8,165,603)
- “Location Fidelity Adjustment Based on Mobile Subscriber Privacy Profile (“Location and Privacy”)” (U.S. Patent No. 8,126,889)
- “Location Derived Presence Information” (U.S. Patent No. 8,032,112)
- “User Plane Location-Based Service Using Message Tunneling to Support Roaming” (U.S. Patent No. 8,126,458)
These inventions allow both a caller and receiver to provide each other with position information related to the caller and/or receiver’s physical location, including address information, GPS coordinates and nearby fixed locations (such as a parking structure, etc.) while the users are stationary or roaming. The user controls who accesses this information.
“TCS has long been a pioneer and leading innovator in messaging, mobile location/mapping, public safety and wireless technology,” said Maurice B. Tose, TCS chairman, CEO and president. “These 35 new patents – including the 13 mapping & location patents – demonstrate our continued commitment to apply innovative engineering solutions to improve mobile device communications.”
“There are significant opportunities to leverage our company’s 200+ patents and other intellectual property around the world, through both licensing and partnership arrangements in core and non-core business areas,” said Bob Held, TCS senior director of intellectual asset management.
With clear strengths in mobile location/mapping, public safety, messaging, and wireless communication fields, TCS has created an impressive intellectual property portfolio. Meaningful partnerships with other industry-leading companies can be developed through direct licensing, cross licensing and joint venture agreements. In 2011, TCS was issued 44 U.S. patents; for the year to date, 29 U.S. patents have been issued, bringing the total number of patents issued worldwide to TCS to date to 218, with 309 patent applications pending worldwide.
The following is a complete list of these new patents: Mobile Location/Mapping (13): 8,095,152; 8,099,238; 8,099,105; 8,166,603; 8,169,343; 8,107,608; 8,032,166; 8,032,112; 8,126,458; 8,126,889; 8,156,068; 8,090,534; 8,090,796. Public Safety (11):8,103,242; 8,068,587; 8,059,789; 8,102,972; 8,102,252; 8,116,722; 8,155,109; 8,149,997; 8,150,363; 8,150,364; 8,175,570. Wireless Data (5): 8,090,856; 8,089,401; 8,095,663; 8,127,025; 8,161,553. Messaging (4): 8,073,477; 8,090,341; 8,175,953; 8,060,429. Semiconductor Drives (1): 8,154,881. Secure Communications (1): 8,090,941.