Challenges in Data Access for New Age Data Sources

The Big Data and Cloud “movements” have acted as catalysts for tremendous growth in fit-for-purpose databases. Along with this growth, we see a new set of challenges in how we access the data through our business-critical applications. Let’s take a brief look at the evolution of these data access methods (and why we are in the mess we are in today).
Back in the ’80s the development of relational databases brought with it a standardized SQL protocol that could be easily implemented within mainframe applications to query and manipulate the data. These relational database systems supported transactions in a very reliable fashion through what was called “ACID” compliance (Atomicity, Consistency, Isolation, and Durability). These databases provided a very structured method of dealing with data and were very reliable. But ACID compliance also brought along lots of overheard process. Hence a downfall – they were not optimized to handle large transaction requests, nor could they handle huge volumes of transactions. To counteract this, we’ve did some significant performance and throughput enhancements within data connectivity drivers that lit a fire under the SQL speeds and connectivity efficiencies.

read more

Nirvanix Taps Zynga Exec as Its CEO

Zynga CIO Debra Chrapaty has been named CEO of cloud, on-premise and hybrid storage house Nirvanix.
Starting next month she’ll replaces Dru Borden, who joined the company last year as strategy chief and was recently promoted. Borden will remain with Nirvanix as SVP of planning and development as well as a board member.
Chrapaty has been chairman of Nivanix since last November when Khosla Ventures became its lead investor. It is also backed by Intel and wants to take on Amazon and the analytics market.

read more

SSD Comes to the Cloud

We’re now entering a world where cloud-managed big data is not just a topic of conversation in tech circles but a necessity for companies and service providers alike. Shifting data needs drive technological innovation. The truth is that as data gets bigger, older technology isn’t able to keep up with data production and storage. With servers, more cloud-based companies are ditching archaic spinning disk technology for powerful storage arrays fueled by solid-state storage technology.
This isn’t to say that there won’t be some significant challenges for early adopters of solid-state storage in the cloud. In fact, this is completely new technology few organizations are attempting to pioneer.

read more

Telefónica Launches Cloud Storage Services for Businesses

“We identified a need in the market for Spain-based, secure cloud storage services,” said Andres López Hedoire, marketing manager, cloud security and management at Telefónica, as it was announced today that Telefónica, one of the world’s largest telecom providers, has launched cloud storage services using the Cloud Storage Enablement Suite from CTERA.

read more

Cloud Expo NY: Environmental Pressures Drive an Evolution in File Storage

Stagnant budgets, overwhelming data growth, and new user and application demands are just a few of the many challenges that are putting IT organizations under more pressure today than ever before. As a result, a new approach is required.
In his session at the 12th International Cloud Expo, Jeff Lundberg, senior product marketing manager for file, content and cloud at HDS, will discuss why object storage-based private cloud is necessary for evolving into a next-generation of IT that supports a new world of applications and storage service delivery models. Attendees will learn to:
Embrace cloud while retaining proper stewardship and governance of data
Deliver better, more cost-effective protection of your data
Bring structure to unstructured data so you can better understand it
Enjoy more efficient storage and network utilization

read more

LiquidPlanner 4.3 Bridges Gap Between Task, Project Management

LiquidPlanner has released new features to their priority-based, predictive project management solution to improve team performance and collaboration.  Checklists can now be added to any task in LiquidPlanner, so that all individual steps can easily be listed and crossed off as they’re completed. Checklist items can be reordered by dragging and dropping, copied between tasks, or assigned to other team members who need to review or approve the work.

“Checklists are the ultimate Swiss Army Knife for project teams,” said Liz Pearce, LiquidPlanner CEO. “They can be used for quality control, new hire training, tracking individual to-do lists, managing repeatable processes, and much more. By capturing the steps that go into tasks in simple checklists, teams can simplify their project plans and—at the same time—better ensure that the work is being done right.”

The new release also includes a complete overhaul of key collaboration features. Comments are now threaded (like Facebook) instead of streamed in date order (like Twitter), so conversations can be followed more easily. Conversation threads can be filtered by client, project, or team. Customers can also choose which types of activities or events (such as adding documents or marking items done) trigger email notification, which drastically improves the signal-to -noise ratio of email alerts.

While many PPM solutions brush off simple task management in favor of more robust scheduling and resource management features, LiquidPlanner is committed to serving the needs of both individual contributors and managers with its dynamic solution.

“To help companies be successful, Social Task Management vendors must boost employee productivity with simple task tracking and allow for more comprehensive resource planning,” said Alan Lepofsky, VP and Principal Analyst at Constellation Research. “By using predictive scheduling and time tracking in combination with lightweight checklists, teams get the best of both worlds.”

BluePhoenix Moves Mainframe COBOL, Batch Processing to the Cloud

BluePhoenix has released their Cloud Transaction Engine and Batch In The Cloud Service. The Cloud Transaction Engine (CTE) is a module of the company’s soon-to-be-released ATLAS Platform.    CTE is a proprietary codebase that enables mainframe processes to be run from off- mainframe infrastructure. BluePhoenix’s Batch In The Cloud service is the first formal offering leveraging CTE capabilities.

“Batch In The Cloud uses off-mainframe, cloud-based processing power to reduce mainframe MIPS and total cost of ownership,” explains Rick Oppedisano, BluePhoenix’s Vice President of Marketing. “The huge array of virtual machines in the cloud brings greater performance and scalability than the mainframe. Jobs can be processed quicker at a lower cost. It’s a great way for customers to save money immediately and explore options for an eventual mainframe transition.”

The Batch In The Cloud service is supported on private or public clouds, including Microsoft’s Azure and Amazon’s EC2. This service is designed to enable COBOL, CA GEN and Natural/ADABAS mainframe environments.

“In a typical scenario, workloads continue to grow while the mainframe’s processing power and batch window stays the same,” says BluePhoenix’s VP of Engineering, Florin Sunel. “Our technology acts as a bridge between the mainframe and cloud. With Batch In The Cloud, all business logic is preserved. Customers can reduce usage cost by running jobs like reporting from the cloud platform rather than the mainframe. In that scenario, they can also add business value by using modern business intelligence tools that aren’t compatible with the mainframe to gain insight from their data.”
Adds Oppedisano, “Beyond the immediate cost savings, this technology creates a competitive advantage. Exposing data in an off-mainframe location empowers the customer to become more agile. Not only can they process reports faster, but they can slice and dice their data to get a broader perspective than competitors who keep data on the mainframe.”

“By moving batch workloads to Windows Azure or a Microsoft Private Cloud, companies are able to take advantage of cloud economics,” said Bob Ellsworth, Microsoft Worldwide Director of Platform Modernization. “Combined with the advanced analytics included in SQL Server, the customer not only realizes great savings, scale and flexibility but increased business value through self-service BI.”

BluePhoenix is offering a free Proof of Concept for the Batch In The Cloud service. “To manage the scale and demand, we’re going to start with a complimentary assessment of the customer environment to identify the most appropriate applications for this service,” says Oppedisano. “Once those applications are identified, we will build the roadmap and execute the Proof of Concept on the cloud platform of the customer’s choice.”

Additional details on the Batch In The Cloud service and Proof of Concept can be found here.

The Mac Chronicles – A CTO Perspective

By Chris Ward, CTO, LogicsOne

It was early February and I was quite excited because it was finally time for me to get a new laptop.  We had recently enacted our Bring Your Own Device “BYOD” policy so I had a decision to make regarding what type of machine I wanted to carry around for the next 3 years of my life.  I’ve been in consulting my entire professional career and always had a laptop given to me by the internal IT group of whatever company I had worked for, albeit with a little personal input on the matter.  So, for the past 16 years, I had carried a Compaq or HP laptop of some flavor in my bag.  Normally, I would always try to get the most bad ass machine I could, which in the land of HP meant a mobile workstation and they were always great.  Fast, multi-core processors, lots of memory, lots of disk space, great video card, and great screen with high resolution.  The downside for me, however, was constant neck pain after lugging around 8-10 lb. laptops over my shoulder for a decade and a half.  So, I decided this time was going to be different.

In my job roles over the past two to three years, I have not been as hands on in the field doing actual implementations and such so no longer truly need the horsepower to run multiple virtual machines, have serial cables to connect to routers/switches, or have a myriad of tools at my beck and call.  No, now that I am a ‘suit’ I need something that is lightweight and very portable as I tend to find myself on planes, trains, and automobiles quite often.  So, I decided to go with the sexy choice and started looking at MacBooks.  I was very skeptical of moving to a Mac platform from an application and productivity perspective but, at the same time, I wanted to learn more about OS X and its BSD/Linux underpinnings so I decided to take the plunge.  The following is an editorial of my personal experience in making this transition.

I picked out a nifty new 13″ Retina display MacBook pro vs. an Air due to the faster i7 processor and the Retina display (yes, I am still a nerd at heart so I do still care).  I was disappointed to discover that with the 13″ Pro you could not get more than 8GB of RAM and also that the memory is literally soldered to the system board so there is no upgrading.  Ok, well as I stated earlier, I no longer need to run 5 virtual machines at once so I’ll live.  I really wanted a lot of drive capacity and performance because I am an impatient guy who does still travel with every OS service pack dating back to Netware 4.11, Windows 2000, and ESX 2.5 (because hey, you never know when you’ll need that stuff right?) and a lot of ripped DVDs to make those 6 hour flights between coasts a little more bearable.  Well, the 512GB SSD option for said MacBook Pro was a pretty penny, but I found a 3rd party one online for a few hundred bucks less and figured, no problem, I’ll upgrade it myself.  So, a few days later, the shiny new Mac and separate SSD show up.  Now, here is where the fun really begins…

So, I know what you’re thinking…. Is this CTO guy really a big enough dumbass to buy a standard SSD to put into a MacBook?  Well, no, I was fully aware of the proprietary form factor of the SSD drives in the Retina MacBooks and did get the correct one and, yes, I know the legacy of Mr. Jobs still remains and he doesn’t want me jacking around with the inside of his precious work of art.  So, anyone ever heard of a Pentalobe screwdriver bit?  No?  Well, me neither. This is what you need to get the bottom cover off the MacBook in order to swap the SSD.  I went to my local Home Depot, Lowes, etc. looking for such a bit but no luck.  I then went to my trusty local Mac retail store (Not an Apple store, but the local mom and pop joint), and while they did have one, it was with their technician and they were not willing to let me borrow it for an hour.  At this point, I was starting to become a bit agitated (again, impatient) but sucked it up and found what I needed online and ordered the magical Pentalobe screwdriver set ($15) plus overnight shipping ($10 – again, impatient).  It arrived the next day and I was off to the races.

If you have not personally seen the inside of one of these MacBooks, the area where this special SSD goes is EXACTLY the same form factor as a standard 2.5″ laptop drive.  However, this special SSD that is just a circuit board has to go into a special case with a special internal connector which connects to a standard SATA cable, but the cable connects to the side of the enclosure vs. the back as a standard SATA SSD.  Wow, someone went through a crap-load of trouble to design a very proprietary solution which was absolutely unnecessary.  Note to Apple, I hope you are enjoying the margin you are making on this stuff!  In any case, I digress, so I got the new drive installed and was now ready to rock.

I got all of the key software I would need ready to go (Office 2011 for Mac, Firefox, Adobe stuff, VIEW and Citrix clients, VLC (gotta watch those movies), Skype, etc. and got them all setup.  And, just in case, I did install VMware Fusion and had a Windows 7 VM on the off chance I would need it for something.  Now, keep in mind that my ultimate goal here with the Mac was to go native.  If I had to constantly be in a Windows VM to do my job then what the F would be the point of using a Mac in the first place right?  Well, the first thing I quickly discovered is that Outlook 2011 is a piece of crap compared to Outlook 2010 or 2013 for Windows.  There is no home style screen where I can see my mail, tasks, and calendar in a single place.  There is no native ActiveSync but rather some ancient sync engine that has more conflicts than a schizophrenic sociopath.  Trying to use group calendaring to see where my team was and what they were up to caused issues because I had to have so many calendars open at the same time (mind you I did this with zero issue in Outlook 2010/2013).  Basically, I was back to using Outlook XP.  So, I thought, well, I want to go native so I’ll go native and use the built in Mail and Calendar stuff from Apple.  While there were some things that got a little better, it still paled in comparison when compared to the experience with full Outlook on a PC.  Then, I got to looking at some of the key reports I use regularly via Excel.

Ok, so there is no ODBC driver that comes out of the box with Excel/Office for Mac.  Oh, but you can buy one from a couple of 3rd parties and Microsoft is happy to point you in the right direction.  Personally, I wouldn’t care if they sold it for a penny, I still wouldn’t buy it.  Are you kidding me?  I can’t update a spreadsheet via an ODBC connection to a backend database?  I’m pretty sure I could do that with Lotus123 on Windows 3.1, give me a break!  So, it was off to the Windows 7 VM for Excel tasks.  Unfortunately, this was only the beginning of my headaches…

I immediately found problems with certain web sites that I use on a regular basis due to the Retina display and the way it scales resolution.  What I didn’t understand about Retina up front (and should have researched it more) was that while the advertised resolution is pretty stunning, the way it actually works is to show you a lower resolution desktop but cram a lot more pixels into a smaller screen area.  The result is admittedly incredibly readable text and super sharp images.  The downside is applications and web apps that are not written to be aware of Retina can have issues with this scaling process.  I also have an issue with the way Apple just assumes the driver of the machine is an idiot.  Example, in the display properties you cannot really select a true desired resolution for the built in display.  You have 4 options such as ‘Larger Text’, ‘Best for Retina’, and ‘More Space’.  Really?? Just please give me the damn list of supported resolutions so I can choose what I want.  I think by this point, you can probably tell where this story is going, and, given this is a blog entry vs. a novel I won’t go deeper into my issues except to regurgitate something I once heard from a friend that certainly rings true in my opinion…

“Using a Mac is like driving tricycle whereas using a PC is like driving a Ducati.  The tricycle is extremely low risk and will most likely get you to where you want to go eventually.  The Ducati, in the hands of an inexperienced driver (Mac User) can be quite deadly however in the hands of a trained professional it can do very amazing things.”

Admittedly, I do believe Microsoft is as much at fault here as Apple as it was the core Microsoft apps that were the bane of my existence throughout this experience.  So, I now have a HP 9470m business class ultra book on order.  It is the same weight and size as the MacBook, has the same or better battery life, requires zero dongles as VGA and gig copper port are built in, has a solid screen resolution of 1400×900, can be upgraded to support 16GB of RAM and can hold both a standards based mSATA SSD plus a traditional 2.5″ SSD or magnetic drive (no F’d up screwdriver required), and has docking capability.  Oh, and did I mention it’s half the price of Mac?

 

Obsidian Strategics to Exhibit at Cloud Expo New York

SYS-CON Events announced today that Obsidian Strategics, the pioneer of “InfiniBand over the WAN,” will exhibit at SYS-CON’s 12th International Cloud Expo, which will take place on June 10–13, 2013, at the Javits Center in New York City, New York.
The Obsidian Longbow™ family of products extend InfiniBand clusters over Campus, Metro, Regional or Global Area optical networks enabling unparalleled DR applications over long-distance, high-bandwidth video transmission and efficient movement of Big Data sets to remote compute or storage resources. Integrated features include: routing, encryption, authentication and dark fiber optimization.

read more

Google Outages: Did the Latest Hit You?

This time it was Postini:

March 25, 2013 1:38:00 PM PDT

We’re investigating reports of an issue with Postini Services.

March 25, 2013 2:38:00 PM PDT

Postini Services service has already been restored for some users, and we expect a resolution for all users within the next 1 hours. Please note this time frame is an estimate and may change. (editor’s note: resolution took over six more hours).

March 25, 2013 9:05:00 PM PDT

The problem with Postini Services should be resolved. We apologize for the inconvenience and thank you for your patience and continued support. Please rest assured that system reliability is a top priority at Google, and we are making continuous improvements to make our systems better.