Innodisk has announced a full-rate production of a nanoSSD SATA device that conforms to JEDEC’s standard (MO-276).
Innodisk integrates a flash control chip, NAND flash, and ball grid array (BGA) package to deliver a nanoSSD that is approximately 1% of the size (16mm x 20mm x2mm) of a 2.5” SSD, by volume. The product weighs only 1.5 grams, supports the SATA III interface, and is fully compatible with both x86 and ARM systems. Innodisk’s nanoSSDs are being incorporated into a wide variety of applications where small form-factors and high transfer rates are critical, including industrial mobile devices, point-of-sale systems, embedded products, tablets, Ultrabooks, and high-end smart phones. In addition to its ultra-slim form factor, the nanoSSD offers blazing fast data transfer speeds by reading up to 500MB/s and writing up to 170MB/s. Whether it’s data storage, system boot, data cache backup, or all three, Innodisk’s nanoSSD successfully answers those challenges with great efficiency and performance. The nanoSSDs are manufactured in compliance with wide-temperature operating range (-40˚C to +85˚C), high shock-resistance, quick erase features, and ATA security.
Monthly Archives: February 2014
Innodisk Begins Full-Rate Production of Its Industrial-Embedded nanoSSD
Innodisk has announced a full-rate production of a nanoSSD SATA device that conforms to JEDEC’s standard (MO-276).
Innodisk integrates a flash control chip, NAND flash, and ball grid array (BGA) package to deliver a nanoSSD that is approximately 1% of the size (16mm x 20mm x2mm) of a 2.5” SSD, by volume. The product weighs only 1.5 grams, supports the SATA III interface, and is fully compatible with both x86 and ARM systems. Innodisk’s nanoSSDs are being incorporated into a wide variety of applications where small form-factors and high transfer rates are critical, including industrial mobile devices, point-of-sale systems, embedded products, tablets, Ultrabooks, and high-end smart phones. In addition to its ultra-slim form factor, the nanoSSD offers blazing fast data transfer speeds by reading up to 500MB/s and writing up to 170MB/s. Whether it’s data storage, system boot, data cache backup, or all three, Innodisk’s nanoSSD successfully answers those challenges with great efficiency and performance. The nanoSSDs are manufactured in compliance with wide-temperature operating range (-40˚C to +85˚C), high shock-resistance, quick erase features, and ATA security.
BMC Software Delivers ‘The New IT’ with Three Pioneering Products
BMC Software on Tuesday introduced an array of product innovations that take full advantage of advancements in user experience and crowdsourcing to give employees complete control of their IT experience via an elegant and intuitive mobile interface.
The new products – BMC MyIT 2.0, BMC AppZone 2.0 and BMC Remedyforce Winter ’14 – showcase the company’s commitment to using mobile, social, and cloud technologies. BMC believes these technologies, along with automated and industrialized IT service delivery, are the defining characteristics of the new IT. As businesses increasingly replace physical products and services with those delivered digitally – such as banks enabling customers to deposit a check with a smartphone versus going to a branch – expectations for improved experiences across the technology landscape have skyrocketed, as have the pressures put on IT to deliver them. BMC addresses these challenges in the customer-focused products announced today.
Why Big Trust is Big Data’s missing DNA
Mark Little, Principal Analyst, Consumer
In the rush to monetize customer data, companies risk diminishing the trust people have in services and brands. Sustaining and growing people’s trust in services is not just about “doing the right thing,” but also makes commercial sense. Telcos and OTT players have worked to establish a satisfactory level of trust with their customers, but as Big Data creates new opportunities for monetizing customer data, even a little more aggression in its exploitation risks driving mistrust among users.
Customers who are aware of this exploitation will become more concerned with their privacy, and with the transparency and control of their data. To exploit customer data more comprehensively, businesses must develop a much greater level of trust with their customers. Ovum calls this approach “Big Trust,” and outlines it in detail in the report Personal Data and the Big Trust Opportunity. Big Trust creates new …
Analysing the maturation of standards in European cloud computing
In September 2012, the European Union released its “Unleashing the Potential of Cloud Computing in Europe” document, aiming for a yearly 160bn Euro (£127.6bn) boost to the European GDP by 2020 and a gain of 2.5m by the rollout of cloud.
A full 15 months later, the response by the European Telecommunications Standards Institute (ETSI) was published. The ‘Cloud Standards Coordination’ report, requested by the European Commission, aimed to analyse Commission VP Neelie Kroes’ opinion there was a “jungle of technical standards.”
ETSI asserted that cloud standardisation was “much more focused tha[n] anticipated” and added the landscape was “complex but not chaotic.”
The executive summary outlined the state of play in the key areas. Important gaps in standards had been identified, with new standards encouraged, whilst the legal environment for cloud computing remains “highly challenging.”
ETSI believes cloud standardisation will mature in the next 18 months.
“Though …
Federated Clouds for Emergency Response
Recent high-profile events (2010 Haitian Earthquake, 2011 Tōhoku Earthquake and Tsunami, 2013 Typhoon Haiyan/Yolanda) have highlighted the growing importance played by the international community in successful humanitarian assistance and disaster response. These events also showcased the critical importance of quickly providing robust information technology resources to response effort participants. In June 2010, in support of its continuing effort to foster international collaboration, the National Geospatial-Intelligence Agency (NGA) initiated a dialog with the Network Centric Operations Industry Consortium (NCOIC) to discuss this and other aspects of geospatial data information-sharing across the international community. In response to this request the NCOIC through the use of a cloud services brokerage paradigm, built and demonstrated a federated cloud computing infrastructure capable of managing the electronic exchange of geospatial data. The effort also led to the development of the more generalized NCOIC Rapid Response Capability Pattern (NRRC), a process that could improve the effectiveness and reduce the cost of emergency situations that require an international joint civilian/military response.
Breaking Down Enterprise Silos in the Cloud
Are you re-creating existing technology silos in the cloud? If so, your entire enterprise investment in the cloud is at risk.
From the perspective of IT, organizational silos seem to be the root of all problems. Every line of business, every department, every functional area has its own requirements, its own technology preferences, and its own way of doing things. They have historically invested in specialized components for narrow purposes, which IT must then conventionally integrate via application middleware – increasing the cost, complexity, and brittleness of the overall architecture.
Now those same stakeholders want to move to the cloud. Save money with SaaS apps! Reduce data center costs with IaaS! Build a single private cloud we can all share! But breaking down the technical silos is easier said than done. There are endless problems: Static interfaces. Legacy technology. Inconsistent policies, rules, and processes. Crusty old middleware that predates the cloud. And everybody still has their own data model and their own version of the truth.
Weighing the Options for Onboarding Data into the Cloud
One of the questions we hear most frequently is “how do I get my data into the cloud?” For many organizations, the benefits of expanding on-premise data storage to include hybrid cloud storage have begun to resonate, but they struggle to get started as they determine how to get move data into the cloud. The decision on how to onboard initial data to the cloud, or what we call the initial ingest, is one that cannot be overlooked.
While there is more than one way to perform the initial ingest, it shouldn’t be a surprise that the best solution can vary on an individual case basis. Relevant factors influencing the decision include: amount of data intended for ingestion, amount of available bandwidth, timeframe in which you want to load the data. Typically, most organizations will decide on one of the following three methods for the initial ingest:
Weighing the Options for Onboarding Data into the Cloud
One of the questions we hear most frequently is “how do I get my data into the cloud?” For many organizations, the benefits of expanding on-premise data storage to include hybrid cloud storage have begun to resonate, but they struggle to get started as they determine how to get move data into the cloud. The decision on how to onboard initial data to the cloud, or what we call the initial ingest, is one that cannot be overlooked.
While there is more than one way to perform the initial ingest, it shouldn’t be a surprise that the best solution can vary on an individual case basis. Relevant factors influencing the decision include: amount of data intended for ingestion, amount of available bandwidth, timeframe in which you want to load the data. Typically, most organizations will decide on one of the following three methods for the initial ingest:
Cloud Computing: Impact of Comcast’s Acquisition
COMCAST’s recent $45 billion acquisition of Time-Warner Cable was big news in the Cable TV world, but it should be big news in the area of cloud computing and network diversity.
Instead of looking at COMCAST as a cable TV provider, it is time that we should be looking at it as another alternative to the traditional network carriers like AT&T and Verizon for our data needs both at home and at the business location.
Do you have network redundancy in place in your cloud computing applications? They cannot be considered “mission critical” if you only have them running on one connection to the central office.