SYS-CON Events announced today that Super Micro Computer, Inc., a global leader in Embedded and IoT solutions, will exhibit at SYS-CON’s 18th International Cloud Expo®, which will take place on June 7-9, 2016, at the Javits Center in New York City, NY.
Supermicro (NASDAQ: SMCI), the leading innovator in high-performance, high-efficiency server technology, is a premier provider of advanced server Building Block Solutions® for Data Center, Cloud Computing, Enterprise IT, Hadoop/Big Data, HPC and Embedded Systems worldwide. Supermicro is committed to protecting the environment through its “We Keep IT Green®” initiative and provides customers with the most energy-efficient, environmentally friendly solutions available on the market.
Monthly Archives: May 2016
NetApp takes Q4 hit but predicts turnaround in 2018
NetApp has reported yearly decline during the Q4 earnings call, though CEO George Kurian remains positive the company will return to moderated revenue growth in 2018.
Revenue for Q4 was $1.38 billion, down from the $1.425 billion mid-year estimations, as the company reported an $8 million loss. Q3 saw the company report a profit of $135 million, whereas Q4 2015 saw the team account for $153 million profit. Year-on-year annual figures were also down 9% to $5.6 billion, with profits also decreasing to $229 million from $560 million in 2015, a decrease of 59%. Although investors have not reacted positively to the news, the $842 million acquisition of Solidfire hit the books during the period, taking some of the light off the company’s performance.
“As we discussed last quarter, we’re making fundamental changes to return the Company to revenue growth with improved profitability, cash flow, and shareholder returns,” said Kurian. “To deliver on this commitment, we’re executing a comprehensive and sustained transformation.”
The company has been undertaking a wide-ranging transformation project in recent months, moving the focus away from the traditional portfolio, which declined roughly 40% year-on-year, and towards the strategic growth initiatives, which grew 21% year-on-year. The strategic initiatives now account for 53% of net product revenue.
Priorities for the business will continue to focus around the delivery of the strategic initiatives including cluster data ONTAP, branded E-Series and All-Flash-Arrays, as well as the introduction of the next generation of ONTAP in the coming weeks. The team claim the new offering will simplify customers IT transformations to modern data centres and hybrid cloud environments, while also giving the customer greater flexibility when it comes to engineered systems, software defined storage, or cloud.
Kurian stated during the call he expects another tough year for fiscal 2017, though the company should be in a position to turn the corner in fiscal 2018, prioritizing the hybrid cloud market, as well as streamlining costs within the organization.
“When I took over as CEO, NetApp was dealing with several internal challenges,” said Kurian. “We were late to the All-Flash-Array market. We were not prepared to assist our installed base of customers in migrating to clustered ONTAP, and we had limited traction in the hybrid cloud.
“Heading into fiscal year ’17, our momentum with customers is accelerating. Data is at the heart of our customers IT transformation efforts and this is where NetApp has a profoundly important role to play. Our strategic relevance to customer’s digital transformation roadmaps is evidenced by the growth of our strategic solutions. We’re making meaningful progress, but still have work ahead of us and remain focused on execution. I remain highly confident in NetApp’s potential.”
Under Kurian’s leadership, NetApp has seemingly been forced into a wide-ranging transformation project to remain relevant in current and future market conditions. By self-admission, NetApp was not ready for the cloud-orientated world, and too focused on legacy products; however the team have outlined the organization’s roadmap to drive the company back into the positive.
Firstly, the team are positioning the clustered ONTAP offerings, as well as SolidFire to reinforce the company’s position as a supplier of technology to the cloud, both service providers and enterprise organizations who are managing private cloud environments. Kurian claims the company leads the way for enterprise storage and data management technology in the open stack cloud deployments.
Within the hyper scale segment, the company reported healthy growth for product offerings which combine hyper scale and cloud computing environments. Kurian noted while it is early days, the company are making progress in carving out market share in the segment.
Finally, Kurian highlighted there are have been a number of examples throughout the year of companies transitioning data back to on premise platforms from public cloud environments. The team believe the current offering positions them well in the market to capture those workloads as they transition.
Although Kurian has put a positive spin on year-on-year declines, outlining in depth the company’s strategy to return to prominence within the newly defined market, the market has reacted slightly differently. Shares fell almost 7% in afterhours trading.
Dell targets SMBs in China with launch of new company
Dell has prioritized growing its presence within the Chinese market targeting SMBs and public sector organizations, according to China Daily.
Speaking at the China Big Data Industry Summit in Guiyang, Dell CEO Michael Dell announced the launch of a new company, alongside its local partner, to gain traction within the lucrative market. Guizhou YottaCloud Technology will now act as a means for Dell to access the local market, prioritizing small and medium-sized enterprises and local governments in the first instance.
“China will play an increasingly important role in the big data era and the United States-based tech giant will speed up efforts to develop new products for the market,” said Dell at the conference.
Dell is one of a number of organizations who have prioritized local partnerships in the Chinese market, as locals tend to favour Chinese businesses and technologies over foreign counterparts, quoting security as the main driver. The country itself is a big draw for Dell as a business, representing its second largest market worldwide, only behind the US. The company also highlighted in September it plans to invest $125 billion in the Chinese market over the next five years, with cloud computing being the focal point.
Last year Dell launched it’s ‘In China for China’ strategy, which not only included the above investments, but also a drive from its Venture Capital arm in China to encourage entrepreneurialism, expanding its R&D function in the country, as well as establishing an artificial intelligence and advanced computing joint-lab, with the Chinese Academy of Sciences. The AI research will focus on the areas of cognitive function simulation, deep learning and brain computer simulation.
“The Internet is the new engine for China’s future economic growth and has unlimited potential,” said Dell in September. “Being an innovative and efficient technology company, Dell will embrace the principle of ‘In China, for China’ and closely integrate Dell China strategies with national policies in order to support Chinese technological innovation, economic development and industrial transformation.”
451 Research argues ‘race to the bottom’ in cloud is a red herring
(c)iStock.com/saadrox321
The cloud services sector is still a long way off from being a commodity market, according to the latest note from analyst house 451 Research.
The results, which have been published in the firm’s latest Cloud Price Index (CPI), show the ‘race to the bottom’ in cloud pricing, as exemplified by continued price cuts from Amazon Web Services (AWS), Microsoft, Google and others, is something of a misnomer and that the supply of higher value services will be key to long term growth for vendors. Despite this, the researchers argue VM pricing went down 12% on average over the past 18 months, while NoSQL, load balancing, and bandwidth among others remained stable.
As a result, 451 has come up with the ‘cloud commodity score’ (CCS) metric, which measures customers’ sensitivity to price by region. The researchers found that the US was the region most likely to have market share driven by cheaper prices, yet argued Europe and APAC present more opportunities to vendors because the markets are ‘fractured’.
“Despite all the noise about cloud becoming a commodity, our research demonstrates a very limited relationship between price and market share,” said Owen Rogers, 451 Research digital economics unit research director. “Cloud is a long way from being a commodity. In fact, the real drama is the race to the top rather than the race to the bottom.”
This publication has examined the supposed disparity between the major cloud players and how their pricing stacks up. Research from Cloud Spectator back in March found 1&1 to be the best ranked European cloud provider, based on performance as well as price, well ahead of Google, Azure, and AWS. Kenny Li, CEO of Cloud Spectator, told this publication that smaller players have an advantage by offering high-performance infrastructure at more competitive prices.
CIOs prioritize collaboration to increase security – Intel
Intel Security has released new findings which claims CIOs are targeting collaboration as a means to shore up defences against cyber threats.
Respondents to the survey believe their own organizations could be between 38-100% more secure if threat management and incident response personnel and systems could simply collaborate better. The team believe collaboration is one area which is often overlooked, with decision maker’s often favouring new threat detection or prevention tools, though security operations’ effectiveness can be increased through better collaboration between silos within the organization.
“Threat management contributions are almost evenly spread among different roles, but there are some notable areas of specialization,” the company stated in its “How Collaboration Can Optimize Security Operations” report. “Every handoff or transition can add significant operational overhead—along with the potential for confusion and chaos and delays in responding. But, on the upside, there is also huge potential for collaboration and increased efficiencies.”
The report states CIOs are still prioritizing new tools as a means to shore up their own perimeters, though collaborations technologies were not far behind in the rankings. 40% of the respondents highlighted their spend would be prioritized on better detection tools, 33% pointed towards preventative tools and 32% said improved collaboration between SOC analysts, incident responders and endpoint administrators.
One of the main challenges for these organizations is the process, accuracy and trust in communication. For a number of organizations data is shared manually and potentially reprocessed several times, increasing the possibility of inaccuracy. Automated collaboration tools ensure data is shared quickly and accurately through an array of different functions and responsibilities. “Trust arises from good communication, transparency, and accountability, all of which engender confidence in the outcome,” the report states.
The number of tools being used within these organizations is also a challenge, as data is often transferred between or collected centrally manually. The average number tools companies use to investigate and close an incident is four, though 20% of the respondents said they can use up to 20 different products to achieve the same aims, further increasing the challenge. Though larger and more geographically diverse organizations will by definition use more tools, the same principles of collaboration and automation apply, and in theory could increase the security of an organizations perimeter.
“Tougher new EU data privacy regulations, which are currently in the process of being modernized, will be implemented in 2017,” said Raj Samani, EMEA CTO for Intel Security, in the report. “Organizations will be legally required to implement a security architecture that ensures a secure and trustworthy digital exchange of data throughout the EU. Data privacy needs to be assured at every level and across the entire infrastructure. In light of that, improved incident investigation and response processes that bring together collaborative tools and teams are imperative.”
While most organizations are answering the threat of more advanced cyber threats with the implementation of more advanced defence solutions, collaboration is an area which could be seen as a complementary means. Collaboration can contribute to real-time visibility for various teams, improve execution capabilities, as well as speed of response.
Here’s why private and hybrid cloud are here to stay
(c)iStock.com/janniwet
When private and hybrid cloud technology first appeared, some pundits predicted that they wouldn’t last. Wasn’t everything going to the public cloud? But last they did, and there are several good reasons why private and hybrid cloud are here to stay.
For one thing, some companies are balking at the cost of maintaining public cloud deployments once their workloads and storage grow into the tens of petabytes. In addition, some vertical markets (financial services, for example) mandate tight internal security controls, so the public cloud is not an option for many aspects of their business. Finally, enterprise customers want to be able to choose the cloud solution that’s best for them, and they don’t want to be mandated to use public cloud if their circumstances dictate otherwise.
Public cloud providers rake in billions of dollars in service fees because using a public cloud is an easy thing to do. With a few mouse clicks, users can activate cloud resources and scale them indefinitely without having to worry about housing or maintaining hardware or developing the in-house expertise to build a cloud on premises. Cloud computing was born as public cloud, and many cloud computing advocates felt that on-premises or co-located clouds were just a fancy name for the same old data center resources.
But rather than declining, private and hybrid cloud deployments are growing because these approaches have valid roles within enterprises. In most cases, it’s not an either/or decision between one type of cloud and another. A more likely scenario is that most enterprises will use a mix of public, private and hybrid clouds as IT departments try to balance security, costs, and scalability.
Leveraging a private or hybrid cloud computing model has three advantages. First, it provides a clear use case for public cloud computing. Specific aspects of existing IT infrastructure (storage and compute, for example) can be placed in public cloud environments while the remainder of the IT infrastructure stays on premise. For example, with business intelligence applications, it may be better to keep the data local and do the analytical processing in the public cloud rather than migrating gigabytes of operational data to the public cloud.
Secondly, using a private or hybrid model delivers more flexibility in gaining maximum leverage from computing architecture, considering you can mix and match the resources between local infrastructure (which is typically a sunk cost but is difficult to scale), with infrastructure that’s scalable and provisioned on demand. IT departments can choose where best to place applications depending on their needs and cost structure.
Finally, the use of hybrid or private cloud computing validates the idea that not all IT resources should exist in public clouds today, and some resources may never be moved to public clouds. Considering compliance issues, performance requirements, and security restrictions, the need for local is a fact of life. This experience with the private or hybrid model helps users better understand what compute cycles and data have to be kept local and what can be processed remotely.
The argument against private or hybrid cloud points to large hardware and software investments required and the depth of in-house expertise needed to make it happen. However, newer cloud platforms address these pain points.
In some cases, cloud infrastructure vendors are offering converged (compute/storage/networking) appliances that can be set up and running in minutes. These appliances provide scalable building blocks to support private and hybrid clouds with ample resources, and they deliver a better ROI than traditional data centre hardware.
Another approach is to use standard OpenStack APIs with a self-healing architecture that reduces the management burden. A few vendors use SaaS-based portals that handle management and operations with complete health monitoring and predictive analytics to prevent problems before they occur. By enabling curated updates through automated patching and upgrades, the end user receives the best possible service experience with minimal effort required of the administrator.
So while private and hybrid cloud once required plenty of in-house cloud-building expertise, it’s not true today. Modern cloud platforms address many of the objections to using private or hybrid cloud, and these architectures can work together with public cloud to give enterprises the performance and cost-effectiveness they seek. Today’s private and hybrid cloud platforms provide the same ease-of-use as public cloud infrastructure while delivering the flexibility, security, performance and control many enterprises demand.
Salesforce to run some core services on AWS
Salesforce has announced it will run some of its core services on AWS in various international markets, as well as continuing investments into its own data centres.
The announcement comes two weeks after the company experiences a database failure on the NA14 instance, which caused a service outage which lasted for 12 hours for a number of customers in North America.
“With today’s announcement, Salesforce will use AWS to help bring new infrastructure online more quickly and efficiently. The company will also continue to invest in its own data centres,” said Parker Harris, on the company’s blog. “Customers can expect that Salesforce will continue to deliver the same secure, trusted, reliable and available cloud computing services to customers, regardless of the underlying infrastructure.”
While Salesforce would not have appeared to have suffered any serious negative impact from the outage in recent weeks, the move could be seen as a means to rebuild trust in its robustness, leaning on AWS’ brand credibility to provide assurances. The move would also give the Salesforce team options should another outage occur within its own data centres. The geographies this announcement will apply to have not been announced at the time of writing.
Sales Cloud, Service Cloud, App Cloud, Community Cloud and Analytics Cloud (amongst others) will now be available on AWS, though the move does not mean Salesforce is moving away from their own data centres. Investment will continue as this appears to be a failsafe for the business. In fact, Heroku, Marketing Cloud Social Studio, SalesforceIQ and IoT cloud already run on AWS.
“We are excited to expand our strategic relationship with Amazon as our preferred public cloud infrastructure provider,” said Salesforce CEO Marc Benioff. “There is no public cloud infrastructure provider that is more sophisticated or has more robust enterprise capabilities for supporting the needs of our growing global customer base.”
MangoApps to Exhibit at @CloudExpo New York | @MangoSpring #Agile #IoT #DevOps
SYS-CON Events announced today that MangoApps will exhibit at SYS-CON’s 18th International Cloud Expo®, which will take place on June 7-9, 2016, at the Javits Center in New York City, NY.
MangoApps provides modern company intranets and team collaboration software, allowing workers to stay connected and productive from anywhere in the world and from any device.
For more information, please visit https://www.mangoapps.com/.
IoT-Blueprint for a Connected Economy | @ThingsExpo #IoT #M2M #DigitalTransformation
The IoT is changing the way enterprises conduct business.
In his session at @ThingsExpo, Eric Hoffman, Vice President at EastBanc Technologies, discuss how businesses can gain an edge over competitors by empowering consumers to take control through IoT. We’ll cite examples such as a Washington, D.C.-based sports club that leveraged IoT and the cloud to develop a comprehensive booking system. He’ll also highlight how IoT can revitalize and restore outdated business models, making them profitable again. Lastly we’ll provide insight on the importance of properly establishing IoT in the cloud and the four steps you need to take in order to make your IoT program improve your service quality.
No-Code/Low-Code Apps | @CloudExpo @Kintone_Global #IoT #DataCenter
Cloud-based NCLC (No-code/low code) application builder platforms empower everyone in the organization to quickly build applications and executable processes that broaden access, deepen collaboration, and enhance transparency for all team members.
Line of business owners (LOBO) and operations managers know best their part of the business and their processes. IT departments are beginning to leverage NCLC platforms to empower and enable LOBOs to lead the innovation, transform the organization, and build the infrastructure for lasting productivity and agility.