VoIP Implementation – Why Won’t the Audio Work?

I recently worked on a project that ended up being a success but looked at first like it could end up being a failure. We were doing a Voice over IP implementation and were putting in a new switch network for a client that had 8 sites. When the time came for implementation we ran into some difficulty with getting audio working out of the local branches. At first, we were stumped, but it ended up being an issue with a 3rd party provider. In the video, I discuss what the issue was, how we found it, and how we remedied it. I also provide some tips on how to avoid similar challenges. Hope you enjoy!

 

VoIP Implementation – Why Won’t the Audio Work?

Watch the video on GreenPages’ YouTube Channel.

 

If you have any questions around unified communications, please reach out!

 

 

By Ralph Kindred, Practice Director, Unified Communications

Lenovo and Nutanix combo to run private cloud over hyperconverged datacentres

datacentreDatacentre hardware maker Lenovo is to install Nutanix Software in a bid to speed up the process of building the infrastructures that support private clouds.

The new family of hyperconverged appliances will be sold by Lenovo’s sales teams and its global network of partners.

Nutanix makes its own units that converge storage, server and virtualisation services into an integrated ‘scale-out’ appliance, but in this partnership Lenovo will use its own hardware devices to run the Nutanix software. The objective is to simplify data centre building, by pre-engineering most of the integration tasks and make data centre management easier. This, say the manufacturers, will cut down both the building costs and construction time for creating the foundation for a private cloud. It also, claims Nutanix, lowers the cost of ownership by creating modules in which moves and changes are easier to conduct and management is simpler.

By running the jointly created convergence appliances on Lenovo hardware, they can take full advantage of Lenovo’s close ties with Intel and run its latest processor inventions. Lenovo said it is making ‘sizeable investments in a dedicated global sales team to support the new converged appliances for datacentre builders. Lenovo and Nutanix say they are jointly planning more co-development in platform engineering and coding, as well as joint marketing initiatives.

“Lenovo can bring a new perspective to the global enterprise space,” said Lenovo CEO Yang Yuanqing, “Nutanix’s well recognised technology leadership can dramatically reduce complexity in data centres of all sizes.”

The Lenovo OEM partnership with Nutanix goes well beyond typical alliances, said analyst Matt Eastwood, Senior VP for IDC’s Enterprise Infrastructure and Datacenter Group. “This partnership will accelerate the reach of hyperconverged infrastructure,” he said.

How to Play Fallout New Vegas on Mac

Are you (not so) patiently waiting for Fallout 4 to come out next week? Are you frustrated when you go to install Fallout New Vegas on your Mac—for some Fallout fun while you wait, of course—only to find it’s PC-only? Save your Darth Vader scream (James Earl Jones, not Hayden Christensen), there is (a new) hope! You […]

The post How to Play Fallout New Vegas on Mac appeared first on Parallels Blog.

Accenture Acquires Cloud Sherpas

Cloud Sherpas, an Atlanta based global technology service provider, has recently been purchased by Accenture. Cloud Sherpas provides cloud based services to companies that wish to both manage their customer relations and make their IT processes more efficient. Since its origination in 2008, Cloud Sherpas has seen rapid growth, with its current 1,100 employees across the world in countries such as Australia, India, Japan, New Zealand, Philippines, Singapore, the United Kingdom, and the United States.

Accenture bought Cloud Sherpas so that it could strengthen the company and become a leading provider of cloud based services to clients that want cloud based solutions. The purchase allows for the combination of each companies skill set: Accenture’s powerful cloud strategy and technology consulting and Cloud Sherpas’ cloud implementation, integration, and management services.

accenture

Cloud Sherpas’ employees will join the newly developed Accenture Cloud First Applications sector. This new team will continue to provide Sherpas’ cloud services to companies such as Google, NetSuite, Salesforce, ServiceNow, and  Workday. Paul Daugherty, Accenture’s Chief Technology Officer has commented: “The addition of Cloud Sherpas to the Accenture Cloud First Applications team reinforces our position as a leading global provider of enterprise cloud services. We welcome the extremely talented Cloud Sherpas specialists to Accenture, where together we are even better positioned to help clients move their businesses to the cloud and achieve significant business results faster than ever before.”

Details pertaining to the amount paid for Cloud Sherpas were not disclosed.

About Accenture:

Accenture plc is a multinational management consulting, technology services, and outsourcing company. Founded in 1989, Accenture focuses on making investments in areas such as training, acquisitions, emerging technology, and offerings and assets.  Its headquarters are currently located in Chicago.

The post Accenture Acquires Cloud Sherpas appeared first on Cloud News Daily.

Meet the data centre technicians – the people who keep the online economy working

(c)iStock.com/4x-image

According to IDC figures, there will be 8.6 million data centres worldwide by 2017, all powering the Internet economy. The technology analyst firm believes that, over the next five years, the majority of organisations will cease to manage their own IT infrastructure and turn to dedicated and shared cloud offerings in service provider data centres.

Smaller data centres and in-house server rooms are on the wane, being replaced by ‘mega data centres’ that house the servers that power the Internet, our cities, and the economy. IDC estimates that these will account for almost three quarters (72%) of all service provider data centre construction in terms of space, while also accounting for 44% of all new high-end data centre space around the world (up from 19% in 2013).

While large companies run their own data centres – Facebook, for example, recently announced it is spending at least $500 million (£324m) on a new facility in Fort Worth, Texas – most organisations either rent space, known as colocation, or buy cloud computing capacity.

Data centres are the engine rooms enabling the internet economy to run – and it’s an area where the UK is a leader. Research from Tariff Consulting shows that the UK data centre market is the largest in Europe by space and power. Tariff also suggests that the growth in demand has led to a surge in construction of data centres outside London, in places such as Slough and Manchester.

But these data centres need to be manned – and it’s these people who have the responsibility of keeping the internet economy running.

Around a dozen people will be responsible for running a typical data centre at any point in time, working around the clock on shifts to constantly monitor vital signals to ensure that power is available and the server rooms are kept cool. They’re a mixture of technicians, engineers, and facilities staff.

My job, as data centre services manager for CenturyLink, is to ensure CenturyLink’s data centre in Slough keeps running and is managed as efficiently as possible. This involves ensuring that equipment is installed and maintained properly, and customers that rent space in the data centre follow best practice guidelines.

An example of this is ensuring that cabling is installed properly. When it isn’t, it can interfere with airflow, which has implications for temperature control. Poor temperature control affects how much power is used, and potentially causes problems with the servers, if they overheat.

The first thing you notice when walking around a data centre is the quiet, with just the faint hum of hundreds of servers permeating the space. CenturyLink’s Slough data centre is roughly the size of an average supermarket, divided into dozens of server rooms, run by a small team of engineers, technicians, and facilities staff at any point in time.

For co-location customers, they bring their own servers in, with our job to ensure they have power and cooling to run their systems in an optimum environment. They are just responsible for what happens in their own cage and they don’t have interaction with any other company – it’s their space. As a result, security is paramount and data centre workers are thoroughly background checked before getting a job at CenturyLink. These are CenturyLink workers and not outsourced, which we – and our customers – feel is more secure and gives greater peace of mind. Even visitors entering the data centre are subject to numerous security checks.

An important part of the role is handling outages or safety issues such as fire or power failure. While these are rare in any data centre, to minimise any risk, preparation is key. These primarily involve monitoring systems, put in place to ensure we know what the ‘health’ of the data centre is at any point in time. Should anything happen that’s beyond our control, we have an escalation process to deal with it – such as standby generators and an uninterruptable power supply in the case of power failure.

Data centre technicians comes from a variety of backgrounds, but usually with a strong interest in IT, electronics, or electrical engineering. The technician role mixes desktop support, with knowledge of router and networking equipment, along with practical skills, such as cabling or repairing power equipment. It’s literally a hands-on job, and there’s more variety than you might think.

Technicians tend to stay in the industry once they join it, and as the likes of Slough and other areas outside London become greater hubs for data centres as connectivity costs dwindle, there are plenty of job opportunities for those who want to enter the profession. Working with people that share my passion for a good job well done is important – but it’s also rewarding to know that we’re an essential cog in the wheel of the internet economy.

Bitcasa CEO Brian Taptich: Competing with Microsoft, Amazon, Google a “suicide mission”

(c)iStock.com/Filograph

Brian Taptich, the CEO of developer-centric cloud storage provider Bitcasa, has told this publication that Microsoft shuttering its unlimited OneDrive storage policy is “definitely not a failure” and other players in the space are risking a “suicide mission” by competing against the hypervendors on their own terms.

Earlier this week, the Redmond giant announced it was to cap each Office 365 subscriber OneDrive account at 1 TB because some users interpreted the definition ‘unlimited’ to its fullest extent. Some users had entire movie collections in Microsoft’s cloud, with the scales topping 75 TB – or 14,000 times the average customer’s data usage – in extreme cases.

For Bitcasa, this represents a full circle change. This time last year, as Microsoft was opening up its OneDrive service for users to store the entire history of recorded cinema into, Bitcasa was shutting down its own $10 all you can eat plan for a similar reason; there was more than a suspicion of businesses using and abusing individual unlimited accounts.

Taptich said he could “empathise” with Microsoft after making the decision to stop unlimited accounts, but stopped short of saying this was akin to waving the white flag. “This is definitely not a failure or an admission of failure,” he tells CloudTech. “I suspect Microsoft just learned that, however theoretical models may have supported the efficacy of offering unlimited storage of a fixed –and low – fee, in the almost entirely frictionless world of data transfer, the first users who show up to the all you can eat buffet break the model with their unimaginable volumes of data.”

Bitcasa’s move away from unlimited, Taptich argues, was not so much a case of trying to recoup lost money, but trying to shift engineering resources from a black hole. But it wasn’t just a case of making the decision, clicking their fingers and switching over.

The company suspected the unlimited plan wasn’t working as out as early as 2013, but it took until late 2014 to implement it. As Taptich explains: “There were two important reasons to continue offering the service. Because we had grown to over 30 PB of data under management from users across 120 countries, the additional time provided an amazing sandbox to test and refine the scalability and performance of the underlying infrastructure that is now the backbone of our developer-focused platform.”

He adds: “We [also] had a small but passionate user base, and we made the decision to invest in building systems that would provide our users relatively seamless ways to transition to our new tiers or, in some cases, to retrieve their data.”

Taptich argues that, while the past year has not been without its challenges, the company’s biggest struggle has been to stay patient. “Companies like Apple and Google have understood for a number of years that whoever owns the customer data owns the customer, but it’s only in the past 12 months that the balance of the connected – and increasingly mobile – ecosystem has woken up,” he explains. “They are playing catch up, and we have a platform which solves their desire to maintain customer ownership without having to custom build solutions.”

Despite arguing that fighting “trench warfare” with Google, Microsoft, and Amazon is not a smart plan, as the battle to provide public cloud infrastructure is being fought alongside companies with trillions of dollars in market capital, Taptich insists there is an “enormous amount of opportunity” for smaller players long term.

Take enterprise-centric file sync and share vendor Egnyte. While their funding pales in comparison to the likes of Dropbox and Box, their laser focus on enterprise customers, and in particular, picking up second generation enterprise customers, is a viable target.

“You have to realise that we are in the middle of an extraordinary transformation from local, device based resources to remote, cloud based resources,” Taptich explains. “All data will eventually reside remotely, and the speed with which this happens is purely a function of connection speed and ubiquity, and security.

“Whether it takes two years or 20 years, this is a fundamental shift that represents multi-hundreds of billions of dollars,” he adds. “And in the short term, successful companies will be laser focused on providing services which enhance the 11 nines reliability of the public cloud, will be targeting customers with a user base every bit as scaled, and will be building a financial model that benefits from the plummeting costs of underlying public cloud infrastructure.”

At the very least, this is the theory which Bitcasa expects to come true.

Kintone Named “Bronze Sponsor” of @CloudExpo 2016 NY | @Kintone_Global #Cloud

SYS-CON Events announced today that Kintone has been named “Bronze Sponsor” of SYS-CON’s 18th International Cloud Expo®, which will take place on June 7-9, 2016, at the Javits Center in New York City, NY.
kintone promotes cloud-based workgroup productivity, transparency and profitability with a seamless collaboration space, build your own business application (BYOA) platform, and workflow automation system.

read more

Join @Sensorberg at @CloudExpo Silicon Valley | #IoT #Cloud #BigData #Microservices

SYS-CON Events announced today that Sensorberg will exhibit at 17th Cloud Expo, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA.
Since 2013, Sensorberg GmbH, based in Berlin, has provided a beacon-based, platform-independent, all-in-one proximity campaign solution. The Sensorberg cloud-based management platform is the company’s core and enables proximity campaigns to be planned, designed and managed. Sensorberg provides open source SDKs (software-development kits) that can be incorporated into any app, rendering it beacon-compatible, and supports all beacon standards: Apple’s iBeacon, Google’s Eddystone and the Microsoft 10 standard.

read more

IBM targets API Harmony in the cloud

APIIBM is to use the cloud to deliver IT developers into a state of API Harmony.

The vendor turned service provider has launched an intelligent cloud based matchmaking technology, which instantly helps coders to find the right application programming interface (API) for the right occasion. The service, API Harmony, could save the world’s developers so much time and money that the global API economy has been predicted to be worth $2.2 trillion by 2018, according to IBM’s internal research.

The system uses cognitive technologies to anticipate the needs of a developer as they build new apps. It then pre-empts some of the delays that developers face by anticipating their interface challenges and researching the answers. It then makes useful time saving recommendations on which APIs to use, API relationships and anything that might be missing.

The API economy is characterised by IBM research as a commercial exchange of business functions and competencies in APIs. IBM says it is the driving force behind most digital transformation across industries today. By 2018, according to research company Ovum, the number of enterprises having an API program will have grown by 150%.

There are three pillars of harmoniousness in the API economy, according to IBM. Accordingly its API Harmony servicee main components: Strategy, Technologies and Ecosystems. The strategy element consists of IBM’s API Economy Journey Map, in which consultants will help clients identify key opportunities and gauge their readiness for their journey. The Technologies aspect of the service is built on the previously described intelligent me has thrapping systems and cloud delivery. The Ecosystem for the services is the fruit of an IBM collaboration with the Linux Foundation to create an open platform for building, managing and integrating open APIs.

IBM’s Watson APIs are managed by IBM API Management on Bluemix, bringing approximately 30 cognitive-based APIs to industries.

“To succeed in the API Economy enterprises need an open ecosystem and IBM is helping guide clients every step of the way,” said Marie Wieck, General Manager for IBM Middleware.

Cloud migrations driven by bosses, business leaders and board – report

multi cloudThe majority of cloud migrations are driven by the three Bs – bosses, board members and business leaders, as technology experts become marginalized, says a new report. However, the report also indicated  most projects end up being led by a technology-savvy third party.

Hosting vendor Rackspace’s new ‘Anatomy of a Cloud Migration’ study found that CEOs, directors and other business leaders are behind 61% of cloud migrations, rather than IT experts. Perhaps surprisingly, 37% of these laymen and laywomen see their cloud migration projects right through to completion, they told the study.

The report, which compiled feedback from a survey of 500 UK IT and business decision-makers, also revealed what’s in the cloud, why it’s there and how much IT has been moved to the cloud already. There was some good news for the technology expert, as the report also indicates that one of the biggest lessons learned was that cloud migration is not a good experience and that the majority of companies end up consulting a third-party supplier. However, in the end, nine out of ten organisations were able to report that their business goals were met, albeit only ‘to some extent’. The report was compiled for Rackspace by Vanson Bourne.

Among the 500 companies quizzed, an average of 43% of the IT estate is now in the cloud. Cost cutting was the main motive in 61% of cases.

Surprisingly, 29% of respondents said they migrated their business-critical applications first, rather than embark on a painful learning curve with a less important application. The report did not cross reference this figure with the figures for migrations led by CIOs. However, 69% of the survey said they learned lessons from their migration that will affect future projects, which almost matches the  71% of people who didn’t make a mission critical application their pilot migration project.

Other hoped-for outcomes nominated by the survey group were improvements in resilience (in 50% of cases), security (38%), agility (38%) and stabilising platforms and applications (37%).

A move to the cloud is no longer an exclusive function of the IT department, concluded Darren Norfolk, UK MD of Rackspace. “Whether business leaders understand the practicalities of a cloud migration project or not, there appears to be broad acceptance that they can do it,” he said.