The ultimate technology resource.

Combining the best of ITProPortal.com and IP EXPO, Technology.info will inform, educate and inspire. Immerse yourself in the very latest IT research and resources to give your business the edge.

Company Directory

Compleate A - Z listing of all companies VIEW NOW

Find a company you need by area of technology:

Brought to you by ITProPortal.com and IPExpo

Why the HeartBleed bug highlights problems with computer crime laws

18 Apr 2014
by
Tom Cross
heartbleed header 2

One of the most unique aspects of the HeartBleed vulnerability is the impact that it has had on consumers. Usually, when a computer security vulnerability is disclosed, the only thing that people have to worry about is making sure that their own software is up to date. In this case, consumers have been asked to pay attention to the software behind the websites that they use, instead. They’ve been told that it’s not safe to log in to vulnerable websites, and they have to change their passwords on sites that have been patched.

 

Naturally, these instructions have prompted people to look for ways to test websites in order to see if they are vulnerable. A number of services have cropped up that make it easy to run these tests, and there are even browser plugins that automatically test every website you visit as you surf the web. These tests are the best way for consumers to determine whether or not a site is safe to log into.

 

But, could these tests have put consumers on the wrong side of the law?

 

Running a HeartBleed test on a website involves sending traffic to the web server that exploits the vulnerability, and reveals information in the server’s memory that you may not be authorized to see. Even if you are running a tool that doesn’t show you the data that was returned, the tool still ran the attack and retrieved that data. Running one of these tests could be construed as unauthorized access to a computer system and could put you on the wrong side of the federal Computer Fraud and Abuse Act.

 

I’m not seriously suggesting that the FBI might launch a criminal investigation into the many thousands of people who’ve used these tests over the past few days. On the contrary, the problem is not with the behavior that people are engaged in, the problem is with the law.

 

Currently, our computer crime laws don’t do a good job drawing a distinction between behavior that is intended to investigate a technical security issue and behavior that has truly malicious intent. Security researchers have often found themselves facing criminal charges for poking at systems that have been connected to the open Internet. Furthermore, our laws don’t do a good job of distinguishing misdemeanor behavior from felonies, so those criminal charges can sometimes be unreasonably severe.

 

In general, website operators will tell you that they don’t want anybody to be able to test the security of their systems without authorization, but the HeartBleed vulnerability provides a perfect example of a situation where the security concerns of consumers outweigh those of website operators. The law should anticipate and accommodate this sort of scenario.

 

It is important to recognize that the Internet is a public place. When everyone has a computer connected to the Internet and anyone can use their computer to send packets to any other computer, the law must take a realistic approach to distinguishing actions with computers that involve different degrees of malice. In general, sending a request to a server on the open Internet shouldn’t be a crime, unless there is real intent to cause harm or violate privacy.  Felony prosecutions should be preserved for the most serious offenses. As the Internet now plays a central role in the lives of ordinary people, it is more important than ever that our actions online be governed by a set of criminal laws that are practical and fair.

 

Tom Cross is the director of security research at Lancope

Is Mark Zuckerberg changing telecoms as we know it?

17 Apr 2014
by
John Hayduck
Facebook Zuckerberg

In Mark Zuckerberg’s keynote address at MWC 2014, he talked about his goal to connect everyone on the globe centred around the involvement in the Internet.org consortium. In his vision, hundreds of millions of new users living in industrialising nations access the Internet on their handsets for free or after paying a nominal charge. Sounds like great news for the app developers and for the Facebook’s of the world, especially given Zuckerberg’s recent WhatsApp acquisition and his plans for it to connect one billion people worldwide through data and voice messaging, but what does this vision mean in the short term and long term for the mobile operators of today?

The goals of internet.org are entirely laudable and I wholeheartedly agree that connecting people is essential for the planet on every level. Just one third of the world’s population has access to the Internet and the pace of growth is slowing. However the plan is not without its challenges. The approach outlined by Zuckerberg is undoubtedly Facebook-focused with Facebook and the OTT content providers as the beneficiaries. The opportunities for telecommunications companies are slim as things stand.Read More

The goal of internet.org is to level the online playing field and give people access to services in an almost democratic way. Should it have the impact it is clearly aiming for, operators will need to look for new means of generating revenues. In the short term the SMS revenue stream will be at risk as more users adopt WhatsApp in its place. In the long term, consumers are unlikely to want to pay a premium for a newly redefined set of core services (messaging, social and voice) when such services are being given away to achieve a new level of global connectivity.

Why does this matter? Well, for a start, where will the bandwidth come from to support an additional six billion internet users? It is unlikely that service providers will be willing to invest the very large sums needed to build out the infrastructure or to purchase spectrum if they are unable to monetize the investment. The raw user numbers may look appealing but at some level, someone has to pay for the services provided. Operators will be forced to focus on high ARPU generating services and solutions just to make ends meet.

Furthermore, with revenue under pressure operators will be forced to squeeze their costs. This potentially impacts manageability and security just as 6 billion more internet users come on line – and history shows that more internet users equals more security and privacy risks and more pressure on bandwidth.

It’s interesting that there are no mobile operators on the consortium at this point and the support and engagement of the telco industry will be critical to the success of a wholly connected globe…

 John Hayduk is the President of product management and service development at Tata Communications.

Why immersive training is the only way to stop phishing attacks

16 Apr 2014
by
Scott Greaux
headset_woman_contentfullwidth

Phishing is a very effective low-cost attack vector that bypasses most traditional detection methods and it has been widely identified as one of the biggest security threats organisations face today. Today cybercriminals will target a specific organisation and develop sophisticated phishing emails in a bid to trick employees into opening malicious attachments.

This causes great concern for organisations because not only can phishing can have disastrous consequences on a company, the emails are so cleverly crafted that they are extremely difficult to spot, even to the most well trained eye.

As a result of the rise in targeted phishing scams, organisations must train their staff on how to spot these malicious emails. However, because of the ever changing nature of phishing, training cannot be carried out through simple paper-handouts or within employee handbooks.

The security awareness training needs to be an experience that staff will actually remember and retain. Immersing a human in an experience triggers the brain in a way that traditional training doesn’t – by drawing an emotional response. In complex vertebrates (contrary to wat some in security might say, your users do fit into this category), the amygdala is the area of the brain associated with both memories and emotions.

An emotional experience sticks in our memory, making training techniques that elicit emotions more powerful. This is why posters and conventional computer based training fall short.

One method of immersive training to help employees spot phishing attacks would be to send staff mock phishing emails. The staff members that correctly identify the phishing email will be commended and the staff that do not will receive training to help them identify future attacks.

Repeating immersive training exercises capitalizes on a neurological process called long-term potentiation, which is how the human brain forms memories and retains them. Memories form from similar synapses between neurons, and repetition of those synaptic processes cause us to learn and retain information. Conducting annual training will not lead to retention – even if the training itself is compelling – because it won’t be frequent enough to stick in employees’ minds.

Whenever we are learning something new, whether it’s to play a sport, instrument, speak a new language, etc. repetition is crucial. It’s the same with teaching email users safe email behaviour, repeatedly conducting security awareness exercises will allow them to make safe email use a habit.

Ultimately, immersing your employees in an experience will improve their behaviour. With that said, here are ways to make your immersive security awareness engaging:

 Start simple

For the average user, security concepts are difficult to grasp, so start simple! Sending a beginner down a black diamond trail is a good way to turn them off of skiing forever (or worse, get them injured). It’s the same with security. Don’t trip up your users by starting them off with complicated concepts – get them on the beginner slope.

 Be Specific

Hollow platitudes will undoubtedly get your users to tune out. Avoid vague messages like “keep company resources safe”, instead give users specific, actionable information that will help them change behaviour.

Mix it up: How many of you pay attention to the airline safety demonstration prior to takeoff? That demonstration never changes so ultimately people lose interest. Don’t make the same mistake with security awareness. Vary both the content and delivery method of your security awareness to continually engage recipients.

 Keep it going

Why is it so easy to forget what you learned in a boring class? After the final exam, you don’t need the information, so there’s no need to retain it. We do know that security is a constant and changing threat; therefore, security awareness needs to be continuously reinforced. By continuously training users at different times throughout the year, safe security behaviour becomes a habit, and not something forgotten as soon as training is over.

Be positive

It might be tempting to expose the users who are security risks, but in our experience the negative backlash this generates will quickly undermine your security awareness program. Keep things positive by measuring the results of your program and recognizing people and departments who have done well. Educate and support those that need additional help.

Scott Greaux is vice president of PhishMe

Will the CIO survive the cloud?

15 Apr 2014
by
Calum MacLeod
Humans-May-Be-Fueling-a-New-Mass-Extinction-Event-2

It’s a few years from now and the last known member of the species finally succumbs to “The Cloud”. The breed appeared from nowhere about twenty years earlier, around the time of another cataclysmic global event that became affectionately known as Y2K. From nowhere this breed became dominant, not only in their own environment, but very quickly adapted to be able to feel comfortable mixing with other species, especially those higher up the food chain.

And for a while it looked like they would thrive in the corporate boardroom, but very soon their ravenous appetite for food, and the inability to contribute any lasting value, meant that they began to be regarded with suspicion by those around them.

Then a number of years ago, a predator arrived on the scene which they simply could not cope with. They sought sanctuary in the land of Outsourcing, hoping against hope that this would be their salvation from the daily inquisition of the board, and that the Outsourcers would protect them and help them demonstrate value, but this to soon became a forlorn hope.

To try and stave of the threats, they used their powerbase, to block any and every attempt by lesser mortals to improve the way the business worked. After all who else but the CIO knew anything about IT.? And along with the henchmen, CSO, CTO, and Audit, every possible obstacle was set up to ensure that all power stayed within the IT department. The statement by Joseph Eger about administrative problems at Lincoln Center back in the mid 70’s became an apt description of the situation in most IT departments; “Administrators are running around straightening out deck chairs while the Titanic goes down.”

So finally they fell victim to “The Cloud”, and apart from the rare anomaly found in public sector organizations, the CIO had ceased to exist.

Fact or fiction?

Nicholas Carr in the excellent read, “The Big Switch Rewiring the World, from Edison to Google” makes the following statement. “Today, we’re the midst of another epochal transformation, and it’s following a similar course. What happened to the generation of power a century ago is now happening to the processing of information. Private computer systems, built and operated by individual companies, are being supplanted by services provided over a common grid—the Internet—by centralized data-processing plants. Computing is turning into a utility, and once again the economic equations that determine the way we work and live are being rewritten.”

I believe it is fair to say that IT has failed to live up to the hype, and particularly Corporate IT departments, and Outsourcers “…in the end, outsourcing was not really a new business model or approach – just a shift in how internal IT was delivered and paid for.” Charles Araujo – The Quantum Age of IT.

Regardless of where you get your analytical data, the conclusions are horrendous. Capital expenditure on IT has risen in the past 50 years from less than 3% of corporate CAPEX, to over 50% in many organizations. And yet when you look at the Return on Investment, it is extremely difficult to find many organizations where the investment has provided a significant business advantage.

Compounding the problem has been the monotonous repetition of failed projects, budget overruns. A study by the Standish Group came with the unbelievable result that only 9% of projects succeeded and only 16% were considered to have been a success; in other words completed on time and on budget. Compare that with a statistic that over 90% of projects had to be restarted, many of them several times.

A similar study by KPMG was slightly more pessimistic. Over three quarters of companies surveyed said that their projects substantially exceeded budget, in many case by over 50%. And what doesn’t help is the pervasive “lemming mentality” within IT. How many failed implementations of a technology are required before enough is enough? CIOs have a lot of interaction with each other, and you would think that a topic of conversation would be “what to avoid”. Or maybe not..

Certainly much of this can be put down to failures resulting from companies trying to gain a competitive advantage by trying to adopt new technologies. One only has to follow the insanity of BYOD and mobile since the arrival of the Tablet.

How many Executives received a Christmas present called iPAD, only to show up at work the very next day demanding to have their corporate email on the device? And why not, this was a seemingly reasonable expectation given that you could do everything else with the device.

This was followed by the knee-jerk reaction from IT to try and discover a method of doing this securely, and low and behold, the next thing is an avalanche of projects to do with Mobile Device Management. And today many of these projects have floundered, usually because IT have focused on finding Eldorado without really looking at the business objectives. In other words you end up with BYOD in the organization with pretty much every useful feature disabled because of a real or perceived security risk!! You may have right on your side, but it doesn’t carry much weight with the business.

The vultures are gathering

A day doesn’t go by without some new risk being identified in the world of IT, and it only takes a few weeks for technology vendors to claim to have solved the problem. But there have been so many false dawns, whether for BYOD, MDM, AV, APT; whatever the acronym, in many cases these solutions fail to deliver the lavish claims. And of course everything on offer is “Enterprise Ready”, but frequently the solutions offer little more than point solutions that end up costing three to four times as much to implement as the technology costs, and rarely delivering on the promises.

Today every CIO is on the back foot, and looking for help. And those offering a panacea are lined up at the door, whether they be vendors, consultants, analysts, whoever. But in general the focus is on IT and not on the business, and CIOs are being asked to provide business value from IT when all their business competitors have access to the same technology. It’s an impossible ask if they continue to try and protect the IT territory, and continue to follow outdated traditions.

Is cloud the saviour or the grim reaper?

Ultimately it will depend on how the CIO responds. But without a doubt, ignore Cloud at your peril, and where specific solutions are needed that are not available as Software as a Service (SaaS), avoid them. Your customers are not expecting you to reinvent the wheel, they’re looking to you to provide them with the services they need.

So will the CIO survive “Cloud” – Depends if they can evolve.

Calum MacLeod is VP of EMEA at Lieberman Software

Security an after-thought for mobile shoppers

11 Apr 2014
by
Michael McLaughlin
IT

Despite the recent surge in mobile shopping shoppers do not have security in place to protect the identity and credit card data stored on the devices, but they’re not going to lose any sleep over it, according to a new survey from Tripwire.

The survey which looked at the attitudes of 1000 consumers revealed that 32 percent of respondents have their mobile device linked to their bank or credit card, yet almost half of those do not have any security on their mobile, and 39 percent of respondents believe the convenience of mobile shopping overrides security concerns. The study also revealed that 36 percent of consumers have their mobile phone linked to their corporate network, yet 18 percent of these people also do not have any security installed on their device.Read More

In response to the findings, Gavin Millard, Tripwire’s EMEA technical director, said: “The results from our survey highlight the fact that consumers are still not recognising that cybercriminals are targeting mobile devices to collect personal information and for financial gain.”

“Because mobile devices are linked to corporate networks and credit cards whilst storing a huge amount of private data, there is a much greater need to keep them secure and protected. Adding a PIN to your lock screen, checking the validity of apps before installing them, and ensuring the SSL certificates on shopping sites are correct is easy to do and will help keep the data stored on mobile devices safe from cybercriminals.”

UK office workers faced with daily onslaught of phishing emails

10 Apr 2014
by
Michael McLaughlin
ddos attack

UK office workers are bombarded with phishing emails with the majority seeing as many as ten attacks hitting their inboxes every day, according to a new survey from PhishMe.

The survey was conducted by One Poll and looked at the attitudes of 1,000 UK office workers towards phishing attacks. The findings of the study revealed that over a third are seeing more phishing emails today than they were a year ago and that 16 percent of office workers claim to have already fallen victim to a phishing attack.Read More

In response to the findings, Rohyt Belani, the CEO of PhishMe, said: “These figures highlight exactly how big a problem phishing and malware attacks are to UK organisations. Spear-phishing emails are contextual, focus on triggering an emotional response, and target specific groups, which makes them very difficult to spot.”

“Today, threat actors will undertake extensive research into their targets to make their emails appear genuine and increase the chance of a recipient taking action. Over the last year we have seen a significant increase in the sophistication of phishing emails; attackers even emulate conversations via email to build confidence with the potential victim before launching the attack. Organisations must enhance their security defences with a continuous programme where they train their staff how to recognise and report phishing emails in a timely manner,” continued Rohyt.

Combating the insider security threat

09 Apr 2014
by
Kevin Waugh
security log in

Insider security breaches are increasingly becoming part of the day-to-day reality for businesses,  as a result forensics investigation is from the world of TV dramas and into the boardroom. Internal security risks (such as the theft of IP protected documents by employees moving to a rival company) are a growing issue, while cyber espionage (exploiting holes in organisations’ security) is estimated to cost the UK economy billions each year.

The Verizon 2014 Data Breach Report found that 14 per cent of all data breaches in businesses were linked to insiders. Even more worrying, Information Management firm Iron Mountain last year reported that 8 per cent of UK employees claimed that if they felt they had been poorly treated by an employer, they would have few qualms in taking revenge by stealing confidential or sensitive information.Read More

It’s a growing trend that has led my colleagues and I at the Open University to work closely with industry to incorporate brand new business-relevant forensics and information security modules into our recently launched postgraduate qualifications in computing.

Despite the impression given by popular US TV crime dramas, there’s far more to forensics in the business world than technical geniuses working with slick computers and gadgets. This misunderstood discipline is fast becoming a requirement for the ongoing integrity of any organisation, and therefore requires understanding and engagement from a number of areas of business operations.

While the latest digital tool kits can highlight weakness and identify current leaks, just as much can be achieved by implementing robust and well understood internal procedures, and only with the right legal understanding can businesses properly investigate, prosecute and claim against acts of insider theft thievery. However the reality is many organisations lack the basic understanding of the legal system in this area, and can initiate actions that can end up with them on the wrong side of the law.

Proving an employee had access to a certain file can be straightforward, but gathering and maintaining evidence that it was stolen without infringing on their employee rights is a difficult process. For example, untrained investigators will often assume that they are entitled to access and search employee emails for information, despite this being an illegal activity.

Add to this the complications thrown up by the introduction of new technological systems and processes like cloud storage and BYOD policies, and there’s a clear need to provide staff with comprehensive and up to date information on the limits of their investigative powers. Businesses must be in a position to readily call upon in-house employees with up-to-date skill sets who understand the legal landscape. Particularly if they want to avoid finding themselves in front of an employment tribunal to defend their actions.

Historically its been the technical experts who controlled the investigations. Today, these skilled employees should no longer be expected to reside purely within the technical parts of the business. The myriad legal ramifications and employee rights in this area means that the responsibility to understand, direct and control internal investigations lies as much in HR as it does in IT. With specific situations requiring individual interpretation and a great deal of context to be understood in these investigations, there is a need for a balance of skills from different sectors of an organisation. In developing our Forensics module, both of these audiences have been thought of.

The postgraduate programme has been specifically designed to develop relevant and recognised skills that employers want throughout their organisation. This means not just providing technical expertise around how to access and analyse employees’ digital trails, but also working within a legal context to ensure organisations remain within the law when addressing internal vulnerabilities.

Employees can often be the source of many of the vulnerabilities identified by digital forensics and security professionals. With the right knowledge and skills sets, however, they can also be part of the solution. Businesses operating across almost all industries can no longer consider themselves to be detached from cyber crime, which affects all organisations. Those that have the right policies in place, with skilled employees across the organisation ready to examine and act on new evidence with clear and informed judgement, will find themselves in the strongest position.

Dr Kevin Waugh is the Programme Director for Postgraduate Technologies& Computing at the Open University. He has just overseen the incorporation of the University’s first business focused digital forensics modules into its brand new postgraduate Computing qualifications.

Welcome to the broadband transformation, whatever your post code

02 Apr 2014
by
Nick Denker
England-travel-Yorkshire-hchalkley

Say rural broadband to a group of people and it is likely to elicit all manner of responses. It is a subject that has been the topic of heated discussion for a while, but it is only recently that the Government has finally announced it will provide the £10 million funding needed to get the remaining five per cent of Britons in remote locations connected to the Internet.

The announcement came around the same time as the new Broadband Chief of BDUK was revealed as Chris Townsend, who, as commercial director, played a crucial part in the successful delivery of the London 2012 Olympics. He will be driving this new broadband delivery programme, including the ambitious project of delivering superfast speeds to the already-connected 95% of the UK by 2017.

When looking at installing fibre in rural areas, the expense has previously been the biggest deterrent for broadband operators. However, the £10m grant will enable alternative technology providers to demonstrate, through testing and trials, how their methods could be a cost-effective and reliable solution. Possible candidate technologies that could be piloted include 4G, satellite technology, and extending fibre direct to premises, or to a distribution point further down the network.

Local authorities will support these pilot projects, which will be undertaken across the country by March 2014, to aid the Government in indentifying solutions for including remote locations in the broadband transformation.

The Role of Satellite

Among the alternative technologies that will be considered, to help connect the remaining five per cent of Britain not included in BDUK’s main superfast infrastructure scheme, is satellite technology.

Satellite Internet is already successfully providing residents and businesses in rural areas across the country with high-speed, reliable broadband. The connection is provided by a specialist satellite Internet Service Provider (ISP) and uses the same satellites used for TV – no phone line or mobile network is needed.

Broadband Everywhere, a satellite ISP based in Birmingham, UK, has already seen success with a similar Government scheme in Wales, Access Broadband Cymru. The initiative offers funding to premises in Wales where the broadband speed is below 2Mbps. Since satellite broadband is a low-cost alternative, Broadband Everywhere has helped many participants using the scheme to connect to the Internet with free-of-charge equipment and installation.

Using SES, Europe’s leading provider of satellite services, to deliver SES Broadband Service via SES’s ASTRA satellites, Broadband Everywhere offers download speeds of up to 20Mbps and upload speeds up to 2Mps.

As a result hundreds of Broadband Everywhere customers are no longer living in the digital dark ages, and instead are driving at full speed in the digital fast lane. Businesses can do their online invoicing, submit orders and manage their website with ease, while families need never miss their favourite television shows again, as websites that require fast download speeds, such as BBC iPlayer and Netflix, can be accessed using the UK IP address provided with the service.

More users are embracing satellite broadband each day.

A Digital Savvy Nation

The grant, which will allow alternative technologies providers to step up and show how they can be part of the broadband revolution, will open for applications on 17 March 2014. Successful candidates will help the Government decide how to spend the final £250m slice of funding which has been put aside especially for rural communities. It seems now will be the time the nation says goodbye to its rural broadband affliction, with the new BDUK broadband chief commenting:

“Ensuring that broadband can reach businesses and consumers across the country is one of the most important policies in Government. Faster connections will improve the way people live, work and spend their leisure time. I look forward to starting my new role as chief executive of BDUK and building on the good work being done to get superfast broadband to people all over the UK.”

The scheme is part of the Government’s long term economic plan, with every £1 invested in broadband seeing a benefit of £20 to the UK economy, making broadband instrumental to innovation and growth of the country.

In the past, broadband rollout schemes have been “mismanaged” and not met their targets. However, now the Government is beginning to think about the alternatives, this may mark the beginning of closing the gap between the digitally-deprived and the digital savvy.

 

Nick Denker is technical director at Broadband Everywhere.

Identity and Access Management seeks its real identity

02 Apr 2014
by
Colin Miles
Future facing

The Gartner Identity and Access Management (IAM) summit took place in London on March 17th and 18th 2014 with this year’s event bringing together a large community of analysts, technology partners and customers to consider the state of the industry and the challenges ahead. Topics ranged from those catering for the traditional demands to discuss IAM programme management and best practice routes to ROI, through to a meeting of minds around the dynamic ‘Nexus of Forces’ (Cloud, Mobile, Big Data and Business Socialisation) and how IAM solutions need to adapt to meet changing demand.

The Gartner IAM summit isn’t just another industry event for us but a real landmark in our calendar. It’s here that we get the chance to take stock of the industry around us, validate some of our thinking and/or set our minds to how we had better adapt our approach to meet the moving targets our customers set.

For those who didn’t get the chance to go, a few observations from this year’s IAM summit:

Read More

IAM seeks its real identity

Ironically, the IAM industry has often lacked a sense of an identity. Just what problem is IAM trying to solve? Security? Compliance? Cost saving and business efficiency? All of the above? And just what does define an IAM solution? Do customers need enterprise grade IAM suites for cradle-to-grave user lifecycle management, or point solutions to deliver control or keep out the bad guys?

A quick tour of the sponsor showcase floor is probably as good as any method you can get for taking a quick snapshot of how vendors are lining themselves up to market solution portfolios against the most pressing concerns of the enterprise budget holder. This year’s event wasn’t lacking in quality or quantity of offerings from those present, but it remains difficult to isolate a common clarity of vision across the field of suppliers from strong authentication providers, privileged identity management vendors, GRC tool and major suite providers.

What is clear however is that customers are going to need help identifying and integrating best-of-breed technology into their delivery models be they cloud or on-premise, and they are going to have to find a way to do this without becoming locked in to dependencies on individual components in what is a still maturing IAM market.

There is more effort needed for IAM to win over the business

We haven’t been alone in being confident to send out the message that with IAM, IT Security can get closer to the holy grail – becoming a business enabler rather than continuing its life as the unloved inhibitor to user efficiency that it sometimes seems.

This has been well borne out as increasing year-on-year we have seen with customer projects which leverage IAM to deliver new portals and services to customers and employees alike. Some realism is still need here however.

Were the sessions on IAM stakeholder management at this year’s summit more sparsely attended or had the topics discussed in these sessions moved on significantly from previous years? Possibly not, but there are few misconceptions in IAM as to the moving nature of focus for CISOs and business leaders – and most of the developments in the industry are lining up to be fully on track with enabling that full IT vision.

Standards are sticking

IAM certainly isn’t the only corner of the IT industry where the battle between the best intentioned open and standards based frameworks versus the necessary ‘evil’ of proprietary based solutions has been fought. It’s been an attritional battle at times, and looking back over the journey it’s interesting to compare the hype of the past versus the reality of today.

So where are we now?

SAML adoption is well underway, OAuth and OpenID Connect bring together a few loose ends and, though it is early days (in standards adoption terms at least), SCIM is showing strong potential. How close are we really to a point of standards maturity and convergence though? A look at almost any real world IAM implementation today will tell you that we are not there yet. Legacy dinosaurs and ‘interesting’ workaround sticky-tape solutions may lurk around any corner of course but the presence of readily exploitable interfaces cannot be guaranteed in all SaaS offerings.

Trends are being observed however and identity focussed standards are becoming more productised. This gives us good reason to be optimistic that future IAM challenges will continue to be more around the what than the how.

Keeping the bad guys out shouldn’t impact the good guys too

For some time analysts have also been promoting the idea of “People Centric Security” (PCS). Here the notion of ‘least privilege’ security is turned on its head to follow the thinking that “everything that isn’t forbidden is allowed”. This may be enough to strike fear into the heart of any security administrator or product owner who has accountability for a high value asset, but the advice of course is to take a pragmatic approach.

Analysts stress that PCS presents an opportunity to cut bureaucracy and costs while increasing staff morale and agility. Its all about finding the right balance between cost, value and security. Applying PCS principles in the right areas, and against the right assets brings opportunity for real business benefit. Note however that more and more, user-centric solution design at any level will be mandated as the norm, rather than the exception.

IAM is an opportunity to enhance your brand

In a environment where “every user is a consumer” it bears highlighting that a users first interaction with a service will typically be through the IAM layer. This may be for (self) registration, logon or perhaps for account management or password reset requests.

What this means is that IAM is uniquely positioned to enhance a users perception of a service. IAM user journeys need to focus on delivering a first class user experience, making these operations clear, simple and easy to complete quickly. Furthermore your branding shouldn’t be left behind the front door. With user centric solutions coming more to the fore, we can exploit IAM to build a strong user experience right from the start.

The Identity of things

The Internet of Things leads us to the Identity of Things. Big Data, SIEM and IAM leads us to Identity and Access Intelligence (“Identity intelligence gets a brain”). There is widespread acceptance that these areas are a given rather than predictions, but the IAM industry has some collective head scratching to get through to deliver business value from the promise.

Moreso IAM needs to keep pace with the demands for new delivery and pricing models that are shaping IT generally. Bringing IAM to SaaS leads us to IDaaS (Identity and Access Management as a Service), a model that needs to accept all of the opportunities and challenges discussed here. The IDaaS solutions of tomorrow need to be clear in vision, articulated to the business, adaptable for integration with both cloud and on-premise solutions and, perhaps above all, focussed on meeting the high expectations of the end user in order for the services they enable to deliver the business value needed.

Colin Miles is CTO at Pirean

How vendors can adopt the right approach to the data integration market

01 Apr 2014
by
Yves de Montcheuil
Yves is the vice president of marketing at Talend.
big data

2014 will see opportunities for data integration vendors increase as businesses ramp up their focus on data and as big data technologies mature. To take advantage, vendors need to adopt the right approach to data integration.

IT has a key role to play in driving the success of most businesses today but during 2014 we expect to see companies move beyond pure IT to focus on using data to transform their business. Big data will start to fulfil its potential as projects move out of the sandbox and into real commercial applications. Businesses will increasingly use big data to monetise their information assets, create new business models or take advantage of unexplored market segments.

With the big data market maturing rapidly, data integration will become increasingly important as businesses access a greater variety of data from a greater range of sources across the organisation and use it to drive analytics, business intelligence and strategic decision-making.

Read More

Businesses looking to make purchasing decisions in this market will, however, face a choice between different vendor approaches. The main distinction is between legacy proprietary vendors and those with an open, community-centric approach.

What the market expects

Buyers of data integration solutions are looking for scalability and business agility. Above all, they want a single platform that meets the needs of their business model, supports speed of deployment and ultimately rapidly leverages the data for business advantage. It is important also that the solutions and the approach used are intuitive, that they effectively make big data integration challenges easier to overcome and that the business can ‘upskill’ staff easily and quickly, making them ready for big data challenges today and in the future.

Businesses are increasingly crying out for an inclusive approach based around collaboration and partnership. In addition to their technology needs, they require experts to provide them with guidance and consultancy.

Vendors can play to this need by developing community editions that can be seeded across the account. This open, inclusive approach needs to extend to the technology platform also. Customers expect vendors to meet their data management or integration technology needs, whatever they may be. They want them to make their big data integration challenges easier to overcome.

To support requirements around scalability, ease-of-use and business agility, vendors must offer a truly unified platform that delivers comprehensive integration across the enterprise.

Buyers in this market need pricing and licensing models that make it easy to work out total cost of ownership from day one. And they want to be in control of the buying process. That typically means an incremental adoption path, delivering scalable integration at a predictable cost and driving short-term return on investment (RoI).

What doesn’t work

Unfortunately, these customer demands are not always being met. Some vendors continue to offer solutions that may deliver rich functionality but which are not easy to use; work with, or maintain, and offer poor integration. Vendors may pursue aggressive acquisition strategies and ‘buy in’ technical capability but often fail to properly integrate the different systems.

We are hearing of instances in which solutions do not deliver what has been promised of them, sometimes known as crippleware. We are also seeing instances of solution providers getting the pricing models wrong and levying a so-called ‘data tax’ on systems, effectively delivering complex, unwieldy solutions that require more money being spent on upgrades as data volumes grow.

data

In particular, we are seeing the gradual decline of the vendor-centric proprietary approach to the marketplace and its replacement with a more collaborative, open model. In my view, there are deep systemic reasons for the decline of proprietary vendors.  The principle ones are as follows:

  • Their business is steadily eroding: The new business vs recurring business mix is slowly but surely degrading. Over time, proprietary vendors have become increasingly dependent on selling more runtime licences, more services, and hopefully new projects to existing customers.
  • Their positioning is not in sync with the market: While proprietary vendors in this space often have great products and reliable technology, they also typically have many years of following legacy, and often unwieldy and cumbersome processes. As a result, their systems are unlikely to be backwards compatible and may struggle to take advantage of the latest technological advances.
  • The perpetual licence approach does not align with market expectations: The sky-high prices typical of this model date back to the mainframe era and the approach is becoming increasingly unpopular with customers. The approach is not sustainable in the long-term either, as it is neither scalable nor ‘future proof’. Also, with this model, spending is typically CAPEX in nature. Not only will this require a much higher level of sign-off, it will also require a totally different accounting approach as well as a bigger impact on the bottom line.

Finding a better path

So what do vendors need to do in order to build an approach that delivers business benefits for their data integration customers? The scale of the task is ramping up as the big data market starts to grow more quickly, with 2014 looking set to be the year of practical implementations as early adopters go live and start to reap the rewards.

The data integration solutions that support this new big data world deliver scalability both up and down. Today, change is happening continuously in the enterprise – and integration needs to help make this change happen. In particular, it needs to provide more value more quickly than before to all types of businesses large and small.

Scalability in this context could mean being able to scale the development team’s skillsets across all levels of integration – data, application, and business process. It could also mean being able to leverage the distributed power of today’s big data processing technologies like Apache Hadoop to deliver infinite scalability at a finite (and predictable) cost.

Going forward, there also needs to be an acknowledgement of the value of total data management. Big data is about all the data that organisations hold. It should incorporate all data sources and types – large and small, ‘slow’ and ‘fast’, traditional and emerging, in-house or external. It is not limited to specific nucleuses of data or confined simply to emerging data sets. It is about the totality of data in the enterprise.

As a result, solutions in this space should enable organisations to unlock all of their data, including historical, live and emerging data, whenever they need it, allowing them to realise the vision of achieving instant value from all their data.

Making the right choice

Data integration is becoming increasingly critical to businesses in driving benefits from rapid deployment of core operational strategies and enhanced data quality, to streamlined and more accurate decision-making.

Big data is becoming increasingly important to businesses. And to leverage big data effectively, businesses will need new technologies and new approaches, including innovative forms of information processing for enhanced insight and decision-making.

In meeting these requirements in 2014 and beyond, it is vital that organisations choose the right vendor to help them. Today, the proprietary approach to delivering data integration solutions is increasingly seen as outmoded and even as a hindrance to the achievement of key business goals.  It is now being replaced by a more customer-centric model, in which flexibility and transparency are key attributes as is the ability to deliver a truly unified platform and, through the latest open source technology, drive competitive edge.

Why IT Needs Social Business

28 Mar 2014
by
Mike Westlund
Social-Business

We in IT spend so much time trying to deliver productivity to the enterprise that it’s easy to overlook one area of critical importance – the productivity of IT itself. What other function has as broad a mandate and requires excellence in such varied competencies? IT needs to provide outstanding customer service, keep operational systems running at 99.99 per cent, secure the enterprise from hoards of hackers, all while delivering strategic solutions to the business. Oh, and don’t forget about innovation!

Unfortunately and not surprisingly, in the face of such disparate demands, IT commonly delivers lukewarm results, and develops a reputation for being…unproductive. But what if there was a way to change that, to turbocharge effectiveness and increase the productivity of IT dramatically?Read More

Before I started working for a company that deployed social business technology, I searched for productivity in the usual places. Collaboration was done primarily via email. Corporate communication was handled by the much-maligned company intranet. Important documents were little used, and far too much time was spent managing the “ERP” system of IT, the ITSM. All along I would regularly extol my team on the virtues of collaboration, and exhort them to make the best use of existing technology. But I always felt like the Greek mythological character Sisyphus, dutifully pushing my boulder uphill, and I couldn’t escape the feeling that something was missing.

Today, social business collaboration systems address this need. Social business is at the heart of my company culture and IT infrastructure. It’s our intranet, our departmental portals, our document system and the central hub that fosters all of our collaboration. And in IT, we use it for every process across the ITIL spectrum, from strategic alignment to release management to service desk. We couldn’t run IT without it. Here are specific examples of how social business has turned our lives around.

Email

I’ll start by describing the wonderful thing that happened with email. I get less of it — in fact about 1/5 the email of my peers. The remaining messages go into the collaboration system in the form of group discussions, which don’t clog my inbox and that I browse at my leisure. It turns out that much of the email I used to get didn’t actually need my immediate attention, or even my input. Who knew?

Corporate Communication and Alignment

With so much spare time not reading emails, my team and I can focus on effective communication with the enterprise. Since the whole company uses the social system, we have a holistic perspective, with visibility and reach into every part of the organisation. My team writes blogs to share our point of view, we keep up to date on the strategic priorities of other business units, we announce new product releases, and importantly, we keep an eye on the sentiment of the employee-base, to see how people react to IT issues, policies, and changes. I cannot stress enough the importance of this feedback, and the opportunity for dialog that it provides to calibrate and manage IT delivery and reputation.

Customer Service

Having everyone on the same social business system has other tremendous benefits. Behaviours that people wouldn’t exhibit when they had to jump between different systems suddenly come to life when they live in a single, elegantly designed user interface. A good example of this is customer service. We use our social system as a front end for IT support. Employees can ask questions and have easy access to knowledge base articles, training documents, and other important IT-support information. Because this information is so accessible, end users will often self-help themselves before submitting IT helpdesk tickets. And better yet, employees from other departments will regularly jump in to answer questions and write official solutions, deflecting work for the helpdesk team!

Operations

I’ve long believed that qualities like creativity, problem solving and continuous improvement form the core of any successful operational culture (and for you Lean practitioners, you know what I mean). But in order to create an IT culture based on these qualities, you need to enable, or better yet, unleash collaboration. Today, using our social business system, collaboration between the applications, security, infrastructure, and helpdesk teams is easy, dare I say, enjoyable. Team members brainstorm, create, argue over and ultimately utilise living documents like standards, policies, and procedures in a way that I have never seen before. They OWN their work, which has created a culture of excellence.

In summary, the silo’d systems of old have not delivered on the promise of productivity, and in fact have resulted in its opposite – a workforce that has simply given up and resorted to email as the lowest common denominator for getting work done. Social business systems are a new way for IT departments to fundamentally change the way they work, and an opportunity to finally deliver on the latent productivity that you always knew was there, locked away in your organisation.

 Mike Westlund is the Head of IT at Jive Software, 

How is the Internet addressing educational inequality?

27 Mar 2014
by
Axel Pawlik
Axel is the managing director of the RIPE NCC.
education

Access to the Internet opens up resources and knowledge to new audiences. As such, it is helping to tackle a range of important issues, including the problem of delivering education to those in need. UNICEF estimates that 61 million children are unable to receive the education they deserve. But the Internet can help change that.

Schools are out, MOOCS are in

In 2012, India witnessed a big change in Internet use. For the first time, local Internet traffic via mobile devices, such as smartphones, exceeded that of traditional desktop PCs. This demonstrates how the Internet is increasingly becoming more democratic, as mobile devices are much cheaper to purchase and run than desktop computers.

With a staggering population of a billion citizens spread across more than 3.2 million km², India has formalised plans to build out and future-proof its Internet infrastructure, in order to facilitate social equality. Widespread Internet access for everyone, with a focus on education and healthcare, will change hundreds of millions of lives.

Read More

Similarly, Pakistan is using technology to improve its educational facilities. Across the country, there are currently 183 higher education institutions with broadband Internet connectivity. As the benefits of this have been felt throughout these institutions, they intend to grow this figure to 600 in the coming years.

The Internet will also change education in the developed world.  Earlier this year, US President Barack Obama announced an initiative with the Federal Communications Commission, designed to bring high-speed Internet access to public schools all across the US. His initiative is supported by the success of so-called MOOCs (massive open online courses). At Harvard, more people signed up for MOOCs in a single year than have attended the academic establishment in its entire 377 year history.

Meanwhile, in the UK, FutureLearn has been launched, giving the public access to free online courses provided by British and international institutions.

These examples show how Internet access can open doors to education in cultures and regions where previously they might have been shut. And as Internet access improves, the possibilities for education are expanding. Students can gain access to specialist educators, while teachers already in schools can be supported with more training options. It also means that students can pursue subjects of interest and watch educational videos outside of school hours.

Breaking down the barriers

However, there are several challenges to these lofty ambitions – Internet infrastructure is the first, but this is a work in progress. The Internet is already available to half of the world’s population, and companies such as Google are devising ways to reach those even in remote areas with ideas like Internet-connected hot air balloons. The cost of hardware is another major issue, but this is becoming less of a barrier, as costs are dropping.

Another less obvious piece of the puzzle is ensuring the world has a large enough resource of IP addresses. Every device that connects to the Internet requires an IP address – a unique identifier that is used to communicate with others on the Internet. The old standard, IPv4, was devised during the 1970s when the Internet was born. Back then, nobody could have anticipated how popular and useful the Internet would become, so the 4.3 billion addresses in IPv4 were thought to be plenty. Now we’ve started to run out across the world and this is putting pressure on network operators who have more and more customers and devices looking to connect to the Internet.

Thankfully, there is a new protocol called IPv6, which allows for 340 trillion trillion trillion IP addresses and would ensure no-one was denied access to such an incredible resource. It’s critical that Internet service providers all over the world start to deploy IPv6, because this is the only way to safeguard the future growth of the Internet, and this is integral to support IT-based education across the world.

Education is a basic human right, which should be made as accessible as possible to as many people as possible. With so many examples of society utilising the Internet to support education right now (and many more likely just around the corner), we need to encourage continued investment in the Internet’s infrastructure. By doing this we will ensure that one day, in the not too distant future, everyone will have quick and easy access to the collective knowledge of the entire world.

Can tech innovation boost uninspired manufacturers?

26 Mar 2014
by
Steve Winder
innovate

Despite repeated quarters of growth and output suggesting that things may be on the up for UK manufacturers, new research shows that without technology led change the sector’s long term prospects could be less rosy. In fact, technology innovation might just be the number one factor to help manufacturers to bridge an ‘inspiration gap’ and improve their future fortunes.

In Epicor’s “Inspired to Make It” study senior manufacturing executives scored their company’s Inspiration Rating – a measure of encouragement about their future – at just 5.7 out of 10. Moreover 62 per cent think their industry has to adapt significantly or completely to survive. In terms of where that change must come from, technology innovation is top, alongside pay and conditions, while the pace of technology development is also considered the biggest external influence likely to affect a manufacturing company’s performance.Read More

Conversely, the study showed a possible technology adoption lag, with only 17 per cent having adopted cloud. While this is set to increase, the study showed a sense of sticking with the old rather than moving to the new in terms of IT – a general preference for desktop PCs over tablets, for example.

Considering the economic pressures of the last few years, innovations like adoption of Cloud based solutions has been difficult for manufacturers to justify against competitive demands for capital and resources. However as the study suggest cloud growth will come as manufacturing organisations seek to extend business processes into their value chain with customers and suppliers, leading to improved customer service and driving out cost.

Actually this point on cloud adoption alludes to how technology innovation may play out in the sector in line with our research. Generally speaking the study showed that product quality and price competition were manufacturers’ critical success factors. With this in mind, manufacturers should be prioritising technology investments that enable continuous improvement in business processes, both driving down costs and driving up product quality.

Technology that allows UK manufacturers to find other competitive means, such as service quality, are also going to be vital, echoed by the fact we’ve seen a major drive to integrate CRM and after sales service systems with core ERP platforms, aimed at improving management of customer relationships throughout the product lifecycle.

It’s positive that UK manufacturing has seen growth, but it must address its attitudinal inspiration gap to be motivated for long term success. This is where technology innovation comes in, which we need to see increasing from this point forward.

Steve Winder is the Regional VP of the UK and Ireland at Epicor, an Enterprise Resource Planning (ERP) software and retail solutions provider.

How to ensure your BYOD policy doesn’t fail

25 Mar 2014
by
Rick DelGado
Rick is a freelance writer with an interest in new technologies and what they can do for us and our planet.
office_phones_fullwidth

Bring your own device seems to be everywhere these days. Businesses are turning to BYOD policies as a way to cut down on costs and improve employee efficiency. There are certainly many advantages associated with BYOD, but many companies are finding the policies difficult to implement. In fact, a recent report says about one in five BYOD programs will fail by the year 2016. So if there are so many upsides to BYOD and companies want to run with it, why are so many expected to fail? A number of issues seem to be fairly commonplace, but there are solutions available.

A quick definition

BYOD is pretty straightforward at first glance. It’s a program or series of policies that a company adopts, allowing its employees to bring their personal devices to work in order to use them for business-related tasks and duties. Most businesses will put in place certain rules that allow for BYOD to work smoothly, but the intent is to cut down on IT costs while giving employees the opportunity to use devices they’re already familiar with.

Read More

Problem: Privacy concerns

There are a number of big reasons why experts believe BYOD programs will fail for so many companies. Many employees may not choose to bring their own personal devices to work over some significant concerns they may have about the overall program. For one thing, they may have worries over privacy. BYOD policies usually require IT crews to have some kind of access to the device. Even though this access is usually fairly limited, even having a little bit of personal information available to someone else could make workers feel uncomfortable.

Solution: Separation of professional and personal lives

Establishing a good “fence” between work applications and personal applications can help ease many of these concerns. As long as employees use work applications for work, any additional access will be unnecessary by IT workers. Proper communication over what files can and can’t be accessed by others can also help employees understand what will be monitored.

Problem: Personal expenses

Employees may also be concerned over increased personal expenses. By using their own devices, workers don’t only have to deal with the cost of the devices, but the upkeep as well. They are also responsible for data charges if they’re using mobile devices and the cost of applications they may need for work. These extra expenses may prevent them from participating in a BYOD program.

Solution: Reimbursements

Companies can set up special reimbursement plans to compensate workers for app costs of data surcharges. As long as employees are using the devices specifically for work purposes, this can always be figured into the budget.

Problem: Poor planning

Picking the right policies for a BYOD program requires careful consideration. Often, overlooking certain issues can lead to significant problems. Companies have to approve which apps, platforms, and services are compatible with their policies. If done haphazardly, there’s a possibility of harmful apps being downloaded to devices intended for work use, which could lead to security issues.

Solution: Update the program

Technology advances rapidly, and every day there are new applications available for download. If a company is still relying on BYOD policies it established four or five years ago, chances are it won’t be flexible enough to deal with today’s newest offerings and challenges. A good BYOD program needs to be maintained and updated on a regular basis, or else it won’t be able to address the needs of the business.

Problem: Restrictive policies

One of the main reasons for a BYOD program failure is the restrictive nature of the policies adopted by the company. If they are too limiting, employees will have little incentive to bring their devices to work. Some overbearing limits may include restricting access to too many features on the device. If there are enough restrictions, the device may not feel like a personal one anymore.

Solution: Clearly define policies

It all comes down to clear communication between management and employees. While business leaders will want employees to operate at maximum efficiency, they must also understand that the employee owns the device. Finding a good middle ground for what both sides will accept is essential in establishing the right policies that will still allow employees to work but not give up too much freedom over their devices.

Never forget that learning from past mistakes and successes is an important part of establishing what will work for a particular company. Business leaders who find out the details of other businesses’ successful BYOD policies can implement identical or similar policies in their own companies. By heeding these guidelines, a BYOD program can thrive, evolve, and be a part of a company for many years.

Virtual disaster recovery: A modern solution for an old problem

24 Mar 2014
by
Paul Evans
Paul is the managing director of Redstor.
recovery

There has been a huge shift over the last year towards virtual disaster recovery, with the industry reporting 15 per cent growth in the past six months. Businesses are embracing the cloud and seeing the benefits it brings against outdated methods of physical disaster recovery. Yet there are still many businesses that rely on traditional recovery solutions to protect their most important asset – data – in the event of a disaster.

Traditionally, contingency plans for business managers would consist of many thousands of pounds spent on physical disaster recovery – storing data on tapes and shipping them out to remote locations to ensure that critical information is protected from any eventuality. In the event of a disaster, accessing the stored data off-site can be expensive, stressful and time-consuming. With this situation being easily averted with a virtual disaster recovery service, it raises the question – shouldn’t all business managers consider having such a system in place?

Virtual disaster recovery replicates critical applications and operating systems and the most recent working data into a secure cloud environment. These can be brought online almost instantly in the event of a disaster, keeping the wheels in motion in an organisation and reducing downtime to an absolute minimum.

Read More

The virtual DR service works as a simple process, where a predetermined set of processes to recover data has been agreed with a third party that manages and frequently checks all of a business’ systems and data via its cloud service. It works by automatically taking a snapshot of the servers at least once every 24 hours. These snapshots are then used as a record of the state of the servers. Should a problem occur, the last pre-recovered and successfully tested snapshot of the servers, including file data, will be converted into live working images of the system and replicated with all prior network topology and security. The working images from pre-recovered snapshots are then tested and compared with live problematic snapshots in order to diagnose and investigate the problem that led to disaster recovery being required.

After testing and diagnostics, the systems are made available through a secure online connection within a matter of minutes. This takes the pressure and worries, incurred with physical based disaster recovery solutions, away. It also means that your business is protected whatever the eventuality, whether it happens to be human error, electrical surge, or even malicious cyber-attack.

One of the key problems faced by businesses that rely on physical disaster recovery solutions is that there is more hardware that could go wrong, increasing the risk of dissimilarities between devices and meaning there is an increased chance that data may become corrupted. If physical storage hardware malfunctions then it can cause the primary location to malfunction too, as they are both interlinked. Therefore, even with precautions in place, a total reboot of systems to recover the data may not actually work. It also forces IT management staff to ensure that both sets of hardware are totally aligned, driving up costs and time. Furthermore, when disaster strikes at the primary location, not only are there lengthy steps to recover the data, investigate and resolve the problem and build safeguards against it happening again, there is also huge pressure on the IT manager to get the systems back up and running as fast as possible, so that the business can function normally again.

Physical disaster recovery is becoming a relic due to its slow speed and the inherent difficulties involved in its use. Virtual disaster recovery, on the other hand, is not only dramatically quicker at getting a business back to normal, it is also often cheaper. One of the main reasons for this is that companies do not have to invest in their own physical space and hardware to store data – they simply rent space at low cost in a third party’s cloud. The speed and increasingly low cost of virtual disaster recovery are two of the key reasons for such high exponential growth over the last six months.

According to CIO Insights, 68% of IT managers cite data loss as being their main fear this year. In order to alleviate these worries, IT managers must equip their businesses with the dual protection of data backup and disaster recovery, especially when dealing with complex systems or confidential and sensitive data. While data backup means that the information is available for restore or auditing requirements, it does not enable bringing whole systems or even environments back online at very short notice, which is why virtual disaster recovery is essential to get everything back to the way it was as quickly as possible.

Research from Neverfail showcases that one in five businesses loses up to £10,000 an hour due to IT outages and that 92.8 per cent of businesses report that they have suffered problems in the past year. This reinforces the question of why more companies are not moving to virtual disaster recovery solutions, especially when 92 per cent of IT managers are in the decision making process of how to protect their businesses.

As data and systems are everything to a business in a world where being offline too long can not only cost productivity but the company itself, virtual disaster recovery is the best way to protect your business from the old worry of losing everything when the inevitable does eventually happen.