The ultimate technology resource.

Technology.info will inform, educate and inspire. Immerse yourself in the very latest IT research and resources to give your business the edge.

Company Directory

Compleate A - Z listing of all companies VIEW NOW

Find a company you need by area of technology:

Brought to you by IP EXPO Europe

Microsoft unveils Windows 10 operating system

01 Oct 2014
by
Mike England
Mike England
Mike is the Content Director at Imago Techmedia which runs IP EXPO Europe, Cyber Security EXPO and Data Centre EXPO
Windows 10

Late yesterday Microsoft uncovered the next generation of Windows, previously codenamed Threshold, the new OS will be called Windows 10.  Technology.Info scoured twitter to present a social overview of the launch with an overview of the key features and commentary from the live launch. The early technical preview of Windows 10 included:

  • Expanded Start menu.

  • Apps that run in a window.

  • Snap enhancements.

  • New Task view button.

  • Multipledesktops.

View image on Twitter

 

Finally check out the full video of the launch event including the live demo from Windows VP Joe Belfiore who covers some of the main features in Windows 10, like the new Start menu, multiple desktops, and improved multi-tasking. He also discusses how you can be part of creating the best Windows yet with the Tech Preview and the Windows Insider Program.

 


Future-proofing the Corporate Data Centre

01 Oct 2014
by
Marcus Jewell
Marcus Jewell
As Vice President Europe, Middle East and Africa (EMEA), Marcus Jewell is responsible for sales operations throughout the geography, helping Brocade expand its footprint in the EMEA market. Jewell has more than 15 years of experience in the networking business, having started his career in technical sales at Xerox, focused on network attached solutions. He then joined MiTech Europe, where he rose through the ranks from lead generation and sales to become Managing Director of MiSpace Ltd, a managed ICT services company jointly owned by MiTech and Jewell. He joined Mitel Networks in 2003, heading up the Enterprise Sales and Services for the UK and Ireland, where he was responsible for significantly growing market share and revenue. He graduated from Glamorgan University with a Bachelor of Engineering Honours Degree in Civil Engineering.
Brocade Data Centre

Technology trends such as big data, cloud and pervasive mobility are transforming the way we live and work, and having a dramatic impact on businesses and consumers.

The benefits from these trends are varied and wide-ranging, but they all have a common element underpinning them; the data centre. Our reliance on this infrastructure is tied to our increasing consumption of data, a dependence which has never been greater and which shows no signs of slowing down.

It’s therefore essential that businesses ensure their data centres are ready for the future. It is already critical for businesses to have scalable and flexible data centre infrastructure, with any downtime likely to prove hugely costly. Long after technical faults are resolved, such issues can impact a business’ profitability and cause irreparable damage to brand and reputation. For example, just consider the impact if your bank or mobile operator’s online services suffered prolonged outages.

As a result, it’s no longer viable for data centres to delay the inevitable and just try and squeeze the most out of legacy infrastructure. Fundamental changes are needed to prepare for the future. Now is the time for data centre owners and business leaders need to embrace innovation.

To gauge future needs, businesses and their IT decision makers must acknowledge the four core elements that accelerate the evolution of data centres:

  1. The prevalence of virtualization

Virtualization is now prominent in the majority of data centres and requires a much more elegant and robust network topology to provide the raw performance and management flexibility needed.

  1. The demand for network reliability

The advent of faster networks, more (virtual) data centre capacity and higher security requirements means network resilience is critical. As companies empower mobile working, the need for a secure and reliable network is paramount.

  1. The drive for ‘always-on’ accessibilty

The adoption of cloud-based services, either on an application-level (such as CRM tools like salesforce.com) or by outsourcing entire IT requirements to hosting providers, has meant the need for 24/7 network accessibility and resilience has never been greater.

  1. The need for scalability

Modern data centres have to deal with more volume – service more users, applications and data. This is a trend that will only increase and will require a highly flexible and scalable data centre infrastructure.

So what does the data centre of the future look like?
The question remains of how to respond to these trends and developments and provide a data centre that can cope with requirements how should data centres, now and in the future? There are three key steps businesses should consider:

A strong foundation
The physical networking infrastructure provides the basis for any data centre, one that provides the connectivity between applications, servers and storage. A fabric-based networking topology is required for businesses that want to embrace a highly flexible and agile on-demand model as it provides the ability to build scalable cloud infrastructures that also reduce cost. A fabric-based network, both at the IP and storage layers, can address the growing complexity in IT and data centres today by simplifying network design and management.

One example is London-based cloud managed service provider Oncore IT; a company that has already upgraded to Ethernet fabric. Oncore IT didn’t want to just improve performance – it wanted to go a step further and guarantee it, in order to deliver a truly exceptional cloud experience. While the market is still buzzing with interest around the cloud, some concerns do remain and Oncore IT recognises that service providers must take total responsibility for their cloud solutions to ensure the highest possible service levels. A solid foundation, based on Ethernet fabric, has helped them to achieve this.

Virtual Infrastructure
On top of the physical infrastructure will be a virtual, or logical, layer. This is well-established in the server domain with hypervisor technology. The same concepts are now being applied to both storage and IP networks, with technologies such as overlay networks enabled through a variety of tunnelling techniques. These allow the provision of IPv6 services over existing IPv4 or multiprotocol network infrastructure.

In the future we will see network services virtualized, as a result of the introduction of virtual switches and routers. “NFV”, or Network Function Virtualization, represents an industry movement towards software or VM-based form factors for common data centre services. Customers want to realize the cost and flexibility advantages of software rather than continuing to deploy specialized, purpose-built devices for services such as application delivery controllers. This is especially the case in cloud architectures where these services want to be commissioned and decommissioned with mouse clicks rather than physical hardware installations and moves. We are already seeing a shift towards open, more flexible, efficient, highly programmable and elastic network infrastructure solutions with key initiatives such as OpenStack and the Open Daylight Project making a big impact.

Frameworks for orchestration
Lastly, it’s essential that the entire data centre environment be managed by orchestration frameworks that allow for the rapid and end-to-end provisioning of virtual data centres. OpenStack, for example, allows customers to deploy network capacity and services in their cloud-based data centres far quicker than with legacy network architectures and provisioning tools.
In the years to come, technologies such as Software-Defined Networking (SDN) looks set to radically transform the datacentre. SDN refers to the separation of the part of the network that is responsible for routing and directing traffic (known as the control plane) from the part that carries the traffic itself (known as the data plane). The goal is to allow organisations to respond rapidly to changing business requirements. By simplifying how network resources are deployed and managed, SDN gives businesses far greater control of their data and applications and makes network management simpler and faster.

SDN is still in its infancy but its potential is vast; IDC has predicted it will be a $3.7 billion market by 2016. By making networks smarter and simpler to manage, it will facilitate innovation throughout the enterprise, helping businesses to develop and deploy new applications and respond to changing market forces faster than ever.

In order to adequately prepare for the data centre of the future and take advantage of SDN and other emerging technologies in the years to come, businesses need to combine the most valuable aspects of the physical and virtual layers. Adopting the steps outlined above will give organisations the ability to flexibly deploy data centre capacity – compute, networking, storage and services – in real-time, whenever and wherever they need it, delivering much improved ROI and helping businesses to turn their data centre into a real competitive advantage.

Marcus Jewell is Vice President EMEA at Brocade

 


Clicking Clean: how renewable energy is powering data centres

01 Oct 2014
by
Puni Rajah
Puni
Puni built her reputation for fact-based decision support as a business analyst at Deloitte. Since then, she has conducted IT consumption, management and sales research in the European and Asia/Pacific markets. She is expert at understanding enterprise software and services use and buying behaviour, and has coached marketing and sales executives. Puni is the Content Analyst for Data Centre EXPO which takes place alongside IP EXPO Europe in London ExCel on 8 - 9 October 2014
Renewable Energy Data Centre

If the internet was a country, it would be sixth in the world in terms of demand for electricity. That’s set to grow even more over the next few years, as more of the world’s population joins the online community. And with this kind of level of demand, the source of the electricity used in data centres really matters in terms of climate change and environmental impact. The campaigning organisation Greenpeace has published a report, Clicking Clean, on this very subject.

Why does it matter?

Surely, you may be thinking, the internet is saving energy around the world by reducing transport costs, making us all more energy efficient? Unfortunately, the gains in energy efficiency from moving business to an online model are more than outweighed by the increase in demand for electricity. Further, the internet’s energy footprint has largely been concentrated in areas where electricity is made from fossil fuels, not from renewable sources. Our ongoing ability to feed our growing internet habit depends on using cleaner, greener, and more sustainable sources of energy.

In 2012, Greenpeace published How Clean is Your Cloud? Just two years on, its update shows that leading data centre companies have begun to take action, although the picture is a mixed one. For every data centre operator with impeccably green credentials and a commitment to achieving 100% renewable energy use, there are others who are less committed. Some pay lip service, for example, offering carbon offsetting or renewable energy credits, instead of changing to renewables. Others provide no information about their energy sources or simply choose the cheapest option.

Detailed findings

Greenpeace looked at 19 major players, including co-location providers. These 19 companies cover over 300 data centres, up from 80 in the previous report. Key findings include:

  • Six major cloud brands, Apple, Box, Facebook, Google, Rackspace and Salesforce, are committed to powering their data centres wholly from renewable energy.
  • Several leading brands, particularly Apple and Facebook, have become much more transparent about their energy use, although transparency still remains weak elsewhere.
  • Amazon Web Services provides the infrastructure for a large amount of internet use, but lags behind its competitors in terms of both use of renewables and transparency. It does not release any information about its energy use or environmental footprint. Twitter is also an offender in these areas.
  • Three of the major brands, Apple, Facebook and Google, successfully put pressure on the US’ largest utility, Duke Energy, to open the market to renewable energy purchases for large customers in North Carolina.
  • Google is expanding its renewable energy purchasing and investment, both independently and through collaboration with utility companies.
  • Facebook also continues to demonstrate its commitment to a green internet. Its decision to locate a data centre in Iowa has driven the largest purchase of wind turbines in the world.
  • Apple has shown the most improvement over the last two years, and is both aggressive and innovative in trying to achieve its commitment to become 100% powered by renewables.

Making a difference

The companies committed to building a future using 100% renewable energy are starting to have an impact. In several US states, including Iowa, Nevada, and North Carolina, the pressure from data centre operators has resulted in solar and wind power displacing coal, natural gas and nuclear plants, with wind or solar farms being built instead of new fossil fuel plants. This has a real effect on the local environment.

But the economic arguments are also turning in favour of renewables. Iceland looks like becoming a data centre hub because of its ready access to renewable energy, and the cost of renewables is falling all the time. At the same time, as fossil fuels become harder to extract, their cost is rising. Customers are applying pressure on companies to move to renewables, voting with their feet if necessary. These messages are ones that even those companies who currently remain unconvinced may have to heed.

More data centre energy management discussions can be found in the Extending the platform keynote session on 8th October and Software, energy and skills; the next axes of convergence keynote session on 9th October.


A closed room event with Microsoft CEO – Satya Nadella

30 Sep 2014
by
Iain Mobberley
Iain Mobberley
As Technology Director at OCSL, Iain leads an ever growing and evolving Services Practice, shape company strategy and advise inwardly appropriately. He also spends his time advising and working with CxO stakeholders around strategic decisions, potential investment in business and IT related systems. Iain serves on a number advisory councils with various vendors bridging the gap between vendor and reseller.
Satya Nadella CEO Microsoft

Iain Mobberley, Technology Director at OCSL shares some thoughts and insights from attending a closed room event with Microsoft CEO, Satya Nadella.

This is what I was faced with on the 16th July 2014 when I was invited, along with nine of my peers, to a closed door session at the Microsoft’s Worldwide Partner Conference with Satya Nadella. For those who don’t know – he is only the third CEO of Microsoft in its 39 years.

In my career I have had the pleasure of meeting all three leaders. Bill Gates always excelled with his desire to change the world and drive his company’s technology to the forefront of people’s minds. With Steve Balmer it was his presence and enthusiasm, as well as, his deep felt passion for Microsoft to be successful that were most noticeable.

Read More

Satya NadellaSatya Nadella was 164 days into his role when he shared his views with this select gathering of partners. He is a career “Microsofter”. Almost, some would say, a safe bet. However, at this point he had already changed a huge amount.

Microsoft have moved to a Mobile-First Cloud-First business, moving away from a Devices and Services company. Office has been released onto iPad – after spending over 14 months sat on the development shelf, despite being ready for release. Bill Gates is now employed as a full-time advisor to Mr Nadella. And to top it off, Microsoft’s share price was higher than it had been for a couple of years ($27.44 two years ago and now $44.90 at time of writing).

We had 40 very carefully constructed minutes with Satya.

The first thing that was hugely different to my previous experiences of Microsoft CEO’s was Satya’s desire to drive Microsoft in his direction. He is a remarkable man – well informed, as you would imagine. And his vision of where the future lies and what Microsoft needs to do to continue to be successful and achieve its goals, is highly admirable. He is sharp and incisive and, I believe, he will continue to drive further changes that will seem both remarkable and really common sense.

My personal reflections of the meeting are that it was great to be one of the ten people invited to attend. And that I left feeling Microsoft is in a very safe pair of hands but with a leader who will really challenge his teams to do more and to strive to be innovators and leaders again.

Only time will tell whether this is correct, but I suspect great things from Microsoft in the coming months.

Iain Mobberley, Technology Director, OCSL

www.ocsl.co.uk


Seed Forum Launches UK Pitch Training and Gives Free Fundraising Education to British Tech Start-Ups

30 Sep 2014
by
Olivia Shannon
Olivia Shannon
Olivia is an award-winning writer and technology PR and social media expert. She is the co-director of Shannon Communications, a technology public relations firm.

Seed Forum International Foundation is celebrating its first UK pitch-training event by giving free investor pitch training to up to 10 UK-based technology start-ups. That’s not all—based on Seed Forum’s assessment at pitch training, up to four start-ups will go on to pitch to investors at the next Seed Forum in London on November 13th.

Steinar Korsmo, President of Seed Forum International Foundation, will conduct Seed Forum’s first UK pitch-training session on October 17th at the London offices of Gowlings law practitioners. Over the last 14 years, Korsmo has personally conducted more than 100 Seed Forum pitch-training sessions in more than 20 countries. He helped establish the first local Seed Forum in Norway in 2001 at the Campus Kjeller business incubator. Since then, Seed Forum has grown into an international network composed of thousands of investors, nominators, partners and alumni in more than 40 countries.

The free pitch-training spots are open to early-stage, high-growth, UK-based businesses working in information technology. The first ten eligible start-ups that apply and are selected by Seed Forum stakeholders will be offered free pitch training in London on October 17th.

To register, start-ups should nominate themselves for the Seed Forum process at http://www.seedforum.org/register/company by 12:00 British Summer Time on Tuesday 14th October 2014. For more information, see the announcement here.

The Seed Forum process is designed to reduce the risks associated with investing in start-ups. It also helps ensure the start-ups are “investor-ready” before they pitch in the investor-matchmaking forums.

The Seed Forum process consists of four stages: nomination, fundraising education, selection and investor matchmaking. First, start-ups are nominated for the Seed Forum process. Next, Seed Forum invites eligible start-ups to participate in pitch training. Seed Forum then assesses pitch-training participants and selects a number of start-ups to pitch to investors at the region’s next investor-matchmaking forum. These start-ups automatically join Seed Forum’s extensive alumni network of entrepreneurs from all over the world.

Seed Forum pitch training is obligatory for all companies that want to pitch at an investor-matchmaking forum. Topics that may be covered in pitch training include investment documentation, due diligence, working with corporate finance partners, negotiating, exit strategies and so on. Pitch trainers also coach participants on presentation techniques and business cultures in different countries.

Seed Forum’s goal is “to facilitate a global market for seed and venture capital without regional or national borders.” To that end, Seed Forum brings together start-ups and investors from different countries to match with each other in the forums, which are organised on local, national and international levels. Seed Forum connects high-growth, early-stage businesses with private investors, seed funds, venture funds, private equity funds, corporate venture investors, family offices and financial intermediaries in born-global organisations.

Investors who are interested in attending the 26th UK Seed Forum in London on November 13th should register at http://www.seedforum.org/register/investor.

A collective voice for European data centres

29 Sep 2014
by
Puni Rajah
Puni
Puni built her reputation for fact-based decision support as a business analyst at Deloitte. Since then, she has conducted IT consumption, management and sales research in the European and Asia/Pacific markets. She is expert at understanding enterprise software and services use and buying behaviour, and has coached marketing and sales executives. Puni is the Content Analyst for Data Centre EXPO which takes place alongside IP EXPO Europe in London ExCel on 8 - 9 October 2014
European Data Centre Association

With more and more players on the data centre stage, including a number of umbrella bodies at national level, such as the the Data Centre Association, France’s CESIT and Germany’s ECO, the question that increasingly arises is whose is the voice at regional level? For Europe, it’s the European Data Centre Association, or EUDCA, with the role of ensuring that, in a global industry, efforts are made to ensure that European interests are raised and heard. We caught up with board member Stefan Norberg to understand the organisation’s priorities, initiatives and progress to date.

Mandate and membership

EUDCA is based in Brussels, and its objective is to act as an umbrella group for national data centre associations. With members from 17 countries, and a Board drawn from seven, it is positioned to provide a unified voice at European level on matters affecting the data centre industry. It works across four interest segments, which are Policy and Regulation, Standards, Data Protection and Energy Policy.

EUDCA aims to capture ideas and work from around Europe on the cutting edge of data centre development, including from its affiliate national umbrella groups for the data centre industry. One example is work in progress in CESIT, in France, on governance, or in ECO, in Germany, on certification processes. EUDCA draws and shares to ensure that good practice spreads quickly, and that lessons learned in one country don’t have to be revisited elsewhere.

EUDCA is a membership body, which currently has 35 members who have an international or regional focus. Many join in order to ensure that their interests will be heard at a European level, or to increase their influence. They recognise that policy bodies EU level can have significant impact on the way that data centres operate, and they want to be able to influence those discussions.

Adding value at regional level

So what value does the EUDCA bring that others cannot? Many EUDCA members are looking for ways in which they can work more effectively, for example, by standardising evaluations or audits, so as not to be blocked by larger companies. EUDCA is able to influence standards, to ensure that adopted standards are regional, and not national.

Other members join to get more international exposure. They want to know what is happening elsewhere, and learn from others, as well as getting their name known internationally. These companies often want to avoid being ‘just’ a national company, and they are using EUDCA as a way of doing so.

EUDCA’s other chief area of value is to bring a regional voice to a global industry, and make sure that decisions made nationally are also in European interests. The organisation tries to bring together work across Europe and ensure that it is coherent and congruent.

Facilitating contact and awareness

EUDCA sees much work ahead to promote the industry and community, and the channels and platforms to do this, the better. EUDCA is partnering with Data Centre Expo for precisely this purpose.

EUDCA is partnering with Data Centre EXPO to promote industry-wide collaboration.  Contact us if you wish to arrange meetings with EUDCA members at the event.


Is the Shellshock Vulnerability the new Heartbleed or something worse?

29 Sep 2014
by
Paul Fisher
Paul Fisher is the founder of pfanda - the only content agency for the information security industry. He has worked in the technology media and communications business for the last 22 years. In that time he has worked for some of the world’s best technology media companies, including Dennis Publishing, IDG and VNU. ​ He edited two of the biggest-selling PC magazines during the PC boom of the 1990s; Personal Computer World and PC Advisor. He has also acted as a communications adviser to IBM in Paris and was the Editor-in-chief of DirectGov.co.uk (now Gov.uk) and technology editor at AOL UK.  In 2006 he became the editor of SC Magazine in the UK and successfully repositioned its focus on information security as a business enabler. In June 2012 he founded pfanda as a dedicated marketing agency for the information security industry - with a focus on content creation, customer relationship management and social media Paul is the Editorial Programmer for Cyber Security EXPO which runs alongside IP EXPO Europe, 8-9 October, ExCel, London.
shellshock bash virus

Since its discovery last week the Shellshock vulnerability in Bash has been hitting the headlines, with some observers predicting a digital armageddon. Bash is open source software that runs on millions of Linux and UNIX based systems including web servers, and every Apple OS X device.

Even defence systems are vulnerable we are warned with cyber attackers already planning devastating attacks, according to the FT. “One of the most acute and pervasive online security loopholes ever identified.” it said.

So concerned was the National Cyber Security Division of the US Department of Homeland Security that they rated it 10 out of 10 for both exploitability and impact, which not surprisingly gave it an overall score of 10 for severity.

However some may argue that with been here before, when Heartbleed made the rounds earlier this year and then was quietly contained. But this may well be different.

“Shellshock is a serious bug. It affects a large portion of the servers out there and, in addition, it is incredibly easy to exploit, since it’s a simple command injection. This vulnerability has all the characteristics that make it a perfect tool for writing a self-spreading worm. Fortunately, we haven’t seen one – yet. I think we are in for some interesting times.” said Giovanni Vigna, CTO at anti malware specialists Lastline and keynote speaker at Cyber Security Expo.

Just like Heartbleed the Bash bug is easy to fix but the real difficulty is identifying how many systems may be already affected, and whether they have already been compromised.

But unlike Heartbleed, the Shellshock bug is much more active, it can enable hackers to run programs in the shell, write files as well as copy and delete them.

Uri Rivner, head of cyber strategy at cognitive biometrics firm Biocatch and a keynote speaker at Cyber Security Expo, believes this is serious and the repercussions will last some time.

“The internet will be a notch more dangerous in the coming days as web site administrators begin to apply patches for the massive new vulnerability. Bash is popularly used in web servers based on Linux and Unix, which are considered relatively safer than other operating systems, so the impact is considerable. While the big websites will quickly find a remedy, there will be a long tail of smaller sites that will not respond fast enough and will succumb to the host of cyber criminals already rubbing their hands and launching attacks based on the new loophole.” he said.

According to Rivner, the main objective for these hackers will be digging for user names, passwords, email addresses and credit card numbers stored in the site, as well as using them for drive by download attacks to install trojans on visitor machines.

However, another Cyber Security EXPO keynoter Jon Callas, CTO and co-founder of Silent Circle is in no doubt it’s a serious problem but it may have been over hyped. “It’s a huge impact because bash is the default shell on many operating systems. Thus, any problem in bash has a huge impact, a minor problem would be a minor problem everywhere. But it isn’t, however, as serious as some news organisations would like it to be. It isn’t worse than Heartbleed.” he said.

As for Apple, a company not used to major vulnerability issues, its typically upbeat statement read: “With OS X, systems are safe by default and not exposed to remote exploits of bash unless users configure advanced Unix services. We are working to quickly provide a software update for our advanced Unix users.”

Given that advanced Unix users are likely using OS X in the enterprise, they perhaps should hurry on that promise.




Defining and developing an enterprise mobility strategy

25 Sep 2014
by
Tim Edwards
Tim Edwards
Tim Edwards is a Director of Dootrix, former Head of Technology at Eurostar and Head of IT at Salmon. Dootrix provide Agile software development, technology consultancy and project delivery. Their team are experts in emerging mobile technologies and how they interface with the enterprise environment.
enterprise mobility strategy

The modern agile business demands an enterprise mobility strategy so many organisations already have some kind of framework in place. But how do you develop a strategy that meets your mobile computing needs now and into the future?

What information do your employees need?

An efficient and effective mobile computing strategy relies on more than simply replicating the desktop experience externally – displaying information based on context is of greater importance.

The cornerstone questions of enterprise mobility is not “what information do mobile employees need?”, but “what information do employees need right now?”. Your business will need to consider the data demands of the various roles and responsibilities and begin to design apps and solutions to fulfil them.

Information access and delivery based on context will help raise efficiency and productivity by ensuring employees have the right information at the right time.

How do we best present that data?

Making the mistake of replicating the desktop experience for mobile users simply amplifies inefficiencies by adding reduced access speeds and hardware limitations (think small smartphone screens) into the mix. Delivering an acceptable user experience is key to ensuring that mobile employees continue to use the systems provided.

With the question of “what” information resolved, the strategy must then consider the how. Notebook computing requires very little alteration to existing remote access solutions, but is also unlikely to deliver the context-aware experience that makes employees truly efficient.

The rise of smartphones and tablets has helped improve the ability to receive context-based data, but existing backend systems will need to be enhanced to provide delivery. These always-on mobile devices allow for near-instant data access and retrieval based on location, timing or virtually any other variable required for context. Such devices also integrate perfectly with a unified communications strategy, allowing employees to be contacted via any channel, any place, any time.

The advent of wearable technologies like Google Glass and the Apple Watch also offer new ways to deliver context-sensitive data to mobile employees. The use cases for such technologies remain limited at present, but in instances where small amounts of data need to be supplied quickly, they could even trump smartphones and tablets.

Whichever mobile devices are chosen, they will also require custom apps designed to best present information to the user. These apps will be properly-optimised to deal with processing and display limitations of the chosen mobile devices.

What needs to change at the back end?

Most mobile devices have limited capacity for displaying and processing data. This means that much of the computation and data shaping will need to be handled by backend systems like enterprise content management. These systems provide integrated security and auditing along with a way to deliver relevant information to a variety of devices, including mobile.

The enterprise mobility strategy will also need to consider how devices are deployed and managed. Distributing apps, enforcing security rules, and connecting employee devices to corporate resources safely can all be simplified through the use of an effective mobile device management solution. Without such a system in place, the mobility strategy will be unable to properly police data access by connected devices, including personal smartphones and tablets used by employees.

Putting the strategy together

A successful mobile enterprise strategy requires a holistic approach that includes changes inside and outside the corporate network. But beyond app development and back-end modifications, the key lies with context – delivering what your employees need, when they need it, wherever they are. Any strategy that fails to account for this will be less than effective, or even doomed to complete failure.

Looking to develop your mobility strategy? Speak with Tim and Dootrix in the Futures Den at IP EXPO Europe, Stand V4 Pod 6




The Web in 2050: anticipation and alarm bells

25 Sep 2014
by
Jessica Twentyman
Jessica Twentyman
Jessica Twentyman is an experienced journalist with a 16 year track record as both a writer and editor for some of the UK's major business and trade titles, including the Financial Times, Sunday Telegraph, Director, Computer Weekly and Personnel Today.

How do you feel the Web is developing – and what concerns do you have? That was the question put to five industry panelists at a recent event, ‘A Vision for the Web in 2050’, organised in the run-up to this year’s IP EXPO Europe.

The question – answered by senior executives from Brocade, HP, McAfee and VMware, along with futurologist Tom Cheesewright – was prompted by remarks made by World Wide Web inventor Sir Tim Berners-Lee, who will present the opening keynote  at this year’s show.

In a recent interview with Wired magazine, Sir Tim said that, over the next few years, millions of sensors, appliances and other devices will be coming online, building out the Internet of Things and taking the web to new places.

“The potential excites and concerns me at the same time – and that makes the web worth our ongoing stewardship,” he told Wired. “We must build and defend it now, so that those who come to it later will be able to create things that we cannot ourselves imagine.”

It’s an exciting prospect for futurologist Tom Cheesewright, author of the ‘Book of the Future’ blog. “In the future,” he told attendees, “there’ll be no clear boundary between us and the technology we use – or at the very least, the lines will be blurred.”

“I’ve already outsourced my sense of direction and memory to a smartphone and the Cloud,” he joked, explaining how he used his phone to reach the event. “The barrier between me and those digital tools is still quite explicit, in that I have to reach into my pocket and take out my smartphone, but with wearables and haptic technologies, that barrier will slowly disappear.”

Across the board, however, our #web2050 participants agreed with Sir Tim’s reservations about how the web is developing. The idea of an open web doesn’t mean that rules regarding privacy, freedom of expression don’t apply and our panellists see challenges ahead, particularly around privacy.

“I see real generational differences at play here,” said David Chalmers, chief technologist within the EMEA enterprise group at IT giant HP. “My first question when I look at the digital world is, ‘Why would I want to do that?’. For my children, it’s ‘Why wouldn’t I want to do that?’ As a generation, they’re conditioned to share. This is my personal concern with the way the web is developing: where do the boundaries of personal privacy, if you’ve already given your privacy away?”

Not only that, but we’re increasingly prepared to give our privacy away for “practically next to nothing”, at the same time that the companies to whom we hand it over attribute it with more and more value, said Raj Samani, chief technology officer at security company McAfee.

If you look at recent M&A deals between social networking companies through the lens of cost-per-user, he pointed out, Google paid around $20 per user for YouTube back in 2006, while Facebook paid $30 per user for Instagram in 2012 and $42 per user for WhatsApp earlier this year. “There’s a very profound disconnect emerging between how we value our own data and how companies value it – and where that might go in future is alarming to me,” he said.

For Marcus Jewell, vice president of EMEA at networking specialist Brocade, meanwhile, there’s a concern over segmentation: as the scale of the Internet becomes even more vast, he forsees it becoming a more divided world, segmented largely along socio-economic class lines. “The ‘dark net’ represents a first segmentation of the Internet, but I think that trend will continue in a more noticeable way,” he told attendees. “As human beings, we like to know who we’re dealing with – what ‘type’ of person they are – and I think that trait could start to shape the Internet in the years ahead.”

Finally, when it comes to IT security, new approaches will be needed, said Brian Gammage, chief market technologist at VMware. “Boundaries are useful – we need boundaries, because we can audit what travels across them,” he said. “In future, auditing will matter more than security, because it’s better to have insight into what’s crossing than it is to build high walls around everything. In any case, that high walls approach will be impossible in a heavily interconnected world.”

Sir Tim Berners-Lee will share his vision for successful business on the web – from predicted challenges and the technologies that businesses will use to overcome them, through to the key innovations that will drive future success, improve customer experience and create new markets – at 10am on 8 October 2014 at IP EXPO Europe.


Starbucks IT head on serving up the best digital experiences

24 Sep 2014
by
Jessica Twentyman
Jessica Twentyman
Jessica Twentyman is an experienced journalist with a 16 year track record as both a writer and editor for some of the UK's major business and trade titles, including the Financial Times, Sunday Telegraph, Director, Computer Weekly and Personnel Today.
Starbucks Coffee EMEA IT Director

Robert Teagle, head of IT in EMEA at coffee shop chain Starbucks, takes a pretty straightforward view of the role of technology at one of the world’s most recognisable retail brands.

“It’s all about innovation – managing innovation and how it relates to us in the retail world,” he says. “Really thinking about how we at Starbucks think about innovation, how we think about it internally, how we think about it in terms of our customers, bringing innovation to everything we do.”

If his approach is clear, however, the role he performs brings its own complexities: he’s responsible for technology in over 2,000 Starbucks stores in the EMEA region (with some exceptions due to the franchising model), plus all the back-end infrastructure and applications used by the business. And while Starbucks relies heavily on its bricks-and-mortar presence as its main channel, there are a number of digital challenges for him to tackle.

Ahead of his presentation at IP EXPO Europe’s Digital Transformation Summit, Teagle took time to discuss some of the key issues he faces.

For example, how does he balance the demands of managing IT infrastructure with the innovation imperative that he feels so keenly?

Starbucks EMEA currently has approximately 60 percent of its technology managed in-house, with 40 percent handed over to external suppliers, he explains: he and the rest of his technology team don’t want to spend their time and efforts managing technology that doesn’t prove to be a differentiator in the eyes of the customer.

“The general rule that we have is that, if it is a commodity-type service, then we at Starbucks don’t need to become experts. Good examples of things that are outsourced include our help desk, our data centre and our managed network across the region. The stuff that doesn’t give us a leading edge in our world,” he says.

Instead, they prefer to take charge of things that “add to the Starbucks experience”, he says: “Adding innovation, making sure we can do things around digital – that’s much more valuable to me than having a team of people that are running a help desk or a data centre.”

At the same time, Teagle and his team can never forget the importance of the company’s bricks-and-mortar outlets, so in-store technologies that drive a better customer experience must be a top priority.

Starbucks was one of the first retailers, for example, to offer its customers free Wi-Fi, way back in 2010. It’s something that continues to draw customers to its stores today, he says, and thus exactly the kind of investment he’s interested in. Another recent example has seen Starbucks stores introduce charging mats for customers to charge their mobile phones.

“Anything that we can use to enhance the experience for our customers –  you most likely have to go to the store to experience it. You can’t do much with our business at home. It’s more about going to the store, so it’s more about in-the-store technology and, when you walk in the door, what you can do there,” Teagle says.

“We are not like other retailers, where they can have a digital first, incredible omni-channel experience – we don’t really have that because you have to go to the store, it’s not something we can send to the house.”

However, he and his team are also investing in mobile and social developments that the customers can then take into the digital world, “so they can take their experience with them once they have left the store.”

That’s important in a world where a mobile app can disrupt an entire industry in a few months. Teagle says his biggest challenge is keeping up with the evolving digital needs – and demands – of customers.

“If people are starting to do less bricks-and-mortar type shopping, what does that then mean for our business? Do we need to change the way we operate our stores? Do we need to change the offerings in stores? What can we do in the digital space when you are not in the store that makes it easier for you?”

“That’s the unique challenge. We could just take a broad brush and say we will open as many stores as we can, everywhere we can, and just flood the market. But I think we have found that that just isn’t necessarily the right thing to do either.”

“I think that’s where innovation comes in. I think you have to try some things out and you learn what the customers like, what they don’t like and you see how things work. Many of those things we are trying, many of those things we don’t know yet, but we are figuring out what it makes it interesting for the customers.”

Robert Teagle will present ‘Managing Innovation and Disruptive Technologies in Retail’, as part of the Digital Transformation Summit at IP EXPO Europe.

Barriers to data centre knowledge transfer

22 Sep 2014
by
David Cameron
David Cameron
David has 30 years’ experience in the design, installation and commissioning of large scale engineering systems and prior to joining OI he spent the previous 15 years in a senior management role within a multidisciplinary, international consultancy. His specialism is high availability engineering systems and he has an extensive track record in delivering complex engineering systems on time and on budget. He has extensive experience of operational risk analysis having undertaken many studies utilizing Failure Mode Effect and Criticality Analysis (FMECA), Fault Tree Analysis (FTA), Certification, IST management and Risk Based Maintenance. He has provided independent advice to clients on energy efficiency of data centres and has been used as an independent specialist on a number of significant failure events. As a project manager he has an extensive track record in delivering projects within a team environment and has the ability to present complex issues in a non-technical way.

The parties involved in the life of a data centre can be broadly split into four distinct roles – client/business, design consultant, installation and commissioning contractor, and operator. Communication between these parties can be complex and is usually influenced by a contractual and commercial arrangement. As a result opportunities for effective transfer of knowledge are limited. Due to the contractual and commercial considerations the transfer of information between these parties is structured, and generally comprises an auditable set of documents that satisfy these contractual requirements as opposed to the actual needs of the facility.

barriers to knowledge transfer by OIIt is generally accepted that 70-80% of failures in data centres are due to human error or ‘unawareness’. In our experience a similar percentage of energy saving opportunities are also missed due to human unawareness of the possibilities. Both risk and energy fall into the domain of the operations team, and their ‘unawareness’ can only be improved by the effective transfer of information to them. The Kolb learning cycle demonstrates how the effective transfer of knowledge between quadrants is limited by the documentation used to transfer this knowledge, and which is focused on contractual requirements as opposed to the needs of the operator.

It is common for barriers to exist between each quadrant (shown by the yellow lines above), restricting the transfer of knowledge and creating a silo-effect. This increases the risk of failure and the ability for facilities to optimise their operation and in particular minimise energy consumption. Continuous site-specific training, run at the point in the project lifecycle that traditionally coincides with a barrier, bridges the gap between the two segments by providing a common language and opportunity for knowledge and information transfer.

  • Barrier 1 – This opportunity for knowledge transfer occurs between the client and design consultant. In order to get what the client wants, the client should not only identify the key performance requirements – such as watts/cabinet, number of cabinets, PUE, and tier rating – but also the handover criteria such as testing, training, and handover documentation. These are important elements in reducing risk and energy consumption and should be as important in the design brief as any performance factor. At this stage a well thought through design brief is required which has lessons learned from all stages (design, installation/commissioning and operation) of previous projects incorporated into it.

 

  • Barrier 2 – At the end of the design stage, the specifications and tender documents are passed onto the installation and commissioning contractors who will implement the design. The transfer of information (as opposed to knowledge) at this time has been developed over many years, and the process between consultant and installation contractor is well understood by both parties. However, it is fair to say that consultant specifications generally transfer the risk of missing information onto the installation contractor, who in turn transfers the risk to the specialist suppliers and sub-contractors. The background to the development of the design and a focus on handover processes is rarely a major consideration at this stage as the primary purpose is to achieve a competitive cost for the project. In other words deliver what is specified for the lowest cost.
  • Barrier 3 – At the point of handover from the installation/commissioning team to the operations team, much of the embedded project knowledge is often lost and the operations team is left to look after a live and critical facility with the benefit of only a few hours of training and a set of record documents to support them. Integrated systems testing (IST) is now common on data centre projects, but it is still very much the domain of the project delivery team, with generally only limited involvement of the operations team. The testing is used to satisfy a contractual requirement as opposed to imparting knowledge from the construction phase to the operations phase. Operators therefore don’t feel sufficiently informed to make changes that may improve energy performance, and lack engagement and awareness, which induces risk. This lack of awareness may not be an issue during stable operation, but at times of reduced resilience due to maintenance or failure events, operational errors may arise.
  • Barrier 4 – Clients with multiple facilities often design and build data centres using a similar design specification to previous builds. If problems/good points with the design and operation of past facilities have not been fed back to the client from the three previous quadrants, weaknesses in the delivery of new facilities will be repeated. Although barriers exist between two quadrants, each stage of a construction project needs feedback from the preceding quadrants (not just those adjacent to it) to ensure weaknesses in a design are eliminated and don’t propagate. Without processes to enable this, it’s easy to see how high risk and excessive energy consumption get compounded.

Operational Intelligence was founded on the understanding that significant risk and energy reduction within the data centre could only be achieved through an active engagement with operations teams across all disciplines throughout the project lifecycle. Risk and energy reduction may be the responsibility of an individual, but it can only be delivered if there is commitment from all stakeholders.

To learn more about effective transfer of knowledge between data centre delivery and operations teams, attend Dave Cameron’s workshop taster at the DCA virtual track program at Data Centre EXPO entitled 80% of Catastrophic Failures are Due to the Human Element.


Accenture: Unlocking the intelligent infrastructure

18 Sep 2014
by
Jessica Twentyman
Jessica Twentyman
Jessica Twentyman is an experienced journalist with a 16 year track record as both a writer and editor for some of the UK's major business and trade titles, including the Financial Times, Sunday Telegraph, Director, Computer Weekly and Personnel Today.
intelligent infrastructure - Accenture

In the race to achieve digital transformation, many organisations are straining their IT infrastructures to breaking point. The business results, meanwhile, of IT systems under intolerable pressure are only too clear: customer-service failures, supply-chain delays, stalled innovation and compromised security.

It doesn’t have to be this way, says Steve Paxman, managing director in the Infrastructure Services Practice at consultancy firm Accenture. By creating what he refers to as an ‘intelligent infrastructure’, organisations can ensure that they’re better prepared to react to market and technology changes – even staying one step ahead of them, if they’re lucky.

“And, as a result, they can serve customers better, collaborate more effectively and innovate faster,” says Paxman.

Read More

This intelligent infrastructure will be the subject of Paxman’s presentation at IP EXPO Europe’s Digital Transformation Summit. There, he’ll provide attendees with more detail on Accenture’s vision of IT infrastructure that can predict, learn, self-provision, optimise, protect and self-heal.

“At Accenture, we talk to a lot of large enterprises – and many smaller ones – that are now making a big push in order to go from being followers to being leaders in digital,” he says. “But increasingly, we see them recognise that their ability to move quickly and respond flexibly to changes in the market is compromised by constraints imposed by their IT infrastructure.”

That doesn’t mean chucking the lot and starting fresh with shiny new kit, however. It means taking a more considered approach to integrating existing IT assets with new ones and weaving together a blend of on-premise and third-party, often cloud-based, resources.

“The components that make up the intelligent architecture are all broadly known and familiar to IT heads,” says Paxman. “The challenge lies in integrating and managing them in ways that produce the best business results.”

So what can the intelligent architecture do that previous approaches to IT infrastructure struggle to achieve? By Accenture’s definition, an intelligent architecture is one that: knows when extra capacity is needed – and when it might be required again; optimises services by moving applications and processes to different providers, based on cost effectiveness; senses when a problem arises and takes steps to fix the problem itself; and automatically configures unifed communications for employees and secure connectivity to the core enterprise.

That sounds like a tall order – but Paxman insists it’s not a pipe-dream, just a careful blending of automation, orchestration and advanced analytics technologies.

“This is a journey. The vision I’ll be describing isn’t something any company can create overnight. For most, it will take a phased approach over two to three years – but they’ll need a very clear idea of where they want to be at the end of the journey. So, in other words, they need to understand the destination they’re trying to reach.”

And that destination, he adds, won’t be defined by new technological capabilities, but by better business outcomes: productive employees, innovative new products and services and satisfied customers. But make mistake, he says: “The race is already on.”

Steve Paxman will present ‘Enabling the Digital Business‘ as part of the Digital Transformation Summit at IP EXPO Europe.

 

 

 


Can relayr bring IoT to the data centre?

17 Sep 2014
by
Puni Rajah
Puni
Puni built her reputation for fact-based decision support as a business analyst at Deloitte. Since then, she has conducted IT consumption, management and sales research in the European and Asia/Pacific markets. She is expert at understanding enterprise software and services use and buying behaviour, and has coached marketing and sales executives. Puni is the Content Analyst for Data Centre EXPO which takes place alongside IP EXPO Europe in London ExCel on 8 - 9 October 2014
Relayr IOT DataCentre

Despite venture capital funding rush and ample column inches dedicated to the subject,  in many other ways, the Internet of Things is a long time coming. For data centre engineers who can visualise the the many possible applications of IoT inside the data centre, the wait could be over. Self-service app development on top of pre-configured connection options is how  relayr believes we will deliver the promise of possibilities, and its WunderBar is positioned for those who cannot wait for suppliers to catch-up.

App-building made simple

The WunderBar is designed to allow easy building of innovative and apps that connect smart devices together. It provides a platform that will connect different sensors with different smart devices.

The idea for the WunderBar arose from co-founder Jackson Bond’s family situation, and the need for his grandmother to move into a care home. From that came conversations about the needs of older people for care and support. It is, it seems, almost possible to maintain your independence at home with the support of smart sensors, wearables and the like. But there’s still nothing that brings the information from all the sensors together, and enables someone to manage their life. No platform. That’s the gap that the WunderBar fills.

Read More

Simple in concept, and mind-blowing in terms of its potential, the WunderBar has six Beacons (BLE), sensors, and Wifi. The sensors include light, colour, distance, temperature, humidity, remote control (infra-red), accelerometer, gyroscope, a sound/noise sensor voted for by the public and a Grove connector. This last allows it to connect with all the sensors in the Grove system. The WunderBar connects all these sensors together, and to other smart devices, via the relayr OpenSensor Cloud, allowing a joined-up approach to app development and the IoT.

The WunderBar is also, and possibly most importantly, easy to program. If you doubt this, rela

yr has run its own team hackathon to test it out. It is happy to report that even the least technically-minded in the team were quickly able to get to grips with the system and what it offered. From a device to protect your property from being stolen, through a gas and alcohol detector, to a proximity-tracking system usable for children, dogs or luggage, as well as a robotic spider, the team’s ingenuity in terms of ideas was impressive. But it’s almost equally impressive that they delivered too, demonstrating the ease of use of their product.

Secure and open

The WunderBar is also secure.  You can input your local Wifi credentials and create a secure connection between the WunderBar sensor modules and the relayr OpenSensor Cloud. You can use secure SSL connection to the cloud, encrypting the data using the ‘Secure-It’ option. Although relayr admits that building security into low energy devices over domestic networks is a challenge, it’s doing its best to rise to it.

relayr is also committed to keeping the IoT open and accessible, by releasing its SDKs and API implementations as Open Source projects. It also plans to open source its hardware as much as possible too.

Crowd-funded confidence in an experienced team

Team credentials explain its positioning. relayr CEO and founder, Harald Zapp, has over 25 years’ experience in IT, and was recently an executive at Cisco Systems Europe. Co-founder, and Chief Product Officer Jackson Bond has plenty of experience in start-ups, having founded and sold a Speech Recognition Apps company. He is also co-founder of MONOQI. The CTO, Cedric Deniau, has built global-scale transactional platforms for Jesta Digital and eMusic, and Chief Engineer Paul Hopton has experience of building ambitious, secure, high-transaction cloud platforms, for Bosch and Number4. There’s a lot of expertise, and a lot of experience there, both technical and commercial.

It’s an impressive team, and it certainly seems to have caught the eye of investors. The team has spent time at Startup Boot Camp in Amsterdam. relayr also launched a crowd-funding bid back in January and had reached its target for investment by March. In fact, it’s made 22% more funding than was actually required, and has a delivery date of this summer, so it’s very much a matter of watch this space, as developers start to use WunderBar for real.

Hear more from Harald Zapp  when he joins a keynote entitled “Extending the platform: using APIs to integrate from bricks to bytes” at Data Centre EXPO on Wednesday 8th October in the Data Centre Keynote Theatre.

Internet of Everything and business – opportunity knocks

15 Sep 2014
by
Rohini Srihari
Rohini Srihari
Rohini Srihari is Chief Scientific Officer at SmartFocus. Rohini Srihari is an educator, scientist and entrepreneur. She has founded several language technology companies, including Content Savvy, a leading provider of content analytics solutions and services to enterprise customers across multiple industries, including the US DoD. Content Savvy was recently acquired by SmartFocus, a company offering context-aware marketing services, and where she now serves as Chief Scientist. Dr. Srihari is also a Professor in the Dept. of Computer Science and Engineering at the University at Buffalo. She has given numerous talks and published extensively in the areas of text mining, social media mining, and multilingual text analysis. She received a B. Math. from the University of Waterloo, Canada and a PhD in Computer Science from the University at Buffalo.

Rohini Srihari, Professor of Computer Science and Engineering at University of Buffalo and Chief Scientific Officer at SmartFocus, discusses the opportunities the Internet of Everything offers businesses

We have all inevitably heard the buzz around the Internet of Things (IoT), a future world in which traditionally non-digital devices, like refrigerators and toasters, communicate with each other. In the world of IoT, appliances are networked together via household robots, which in turn communicate with other devices in order to carry out daily tasks.

But what is the Internet of Everything (IoE)?

More importantly, what is the opportunity it presents for businesses? In short, the IoE is what is achievable now. It is where previously disconnected digital objects, systems and data come together in a meaningful way. IoE is an all-encompassing interconnected existence that brings together four main elements:

  1. people
  2. processes whereby people, data, and things interact
  3. data
  4. inanimate objects and devices.

 

For instance, IoE would allow us to observe the internet ‘trail’ that we create based on our movement, online conversations and actions. When this is achieved it allow us to model and understand human behavior in a variety of applications. For example, we can take advantage of both mobile and behavioral context to deliver personalized and timely messages to customers.

Read More

The Guardian, the leading UK daily newspaper, recently referred to the IOE asan opportunity to measure, collect and analyze an ever-increasing variety of behavioral statistics. In theory, the mash-up and cross-correlation of this data could revolutionize the targeted marketing of products and services.” (June 20, 2014)

With beacon, machine learning and natural language processing, businesses have the capability to deliver personalized, hyper-aware content to prospective customers and partners; engaging customers more effectively and opportunistically. Traditional recommendation systems are focused solely on “which” products or services to suggest to customers, based on the habits of people with similar transactional history. With IoE, businesses have the opportunity to go a step further, and learn “how”, “when” and “where” to make suggestions – what we call “context aware communication”.

Furthermore, IoE can be used in many other ways – from providing the infrastructure for an application to alert people about events in their neighborhood, to health and wellness applications that help keep people fit and healthy. Mobile platforms like the new iOS and personal robots such as the JIBO robot developed at MIT enable more sophisticated applications, in which behavioral models can be developed to create lifelike interactions.

IoE opportunity from a Technology Perspective

From a technology perspective, the fundamental opportunity that the IoE offers businesses is the ability to model customer behaviour and context by connecting a diverse range of digital sources. By understanding how consumers engage with brands through geospatial tracking, beacon sensors, social media, and the customer data we have via digital marketing campaigns, purchasing history and call centre transcripts, it is possible to communicate with customers in the moment, irrespective of whether it’s in store, on their mobile, online or via email.

Machine learning models are now capable of operating on very large and diverse sets of observed data, a development that we are taking advantage of. We have access to data which informs us as to what contexts (i.e. combination of customer profiles, marketing message, delivery method etc.) have resulted in successful marketing outcomes. It is then possible to leverage machine learning technology to understand the correlation between user profiles and the types of recommendations they are likely to be responsive to.

With all of this data analysis, however, it is important to understand how brands can use what they know and can predict what consumers will want, while still respecting data privacy. The good news is that when creating a model based on behavior, it is not necessary to identify each unique customer. Instead, it is possible to distill the key characteristics that are needed for a customer profile – data such as demographics, interests, purchasing history, etc.   By clustering these anonymous customers’ profiles, we can create “pseudo profiles” that represent different and distinct customer classes. A new customer can then be mapped to the most similar cluster based on observed data that is gathered and further used to personalize communications.

It’s an exciting time and the first businesses to pursue insightful, context-aware communication will no doubt prosper. Businesses now have access to the expertise and ability to analyze content and social media conversations, leverage machine learning to understand what the large sets of data are telling them, use recommendation systems to dynamically create engaging, real-time conversations, as well as deploy mobile technologies to keep in touch with their on-the-go consumers. These are fundamental components of a scalable solution that is able to continually provide consumers with engaging marketing, which is all driven by what we know about customers.

 

Rohini Srihari is Chief Scientific Officer at SmartFocus. At SmartFocus, we have the technology to help you truly know your customer and engage them with the right message, at the perfect moment, wherever they are – in store and online across all of their preferred channels. SmartFocus is the Internet of Everything solution for brands.  It includes beacon technology that knows when your customer visits a physical store; real-time predictive recommendations generated at the precise moment they open your email; and powerful insights that use smart analysis of Big Data to effortlessly generate a truly personal experience. Every year, we execute more than five million marketing campaigns, manage more than five billion customer records, and deliver more than 55 billion personalized interactions. Global brands such as French Connection, Speedo and Mercedes-Benz rely on SmartFocus to deliver business-critical, joined-up marketing campaigns that make a difference to their customers and to the growth of their business.




CIO Survey 2014: Optimism, importance and skills

11 Sep 2014
by
Puni Rajah
Puni
Puni built her reputation for fact-based decision support as a business analyst at Deloitte. Since then, she has conducted IT consumption, management and sales research in the European and Asia/Pacific markets. She is expert at understanding enterprise software and services use and buying behaviour, and has coached marketing and sales executives. Puni is the Content Analyst for Data Centre EXPO which takes place alongside IP EXPO Europe in London ExCel on 8 - 9 October 2014
CIO Survey 2014 Harvey Nash

The results are in from this year’s CIO Survey, published by Harvey Nash, the global recruitment and offshoring group. The survey represents the views of 3,211 senior IT leaders across more than 30 countries, and is one of the largest studies of its kind. It makes positive reading.

CIO Survey 2014 – A watershed year

Jonathan Mitchell, Harvey Nash’s Non-Executive Chairman of Global CIO practice, believes that the CIO survey 2014 results suggest that we’ll look back on 2014 as the watershed of the global financial crisis. Optimism is creeping into the views of CIOs. Budgets seem to be rising, and for the first time in a decade, CIOs are worrying about skills shortages. So what are the main findings from the survey?

First of all, CIOs are becoming increasingly important to their organisations. More than half of CIOs hold executive posts, with a seat at the ‘top table’, and 44% of organisations now report that they have a CIO, up from just 11% in 2005. This seems to be part of an ongoing trend since the financial crisis. But perhaps what’s even more crucial is the strategic focus of CIOs. A few years ago, it was all cost savings, or at best operational efficiencies. Now, most CIOs report that their focus is on growth, whether building revenue, or better engagement with customers. It’s an interesting trend that matches with Ian Cox’s work on the changing role of the CIO and IT departments.

Read More

Shortage of skills at the root of many challenges

All is not totally positive, of course. Project delivery remains an issue for major business transformation projects, very few of which have been considered highly successful this year. CIOs are also increasingly concerned about skills shortages, and the impact on the organisation’s ability to keep up with the market. The skills shortage is exacerbating problems with project delivery. Nearly 60% of CIOs responding to the survey felt that this was a significant problem, 20% more than last year, and the highest level since the start of recession in 2008.

The biggest skills shortage is in project management skills, and change management skills have seen the biggest change in demand. Neither of these should come as a surprise given the issues with delivery of major transformation projects, and the challenge of transforming IT departments from supportive into strategic players. CIOs report that the skill most valued in interview candidates for leadership positions is ‘business understanding’.

The expanding scope of relationship management

Relationships with other functions also remain a challenge for many IT departments. Most have good relationships with operational departments, but struggle with marketing and HR colleagues. And this is a big issue, because in many organisations, it is the marketing function that ‘owns’ the role of Chief Digital Officer, leaving the CIO struggling to make strategic headway. Several companies reported that they have opted for a perhaps more mature, and more successful, approach to the CDO role: co-ownership between IT and marketing. This pragmatic solution allows each to play to their strengths, and should strengthen the relationships between the two functions.

In financial terms, the situation is also positive. IT departments are generally reporting budget increases. Interestingly, many more are planning to outsource infrastructure projects, up 14% on last year, and also service desks and helpdesks. It’s perhaps a sign that, as Ian Cox has predicted, IT departments are becoming more strategic, and moving towards adding customer value to the organisation.

You can request a copy of the CIO Survey 2014 directly from Harvey Nash.