The ultimate technology resource.

Technology.info will inform, educate and inspire. Immerse yourself in the very latest IT research and resources to give your business the edge.

Company Directory

Compleate A - Z listing of all companies VIEW NOW

Find a company you need by area of technology:

Brought to you by IP EXPO Europe

Disrupting IT

27 Aug 2014
by
Puni Rajah
Puni
Puni built her reputation for fact-based decision support as a business analyst at Deloitte. Since then, she has conducted IT consumption, management and sales research in the European and Asia/Pacific markets. She is expert at understanding enterprise software and services use and buying behaviour, and has coached marketing and sales executives. Puni is the Content Analyst for Data Centre EXPO which takes place alongside IP EXPO Europe in London ExCel on 8 - 9 October 2014
Disrupt IT Department

Disruption. It’s what SouthWest Airlines and EasyJet have done to the staid, formal airline industry. It’s what Formule 1 has done to the travel-lodge/hotel trade in Europe. It’s basically become a synonym for ‘changed beyond recognition, and how come it took us so long to work out that we would like this?’. And now it’s happening in IT. The rise of cloud, mobile internet, and machine to machine connectivity, to name but a few, have changed the way that we use and view technology on a day to day basis.

A brave new world

The last five to ten years has seen a huge change in the way that we use technology. We all have smartphones and tablets at home. Those leaving university now are the first generation of ‘digital natives’: those who cannot remember a world without the internet. If we want an app to do something, we go and get it. And the same applies to the way in which we work. If we want an app, we want to be able to go and get it. We don’t expect to have to wait 6 months while the IT department thinks about it, and then decides that they don’t approve. In fact, there’s evidence that the majority of IT spend is no longer spent by the IT department.

This is massively disruptive. Many of those working in IT, including CIOs, don’t know how to respond. And that has huge implications for IT departments.

Ian Cox, a former CIO and now a consultant, has been watching and writing about these changes for several years. Increasingly, about changes that IT departments and CIOs had to make to fit into the ‘brave new world’, and realised that many were no longer fit for purpose. His book, Disrupt IT sets out a new model for a radical transformation of the way in which IT is delivered.

At the heart of the matter is the way in which the IT department, and the CIO, need to change to meet the needs of digital businesses, and to stay relevant. He discusses a new role for the IT department as a collaborative and supportive technology broker. And much of what he says shows why data centres, and especially colocation providers, are a key part of the new model of IT.

IT departments used to function as providers of technology and of a service. However, there are now many external vendors who can provide the same type of service. In fact, given the lifetime of technology, they can probably do it much more efficiently and cost-effectively. In-house IT departments need to move from being a service provider to being a service broker, and that means a new set of core competencies, which are much more business-focused. They are architecture and design, delivery management, data management and vendor management, as well as managing internal relationships and developing an understanding of the business. Instead of being about building and maintaining infrastructure and applications, they’re about adding value to the business. And in order to provide value in the new world, IT departments need to focus on these new core competencies, so that they drive changes in structure, recruitment, and the way that the department operates.

Working in a different way

Having focused on the core competencies, the IT department then needs to make sure that it’s really focused time and resources on things that add value to the business. And this means outsourcing the low-value, non-core areas and anything that is not a differentiator.

Outsourcing is routine in business. We all know the mantra: focus on your core competencies, the things that distinguish you from your competition, and outsource anything that requires specialist knowledge or skills, or does not differentiate you. And much of the old business of the IT department can be outsourced, so that the IT specialists within the company can focus on enhancing the customer experience and adding value to the business. Colocation providers can offer the data centre capacity and space that’s needed for this outsourcing, in new and more up-to-date premises than any company can afford to equip themselves, and the newly-refocused IT department is well-placed to advise on which offers best value.

It sounds simple, but such changes take time and money, and are not easy to achieve in practice. However, they are essential if IT departments and CIOs are to remain useful and functional, and not be bypassed by the rest of the business.

Hear more from Ian at his Data Centre EXPO keynote presentation, It’s time to disrupt IT, on Wednesday, 8th October at 11am.

Cloudera’s Mike Olson: “This isn’t about knocking over old guys and stealing their wallets”

26 Aug 2014
by
Jessica Twentyman
Jessica Twentyman
Jessica Twentyman is an experienced journalist with a 16 year track record as both a writer and editor for some of the UK's major business and trade titles, including the Financial Times, Sunday Telegraph, Director, Computer Weekly and Personnel Today.
Hadoop big data cloudera

Hadoop distributor Cloudera’s co-founder and chief strategy officer, Mike Olson, talks candidly to Technology.info about the open source framework’s growing maturity, that investment from Intel and whether rumours of the death of data warehousing have been greatly exaggerated.

Q: Forrester analyst Mike Gualtieri recently described Hadoop’s momentum as “unstoppable”. Where, in your opinion, does the technology sit today in the minds of enterprise customers?

A: Hadoop’s come a long, long way. Cloudera’s been in business for just about six years now. We were incorporated in June 2008 and, at that time, ‘Big Data’ was not a meme. The idea just hadn’t occurred to most people and Hadoop was pretty rudimentary back then. In the ensuing six years, Cloudera has led the community in driving new capabilities: we’ve added NoSQL engines to our offering, like Hbase and Accumulo; we’ve invented, and given away as open source, high-performance SQL query capabilities, in the form of Impala; we’ve integrated other open source projects, like Solar, to deliver search capacity; we’ve embraced third-party tools, like the Spark engine from Databricks, for complex analytic and stream processing. We give our customers lots of ways to get at data, as well as capabilities around security, data lineage, governance, back-up, disaster recovery – enterprise-grade features that were missing from platform that Yahoo and Google first used. That’s a long way for me to say that, while the market is still emerging, it’s no longer fair to see Hadoop as an early-stage technology. We’re seeing it adopted across every single vertical, we’re seeing customers roll out large-scale deployments, and pay seven, even eight-figure sums, on an annual basis, to do just that. Broad adoption may be only just now accelerating but the momentum is tremendous.

Q: Back in April, Cloudera announced a huge $900 million round of investment, $740 million of which came from Intel. What can you tell us about that?

A: Well first, I’d say that it’s not just Intel making big investments. Look at the considerable investment that IBM’s making in big data. Look at the investment in Pivotal, the VMware/EMC spin-off, made by General Electric. Large enterprise infrastructure vendors everywhere are ploughing enormous money and staff into the big data opportunity built around Hadoop.
Talking about our relationship with Intel specifically, I understand why people hear a sum like $900 million dollars and imputed market capitalisation of $4.1 billion and they stop listening after that, because those are just breathtaking numbers. What I want them to hear is that we did not do this deal for the money.
The relationship with Intel does three important things for Cloudera. First, it aligns our two companies behind a single platform that we can bring to market together, rather than battling for customers with our own Hadoop distributions. Intel had its own distro and it was pretty good. It has a lot of features that we, frankly, coveted. By aligning behind a single distribution, we can marry the best of both.
Second thing is, we’re a platform company that relies on channel relationships and partners to help us deliver a full stack of value to our customers. Intel has decades-long partnerships across every sector of hardware and software. It’s deeply respected and genuinely liked by all the big players in the market. So by working with Intel, we create a much larger indirect channel and amplify our selling effort significantly.
The third and most significant point for me, is that the silicon we run on is going to change dramatically over the next few years – the balance between memory, disk, compute and networking will be radically redrawn. Intel knows what the future looks like. And we’re in a position to ensure that the open-source ecosystem takes full advantage of all that silicon goodness, that it adapts to these ratio changes in data centre infrastructure. We’ll be able to deliver a much better, faster, more secure platform to the market broadly, before anyone else will.
Now Intel believes the opportunity is enormous and wanted to participate meaningfully, so as a condition of the large commercial relationship, wanted to take a substantial equity stake in Cloudera. We weren’t looking to dilute our shareholders to that extent. We weren’t looking for $900 million in cash – it’s a crazy amount of money for a software company to raise at this stage. But what we did want was that commercial relationship with Intel and so we negotiated an investment that we think makes sense to everybody.

Q: Over at your competitor Hortonworks, executives tell us that Cloudera’s message is that the traditional data warehouse is dead. Can you clarify for us where you actually stand on this issue?

A: Everybody wants to know if Cloudera is after Teradata. I say to them, “Look: the opportunity here isn’t to knock over old guys and steal their wallets.” The opportunity is to monetise vast amounts of data using tools that weren’t previously available. Now Hortonworks and Teradata have a joint go-to-market strategy and Hortonworks CEO [Rob Beardon] is on record saying that Hadoop cannot be real-time, that it will not do interactive queries in the way data warehouses do. Cloudera’s point of view is that that’s ridiculous: not only will it, but right now, today, with our platform, it does. The data centre of the future will be different. The enterprise data hub will absolutely be a core component of every large enterprise’s data management strategy. Data warehousing will evolve to do more things, better – but an enormous amount of what happens in the data warehouse today is moving out. It’s moving out for flexibility, for cost and because most of the innovation is going to happen in open source, not in single-vendor proprietary technologies. We’re not interested in preserving existing markets. We’re interested in innovating so that customers can do more with their data. So our mission really isn’t to replace the data warehouse. It’s to make data management and data infrastructure better.

See Cloudera’s Chief Architect, Doug Cutting keynote as part of the Big Data Evolution Summit at IP EXPO Europe, 8 – 9 October 2014, London ExCel.

Big Data in action: smart cities meets energy

20 Aug 2014
by
Jessica Twentyman
Jessica Twentyman
Jessica Twentyman is an experienced journalist with a 16 year track record as both a writer and editor for some of the UK's major business and trade titles, including the Financial Times, Sunday Telegraph, Director, Computer Weekly and Personnel Today.
Smart Cities - big data

In Germany and the US, two very different big data projects are underway, but both with the same goal: optimising the way people consume energy by analysing the data generated by smart meters.

More efficient use of energy and water, fewer traffic jams, better public safety: these are just a few of the promises that the ‘smart city’ concept makes. It’s vital to human health and security that these goals are achieved: according to figures from the UN, more than two-thirds (70 percent) of the world’s population will live in urban areas by 2050, stretching resources, space and possibly, residents’ patience, to their limits.

But in order to be smarter, a city needs data – a lot of it. Sensors and meters embedded in utilities networks, in particular, are already capable of delivering data in huge volumes, but if city authorities and utilities firms are to be able to understand energy usage patterns and potential supply issues, they’ll need a hefty dose of big data technologies to collect, manage and analyse that information.

This idea is central to a giant construction project currently underway on an abandoned airfield on the north-eastern outskirts of Vienna. It’s the site of a new smart city, Aspern, and by next year, if work goes to plan, there will be around 3,240 apartments there. By 2028, Aspern will be home to around 8,500 apartments, along with shops, schools, health centres, offices and a subway station that will transport passengers to or from central Vienna in just 25 minutes.

It’s also the site of one of Europe’s most ambitious smart energy projects – a “living laboratory” in the words of its creators – where researchers hope to establish how renewable power sources, smart buildings and smart-grid technologies might best be combined to power a thriving, eco-friendly community.

That research will be lead by Monika Sturm, head of R&D at Aspern Smart City Research (ASCR), a €40 million joint venture formed in 2013 between the City of Vienna, utility provider Wien Energie and industrial giant Siemens.

The plan, she told attendees at data warehousing company Teradata’s recent customer conference in Prague, is to kit out individual buildings at Aspern with different combinations of smart-energy technologies and analyse the results using a range of big data technologies: traditional data warehouses, MPP [massively parallel processing] appliances, and open-source data analysis framework Hadoop. As we’ll see in the next article [xxxx], the big-data trend is, in some cases, setting these approaches in direct competition with each other – but many organisations continue to embrace whatever tools they need get analytics work done.

“By analysing the most efficient mixes of technologies and their influence on end-user behaviour, we expect data analytics will lead to new paths for energy optimisation in smart cities everywhere, for the benefit of all.”

In California, meanwhile, utility company PG&E is somewhat further down the line: it’s already the largest US utility to have installed smart meters right across its service territory, which covers 70,000 square miles and 9.4 million residential and commercial properties. Now, the focus is firmly on extracting real business value from that roll-out effort, says Jim Meadows, PG&E’s director of smart grid technologies.

The smart meters that PG&E has installed measure energy use in hourly or quarter-hourly increments, allowing customers to track energy usage throughout the billing month and giving them greater control over the costs of heating, cooling and lighting their homes. They also give PG&E more visibility into its own operations, as well as lowering the costs associated with meter reading and management.

But these 9.4 million meters generate a mountain of data – around 2 terabytes per month or 100 billion readings per year. This is collected by the company and stored for analysis in PG&E’s Interval Data Analytics (IDA) platform, based on a data warehouse from Teradata. Analytics tools from SAS Institute and Tableau Software, meanwhile are used to interrogate the data.

“We’re doing our best to focus the company, and all of its different lines of business, on a single data platform for this interval data,” says Meadows. “We made a conscious decision early on to build a platform where data could be cleansed and perfected in a single place and made ready for presentation to business users in a wide range of different ways.”

As at Aspern and PG&E, countless other public servants and utilities executives are planning big data projects – and a huge cast of hardware, software and services vendors will be more than happy to assist them.

The main drivers of smart-grid analysis investments, according to a recent report from research company GTM Research, will be to improve asset management for grid components, bring more granularity to demand-side management, speed up outage response times – and achieve a better return on investment for smart meters. GTM Research expects culmulative spending to total around $20.6 billion between 2012 and 2020, with an annual spend of $3.8 billion worldwide in 2020.

Newcomers swell the SQL-on-Hadoop ranks

19 Aug 2014
by
Jessica Twentyman
Jessica Twentyman
Jessica Twentyman is an experienced journalist with a 16 year track record as both a writer and editor for some of the UK's major business and trade titles, including the Financial Times, Sunday Telegraph, Director, Computer Weekly and Personnel Today.
Big Data Analytics

It’s one of the hottest areas in big data science and the choice of tools for running SQL queries on Hadoop platforms is getting bigger all the time.

Steve Shine, CEO at business intelligence company Actian, is convinced his company is onto a winner. The company recently announced the Hadoop Edition of its analytic software, combining high-performance SQL with the company’s visual dataflow framework, running entirely natively in Hadoop.
According to Shine, this addresses a huge need among would-be Hadoop users, who have so far held back on investing in the open-source big data framework because of the relatively high costs of skilled MapReduce engineers. (For more on how the Hadoop market is developing, see our interview with Mike Olson of Cloudera which will be published shortly)

“SQL skills are more abundant and more affordable. Most companies with IT teams have them. So for them, it’s a chance to get value from big data with the skills they already have, rather than going out to market for a skillset that is short on supply and tends to get quickly snapped up by technology vendors and consultancy firms.”

But Actian is far from alone in spotting the market opportunity here. The company’s launch of its Hadoop Edition came just one week after IT giant HP upgraded the SQL-on-Hadoop functionality it introduced in late 2014 with its Vertica database. VMware/EMC spin-off Pivotal and MPP (massively parallel processing) database company InfiniDB have also introduced SQL-on-Hadoop products. And among the Hadoop distribution companies, Cloudera is pushing its Impala offering, while HortonWorks is arguably the most active contributor to the open-source Hive effort.

In fact, despite the recent proliferation of tools for running SQL queries against big data stores on Hadoop, Hive remains the most widely used query tool with Hadoop – and executives at Hortonworks claim that the company’s recent Stinger project has done much to improve its overall performance.

That said, there is still a glaring gap in the market for tools that can offer the full range of SQL functionality on Hadoop that data scientists can already achieve using traditional relational databases. In other words, there is still much work to do, according to Mike Gualtieri, an analyst with Forrester Research.

But levels of interest in SQL-on-Hadoop are high, he adds. While Hadoop is generally positioned as an environment in which unstructured data can be analysed, many companies have begun their Hadoop experiments with structured data.

“Hadoop can handle both,” he says. “That’s what’s so interesting about the platform. And, over time, most organisations will do both, but for now, I advise firms to start with structured data and then move onto unstructured.”

After all, he adds, plenty of organisations have vast treasure troves of structured data at their disposal, much of which goes unanalysed today. And as we’ll see in the next article, the Internet of Things trend seems set to fill those stores even further. Gualtieri believes that most companies only ever analyse around 12% of the data they hold, leaving the rest (which is potentially valuable), “on the cutting room floor.”

Often, that’s because the vast databases and data warehouses needed to collect – and use – the remaining 88% would be prohibitively expensive to buy and maintain. Hadoop, by contrast, provides a low-cost way to gather vast volumes of data from different data sources on commodity hardware. In other words, Hadoop presents an opportunity to bring it all together in one place – but in order to analyse structured data, most companies are still more comfortable using tools and approaches with which they are already familiar.

This is where Actian, Vertica, Pivotal and others could help – by supporting the queries that companies already run against their structured data, but doing it in a more scalable, less pricey environment. Or, as Shine puts it, “We’re making Hadoop more accessible to a wider range of companies – and, frankly, that’s long overdue. We’re making Hadoop industrial-strength, to tackle more of that analysis needs that customers have today.”

Can software bridge diverging data centre demand and supply investment cycles?

18 Aug 2014
by
Puni Rajah
Puni
Puni built her reputation for fact-based decision support as a business analyst at Deloitte. Since then, she has conducted IT consumption, management and sales research in the European and Asia/Pacific markets. She is expert at understanding enterprise software and services use and buying behaviour, and has coached marketing and sales executives. Puni is the Content Analyst for Data Centre EXPO which takes place alongside IP EXPO Europe in London ExCel on 8 - 9 October 2014
Data Centre Supply & Demand

Stuart Sutton is Chief Executive Officer at co-location services company Infinity. His Data Centre EXPO education session explores the changing profile of design and build practices in IT solution considerations.  

The impact of software defined architectures is reaching across to data centre design and build considerations. Co-location provider Infinity cites location as only one of 12 reasons customers choose its services. We caught up with Stuart Sutton to explore why this is so, and how this position is helping Infinity grow its business faster than its competitiors.

How have your customers priorities changed?

The digital economy has been evident in the changing shape of demand for data centres. An increasing number of customers, whether they are part of an IT department or not, are not looking for rack space for their data. Rather, they are more often thinking about what they want to achieve by buying data centre space. For example, they want the capacity to manage transactions, including flexibility to manage peaks and troughs in demand.

This shift in customer priorities has shaped our approach which has been very successful with the new breed of customers who want to have conversations about how to support their IT strategy, rather than about rack space.

What’s holding back faster transition to this new approach?

Part of the issue for many of data centre providers is that they started out as a building supplier. Buildings, of course, are designed to last for decades, and so is the infrastructure within them, such as air conditioning, and heating. While buildings and facilities need to be excellent to attract data centre customers, technology moves on much more quickly. Users and consumers of technology want to move with the times, and update their technology every few months. This has led to a disconnect in thinking between traditional data centre suppliers and their potential customers

How have you tackled this challenge?

One of the key ways in which we have made our offering more flexible is to move from being a company that built and ran data centres for customers who knew exactly what they wanted, effectively a bespoke data centre supplier, to providing a platform. The advantage is that customers can choose whether they want a high-specification, high-reliability platform, or something that is much more ‘cheap and cheerful’, but still fit for purpose. Effectively, we have created the data centre as a service, rather than as a building in which you put your IT.

As a consequence, we are able to serve a wide range of customers. We can talk directly to end customers as well as managed service providers (MSPs), and offer more flexibility in terms of managing peaks and troughs in demand. In the new world of nimble, cloud-based services, our customers have the flexibility to move between platforms and vendors, but still retain control of their systems and data within a coherent and logical digital strategy, and in a much more secure way.

What’s different about serving MSPs and software developers?

In July 2014, Advanced Computer Software Group(ACS) migrated operations from its five data centres to Infinity. This migration is part of ACS’s portfolio expansion to include cloud-based delivery of its software. Our platform approach has been particularly attractive for companies providing software as a service (SaaS) and managed cloud as they need both mission-critical reliability for end users and a more service-based environment for development cycles that can be more cost-effective. These SaaS companies also need the flexibility to take more space when they move customers from old legacy systems to new private or hybrid cloud based systems. We see this approach as a disruption to traditional data centre design and build processes.

Join Stuart on Wednesday 08th October at 12:20 in the Data Centre Design & Build theater.

 

Interview: Pure Storage looks to help companies ‘dump the complexity’

15 Aug 2014
by
Jessica Twentyman
Jessica Twentyman
Jessica Twentyman is an experienced journalist with a 16 year track record as both a writer and editor for some of the UK's major business and trade titles, including the Financial Times, Sunday Telegraph, Director, Computer Weekly and Personnel Today.
Pure Storage - All Flash for All

With his insistence on all-flash everywhere, CEO Scott Dietzen is sending out a radical message to IT decision-makers contemplating the future of their data storage infrastructures.

Flash memory already defines the consumer technology experience. It powers the smartphones and tablets we carry everywhere we go, as well as the social networking sites that we use daily to share with family, friends and colleagues.

So why should the corporate technology experience be any different, asks Pure Storage CEO Scott Dietzen? After all, he argues, when compared to traditional disk-centric storage arrays, an all-flash array is ten times faster and ten times more space and power-efficient – and all this at a lower per-gigabyte price point.

Technology.Info met up with Dietzen at Oracle OpenWorld to learn more about the company he heads and the eye-opening claims it makes for its all-flash arrays.

It’s a great time to be heading up a hot new storage company, according to Dietzen, and it’s not hard to see why. Pure Storage may be facing a whole army of similarly young and hot storage vendors, looking to grab the attention and budgets of corporate IT decision-makers, but the company recently got a hefty vote of confidence in its approach, in the form of $150 million in funding for global expansion from venture capitalists. Dietzen claims it’s the largest private funding round ever in the history of enterprise storage and, either way, it brings the company’s total funding to an eye-opening $245 million.

So why the excitement around Pure Storage? The company (alongside a number of rivals) advocates the use of speedy solid-state memory pretty much everywhere in the corporate data centre, as opposed to the hybrid, tiered approach that involves mixing in a bit of flash with a lot of spinning disk, endorsed by larger enterprise storage companies such as EMC and NetApp. As the costs of storing data on silicon continue to keep dropping, says Dietzen, Pure Storage’s all-flash vision becomes increasingly more feasible – and affordable – for mainstream businesses to pursue.

At early-adopter organisations, he adds, the shift is already underway. At Facebook, for example, data is stored on flash for several weeks before it’s shifted to mechanical disk, he says. And at payroll processing company Paylocity, another Pure Storage customer, storage admins have recently unplugged the last of their production disk arrays, replacing it with an all-flash array from his company.

“Mechanical disk is out of gas,” he says. “It may take some companies some time to realise that, but other companies already know it and something remarkable is going on here, with flash [becoming] the driver of a wholesale sea-change in enterprise storage.”

As for his claim that flash might actually cost companies less than spinning disk, he argues that deduplication and compression need to be taken into account when the sums are calculated, as well as the difference between raw storage capacity and actual addressable storage.

So this is Pure Storage’s goal: to do for silicon-based storage what EMC and NetApp did for spinning disks. Along the way, Dietzen claims, the four-year-old company has ignored a bunch of approaches from would-be acquisitors (he declines to name them) on its mission to build the storage industry’s next-generation heavy-hitter.

Pure Storage is not for sale, he insists, and in any case, the most recent round of funding values the company at an “insane premium” that would halt most prospective buyers in their tracks.

“Long-term independence is the best way to preserve our ability to create value for customers,” he says. An IPO isn’t even on the cards before 2015, because global expansion and R&D are much higher priorities at a time when the company is growing revenues at up to 300 percent year-on-year. “When you’re seeing growth like that, believe me: you’re more inclined to let your bets run.”

The company now has the funds, he says, to replicate the success it has experienced in the US elsewhere in the world, particularly Europe and Asia. “Our experience has been that our customers initially buy us for improved performance, sure, but then they repeat-buy because they’re dumping the complexity that they’ve had to tolerate for a long time in their storage environments,” he says. “That’s something that translates well internationally. We now just need to get that message out there.”

See Pure Storage discuss the Rise of the Flash Fuelled Data Centre at IP EXPO Europe 8th – 9th October 2014


Overview: Getting value from Big Data

12 Aug 2014
by
Jessica Twentyman
Jessica Twentyman
Jessica Twentyman is an experienced journalist with a 16 year track record as both a writer and editor for some of the UK's major business and trade titles, including the Financial Times, Sunday Telegraph, Director, Computer Weekly and Personnel Today.
Big Data Analytics

Big data is far too important these days to be left entirely to data scientists. In a world awash with data, a working knowledge of analytical methods, technologies and terminologies is an essential business skill for business managers.

After all, they’ll be the ones who’ll be signing big cheques for big data – and on average, organisations will pour, on average, around $8 million each on big data initiatives in 2014, according to IT market research firm IDC. The company’s survey of over 750 technology decision-makers worldwide, released earlier this year, found that in many cases, the spending spree has already begun. Over two-thirds of respondents from large enterprises said their organisations had already deployed or planned to deploy big data projects. Over half (56%) of respondents from SMEs said the same.

In this Technology.info chapter, we take a look at some of the ways that organisations are getting business value from big data and the tools they are using to extract new insights. For example, we take a look at the burgeoning market for SQL-on-Hadoop tools and chat with Mike Olson, co-founder and chief strategy officer at leading Hadoop distribution specialist Cloudera, about how the market is developing . We also explore the ‘smart city’ concept – a technology intersection where big data and the Internet of Things collide, with a look at how two organisations are using the data generated by smart meters to encourage consumers to reduce their energy consumption.

Register now for the Big Data Evolution Summit at IP EXPO Europe

Given most organisations’ spending intentions in these areas, it’s vital for bosses to have a clear understanding, upfront, on the value they hope to get from their investments, but they will be entering unknown territory.

The good news about big data is that it makes it possible for executives to ask questions they’ve always wondered about – and get rich, multi-dimensional answers back in return. The bad news about big data is that traditional methods and tools for data analysis, with which bosses might enjoy a passing familiarity, have a nasty tendency to fall short in environments where the pressure is on to explore vast volumes of both structured and unstructured data, from a wide variety of both internal and external sources.

In other words, companies looking to get business value from big data must familiarise themselves with a world populated by new technologies with strange, unfamiliar names: Pig, Hive, Flume and Squoop, to name but a few.

On the plus side, cloud technologies will help: there are many ways now to access big data tools and expertise on an on-demand basis, and the IT infrastructure needed to store and process big data can be supplied by a host of cloud providers, such as Amazon Web Services and Microsoft Azure.

With smart choices, big data can mean big insight – if organisations have the ability to turn insight into actions that create competitive advantage. Delivering on that final step, in fact, will likely be the biggest challenge of all.


Advanced Persistent Threat: a new cyber attack eco system emerging

11 Aug 2014
by
Paul Fisher
Paul Fisher is the founder of pfanda - the only content agency for the information security industry. He has worked in the technology media and communications business for the last 22 years. In that time he has worked for some of the world’s best technology media companies, including Dennis Publishing, IDG and VNU. ​ He edited two of the biggest-selling PC magazines during the PC boom of the 1990s; Personal Computer World and PC Advisor. He has also acted as a communications adviser to IBM in Paris and was the Editor-in-chief of DirectGov.co.uk (now Gov.uk) and technology editor at AOL UK.  In 2006 he became the editor of SC Magazine in the UK and successfully repositioned its focus on information security as a business enabler. In June 2012 he founded pfanda as a dedicated marketing agency for the information security industry - with a focus on content creation, customer relationship management and social media Paul is the Editorial Programmer for Cyber Security EXPO which runs alongside IP EXPO Europe, 8-9 October, ExCel, London.
Uri BioCatch website (2)

Uri Rivner is VP Business Development & Cyber Strategy at Israeli start-up BioCatch. His Keynote at Cyber Security Expo focuses on Advanced Persistent Threats (APT). 

1 How worried should CISOs and CEOs be about the threat from Advanced Persistent Threats (APT)?
Obviously you should ask the CEOs, but advanced persistent threats have become mainstream. Three to four years ago, it was a novelty – not any more. If you care about your intellectual property, you’ve got to take this threat seriously.

Then there’s the serious impact on the bottom line, the cost of these attacks. And the damage to the reputation of your business, the loss of trust among customers and partners. And of course, you may not even be the final target, merely a stepping stone to a bigger target and the legal implications of that. This is the reality, and the starting point of this discussion

2 By focusing on Advanced Persistent Threats, are we in danger of missing more threats from more conventional sources?

We are still spending around £70bn every year on something that doesn’t work. Paying for IPS, IDS, web filtering and antivirus – the traditional basics of security. This cannot continue, security is not working. I don’t want to pay that much for a commodity. So there’s no danger of missing old threats because you have that covered – but the new threats are much more serious and insidious, and conventional defences cannot cope.
Read More

These are Advanced Persistent Threats, hacktivism and cyber crime with criminals now working right inside corporations. Three years ago, criminals were just after employees money by hijacking online banking sessions. Now they’re after the business itself with APTs used to steal data and IP.

We need security that will perform detection, investigation and resilience. In other words, cyber-intelligence.

3 In your experience is the APT problem getting worse? Without giving away too much of your talk what new techniques are the bad guys using?
There is evidence that there is a blurring of lines between cyber criminals, hacktivists and state actors – all assisting each other to their respective goals. And many more nations are now involved, it’s not just China anymore. The actors have become very adaptive and cunning, and as I mentioned, attacking new vectors such as the supply chain. We are seeing almost the emergence of a attack ecosystem, where it is difficult to identify who is who. A kind of merging of attackers.

4 Tell us a little bit about your new company BioCatch.
If you recall the famous scene in  Blade Runner when a replicant is being interrogated to see if he is human or not. Our technology does something similar with cognitive science. At the core of the technology lies a unique, mechanism we call Invisible Challenges.This mechanism is responsible for the interaction of the user with the application, so whenever a user interacts with an application, a subtle dynamic cognitive challenge is injected and the user responds without being aware to the fact it was there.

Each user reacts differently and has a unique Cognitive Signature. If a deviation from the regular behavioral profile is spotted at any point during a session, BioCatch immediately senses foul play and sends out an alert of a possible threat. The best thing is, it works!

We are in talks with top banks in the UK, Spain and Italy and a major public cloud provider.

5 What are you looking to get out of Cyber Security Expo 2014?
In the banking sector not about just stopping fraud anymore. It’s about the fraud, friction and functionality conundrum. To defeat fraud you add more security which adds friction and a loss of functionality. But we need to reduce friction to boost functionality, ideally.

So I’d like to talk to visitors from other industries at Cyber Security Expo to see if they have the same problem in their organisations. For example, are they under pressure to increase functionality through BYOD and Instant Access etc, but are they managing this without increasing friction? I’d like to find out.

6 What are you hoping for in the year ahead in terms of security?
Look, there’s always going to be more fraud. I want to see how people move beyond that, are you stopping business or enabling business? How do we make it so that the business still operates. You have to find clever ways of increasing security without hindering the business.

Join Uri on Wednesday 09th October, 13:40 – 14:10, in the Cyber Security EXPO Keynote Theatre.

 

Balancing security and flexibility in the cloud

08 Aug 2014
by
James Carnie
James Carnie
James Carnie is head of solutions architecture at UK cloud company, Adapt.
cloud

Security concerns still make businesses wary of the cloud. Whilst companies are aware of its benefits – speed of deployment, scalability, capital expenditure savings, flexibility and technological innovation offered by service providers – reservations and scepticism still remain.

The fear is understandable – businesses typically like to be able to see and touch their own servers and know exactly where their data is being held, whereas cloud services are by definition intangible.  Businesses are unsure how safe it is – so much so that a past LinkedIn survey revealed that 54 per cent of IT decision makers cite data security as the key inhibitor to cloud adoption.

Essentially, an organisation’s cloud security strategy should not be fundamentally different than that of their physical infrastructure. An effective security management policy requires a holistic approach and by following best practice guidelines, businesses can distil their own cloud security concerns and make sure their chosen cloud provider has the right approach to data protection. This will enable organisations to enjoy the savings and flexibility offered by the cloud without compromising their own security requirements.

Read More

Cloud security best practices

Any company planning on migrating to the cloud should ask some searching questions beforehand to make sure their provider offers watertight security as part of their cloud service. However, the first question should be whether the cloud is even right for them. Businesses should thoroughly assess their needs and current IT situation before outsourcing to the cloud, as not all apps and data are ready to be moved there, and may actually increase costs rather than reduce them.

It should be noted that the cloud is dependent on people, processes, technology and location. Conducting a thorough assessment of these key areas can give organisations a level of confidence that can help them overcome the fears associated with migrating to the cloud.

People

The greatest threat to any organisation is users with privileged access doing something they shouldn’t, whether it is done accidentally or with malicious intent. However, the risks can be lessened if the service provider vets employees using simple background-checking processes and ensures they have the relevant level of competency to do their job.

Processes

Data security is not just about implementing the latest firewall, robust threat management systems or having the most secure site. The effective management and administration of a data centre is just as vital. Additionally, most outages are not caused maliciously, but are due to mismanagement of the systems. A simple misconfiguration during an upgrade can cause an entire system to shut down. The continuity of business-critical information must therefore be effectively managed. Any analysis of a potential vendor requires full understanding of its internal processes and procedures and how change is handled.

Organisations need to adopt an information security management system (ISMS), such as IS27001, and a set of robust change management processes, such as those laid out by ITIL.

security padlock

Technology

One advantage of the cloud is that is can be accessed from anywhere at any time. Yet this constant availability also means that it will always be subject to attack.  If a data centre is (or appears to be) inadequately protected, the infrastructure held there will be vulnerable to hackers.

Single tier firewall solutions will not deter hackers targeting confidential information – a multi-tiered defence in depth approach will offer much greater protection. Does the service provider offer leading edge technology to deter attacks?  Does it offer basic perimeter firewalls or advanced application host-based anomaly detection? Public-facing companies that often attract the highest number of attacks should consider whether a provider’s security coverage includes real-time threat monitoring, log management and importantly, denial of service (DDoS) attack mitigation to protect their customer data.

Location

Many firms do not know where their provider’s data centres are located but this should in fact be one of the most important questions in the selection process – their facilities will house an organisation’s private data and mission critical systems.

With varying degrees of data protection legislation around the world, businesses should also consider the implications of placing their operations in a location where laws are not as stringent as their domestic market. Larger organisations are increasingly concerned with where their data is being held, not only because of data protection issues but also because of privacy laws such as the Patriot Act, which allows the US government access to data held on US soil.

Additionally, it is not just the location of your compute estate – you need to consider where the administrative staff are based and ask if information is leaking outside the geographic jurisdiction of your choice.

Finally it is also important to assess whether the site is in a secure location or if it is susceptible to adverse environmental conditions that may cause outages.  If an outage does occur, are there resilient failure capabilities to ensure seamless continuity of service?

Flexibility

Organisations place a significant level of trust in their chosen cloud provider, which makes recommendation one of the most powerful tools in deciding whether to adopt cloud services and from whom.

While a provider may tick all the right boxes, there is still a level of risk when it comes to migrating to the cloud.  These potential risks need to be weighed up against the flexibility the cloud provides. Businesses that regularly work with highly sensitive data in industries such as the public sector, healthcare and finance may want to opt for their own private cloud environment. However, this approach limits an organisation’s flexibility to expand beyond its constrained hardware investment. When it reaches capacity, it has to quickly invest in additional hardware in order to support increased demand. With public cloud services, businesses can order more utility instantly, scaling up or down as required.

Ultimately, businesses need to assess their own needs carefully and be mindful of the risks. Utilising cloud services can provide significant, well-documented and measurable benefits – however there is no silver bullet to reduce the risks. Working collaboratively with chosen cloud providers and ensuring the necessary processes are in place can give organisations the level of confidence that overcomes fears associated with the cloud.

Global retail brands keynoting at IP EXPO Europe 2014

08 Aug 2014
by
Mike England
Mike England
Mike is the Content Director at Imago Techmedia which runs IP EXPO Europe, Cyber Security EXPO and Data Centre EXPO
ip_expo_floor_contentfullwidth

Europe’s leading IT infrastructure and cloud event, IP EXPO Europe 2014 announced retail as one of its key areas of focus for this October’s event. Insights and excellence will be shared by payments giant Visa Europe, multinational consumer goods giant Unilever, global coffee corporation Starbucks and world-leading travel group TUI Travel. All four global brands will share insights into how technology enhances customer experience in the multi-channel retail sector.

The CEO for IP EXPO Europe. Mark Steel stated that; “In the increasingly competitive retail sector, it is imperative that brands keep ahead of technology developments to drive business success. Few have done this more effectively than those we are announcing today. Whether it be the challenges from managing Big Data, how to enhance overall customer experience or ensuring online retail security, this year’s event has everything retail IT professionals need to put technology at the heart of business growth and efficiency.”

Highlights at IP EXPO Europe 2014 will include:

Read More

Robert Teagle - Startbucks Robert Teagle, EMEA IT Director for Starbucks, will take the stage in the Digital Transformation Summit, a new one day programme that takes place within the main IP EXPO Europe keynote theatre on Thursday 9th October. Robert will be sharing his recipe for managing innovation in the retail sector – what processes Starbucks seeks to employ and how success is measured. He will also demonstrate how the company’s enthusiasm and dedication within the drinks industry is reflected in the company’s innovative approach to IT.

Yas Poptani, Group Chief eCommerce Architect for TUI TravelYas Poptani, Group Chief eCommerce Architect for TUI Travel, will present in another new programme being introduced for 2014, the Big Data Evolution Summit. In association with Cloudera, this dedicated event, will consist of a Hadoop Business Use Case Theatre and a Hadoop Technology Takeaways Theatre as well as a bustling partner zone, that provides individuals with the opportunity to engage with the experts and see solutions in action. Yas will be presenting in the Hadoop Business Use Case Theatre on 10th October. Poptani will discuss how the company approaches Big Data and its integration into a complex multi-solution environment. As one of the world’s leading travel groups, TUI is on a mission to drive better customer experience, enabled by data. Poptani will touch on the risks associated with data-driven technologies in the travel sector, and how approaching Big Data management in a phased manner will ultimately provide a more streamlined customer experience.

Steve Wright, Chief Privacy Officer UnileverSteve Wright, Global Privacy Officer for Unilever, will be speaking in the Cyber Security EXPO keynote theatre on 9th October. Wright will be discussing how to ‘Solve the privacy versus marketing conundrum.’ At a global brand like Unilever this means, maintaining a balance between the requirement to protect personal data and meet global legislative demands whilst utilising data to provide them with the brands and products they want. In a recent exclusive Technology.Info interview, Wright touched upon the area of Big Data. “It’s how we manipulate, how we analyse and interrogate data which has come on leaps and bounds, in the last few years.   It should be called the “big ability to analyze data” perhaps

Darron Gibbard, Head of Enterprise Risk Services for Visa EuropeDarron Gibbard, Head of Enterprise Risk Services for Visa Europe, will speak at the Cyber Security theatre on 9th October. Gibbard will pose the question “What constitutes your network anyway?” and discuss how network protection is more complex than ever before, thanks to cloud and BYOD, examining where network protection starts and ends – prepare to unlearn everything you know about network security and walk away with some new thinking about Network Security.

 

Registration is now open for IP EXPO Europe and either of the brand new events collocated for 2014, Cyber Security EXPO or Data Centre EXPO.




DCAM versus DCIM

06 Aug 2014
by
Puni Rajah
Puni
Puni built her reputation for fact-based decision support as a business analyst at Deloitte. Since then, she has conducted IT consumption, management and sales research in the European and Asia/Pacific markets. She is expert at understanding enterprise software and services use and buying behaviour, and has coached marketing and sales executives. Puni is the Content Analyst for Data Centre EXPO which takes place alongside IP EXPO Europe in London ExCel on 8 - 9 October 2014
IBM_Blue_Gene_P_supercomputer

In our chapter introduction, we pondered on the possible reasons DCIM has been slow to start. Steven Beber, Managing Director of Trackit Solutions, thinks he may know the answer to that. He suggests that it may be because DCIM does not offer immediate returns on investment, especially for smaller companies with simpler needs. In fact, it may be better for some companies to opt for a Data Centre Asset Management (DCAM) solution, which will be quicker and more straightforward to implement, and often more cost effective.

Using DCIM

Companies tend to turn to DCIM for two related reasons. The first is to enable them to control and manage their data centre’s energy consumption, especially in view of governmental interest in energy efficiency in data centres. The second reason is that DCIM solutions enable the integrated management of IT and facilities, two areas which have not commonly been managed side-by-side, and whose systems are generally incompatible. An overarching DCIM solution can save months of work on incompatible legacy systems.

However, there are real issues with getting immediate value from DCIM solutions. You need to put in an awful lot of data to get anything of value out. And many companies just do not have that sort of data, especially not of the necessary quality to be able to rely on. As the old adage goes, garbage in, garbage out. It is therefore possible that what companies need first is some kind of asset management tool to tell them just what they’ve got and what it’s doing, and that’s where Data Centre Asset Management (DCAM) comes in.

DCAM as an alternative

Steven Beber launched Trackit Solutions back in 2008 to fill a gap in the DCIM/DCAM market. The company delivers data centre inventory and validation services, and has audited millions of assets across six continents since 2008. In the process, company staff recognised a gap in the market: the most efficient way to track IT was surely to use IT, but there was no really good tool to do so. They set about creating one, in the form of Trackit.

Trackit, as the name suggests, tracks, validates and updates information on the assets in data centres. Customers can rely on the data provided by Trackit to make strategic decisions about the use of their assets. It sounds simple, but a good many data centre managers don’t have this kind of reliable information at their fingertips. And certainly, if you are to get value out of a DCIM solution, you need this basic data.

The 80-20 rule

Trackit is also efficient. Because the company was familiar with data centre audits, staff knew that they could improve processes. For 23 typical individual processes, using Trackit improved the efficiency by over 50%.  It also integrates with DCIM tools,  but with added confidence that the data is accurate. And if you don’t have a existing DCIM tool, the company suggests that Trackit gives you 80% of DCIM functionality at 20% of the cost.

Hear more from Steve Beber at Data Centre EXPO

 


The challenge of protecting the world’s biggest telco

06 Aug 2014
by
Paul Fisher
Paul Fisher is the founder of pfanda - the only content agency for the information security industry. He has worked in the technology media and communications business for the last 22 years. In that time he has worked for some of the world’s best technology media companies, including Dennis Publishing, IDG and VNU. ​ He edited two of the biggest-selling PC magazines during the PC boom of the 1990s; Personal Computer World and PC Advisor. He has also acted as a communications adviser to IBM in Paris and was the Editor-in-chief of DirectGov.co.uk (now Gov.uk) and technology editor at AOL UK.  In 2006 he became the editor of SC Magazine in the UK and successfully repositioned its focus on information security as a business enabler. In June 2012 he founded pfanda as a dedicated marketing agency for the information security industry - with a focus on content creation, customer relationship management and social media Paul is the Editorial Programmer for Cyber Security EXPO which runs alongside IP EXPO Europe, 8-9 October, ExCel, London.
vodafone

Richard Knowlton is Group Security Director at Vodafone. He is appearing at Cyber Security Expo in a keynote presentation on Wednesday 8th October.

His global role with the world’s biggest telco brings substantial risks and pressures. Dialing in from his base in Italy, he focused on some in a recent interview with us.

Since we last spoke, the world seems to have got more dangerous – not less. How has this impacted on the risks that businesses like Vodafone face?
In general the broad security threats that multinational face don’t tend to change much over time.  State actors, organised crime, hacktivists and extremists are very much the background noise to my life.

However, a specific concern for Vodafone is data attacks. Our crown jewels are customer details, the call content, location data, who they are calling. Plus of course stock price sensitive information such as M&A plans makes us a juicy target.

Then there are attacks launched across our networks aimed at our major enterprise clients. We are also part of the critical Infrastructure, and so all of these factors play into the complex risk management framework that Vodafone has.

Are huge multinationals like Vodafone increasingly a target for hackers, criminals and hacktivists?
The simple answer is yes. Every month we face 70bn cyber events, blocking about 250,000 attempts to hack into our networks. These are huge numbers that result in about 60 or 70 events that warrant really serious attention.

As a business Vodafone is expanding fast into new markets which makes us an even bigger target. More generally the attackers are developing in sophistication at an extraordinary rate. To combat this close cooperation is essential with industry peers and government.

But a criticism of many large enterprises is that they won’t share intelligence on cyber attacks as they feel it would be bad PR?
The sharing of intelligence on cyber attacks and defence best practice is essential, and we all need to get better at it as fast as possible. We all need each other.

The UK Computer Information Security Partnership (CISP) is a good model, but really valuable exchanges need a mutual level of trust. The telecoms, finance and defence sectors tend to work well because they have been the subject of extreme attacks for much longer. The intelligence exchange carries on behind the scenes and largely carries no PR risk.

There are of course tricky issues around notifying customers when a breach has been identified. Our approach is based on a firm principle: the customer is king. Transparency is the only way we will retain the confidence of our customers.

Smartphones and social media are becoming a third front in wars and conflicts. How hard is it to resist calls to shut down networks in troubled operating regions?
The Arab Spring was a wakeup call that we were now in the front line, not just us but the whole telecoms sector. So we took a cross sector approach with eight partners in the telecoms sector and published our guiding principles, for anyone to read.

These state our principles on how we manage situations. If there is a deteriorating situation in country x, and the government asks us to switch off our network, the principles set out exactly what we do. We will make that request public, questioning if this is really what they want to do.

We were the first global telecom company to publish full information on what national governments demand from us and how far we go along with them. This was published in June in our Annual Sustainability Report,  and has had a big impact around the world. Many other telecom companies are now putting the issue  at the top of their agendas.

It’s an area that we are absolutely determined to strike the right balance between our customers’ rights and our legal obligations.

What is your hope for the world of cyber security in 2015?
First, that all of us do everything possible to encourage greater co-operation to protect ourselves against cyber attacks. Intelligence exchange, that’s crucial, but also we need to focus on the knotty public policy issues.

For example, on EU law we need to ensure there is a private public partnership approach to legislation. We don’t want extra layers of legislation, which is often irrelevant by the time it gets to the statute.

I’m the Vodafone representative on the board of the Internet Security Alliance, and working very hard to establish a European version (ISAFE) to help thrash out such matters.

My second hope is for more security awareness at board level. Still too many see cyber security as a technical issue, throw money at the CISO and CIO and hope the problem will go away. We have to make it an enterprise wide risk issue. The ISA has just published a paper on principles for managing cyber risk for Boards of Directors.  It’s been downloaded 9000 times already.

Finally, what’s the best thing about being based in Italy?
This question is more difficult than it seems! Lots of people would say the people, the food and the wine, but I think the greatest thing about Italy, and Sardinia in particular, is that the work life balance is geared much more towards the life bit. In an international job like mine, it’s really important to get that right.

Richard Knowlton’s keynote presentation titled “Keeping the world’s biggest phone business safe and its shareholders happy” will take place at Cyber Security EXPO in London ExCel this October.



Cyber Security EXPO Panel: BYOD – why we can’t just say yes ! ?

05 Aug 2014
by
Mike England
Mike England
Mike is the Content Director at Imago Techmedia which runs IP EXPO Europe, Cyber Security EXPO and Data Centre EXPO

As the post-PC revolution gains momentum, and dependence on tablets grows, IT departments are receiving repeated requests from employees to bring their own device to work.

BYOD is said to increase productivity, due to familiarity of the software, and the initial cost of investment is removed – so, why isn’t the answer a simple yes?

Finding answers to this and other questions, and being aware of the risks associated with modern digital needs will be popular themes at Cyber Security EXPO 2014.

Join us in this exclusive panel debate, in which Paul Fisher, Conference Producer, Cyber Security EXPO, will be joined by Quentyn Taylor, Director of Information Security, Governance and Risk at Canon to give real time insight from a leading global organisation. Also joining the panel, Stratos Komotoglou, EMEA Product Marketing Manager at MobileIron.  Stratos has a vast experience of mobile security and networking and a great ability to connect technical implications to strategic business decisions.

Attendees will learn:

  • How to successfully implement a BYOD strategy and factor in security, risk and compliance
  • The importance of data classification in any BYOD roll out
  • The advantages and disadvantages associated with BYOD
  • The difference between BYOD and CYOD – is there one?
  • Can you phase in a BYOD distribution?
  • Why young people will demand to bring their own device
  • The implications of the Apple/IBM enterprise deal for BYOD and the post-PC business environment

Register below and join this years Cyber Security EXPO conversation early



2014: a buyers’ market for skilled, experienced IT professionals

05 Aug 2014
by
Jessica Twentyman
Jessica Twentyman
Jessica Twentyman is an experienced journalist with a 16 year track record as both a writer and editor for some of the UK's major business and trade titles, including the Financial Times, Sunday Telegraph, Director, Computer Weekly and Personnel Today.
Tech Salary

Employers can’t afford to hang around when it comes to hiring the top talent with the most sought-after skills, warns recruitment firm Robert Half

For IT professionals with the right skills, 2014 is looking like a buyers’ market when it comes to job opportunities: salaries are rising and those candidates with the most sought-after skills are receiving multiple job offers, according to a new report from recruitment firm Robert Half.

In fact, the report warns, employers must be careful not to spend too long weighing up the pros and cons of a potential hire: those looking to secure the industry’s top talent “are finding that lengthy interview rounds are prompting their top choices to accept competing offers.”

The findings of the Robert Half Technology 2014 Salary Guide echo another recent report from the UK Commission for Employment and Skills (UKCES), which estimates that employment in the UK IT industry grew 5.5 percent between 2009 and 2012, more than three times the UK average.

The biggest salary increases for IT professionals, meanwhile, are for roles that demand specific experience in big data analytics; ecommerce and mobile; and IT security. Salaries for all those roles will rise year-on-year by 4 percent or more in 2014, outpacing the average UK salary rise rate of 2.5 percent, says Robert Half’s report.

This buoyancy in the UK IT jobs market, says the firm, is being driven by two key areas of business need. First, organisations are looking to enhance employee productivity by updating desktop and business applications. Second, they’re investing in customer-facing initiatives to provide a more dynamic online experience. According to e-skills, the UK’s Sector Skills Council for Business and Information Technology, around half of firms blame IT skills gaps for delays in developing new products or services, and thereby hindering further plans for growth.

“Technology is a key element in the recovery of the UK economy, whether it’s supporting new propositions or helping to improve business operations,” said Phil Sheridan, senior managing director at Robert Half Technology. “Companies will use technology to grow their businesses as we emerge from the economic downturn: candidates with a strong sense of commerciality and business acumen as well as technical skills are increasingly called on to manage product development lifecycles and deliver business solutions.”

When it comes to specific job roles, the positions offering salaries that will rise 4 percent or more between 2013 and 2014 are: database/business intelligence developer; web designer; information security manager; information security officer; mobile applications developer.

The best-paid role overall, naturally, is chief information officer, offering average remuneration of between £120,500 and £230,000 across the UK as a whole, and ranging from £155,500 to over £297,000 in London. CIO salaries will rise 2.3 percent year-on-year in 2014, says Robert Half.

Chief information security officers (CISOs) in the UK can expect to command a salary between £75,000 and £134,500 (up 3.5 percent year-on-year) and chief architects will be paid between £78,750 and £143,500 (up 2.4 percent).

The Robert Half Technology 2014 Salary Guide covers a wide range roles, geographies and industry sectors and is now available for download.


Chasing and detecting evasive malware

05 Aug 2014
by
Paul Fisher
Paul Fisher is the founder of pfanda - the only content agency for the information security industry. He has worked in the technology media and communications business for the last 22 years. In that time he has worked for some of the world’s best technology media companies, including Dennis Publishing, IDG and VNU. ​ He edited two of the biggest-selling PC magazines during the PC boom of the 1990s; Personal Computer World and PC Advisor. He has also acted as a communications adviser to IBM in Paris and was the Editor-in-chief of DirectGov.co.uk (now Gov.uk) and technology editor at AOL UK.  In 2006 he became the editor of SC Magazine in the UK and successfully repositioned its focus on information security as a business enabler. In June 2012 he founded pfanda as a dedicated marketing agency for the information security industry - with a focus on content creation, customer relationship management and social media Paul is the Editorial Programmer for Cyber Security EXPO which runs alongside IP EXPO Europe, 8-9 October, ExCel, London.
vigna_ucsb

Giovanni Vigna is CTO at Lastline. He is appearing at Cyber Security Expo in a keynote presentation that analyses evasive malware techniques. We caught up with him for a quick chat about the current state of anti-malware techniques and the inspiration behind his upcoming talk. Here’s what he told us.

1 Are our existing tools for combating cybercrime becoming redundant?
I wouldn’t say redundant, we still need antivirus and other conventional tools. But the rapidly changing nature of malware requires a multi-faceted solution. Malware is a fast moving target. It’s also become personalised, and targeted against individuals in the enterprise. So we need increased customisation of anti-malware, and a mix of anti-malware solutions. But by the same token the bad guys are evolving malware to avoid detection in increasingly sophisticated ways. There are very different set of targets. Home users and CEOs get attacked by very different types of malware, and targeted attacks can now bypass traditional barriers to reach very high levels in an organization. Then we have opportunistic attacks. Even if just 10% of the 10% ransomware attacks that hit the target result in payment, its still lucrative for the cyber criminals.

2 Your talk sounds fascinating, particularly this concept of evasiveness. Can you explain briefly what you mean?
Evasiveness has become an important side of the game. Malware is becoming evasive in many different ways, using stalling techniques to prevent detection or even altering [its] fingerprint. So we need to detect this evasiveness and identify if a website or a network is being attacked. If we can identify the fact that the malware is looking around, knowing it is being analysed, then we can see that it is malware. But of course it is hard to do this without sophisticated analysis techniques. Which is what I will be expanding upon at Cyber Security Expo!

3 Is there still a place for antivirus in the modern enterprise?
Yes. I don’t think that antivirus is dead, I think it works for certain types of threats but the antivirus community is having a hard time keeping up with modern malware.  Antivirus uses recognition and static analysis, which provides a very useful signal in many cases. Unfortunately we now see threats that are easily bypass traditional antivirus. Any organisation that uses AV as sole protection is making a huge mistake.

4 Is academic research underused in the battle against cybercrime?
I cover both the research and commercial worlds. It’s of course very important that we take academic cyber research and try to make it work in the commercial world. But many academic things are very difficult to transfer. For example, the evasive behavior that was studied in the academic world a few years ago was considered cutting edge and rare in the real world. Now it’s become commonplace. So it’s important that we take notice of the research being done right now on new and rare threats, before they become common and substantial threats.

5 Should security vendors share more information with each other?
Of course they should yes, it’s a great way to improve the state of the art in cyber security. The problem is that information and intelligence is a major differentiator between vendors, so it’s not feasible strategy from a commercial point of view. If everybody shares the intelligence, then the competitive advantage will disappear. Take APTs. A company like Mandiant that made it their business to track APTs would not exist. So for all political and strategic reasons such cooperation is unlikely. And a forum for shared intelligence would be very difficult to regulate, someone may simply suck out intelligence and use it against you competitively. Microsoft has become very good at sharing information with security vendors about vulnerabilities. If everybody would do the same, that would be good. If software companies would share this with vendors – everybody wins. This would commercially viable

6 What are you most looking forward to at Cyber Security Expo this year?
I’m looking forward to seeing innovation. New and radical approaches to security. There is so much going on that we need to be aware of in mobile BYOD, APT and other areas. I am sure Cyber Security Expo will have much to offer in this respect.