The ultimate technology resource.

Technology.info will inform, educate and inspire. Immerse yourself in the very latest IT research and resources to give your business the edge.

Company Directory

Compleate A - Z listing of all companies VIEW NOW

Find a company you need by area of technology:

Brought to you by IP EXPO Europe

Accenture: Unlocking the intelligent infrastructure

18 Sep 2014
by
Jessica Twentyman
Jessica Twentyman
Jessica Twentyman is an experienced journalist with a 16 year track record as both a writer and editor for some of the UK's major business and trade titles, including the Financial Times, Sunday Telegraph, Director, Computer Weekly and Personnel Today.
intelligent infrastructure - Accenture

In the race to achieve digital transformation, many organisations are straining their IT infrastructures to breaking point. The business results, meanwhile, of IT systems under intolerable pressure are only too clear: customer-service failures, supply-chain delays, stalled innovation and compromised security.

It doesn’t have to be this way, says Steve Paxman, managing director in the Infrastructure Services Practice at consultancy firm Accenture. By creating what he refers to as an ‘intelligent infrastructure’, organisations can ensure that they’re better prepared to react to market and technology changes – even staying one step ahead of them, if they’re lucky.

“And, as a result, they can serve customers better, collaborate more effectively and innovate faster,” says Paxman.

Read More

This intelligent infrastructure will be the subject of Paxman’s presentation at IP EXPO Europe’s Digital Transformation Summit. There, he’ll provide attendees with more detail on Accenture’s vision of IT infrastructure that can predict, learn, self-provision, optimise, protect and self-heal.

“At Accenture, we talk to a lot of large enterprises – and many smaller ones – that are now making a big push in order to go from being followers to being leaders in digital,” he says. “But increasingly, we see them recognise that their ability to move quickly and respond flexibly to changes in the market is compromised by constraints imposed by their IT infrastructure.”

That doesn’t mean chucking the lot and starting fresh with shiny new kit, however. It means taking a more considered approach to integrating existing IT assets with new ones and weaving together a blend of on-premise and third-party, often cloud-based, resources.

“The components that make up the intelligent architecture are all broadly known and familiar to IT heads,” says Paxman. “The challenge lies in integrating and managing them in ways that produce the best business results.”

So what can the intelligent architecture do that previous approaches to IT infrastructure struggle to achieve? By Accenture’s definition, an intelligent architecture is one that: knows when extra capacity is needed – and when it might be required again; optimises services by moving applications and processes to different providers, based on cost effectiveness; senses when a problem arises and takes steps to fix the problem itself; and automatically configures unifed communications for employees and secure connectivity to the core enterprise.

That sounds like a tall order – but Paxman insists it’s not a pipe-dream, just a careful blending of automation, orchestration and advanced analytics technologies.

“This is a journey. The vision I’ll be describing isn’t something any company can create overnight. For most, it will take a phased approach over two to three years – but they’ll need a very clear idea of where they want to be at the end of the journey. So, in other words, they need to understand the destination they’re trying to reach.”

And that destination, he adds, won’t be defined by new technological capabilities, but by better business outcomes: productive employees, innovative new products and services and satisfied customers. But make mistake, he says: “The race is already on.”

Steve Paxman will present ‘Enabling the Digital Business‘ as part of the Digital Transformation Summit at IP EXPO Europe.

 

 

 


Can relayr bring IoT to the data centre?

17 Sep 2014
by
Puni Rajah
Puni
Puni built her reputation for fact-based decision support as a business analyst at Deloitte. Since then, she has conducted IT consumption, management and sales research in the European and Asia/Pacific markets. She is expert at understanding enterprise software and services use and buying behaviour, and has coached marketing and sales executives. Puni is the Content Analyst for Data Centre EXPO which takes place alongside IP EXPO Europe in London ExCel on 8 - 9 October 2014
Relayr IOT DataCentre

Despite venture capital funding rush and ample column inches dedicated to the subject,  in many other ways, the Internet of Things is a long time coming. For data centre engineers who can visualise the the many possible applications of IoT inside the data centre, the wait could be over. Self-service app development on top of pre-configured connection options is how  relayr believes we will deliver the promise of possibilities, and its WunderBar is positioned for those who cannot wait for suppliers to catch-up.

App-building made simple

The WunderBar is designed to allow easy building of innovative and apps that connect smart devices together. It provides a platform that will connect different sensors with different smart devices.

The idea for the WunderBar arose from co-founder Jackson Bond’s family situation, and the need for his grandmother to move into a care home. From that came conversations about the needs of older people for care and support. It is, it seems, almost possible to maintain your independence at home with the support of smart sensors, wearables and the like. But there’s still nothing that brings the information from all the sensors together, and enables someone to manage their life. No platform. That’s the gap that the WunderBar fills.

Read More

Simple in concept, and mind-blowing in terms of its potential, the WunderBar has six Beacons (BLE), sensors, and Wifi. The sensors include light, colour, distance, temperature, humidity, remote control (infra-red), accelerometer, gyroscope, a sound/noise sensor voted for by the public and a Grove connector. This last allows it to connect with all the sensors in the Grove system. The WunderBar connects all these sensors together, and to other smart devices, via the relayr OpenSensor Cloud, allowing a joined-up approach to app development and the IoT.

The WunderBar is also, and possibly most importantly, easy to program. If you doubt this, rela

yr has run its own team hackathon to test it out. It is happy to report that even the least technically-minded in the team were quickly able to get to grips with the system and what it offered. From a device to protect your property from being stolen, through a gas and alcohol detector, to a proximity-tracking system usable for children, dogs or luggage, as well as a robotic spider, the team’s ingenuity in terms of ideas was impressive. But it’s almost equally impressive that they delivered too, demonstrating the ease of use of their product.

Secure and open

The WunderBar is also secure.  You can input your local Wifi credentials and create a secure connection between the WunderBar sensor modules and the relayr OpenSensor Cloud. You can use secure SSL connection to the cloud, encrypting the data using the ‘Secure-It’ option. Although relayr admits that building security into low energy devices over domestic networks is a challenge, it’s doing its best to rise to it.

relayr is also committed to keeping the IoT open and accessible, by releasing its SDKs and API implementations as Open Source projects. It also plans to open source its hardware as much as possible too.

Crowd-funded confidence in an experienced team

Team credentials explain its positioning. relayr CEO and founder, Harald Zapp, has over 25 years’ experience in IT, and was recently an executive at Cisco Systems Europe. Co-founder, and Chief Product Officer Jackson Bond has plenty of experience in start-ups, having founded and sold a Speech Recognition Apps company. He is also co-founder of MONOQI. The CTO, Cedric Deniau, has built global-scale transactional platforms for Jesta Digital and eMusic, and Chief Engineer Paul Hopton has experience of building ambitious, secure, high-transaction cloud platforms, for Bosch and Number4. There’s a lot of expertise, and a lot of experience there, both technical and commercial.

It’s an impressive team, and it certainly seems to have caught the eye of investors. The team has spent time at Startup Boot Camp in Amsterdam. relayr also launched a crowd-funding bid back in January and had reached its target for investment by March. In fact, it’s made 22% more funding than was actually required, and has a delivery date of this summer, so it’s very much a matter of watch this space, as developers start to use WunderBar for real.

Hear more from Harald Zapp  when he joins a keynote entitled “Extending the platform: using APIs to integrate from bricks to bytes” at Data Centre EXPO on Wednesday 8th October in the Data Centre Keynote Theatre.

Internet of Everything and business – opportunity knocks

15 Sep 2014
by
Rohini Srihari
Rohini Srihari
Rohini Srihari is Chief Scientific Officer at SmartFocus. Rohini Srihari is an educator, scientist and entrepreneur. She has founded several language technology companies, including Content Savvy, a leading provider of content analytics solutions and services to enterprise customers across multiple industries, including the US DoD. Content Savvy was recently acquired by SmartFocus, a company offering context-aware marketing services, and where she now serves as Chief Scientist. Dr. Srihari is also a Professor in the Dept. of Computer Science and Engineering at the University at Buffalo. She has given numerous talks and published extensively in the areas of text mining, social media mining, and multilingual text analysis. She received a B. Math. from the University of Waterloo, Canada and a PhD in Computer Science from the University at Buffalo.

Rohini Srihari, Professor of Computer Science and Engineering at University of Buffalo and Chief Scientific Officer at SmartFocus, discusses the opportunities the Internet of Everything offers businesses

We have all inevitably heard the buzz around the Internet of Things (IoT), a future world in which traditionally non-digital devices, like refrigerators and toasters, communicate with each other. In the world of IoT, appliances are networked together via household robots, which in turn communicate with other devices in order to carry out daily tasks.

But what is the Internet of Everything (IoE)?

More importantly, what is the opportunity it presents for businesses? In short, the IoE is what is achievable now. It is where previously disconnected digital objects, systems and data come together in a meaningful way. IoE is an all-encompassing interconnected existence that brings together four main elements:

  1. people
  2. processes whereby people, data, and things interact
  3. data
  4. inanimate objects and devices.

 

For instance, IoE would allow us to observe the internet ‘trail’ that we create based on our movement, online conversations and actions. When this is achieved it allow us to model and understand human behavior in a variety of applications. For example, we can take advantage of both mobile and behavioral context to deliver personalized and timely messages to customers.

Read More

The Guardian, the leading UK daily newspaper, recently referred to the IOE asan opportunity to measure, collect and analyze an ever-increasing variety of behavioral statistics. In theory, the mash-up and cross-correlation of this data could revolutionize the targeted marketing of products and services.” (June 20, 2014)

With beacon, machine learning and natural language processing, businesses have the capability to deliver personalized, hyper-aware content to prospective customers and partners; engaging customers more effectively and opportunistically. Traditional recommendation systems are focused solely on “which” products or services to suggest to customers, based on the habits of people with similar transactional history. With IoE, businesses have the opportunity to go a step further, and learn “how”, “when” and “where” to make suggestions – what we call “context aware communication”.

Furthermore, IoE can be used in many other ways – from providing the infrastructure for an application to alert people about events in their neighborhood, to health and wellness applications that help keep people fit and healthy. Mobile platforms like the new iOS and personal robots such as the JIBO robot developed at MIT enable more sophisticated applications, in which behavioral models can be developed to create lifelike interactions.

IoE opportunity from a Technology Perspective

From a technology perspective, the fundamental opportunity that the IoE offers businesses is the ability to model customer behaviour and context by connecting a diverse range of digital sources. By understanding how consumers engage with brands through geospatial tracking, beacon sensors, social media, and the customer data we have via digital marketing campaigns, purchasing history and call centre transcripts, it is possible to communicate with customers in the moment, irrespective of whether it’s in store, on their mobile, online or via email.

Machine learning models are now capable of operating on very large and diverse sets of observed data, a development that we are taking advantage of. We have access to data which informs us as to what contexts (i.e. combination of customer profiles, marketing message, delivery method etc.) have resulted in successful marketing outcomes. It is then possible to leverage machine learning technology to understand the correlation between user profiles and the types of recommendations they are likely to be responsive to.

With all of this data analysis, however, it is important to understand how brands can use what they know and can predict what consumers will want, while still respecting data privacy. The good news is that when creating a model based on behavior, it is not necessary to identify each unique customer. Instead, it is possible to distill the key characteristics that are needed for a customer profile – data such as demographics, interests, purchasing history, etc.   By clustering these anonymous customers’ profiles, we can create “pseudo profiles” that represent different and distinct customer classes. A new customer can then be mapped to the most similar cluster based on observed data that is gathered and further used to personalize communications.

It’s an exciting time and the first businesses to pursue insightful, context-aware communication will no doubt prosper. Businesses now have access to the expertise and ability to analyze content and social media conversations, leverage machine learning to understand what the large sets of data are telling them, use recommendation systems to dynamically create engaging, real-time conversations, as well as deploy mobile technologies to keep in touch with their on-the-go consumers. These are fundamental components of a scalable solution that is able to continually provide consumers with engaging marketing, which is all driven by what we know about customers.

 

Rohini Srihari is Chief Scientific Officer at SmartFocus. At SmartFocus, we have the technology to help you truly know your customer and engage them with the right message, at the perfect moment, wherever they are – in store and online across all of their preferred channels. SmartFocus is the Internet of Everything solution for brands.  It includes beacon technology that knows when your customer visits a physical store; real-time predictive recommendations generated at the precise moment they open your email; and powerful insights that use smart analysis of Big Data to effortlessly generate a truly personal experience. Every year, we execute more than five million marketing campaigns, manage more than five billion customer records, and deliver more than 55 billion personalized interactions. Global brands such as French Connection, Speedo and Mercedes-Benz rely on SmartFocus to deliver business-critical, joined-up marketing campaigns that make a difference to their customers and to the growth of their business.




CIO Survey 2014: Optimism, importance and skills

11 Sep 2014
by
Puni Rajah
Puni
Puni built her reputation for fact-based decision support as a business analyst at Deloitte. Since then, she has conducted IT consumption, management and sales research in the European and Asia/Pacific markets. She is expert at understanding enterprise software and services use and buying behaviour, and has coached marketing and sales executives. Puni is the Content Analyst for Data Centre EXPO which takes place alongside IP EXPO Europe in London ExCel on 8 - 9 October 2014
CIO Survey 2014 Harvey Nash

The results are in from this year’s CIO Survey, published by Harvey Nash, the global recruitment and offshoring group. The survey represents the views of 3,211 senior IT leaders across more than 30 countries, and is one of the largest studies of its kind. It makes positive reading.

CIO Survey 2014 – A watershed year

Jonathan Mitchell, Harvey Nash’s Non-Executive Chairman of Global CIO practice, believes that the CIO survey 2014 results suggest that we’ll look back on 2014 as the watershed of the global financial crisis. Optimism is creeping into the views of CIOs. Budgets seem to be rising, and for the first time in a decade, CIOs are worrying about skills shortages. So what are the main findings from the survey?

First of all, CIOs are becoming increasingly important to their organisations. More than half of CIOs hold executive posts, with a seat at the ‘top table’, and 44% of organisations now report that they have a CIO, up from just 11% in 2005. This seems to be part of an ongoing trend since the financial crisis. But perhaps what’s even more crucial is the strategic focus of CIOs. A few years ago, it was all cost savings, or at best operational efficiencies. Now, most CIOs report that their focus is on growth, whether building revenue, or better engagement with customers. It’s an interesting trend that matches with Ian Cox’s work on the changing role of the CIO and IT departments.

Read More

Shortage of skills at the root of many challenges

All is not totally positive, of course. Project delivery remains an issue for major business transformation projects, very few of which have been considered highly successful this year. CIOs are also increasingly concerned about skills shortages, and the impact on the organisation’s ability to keep up with the market. The skills shortage is exacerbating problems with project delivery. Nearly 60% of CIOs responding to the survey felt that this was a significant problem, 20% more than last year, and the highest level since the start of recession in 2008.

The biggest skills shortage is in project management skills, and change management skills have seen the biggest change in demand. Neither of these should come as a surprise given the issues with delivery of major transformation projects, and the challenge of transforming IT departments from supportive into strategic players. CIOs report that the skill most valued in interview candidates for leadership positions is ‘business understanding’.

The expanding scope of relationship management

Relationships with other functions also remain a challenge for many IT departments. Most have good relationships with operational departments, but struggle with marketing and HR colleagues. And this is a big issue, because in many organisations, it is the marketing function that ‘owns’ the role of Chief Digital Officer, leaving the CIO struggling to make strategic headway. Several companies reported that they have opted for a perhaps more mature, and more successful, approach to the CDO role: co-ownership between IT and marketing. This pragmatic solution allows each to play to their strengths, and should strengthen the relationships between the two functions.

In financial terms, the situation is also positive. IT departments are generally reporting budget increases. Interestingly, many more are planning to outsource infrastructure projects, up 14% on last year, and also service desks and helpdesks. It’s perhaps a sign that, as Ian Cox has predicted, IT departments are becoming more strategic, and moving towards adding customer value to the organisation.

You can request a copy of the CIO Survey 2014 directly from Harvey Nash.



Microsoft IT head’s advice for the smart CIO: Think big

10 Sep 2014
by
Jessica Twentyman
Jessica Twentyman
Jessica Twentyman is an experienced journalist with a 16 year track record as both a writer and editor for some of the UK's major business and trade titles, including the Financial Times, Sunday Telegraph, Director, Computer Weekly and Personnel Today.
Tim HYnes Microsoft

Tim Hynes, head of IT for Microsoft EMEA, has a question for his fellow CIOs: “What type of conversations do you imagine you’d need to be having with your chief executive, before they’d invite you to take a regular place at the boardroom table?”

It’s a rhetorical question, he says, because for many IT heads, the opportunity to get a hearing at that level simply won’t arise. The issues that occupy the time and attention of most IT heads aren’t ones that interest the senior executive team: maintaining and extending core infrastructure; rolling out line-of-business applications; the systems integration work that stitches everything together.

“Here’s the point,” says Hynes. “Your CEO doesn’t want to talk about those things. Neither does your CFO. What they want to do is have business-value conversations. All they care about is the difference you can make to their business.”

“I was recently talking to the CEO of a huge multinational company, a household name, and do you know what he told me? He said: “IT people are among the dumbest smart people I’ve ever met. They make things too hard for themselves.’ And he’s right, he’s absolutely right. There are far too many IT leaders struggling for relevance in their own organisation, just because they’re focusing on the wrong things.”

Read More

Hynes isn’t dismissing the pressures that most IT heads face. He knows them only too well: in a 25-year career, he says, “I’ve done most of the jobs you can do in IT.” His last role at Microsoft was a global infrastructure role, running all IT systems outside of the company’s global headquarters in Redmond, Washington. These systems supported around 90,000 employees worldwide and Hynes headed up a team of 400 people over 42 countries. He took up his current role in January 2013, because it gives him, he says, the opportunity “to go deeper in my conversations with customers and to understand the challenges they face.”

The biggest of these challenges, he says, is IT’s battle to be considered ‘relevant’, even at a time when digitisation of business processes means that all organisations are using more technology than ever before. It’s an anomaly, he says – but it’s also a direct result of IT’s fixation on systems, apps and integration, rather than strategic innovation.

They need to think big, Hynes urges. “When CIOs talk about transforming their organisation to meet the new challenges they face, they’re talking about the challenges they’re already facing today. The smart CIO is trying to figure out about the challenges their organisation will face in two or three years’ time and organising today for that. That’s the key: you need to try and get a few steps ahead.”

“And the only way to do that is to free your mindshare from those three traditional concerns of IT,” he continues. “You’ve got to free your time up from all that, so you can focus on transformation from an organisational perspective, getting more from your data, establishing faster project turnaround times.”

“You need to think differently – but in order to be able to do that, you’ve got to let go of the past and free up the time of your smartest people, too. They need to be looking at, for example, at how cloud infrastructure could be used to relieve some of the burden of systems management. They need to be developing an IT architecture that allows people in the business to self-provision the IT that they need , where appropriate.”

The trouble is that most CIOs have their noses so permanently up against the grindstone that they’ve no opportunity to think about big data, or consider introducing dev/ops practices or even extending their use of cloud infrastructures.

“You need to back up and take a look around,” he says. “You need to have the confidence to say, ‘I’m first and foremost a business leader. IT is the way I enable the business.”

Tim Hynes will present ‘The Digital Business Pyramid: Creating a Digital Enterprise – Implications and Opportunities‘ as part of the Digital Transformation Summit at IP EXPO Europe.



A ‘nuclear’ approach to digital transformation at Amnesty International

04 Sep 2014
by
Jessica Twentyman
Jessica Twentyman
Jessica Twentyman is an experienced journalist with a 16 year track record as both a writer and editor for some of the UK's major business and trade titles, including the Financial Times, Sunday Telegraph, Director, Computer Weekly and Personnel Today.
Amnesty International Digital Transformation

When an organisation is teetering on the brink of a digital transformation, its leaders must sometimes be prepared to “press the nuclear button”, says Owen Pringle, former director of digital communications at human rights charity Amnesty International. Some collateral damage, he warns, may be unavoidable.

In Amnesty’s case, the nuclear option meant dismantling the digital communications department over which Pringle presided, in order to move to a new approach where everyone at Amnesty uses digital as part of their day-to-day work. In other words, over the course of a two-year digital transformation project, Pringle made himself redundant.

He’s pretty upbeat about that, considering: “What we achieved was a long-term understanding at Amnesty that digital isn’t something that should be limited to a siloed function within the organisation. It’s something that needs to be owned by everyone, from the management team down.”

That rule applies to other organisations too: Pringle’s philosophy is that digital transformation needs to be about fostering a culture of “digital ownership” across all business units – but he acknowledges that it can be a tough transition to make.

With that in mind, the former-journalist-turned-digital-expert, who has also worked at ITN, BSkyB and the Southbank Centre, has recently set up his own digital consultancy, Therein, with the aim of helping others navigate these choppy waters.

Resistance to change, he says, can be a funny thing, coming from areas of an organisation where you don’t expect to find it. Sometimes, it comes from people who are reticent to learn new skills or are overly attached to their existing job descriptions. At Amnesty, there was some concern that, by getting rid of the digital department, the charity as a whole was disinvesting in digital – although that couldn’t have been further from the truth.

“I had to go to great lengths to explain that this was a move that would demand greater investment, in fact – but it was also one that made sense, because there are some aspects of digital communications that are best left to subject-matter experts,” he says. A good example are the researchers that Amnesty posts to conflict hotspots such as Gaza and Syria: “We don’t want people on the ground in areas like that to be deferring to a London-based digital department just to get their findings out there and in front of the right audiences. That’s crazy.”

But pressing the nuclear button isn’t always necessary, he concedes. Sometimes, there are less scary routes to take on the digital transformation journey. Most organisations could be doing more to establish digital governance models, information architecture guidelines and rules of engagement for social networking sites.

The danger, he says, is that they stop once they’ve got those pieces in place. In fact, digital transformation is an ongoing process: once an organisation has its internal rules established, then it needs to think externally: What is the customer experience of our digital communications – are we giving them what they need? How well do we engage with these audiences on an ongoing basis? What role could digital play in the innovation of new products and services for them?

The IT function has a significant role, says Pringle: “It’s about the ‘demystification of digital’, through the provision of a capability layer and the development of the skills employees need to use those capabilities with confidence.”

That may prove to be an easier task that it seems at first glance. “We should be celebrating and exploiting the fact that people know this stuff already. They’re all at home using Facebook and Twitter and so on. They just don’t always see how it fits into a workplace environment. They pass through the company’s doors at 9am each day and suddenly disown it – so they need to be given the confidence to transfer these skills into their working lives.”

Pringle’s a strong believer in the idea proposed by US academic and management expert Peter Senge that “today’s problems come from yesterday’s solutions.” Those organisations that cling to the notion that their digital expertise should be embedded in a specialist team will, over time, run into problems, he warns.

“In a few years’ time, all successful organisations and all successful employees will use digital without thinking about it. They’ll be able to say ‘This is just what we do’,” he says. “After all, we all use electricity at work every day, but most organisations don’t have a head of electricity or an electricity department. It’s just a tool for getting work done.”

Owen Pringle will be speaking at IP EXPO Europe on Thursday 9 October, 13:05-13:30, giving a presentation entitled: ‘Digital Transformation – What is it? And how to do it.


What role for IT in digital transformation, asks LSE professor?

02 Sep 2014
by
Jessica Twentyman
Jessica Twentyman
Jessica Twentyman is an experienced journalist with a 16 year track record as both a writer and editor for some of the UK's major business and trade titles, including the Financial Times, Sunday Telegraph, Director, Computer Weekly and Personnel Today.
London School of Economics - Digital Transformation Research

“I have a sense that, when it comes to cloud computing, we’ve overestimated the short-term impacts, but we’re seriously underestimating its long-term effects,” says Leslie Willcocks, professor of technology, work and globalisation at the London School of Economics.

Professor Willcocks feels so strongly about this imbalance, he tells Technology.info, that he made it the central theme of his latest book, Moving to the Cloud Corporation: How to face the challenges and harness the potential of cloud computing

“While the subject of the cloud is being discussed everywhere,” he laments in the book, “there is a lack of substantive, objective evidence not just about the technological trajectories but also about the potentially more far-reaching business implications of the cloud.” Given the role that cloud computing is set to play in companies’ digital transformation efforts, this lack of foresight could be perilous.

In Moving to the Cloud Corporation, Willcocks and his co-authors Will Venters and Edgar Whitley set out to address this shortfall. The book is based, in part, on a survey of over 1,000 business executives, giving it a strong management perspective. What the co-authors discovered by talking to these respondents was a strong sense of disappointment in the gap between cloud promises and cloud realities.

“If you listen to vendors, every year since 1999 has been the year of the Cloud,” says Willcocks. “But many organisations have found that the Cloud is not as simple as they thought. It’s not the straightforward, plug-and-play solution they thought they’d be buying.”

But, at the same time, once the many challenges of cloud adoption have been overcome, he adds, cloud technologies will have a massive impact on the ability of so-called ‘digital businesses’ to tap into trends such as the Internet of Things (IoT), the automation of knowledge work and robotics. In short, the Cloud lies at the heart of most digital transformation efforts.

CIOs and their IT teams must help their organisations to cross this chasm, Willcocks says. He sees as “problematic” the general perception amongst non-IT executives that the emergence of Cloud chimes the death knell for IT departments. For a start, he argues, it’s based on a fundamental misinterpretation of the role of the modern IT team, overlooking its contribution to the fulfilment of wider business goals.

In a cloud corporation, he says, the IT team may be smaller than before, but it will have a far wider range of skills. “Above all, these [IT professionals] are the holders of the blueprint of the technology platform that their organisation needs. That requires technical and systems architecture skills, certainly, but it also makes them sourcing specialists, business strategists, innovators and supplier-management experts. The modern IT professional must be able to perform a wide range of jobs to make the whole thing work, to manage the strategic direction of IT across the whole organisation.”

These jobs boil down to four key tasks, he says: Governance, to ensure that the technology team’s activities align with wider company activities and goals; Requirements capture, so that systems and services delivered by IT perform closely fit business needs; Blueprint definition, in order that the technology platform at an organisation’s disposal evolves to support new systems and processes; and supplier management, so that contractual obligations and service levels are met by third-party technology partners at a time when so much IT is delivered ‘as a service’.

However, Willcocks warns, technology can’t support digital transformation without two major organisational changes. First, he says, IT teams need to achieve a “step-change” in outsourcing maturity, so that they can handle collaborative innovation with third-party suppliers. Second, senior executives from outside the IT department need to be fully engaged in funding and helping to design and deploy technologies.

“Everyone in an organisation needs to adjust, everyone needs to adapt. But the rewards for those that adjust and adapt successfully will be huge,” says Willcocks.

Interested in learning more? Don’t miss Professor Leslie Willcocks’ presentation at IP EXPO Europe, Moving to the Cloud Corporation: The CIO Mission. This will take place in the Digital Transformation Summit (IP EXPO Europe Keynote Theatre) on Thursday 9 October between 12:40 and 13:05.


Disrupting IT

27 Aug 2014
by
Puni Rajah
Puni
Puni built her reputation for fact-based decision support as a business analyst at Deloitte. Since then, she has conducted IT consumption, management and sales research in the European and Asia/Pacific markets. She is expert at understanding enterprise software and services use and buying behaviour, and has coached marketing and sales executives. Puni is the Content Analyst for Data Centre EXPO which takes place alongside IP EXPO Europe in London ExCel on 8 - 9 October 2014
Disrupt IT Department

Disruption. It’s what SouthWest Airlines and EasyJet have done to the staid, formal airline industry. It’s what Formule 1 has done to the travel-lodge/hotel trade in Europe. It’s basically become a synonym for ‘changed beyond recognition, and how come it took us so long to work out that we would like this?’. And now it’s happening in IT. The rise of cloud, mobile internet, and machine to machine connectivity, to name but a few, have changed the way that we use and view technology on a day to day basis.

A brave new world

The last five to ten years has seen a huge change in the way that we use technology. We all have smartphones and tablets at home. Those leaving university now are the first generation of ‘digital natives’: those who cannot remember a world without the internet. If we want an app to do something, we go and get it. And the same applies to the way in which we work. If we want an app, we want to be able to go and get it. We don’t expect to have to wait 6 months while the IT department thinks about it, and then decides that they don’t approve. In fact, there’s evidence that the majority of IT spend is no longer spent by the IT department.

This is massively disruptive. Many of those working in IT, including CIOs, don’t know how to respond. And that has huge implications for IT departments.

Ian Cox, a former CIO and now a consultant, has been watching and writing about these changes for several years. Increasingly, about changes that IT departments and CIOs had to make to fit into the ‘brave new world’, and realised that many were no longer fit for purpose. His book, Disrupt IT sets out a new model for a radical transformation of the way in which IT is delivered.

At the heart of the matter is the way in which the IT department, and the CIO, need to change to meet the needs of digital businesses, and to stay relevant. He discusses a new role for the IT department as a collaborative and supportive technology broker. And much of what he says shows why data centres, and especially colocation providers, are a key part of the new model of IT.

IT departments used to function as providers of technology and of a service. However, there are now many external vendors who can provide the same type of service. In fact, given the lifetime of technology, they can probably do it much more efficiently and cost-effectively. In-house IT departments need to move from being a service provider to being a service broker, and that means a new set of core competencies, which are much more business-focused. They are architecture and design, delivery management, data management and vendor management, as well as managing internal relationships and developing an understanding of the business. Instead of being about building and maintaining infrastructure and applications, they’re about adding value to the business. And in order to provide value in the new world, IT departments need to focus on these new core competencies, so that they drive changes in structure, recruitment, and the way that the department operates.

Working in a different way

Having focused on the core competencies, the IT department then needs to make sure that it’s really focused time and resources on things that add value to the business. And this means outsourcing the low-value, non-core areas and anything that is not a differentiator.

Outsourcing is routine in business. We all know the mantra: focus on your core competencies, the things that distinguish you from your competition, and outsource anything that requires specialist knowledge or skills, or does not differentiate you. And much of the old business of the IT department can be outsourced, so that the IT specialists within the company can focus on enhancing the customer experience and adding value to the business. Colocation providers can offer the data centre capacity and space that’s needed for this outsourcing, in new and more up-to-date premises than any company can afford to equip themselves, and the newly-refocused IT department is well-placed to advise on which offers best value.

It sounds simple, but such changes take time and money, and are not easy to achieve in practice. However, they are essential if IT departments and CIOs are to remain useful and functional, and not be bypassed by the rest of the business.

Hear more from Ian at his Data Centre EXPO keynote presentation, It’s time to disrupt IT, on Wednesday, 8th October at 11am.

Cloudera’s Mike Olson: “This isn’t about knocking over old guys and stealing their wallets”

26 Aug 2014
by
Jessica Twentyman
Jessica Twentyman
Jessica Twentyman is an experienced journalist with a 16 year track record as both a writer and editor for some of the UK's major business and trade titles, including the Financial Times, Sunday Telegraph, Director, Computer Weekly and Personnel Today.
Hadoop big data cloudera

Hadoop distributor Cloudera’s co-founder and chief strategy officer, Mike Olson, talks candidly to Technology.info about the open source framework’s growing maturity, that investment from Intel and whether rumours of the death of data warehousing have been greatly exaggerated.

Q: Forrester analyst Mike Gualtieri recently described Hadoop’s momentum as “unstoppable”. Where, in your opinion, does the technology sit today in the minds of enterprise customers?

A: Hadoop’s come a long, long way. Cloudera’s been in business for just about six years now. We were incorporated in June 2008 and, at that time, ‘Big Data’ was not a meme. The idea just hadn’t occurred to most people and Hadoop was pretty rudimentary back then. In the ensuing six years, Cloudera has led the community in driving new capabilities: we’ve added NoSQL engines to our offering, like Hbase and Accumulo; we’ve invented, and given away as open source, high-performance SQL query capabilities, in the form of Impala; we’ve integrated other open source projects, like Solar, to deliver search capacity; we’ve embraced third-party tools, like the Spark engine from Databricks, for complex analytic and stream processing. We give our customers lots of ways to get at data, as well as capabilities around security, data lineage, governance, back-up, disaster recovery – enterprise-grade features that were missing from platform that Yahoo and Google first used. That’s a long way for me to say that, while the market is still emerging, it’s no longer fair to see Hadoop as an early-stage technology. We’re seeing it adopted across every single vertical, we’re seeing customers roll out large-scale deployments, and pay seven, even eight-figure sums, on an annual basis, to do just that. Broad adoption may be only just now accelerating but the momentum is tremendous.

Q: Back in April, Cloudera announced a huge $900 million round of investment, $740 million of which came from Intel. What can you tell us about that?

A: Well first, I’d say that it’s not just Intel making big investments. Look at the considerable investment that IBM’s making in big data. Look at the investment in Pivotal, the VMware/EMC spin-off, made by General Electric. Large enterprise infrastructure vendors everywhere are ploughing enormous money and staff into the big data opportunity built around Hadoop.
Talking about our relationship with Intel specifically, I understand why people hear a sum like $900 million dollars and imputed market capitalisation of $4.1 billion and they stop listening after that, because those are just breathtaking numbers. What I want them to hear is that we did not do this deal for the money.
The relationship with Intel does three important things for Cloudera. First, it aligns our two companies behind a single platform that we can bring to market together, rather than battling for customers with our own Hadoop distributions. Intel had its own distro and it was pretty good. It has a lot of features that we, frankly, coveted. By aligning behind a single distribution, we can marry the best of both.
Second thing is, we’re a platform company that relies on channel relationships and partners to help us deliver a full stack of value to our customers. Intel has decades-long partnerships across every sector of hardware and software. It’s deeply respected and genuinely liked by all the big players in the market. So by working with Intel, we create a much larger indirect channel and amplify our selling effort significantly.
The third and most significant point for me, is that the silicon we run on is going to change dramatically over the next few years – the balance between memory, disk, compute and networking will be radically redrawn. Intel knows what the future looks like. And we’re in a position to ensure that the open-source ecosystem takes full advantage of all that silicon goodness, that it adapts to these ratio changes in data centre infrastructure. We’ll be able to deliver a much better, faster, more secure platform to the market broadly, before anyone else will.
Now Intel believes the opportunity is enormous and wanted to participate meaningfully, so as a condition of the large commercial relationship, wanted to take a substantial equity stake in Cloudera. We weren’t looking to dilute our shareholders to that extent. We weren’t looking for $900 million in cash – it’s a crazy amount of money for a software company to raise at this stage. But what we did want was that commercial relationship with Intel and so we negotiated an investment that we think makes sense to everybody.

Q: Over at your competitor Hortonworks, executives tell us that Cloudera’s message is that the traditional data warehouse is dead. Can you clarify for us where you actually stand on this issue?

A: Everybody wants to know if Cloudera is after Teradata. I say to them, “Look: the opportunity here isn’t to knock over old guys and steal their wallets.” The opportunity is to monetise vast amounts of data using tools that weren’t previously available. Now Hortonworks and Teradata have a joint go-to-market strategy and Hortonworks CEO [Rob Beardon] is on record saying that Hadoop cannot be real-time, that it will not do interactive queries in the way data warehouses do. Cloudera’s point of view is that that’s ridiculous: not only will it, but right now, today, with our platform, it does. The data centre of the future will be different. The enterprise data hub will absolutely be a core component of every large enterprise’s data management strategy. Data warehousing will evolve to do more things, better – but an enormous amount of what happens in the data warehouse today is moving out. It’s moving out for flexibility, for cost and because most of the innovation is going to happen in open source, not in single-vendor proprietary technologies. We’re not interested in preserving existing markets. We’re interested in innovating so that customers can do more with their data. So our mission really isn’t to replace the data warehouse. It’s to make data management and data infrastructure better.

See Cloudera’s Chief Architect, Doug Cutting keynote as part of the Big Data Evolution Summit at IP EXPO Europe, 8 – 9 October 2014, London ExCel.

Big Data in action: smart cities meets energy

20 Aug 2014
by
Jessica Twentyman
Jessica Twentyman
Jessica Twentyman is an experienced journalist with a 16 year track record as both a writer and editor for some of the UK's major business and trade titles, including the Financial Times, Sunday Telegraph, Director, Computer Weekly and Personnel Today.
Smart Cities - big data

In Germany and the US, two very different big data projects are underway, but both with the same goal: optimising the way people consume energy by analysing the data generated by smart meters.

More efficient use of energy and water, fewer traffic jams, better public safety: these are just a few of the promises that the ‘smart city’ concept makes. It’s vital to human health and security that these goals are achieved: according to figures from the UN, more than two-thirds (70 percent) of the world’s population will live in urban areas by 2050, stretching resources, space and possibly, residents’ patience, to their limits.

But in order to be smarter, a city needs data – a lot of it. Sensors and meters embedded in utilities networks, in particular, are already capable of delivering data in huge volumes, but if city authorities and utilities firms are to be able to understand energy usage patterns and potential supply issues, they’ll need a hefty dose of big data technologies to collect, manage and analyse that information.

This idea is central to a giant construction project currently underway on an abandoned airfield on the north-eastern outskirts of Vienna. It’s the site of a new smart city, Aspern, and by next year, if work goes to plan, there will be around 3,240 apartments there. By 2028, Aspern will be home to around 8,500 apartments, along with shops, schools, health centres, offices and a subway station that will transport passengers to or from central Vienna in just 25 minutes.

It’s also the site of one of Europe’s most ambitious smart energy projects – a “living laboratory” in the words of its creators – where researchers hope to establish how renewable power sources, smart buildings and smart-grid technologies might best be combined to power a thriving, eco-friendly community.

That research will be lead by Monika Sturm, head of R&D at Aspern Smart City Research (ASCR), a €40 million joint venture formed in 2013 between the City of Vienna, utility provider Wien Energie and industrial giant Siemens.

The plan, she told attendees at data warehousing company Teradata’s recent customer conference in Prague, is to kit out individual buildings at Aspern with different combinations of smart-energy technologies and analyse the results using a range of big data technologies: traditional data warehouses, MPP [massively parallel processing] appliances, and open-source data analysis framework Hadoop. As we’ll see in the next article [xxxx], the big-data trend is, in some cases, setting these approaches in direct competition with each other – but many organisations continue to embrace whatever tools they need get analytics work done.

“By analysing the most efficient mixes of technologies and their influence on end-user behaviour, we expect data analytics will lead to new paths for energy optimisation in smart cities everywhere, for the benefit of all.”

In California, meanwhile, utility company PG&E is somewhat further down the line: it’s already the largest US utility to have installed smart meters right across its service territory, which covers 70,000 square miles and 9.4 million residential and commercial properties. Now, the focus is firmly on extracting real business value from that roll-out effort, says Jim Meadows, PG&E’s director of smart grid technologies.

The smart meters that PG&E has installed measure energy use in hourly or quarter-hourly increments, allowing customers to track energy usage throughout the billing month and giving them greater control over the costs of heating, cooling and lighting their homes. They also give PG&E more visibility into its own operations, as well as lowering the costs associated with meter reading and management.

But these 9.4 million meters generate a mountain of data – around 2 terabytes per month or 100 billion readings per year. This is collected by the company and stored for analysis in PG&E’s Interval Data Analytics (IDA) platform, based on a data warehouse from Teradata. Analytics tools from SAS Institute and Tableau Software, meanwhile are used to interrogate the data.

“We’re doing our best to focus the company, and all of its different lines of business, on a single data platform for this interval data,” says Meadows. “We made a conscious decision early on to build a platform where data could be cleansed and perfected in a single place and made ready for presentation to business users in a wide range of different ways.”

As at Aspern and PG&E, countless other public servants and utilities executives are planning big data projects – and a huge cast of hardware, software and services vendors will be more than happy to assist them.

The main drivers of smart-grid analysis investments, according to a recent report from research company GTM Research, will be to improve asset management for grid components, bring more granularity to demand-side management, speed up outage response times – and achieve a better return on investment for smart meters. GTM Research expects culmulative spending to total around $20.6 billion between 2012 and 2020, with an annual spend of $3.8 billion worldwide in 2020.

Newcomers swell the SQL-on-Hadoop ranks

19 Aug 2014
by
Jessica Twentyman
Jessica Twentyman
Jessica Twentyman is an experienced journalist with a 16 year track record as both a writer and editor for some of the UK's major business and trade titles, including the Financial Times, Sunday Telegraph, Director, Computer Weekly and Personnel Today.
Big Data Analytics

It’s one of the hottest areas in big data science and the choice of tools for running SQL queries on Hadoop platforms is getting bigger all the time.

Steve Shine, CEO at business intelligence company Actian, is convinced his company is onto a winner. The company recently announced the Hadoop Edition of its analytic software, combining high-performance SQL with the company’s visual dataflow framework, running entirely natively in Hadoop.
According to Shine, this addresses a huge need among would-be Hadoop users, who have so far held back on investing in the open-source big data framework because of the relatively high costs of skilled MapReduce engineers. (For more on how the Hadoop market is developing, see our interview with Mike Olson of Cloudera which will be published shortly)

“SQL skills are more abundant and more affordable. Most companies with IT teams have them. So for them, it’s a chance to get value from big data with the skills they already have, rather than going out to market for a skillset that is short on supply and tends to get quickly snapped up by technology vendors and consultancy firms.”

But Actian is far from alone in spotting the market opportunity here. The company’s launch of its Hadoop Edition came just one week after IT giant HP upgraded the SQL-on-Hadoop functionality it introduced in late 2014 with its Vertica database. VMware/EMC spin-off Pivotal and MPP (massively parallel processing) database company InfiniDB have also introduced SQL-on-Hadoop products. And among the Hadoop distribution companies, Cloudera is pushing its Impala offering, while HortonWorks is arguably the most active contributor to the open-source Hive effort.

In fact, despite the recent proliferation of tools for running SQL queries against big data stores on Hadoop, Hive remains the most widely used query tool with Hadoop – and executives at Hortonworks claim that the company’s recent Stinger project has done much to improve its overall performance.

That said, there is still a glaring gap in the market for tools that can offer the full range of SQL functionality on Hadoop that data scientists can already achieve using traditional relational databases. In other words, there is still much work to do, according to Mike Gualtieri, an analyst with Forrester Research.

But levels of interest in SQL-on-Hadoop are high, he adds. While Hadoop is generally positioned as an environment in which unstructured data can be analysed, many companies have begun their Hadoop experiments with structured data.

“Hadoop can handle both,” he says. “That’s what’s so interesting about the platform. And, over time, most organisations will do both, but for now, I advise firms to start with structured data and then move onto unstructured.”

After all, he adds, plenty of organisations have vast treasure troves of structured data at their disposal, much of which goes unanalysed today. And as we’ll see in the next article, the Internet of Things trend seems set to fill those stores even further. Gualtieri believes that most companies only ever analyse around 12% of the data they hold, leaving the rest (which is potentially valuable), “on the cutting room floor.”

Often, that’s because the vast databases and data warehouses needed to collect – and use – the remaining 88% would be prohibitively expensive to buy and maintain. Hadoop, by contrast, provides a low-cost way to gather vast volumes of data from different data sources on commodity hardware. In other words, Hadoop presents an opportunity to bring it all together in one place – but in order to analyse structured data, most companies are still more comfortable using tools and approaches with which they are already familiar.

This is where Actian, Vertica, Pivotal and others could help – by supporting the queries that companies already run against their structured data, but doing it in a more scalable, less pricey environment. Or, as Shine puts it, “We’re making Hadoop more accessible to a wider range of companies – and, frankly, that’s long overdue. We’re making Hadoop industrial-strength, to tackle more of that analysis needs that customers have today.”

Can software bridge diverging data centre demand and supply investment cycles?

18 Aug 2014
by
Puni Rajah
Puni
Puni built her reputation for fact-based decision support as a business analyst at Deloitte. Since then, she has conducted IT consumption, management and sales research in the European and Asia/Pacific markets. She is expert at understanding enterprise software and services use and buying behaviour, and has coached marketing and sales executives. Puni is the Content Analyst for Data Centre EXPO which takes place alongside IP EXPO Europe in London ExCel on 8 - 9 October 2014
Data Centre Supply & Demand

Stuart Sutton is Chief Executive Officer at co-location services company Infinity. His Data Centre EXPO education session explores the changing profile of design and build practices in IT solution considerations.  

The impact of software defined architectures is reaching across to data centre design and build considerations. Co-location provider Infinity cites location as only one of 12 reasons customers choose its services. We caught up with Stuart Sutton to explore why this is so, and how this position is helping Infinity grow its business faster than its competitiors.

How have your customers priorities changed?

The digital economy has been evident in the changing shape of demand for data centres. An increasing number of customers, whether they are part of an IT department or not, are not looking for rack space for their data. Rather, they are more often thinking about what they want to achieve by buying data centre space. For example, they want the capacity to manage transactions, including flexibility to manage peaks and troughs in demand.

This shift in customer priorities has shaped our approach which has been very successful with the new breed of customers who want to have conversations about how to support their IT strategy, rather than about rack space.

What’s holding back faster transition to this new approach?

Part of the issue for many of data centre providers is that they started out as a building supplier. Buildings, of course, are designed to last for decades, and so is the infrastructure within them, such as air conditioning, and heating. While buildings and facilities need to be excellent to attract data centre customers, technology moves on much more quickly. Users and consumers of technology want to move with the times, and update their technology every few months. This has led to a disconnect in thinking between traditional data centre suppliers and their potential customers

How have you tackled this challenge?

One of the key ways in which we have made our offering more flexible is to move from being a company that built and ran data centres for customers who knew exactly what they wanted, effectively a bespoke data centre supplier, to providing a platform. The advantage is that customers can choose whether they want a high-specification, high-reliability platform, or something that is much more ‘cheap and cheerful’, but still fit for purpose. Effectively, we have created the data centre as a service, rather than as a building in which you put your IT.

As a consequence, we are able to serve a wide range of customers. We can talk directly to end customers as well as managed service providers (MSPs), and offer more flexibility in terms of managing peaks and troughs in demand. In the new world of nimble, cloud-based services, our customers have the flexibility to move between platforms and vendors, but still retain control of their systems and data within a coherent and logical digital strategy, and in a much more secure way.

What’s different about serving MSPs and software developers?

In July 2014, Advanced Computer Software Group(ACS) migrated operations from its five data centres to Infinity. This migration is part of ACS’s portfolio expansion to include cloud-based delivery of its software. Our platform approach has been particularly attractive for companies providing software as a service (SaaS) and managed cloud as they need both mission-critical reliability for end users and a more service-based environment for development cycles that can be more cost-effective. These SaaS companies also need the flexibility to take more space when they move customers from old legacy systems to new private or hybrid cloud based systems. We see this approach as a disruption to traditional data centre design and build processes.

Join Stuart on Wednesday 08th October at 12:20 in the Data Centre Design & Build theater.

 

Interview: Pure Storage looks to help companies ‘dump the complexity’

15 Aug 2014
by
Jessica Twentyman
Jessica Twentyman
Jessica Twentyman is an experienced journalist with a 16 year track record as both a writer and editor for some of the UK's major business and trade titles, including the Financial Times, Sunday Telegraph, Director, Computer Weekly and Personnel Today.
Pure Storage - All Flash for All

With his insistence on all-flash everywhere, CEO Scott Dietzen is sending out a radical message to IT decision-makers contemplating the future of their data storage infrastructures.

Flash memory already defines the consumer technology experience. It powers the smartphones and tablets we carry everywhere we go, as well as the social networking sites that we use daily to share with family, friends and colleagues.

So why should the corporate technology experience be any different, asks Pure Storage CEO Scott Dietzen? After all, he argues, when compared to traditional disk-centric storage arrays, an all-flash array is ten times faster and ten times more space and power-efficient – and all this at a lower per-gigabyte price point.

Technology.Info met up with Dietzen at Oracle OpenWorld to learn more about the company he heads and the eye-opening claims it makes for its all-flash arrays.

It’s a great time to be heading up a hot new storage company, according to Dietzen, and it’s not hard to see why. Pure Storage may be facing a whole army of similarly young and hot storage vendors, looking to grab the attention and budgets of corporate IT decision-makers, but the company recently got a hefty vote of confidence in its approach, in the form of $150 million in funding for global expansion from venture capitalists. Dietzen claims it’s the largest private funding round ever in the history of enterprise storage and, either way, it brings the company’s total funding to an eye-opening $245 million.

So why the excitement around Pure Storage? The company (alongside a number of rivals) advocates the use of speedy solid-state memory pretty much everywhere in the corporate data centre, as opposed to the hybrid, tiered approach that involves mixing in a bit of flash with a lot of spinning disk, endorsed by larger enterprise storage companies such as EMC and NetApp. As the costs of storing data on silicon continue to keep dropping, says Dietzen, Pure Storage’s all-flash vision becomes increasingly more feasible – and affordable – for mainstream businesses to pursue.

At early-adopter organisations, he adds, the shift is already underway. At Facebook, for example, data is stored on flash for several weeks before it’s shifted to mechanical disk, he says. And at payroll processing company Paylocity, another Pure Storage customer, storage admins have recently unplugged the last of their production disk arrays, replacing it with an all-flash array from his company.

“Mechanical disk is out of gas,” he says. “It may take some companies some time to realise that, but other companies already know it and something remarkable is going on here, with flash [becoming] the driver of a wholesale sea-change in enterprise storage.”

As for his claim that flash might actually cost companies less than spinning disk, he argues that deduplication and compression need to be taken into account when the sums are calculated, as well as the difference between raw storage capacity and actual addressable storage.

So this is Pure Storage’s goal: to do for silicon-based storage what EMC and NetApp did for spinning disks. Along the way, Dietzen claims, the four-year-old company has ignored a bunch of approaches from would-be acquisitors (he declines to name them) on its mission to build the storage industry’s next-generation heavy-hitter.

Pure Storage is not for sale, he insists, and in any case, the most recent round of funding values the company at an “insane premium” that would halt most prospective buyers in their tracks.

“Long-term independence is the best way to preserve our ability to create value for customers,” he says. An IPO isn’t even on the cards before 2015, because global expansion and R&D are much higher priorities at a time when the company is growing revenues at up to 300 percent year-on-year. “When you’re seeing growth like that, believe me: you’re more inclined to let your bets run.”

The company now has the funds, he says, to replicate the success it has experienced in the US elsewhere in the world, particularly Europe and Asia. “Our experience has been that our customers initially buy us for improved performance, sure, but then they repeat-buy because they’re dumping the complexity that they’ve had to tolerate for a long time in their storage environments,” he says. “That’s something that translates well internationally. We now just need to get that message out there.”

See Pure Storage discuss the Rise of the Flash Fuelled Data Centre at IP EXPO Europe 8th – 9th October 2014


Overview: Getting value from Big Data

12 Aug 2014
by
Jessica Twentyman
Jessica Twentyman
Jessica Twentyman is an experienced journalist with a 16 year track record as both a writer and editor for some of the UK's major business and trade titles, including the Financial Times, Sunday Telegraph, Director, Computer Weekly and Personnel Today.
Big Data Analytics

Big data is far too important these days to be left entirely to data scientists. In a world awash with data, a working knowledge of analytical methods, technologies and terminologies is an essential business skill for business managers.

After all, they’ll be the ones who’ll be signing big cheques for big data – and on average, organisations will pour, on average, around $8 million each on big data initiatives in 2014, according to IT market research firm IDC. The company’s survey of over 750 technology decision-makers worldwide, released earlier this year, found that in many cases, the spending spree has already begun. Over two-thirds of respondents from large enterprises said their organisations had already deployed or planned to deploy big data projects. Over half (56%) of respondents from SMEs said the same.

In this Technology.info chapter, we take a look at some of the ways that organisations are getting business value from big data and the tools they are using to extract new insights. For example, we take a look at the burgeoning market for SQL-on-Hadoop tools and chat with Mike Olson, co-founder and chief strategy officer at leading Hadoop distribution specialist Cloudera, about how the market is developing . We also explore the ‘smart city’ concept – a technology intersection where big data and the Internet of Things collide, with a look at how two organisations are using the data generated by smart meters to encourage consumers to reduce their energy consumption.

Register now for the Big Data Evolution Summit at IP EXPO Europe

Given most organisations’ spending intentions in these areas, it’s vital for bosses to have a clear understanding, upfront, on the value they hope to get from their investments, but they will be entering unknown territory.

The good news about big data is that it makes it possible for executives to ask questions they’ve always wondered about – and get rich, multi-dimensional answers back in return. The bad news about big data is that traditional methods and tools for data analysis, with which bosses might enjoy a passing familiarity, have a nasty tendency to fall short in environments where the pressure is on to explore vast volumes of both structured and unstructured data, from a wide variety of both internal and external sources.

In other words, companies looking to get business value from big data must familiarise themselves with a world populated by new technologies with strange, unfamiliar names: Pig, Hive, Flume and Squoop, to name but a few.

On the plus side, cloud technologies will help: there are many ways now to access big data tools and expertise on an on-demand basis, and the IT infrastructure needed to store and process big data can be supplied by a host of cloud providers, such as Amazon Web Services and Microsoft Azure.

With smart choices, big data can mean big insight – if organisations have the ability to turn insight into actions that create competitive advantage. Delivering on that final step, in fact, will likely be the biggest challenge of all.


Advanced Persistent Threat: a new cyber attack eco system emerging

11 Aug 2014
by
Paul Fisher
Paul Fisher is the founder of pfanda - the only content agency for the information security industry. He has worked in the technology media and communications business for the last 22 years. In that time he has worked for some of the world’s best technology media companies, including Dennis Publishing, IDG and VNU. ​ He edited two of the biggest-selling PC magazines during the PC boom of the 1990s; Personal Computer World and PC Advisor. He has also acted as a communications adviser to IBM in Paris and was the Editor-in-chief of DirectGov.co.uk (now Gov.uk) and technology editor at AOL UK.  In 2006 he became the editor of SC Magazine in the UK and successfully repositioned its focus on information security as a business enabler. In June 2012 he founded pfanda as a dedicated marketing agency for the information security industry - with a focus on content creation, customer relationship management and social media Paul is the Editorial Programmer for Cyber Security EXPO which runs alongside IP EXPO Europe, 8-9 October, ExCel, London.
Uri BioCatch website (2)

Uri Rivner is VP Business Development & Cyber Strategy at Israeli start-up BioCatch. His Keynote at Cyber Security Expo focuses on Advanced Persistent Threats (APT). 

1 How worried should CISOs and CEOs be about the threat from Advanced Persistent Threats (APT)?
Obviously you should ask the CEOs, but advanced persistent threats have become mainstream. Three to four years ago, it was a novelty – not any more. If you care about your intellectual property, you’ve got to take this threat seriously.

Then there’s the serious impact on the bottom line, the cost of these attacks. And the damage to the reputation of your business, the loss of trust among customers and partners. And of course, you may not even be the final target, merely a stepping stone to a bigger target and the legal implications of that. This is the reality, and the starting point of this discussion

2 By focusing on Advanced Persistent Threats, are we in danger of missing more threats from more conventional sources?

We are still spending around £70bn every year on something that doesn’t work. Paying for IPS, IDS, web filtering and antivirus – the traditional basics of security. This cannot continue, security is not working. I don’t want to pay that much for a commodity. So there’s no danger of missing old threats because you have that covered – but the new threats are much more serious and insidious, and conventional defences cannot cope.
Read More

These are Advanced Persistent Threats, hacktivism and cyber crime with criminals now working right inside corporations. Three years ago, criminals were just after employees money by hijacking online banking sessions. Now they’re after the business itself with APTs used to steal data and IP.

We need security that will perform detection, investigation and resilience. In other words, cyber-intelligence.

3 In your experience is the APT problem getting worse? Without giving away too much of your talk what new techniques are the bad guys using?
There is evidence that there is a blurring of lines between cyber criminals, hacktivists and state actors – all assisting each other to their respective goals. And many more nations are now involved, it’s not just China anymore. The actors have become very adaptive and cunning, and as I mentioned, attacking new vectors such as the supply chain. We are seeing almost the emergence of a attack ecosystem, where it is difficult to identify who is who. A kind of merging of attackers.

4 Tell us a little bit about your new company BioCatch.
If you recall the famous scene in  Blade Runner when a replicant is being interrogated to see if he is human or not. Our technology does something similar with cognitive science. At the core of the technology lies a unique, mechanism we call Invisible Challenges.This mechanism is responsible for the interaction of the user with the application, so whenever a user interacts with an application, a subtle dynamic cognitive challenge is injected and the user responds without being aware to the fact it was there.

Each user reacts differently and has a unique Cognitive Signature. If a deviation from the regular behavioral profile is spotted at any point during a session, BioCatch immediately senses foul play and sends out an alert of a possible threat. The best thing is, it works!

We are in talks with top banks in the UK, Spain and Italy and a major public cloud provider.

5 What are you looking to get out of Cyber Security Expo 2014?
In the banking sector not about just stopping fraud anymore. It’s about the fraud, friction and functionality conundrum. To defeat fraud you add more security which adds friction and a loss of functionality. But we need to reduce friction to boost functionality, ideally.

So I’d like to talk to visitors from other industries at Cyber Security Expo to see if they have the same problem in their organisations. For example, are they under pressure to increase functionality through BYOD and Instant Access etc, but are they managing this without increasing friction? I’d like to find out.

6 What are you hoping for in the year ahead in terms of security?
Look, there’s always going to be more fraud. I want to see how people move beyond that, are you stopping business or enabling business? How do we make it so that the business still operates. You have to find clever ways of increasing security without hindering the business.

Join Uri on Wednesday 09th October, 13:40 – 14:10, in the Cyber Security EXPO Keynote Theatre.