Guest blog post by Sean Farrington, UK 
MD and RVP Northern Europe, QlikTech
Financial markets are 
experiencing a ‘data fog’. We are forever hearing about the increasing amount of 
data being generated, whether IBM’s comment about data volume doubling every two 
years for the average organisation, with that data needing to be managed for 
years, or Cisco’s recent prediction that there will be 15 billion connected 
devices by 2015 with a staggering 
amount of data traversing the networks as a result.
Too little thought is 
given to the amount of data (and sensitive data) being generated by the 
financial services, in my opinion. We know just how important access to data is 
for the financial services industry, however, as we work with over 250 financial 
service firms, including over 50 per cent of the Fortune global top 25. There have been 
several recent innovations in IT theory and we are continually working closely 
with the financial services firms to improve the algorithms and tools that help 
them gain access to relevant data so they can harness insights for business 
growth at the same time as providing increased transparency. 
All too often, the 
focus is on big data and ensuring that all information is downloaded, stored and 
managed, in case of future regulatory requests and to mitigate political and 
legal risks. In our view, and judging by our customers’ needs in the financial 
sector, the focus should be on enabling organisations to empower their employees 
to harness data for insight and for business discovery. Backing data up and 
making it available for the long-term should happen regardless, but it’s the 
here and now that counts in the financial sector, and using IT and technology 
cleverly gives financial organisations the opportunity to collate data and 
analyse it to avoid repeating past mistakes as well as look for future growth 
opportunities. 

No comments:
Post a Comment