How to avoid a yearly 20 billion € cost of duplicate research
In my last blog post from December 2, I addressed one of the major challenges we have faced when it comes to managing and analysing patent big data, and that no one so far has managed to come to terms with – why 70 % of European companies don´t know that they can use patent information to make better business*. In today´s post, I want to address another major challenge that we´re about to solve.
– The yearly cost of duplicate research is 20 billion euro, in Europe alone.
National and international patent databases have been available for decades as a source of information. And today there are all kinds of tools available to support the management and analysis of patent data. But there is a problem – it´s much too difficult to understand the data if you´re not a patent expert. And if you´re a patent expert, you seldom have the experience or the expertise to analyse and understand the associated business opportunities.
An indisputable proof that patent data is not being used to any great extent was presented by the European Patent Office, EPO (http://www.epo.org/searching/essentials/business.html) some time ago. EPO estimates the cost of duplicate research to be the enormous sum of 20 billion €/year in Europe alone! Knowing this, why should a business management accept continuously and unnecessary duplication of work from their skilled employees? Probably because this information is a "well hidden” secret.
Each and every one of us who help companies to manage and analyse patent information have to address the actual needs of corporate decision makers, investors, managers and innovators who are engaged in research and development, more successfully. Not to forget the needs of the Academia where duplicate research often is carried out with the support of public or private funds.
I am convinced that we have more and better insight than we have managed to share so far. We can contribute to strengthen entire industries and even societies.