Just today German Vogel Verlag release the new Next Industry issue dedicated to Big Data. The magazine issue includes several articles that explain how big data algorithms are applied in different industry scenarios. One of the articles – written by IPlytics CEO Tim Pohlmann – explains how big data will shape the future of Intellectual Property analytics.

Patents have developed to be the most valuable asset within innovative companies. Just a few years ago Google acquired Motorola for USD 12,5 billions; to then just a few years later sell it to Lenovo for about USD 3 billion. However, Google did not sell the patents -as to this deal- worth three times the business value. Especially in fast evolving industries patents are means to secure the ownership over technologies that will shape future markets. Yet, patents also increasingly create direct revenue income by licensing and selling patent portfolios. Understanding where to patent and how to commercialize patents globally is however one of the most challenging tasks. One reason is the sheer amount of patent data. Each month over 500,000 patents are filed worldwide. Even large corporations will not be able to read each and every patent document. New big data driven solutions such as IPlytics Platform help to

  • valuate patents – to focus on the gold nuggets,
  • identify patented technologies – to understand technology clusters without reading patents,
  • extrapolate information from patented technologies – to identify patents relevant to a given portfolio or licensing program
  • monitor each and every patent related activity – to identify patent trades, patent litigation, patent licensing deals, patent legal statuses

The Next Industry article further explains in more detail how big data algorithms allow to connect additional information to patent data such as:

  • worldwide research articles
  • worldwide standards documents, standard contributions and standard essential patents
  • worldwide company profiles, startup profiles and business news

Connecting millions of data points in real time – so that the customer can search, handle and analyze the data – is a Big Data problem. IPlytics has developed an algorithm that makes use of a scalable distributed big data framework to allow large data queries in just a few seconds.

Please contact IPlytics if you would like to learn more about IPlytics or read the full article.

Introducing the all-new Semantic Essentiality Score (SES)

X