Posted: June 21, 2016
Contributing Authors: Mike Spence and Martin Bijman
It feels like the intellectual property (IP) industry is being inundated with new patent tools. These tools generally sit on top of the same corpus of data – patent and trademark filings from the international Patent and Trademark Offices (PTOs). The USPTO corpus has arguably been the most profitable for IP stakeholders and therefore, it figures large in the value proposition of most patent tools.
Key differentiators
In the face of both increased competition and corporate interest, tool vendors now differentiate themselves on the basis of:
- Their data (e.g., by providing the earliest possible access to corpus changes or by further curating the data set)
- How they augment the data with other data sources, for example, company mergers and patent sales
- The IP workflows they automate/improve
- The analytics they offer that reveal hidden trends and yield insights
- Their ease of use – better graphical user interfaces (GUIs)
Focusing on analytics
Most vendors are now focused on analytics – from new visualizations, like force directed maps, to lexical and statistical techniques, like latent semantic analysis (LSA). Patents are fundamentally about language. These new techniques are relegating manual, purely patent searches to lower-cost labour markets. As global PTO data grows, these techniques are of increasing importance and are (rightfully) dominating patent tool innovation and new features.
Anticipated impacts
The USPTO’s planned application programming interface (API) release will likely have the following impacts:
- Encourage new entrants and further innovation in the patent tool market by lowering a barrier to entry
- Reduce the value proposition of purely data-focused tool vendors
- Further shift the focus of tool vendors towards GUIs, workflows, and analytics
- Set a precedent for other international PTOs to follow suit
It should be noted that the USPTO corpus has been free for a while – it was hosted publically by Google for years and then more recently by others. This public data was, however, inconsistently formatted and unwieldy, making it difficult to leverage.
If “done right” a formal public API will make it easier and faster for tool vendors to spin up new applications. It’s likely that existing big data platforms (including open source ones) can be hooked directly up to the new APIs. This will put low-cost IP analytics at many companies’ fingertips and will do for patent analytics what the release of the first software development kits (SDKs) did for smartphone development.