By Lior Shvo, Managing Director at Sellers.guide
When the IAB launched ads.txt in 2017, its adoption was slow going until Google embraced and announced that DoubleClick Bid Manager, AdSense and DoubleClick Ad Exchange would filter for the Ads.txt tag. At that point, ads.txt began to take off, and its hope of bringing transparency to digital advertising started to take shape.
By early 2018, the adoption of ads.txt exceeded 50% of the top 5,000 programmatic sites. This represented not only a sign that ads.txt was gaining momentum, but it was doing so with the top sites in the world, the ones used by big brands and significant programmatic media buyers.
By this point, ads.txt was on its way to becoming the solution the industry had been waiting for. A silver bullet that could align buyers and sellers and create a sense of balance within the ecosystem.
Ads.txt has done great things for the industry. It HAS brought greater transparency. It HAS helped keep bad actors out of the programmatic scene and prevent fraudulent selling of a publisher’s inventory.
What it hasn’t done is delivered on a perceived notion that ads.txt would be the silver bullet to clean up the industry once and for all. The concept was good, the architecture was solid, but it was not a single-source solution.
The problem with the silver bullet theory is that ads.txt required some legwork from publishers. It was not a set it and forget solution, but a tool that needs periodic attention and maintenance.
I believe that part of the reason ads.txt has not lived up to its full potential is because it has been mischaracterized. It has been elevated to this almost mythical idea that would be the guiding light for transparency in adtech.
In reality, ads.txt is just a tool for publishers, not a cure-all solution.
Think of ads.txt like the engine of a car. Cars need regular maintenance or overtime, they will build up unwanted material in their engine that takes away from the performance.
Ads.txt is the same. Once publishers set up their crawlable file and start adding sellers, the file needs to be maintained, it can’t be left without the proper attention and care, or it will harbor a build-up of unwanted material.
In practice, every time a publisher signs a new seller, they must add that seller to their ads.txt file. This will establish a protocol for who is authorized to sell inventory on behalf of that publisher. As sellers come and go, publishers must review their ads.txt files, ensure the relationships (direct vs. reseller) are correct and remove unwanted sellers.
If they don’t, every outdated seller leaves behind lines of code in the publisher’s ads.txt file, which grants them access to the publishers’ inventory, creating a loophole for possible fraud. Every misrepresentation of inventory cannibalized the publishers’ direct budgets and hurt revenue. As those old lines continue to build, the ads.txt grows larger and larger and no longer accurately represents who is authorized to sell that publisher’s inventory.
Sadly, too many publishers have ignored the responsibility of properly maintaining their ads.txt files. The IAB guidelines are 20+ pages, they don’t have the proper knowledge of how to do it, or they don’t understand the importance of doing so.
Regardless of the reason, too many publishers have let their ads.txt files fall into disarray, and some have grown into files containing thousands of lines of code. As a result, these publishers have opened the door for fraudulent sellers to enter the mix, costing them revenue and potentially putting them out of favor among DSPs who are keen on maintaining a robust SPO protocol.
The time has come for publishers to hold themselves accountable and play their part in bringing transparency to the industry. It starts by understanding that ads.txt is a great tool that should be utilized to its fullest extent.
Ads.txt was never meant to be a silver bullet solution, and it’s time we stop saying it failed. Everything is at a publisher’s fingertips to make ads.txt successful and prosper in the intended way.