#Asia How your website’s terms and conditions can prevent content scraping


In an ideal world, Internet users with malicious intent will visit your site for scraping, but will back off after reading your updated Terms and Conditions


Online businesses have been coping up the menace of content scraping for decades now. However, as the bot threat landscape widens, intelligent bots try to beat the existing security solutions with enhanced vigour. A well-defined set of Terms and Conditions can help online businesses take the first step to deter content scraping. A recent update to their Terms of Service by StackExchange, and the subsequent comments during and after the update reflects the seriousness of this scraping issue. Are you trying to prevent content theft and data privacy on your websites? Read on…

Scraping and Lawsuits

In Jan 2015, The Court of Justice of the European Union (CJEU) mandated a rule, according to which, if a database (or the sensitive content) of an online business is accessible to public, but is not protected under the Database Directive(96/9/EC, the “Directive”), then the owners of the business can use their website’s proprietary Terms and Conditions to prevent breach of their content. 

The advantages of having Terms and Conditions to sue the infringers of content can be vouched by quite a few famous lawsuits:

Also Read: New Indian fund StartupXseed backs anti-malware startup ShieldSquare

Ryanair had held legal proceedings against PR Aviation for screen-scraping their price information through an automated software. This lawsuit highlights the importance of bringing the terms and conditions properly to a user’s attention through a simple clause.

In 2009, NewRiver Inc, a software provider, filed a lawsuit against MorningStar for screen-scraping their prospectus and engaging in an unfair competition. Their website’s terms and conditions saved their compliance documents about mutual funds and annuity from being exploited.

In July 2014, LinkedIn held proceedings against HiringSolved, a HRM company that engaged in scraping LinkedIn users’ information for data aggregation on their website. LinkedIn is well-known for its up-to-date website information, so their terms and conditions helped sue these miscreants under multiple copyright infringement acts.

Listings giant, Craigslist, had sued PadMapper for scraping (‘mass harvesting’) and sharing their data with third-party API creator service, 3Taps. According to Craigslist, they had issued a summon to PadMapper to cease scraping their data once they found that their terms and conditions had been breached.

T&C Quick Tips

Now that we have realised the importance of having terms and conditions, it is also equally important to present them in a comprehensive manner. Here are two no-brainer steps to make your Terms and Conditions section more customer-friendly: 

Use a simple language with little or no jargons to make your terms and conditions relevant to every single customer.

A clickwrap agreement (also known as a click through agreement or clickwrap license) is an online contract that confirms user consent to a company’s terms and conditions. When this technique is incorporated, users can proceed with an online business only after they have consented to the Terms and Conditions by clicking on options like “I agree”. Though, this will depend on the type of online portal accessed by the user.

I got a great Terms and Conditions. Now what?

In an ideal world, Internet users with malicious intent will visit your site for scraping, but will back off after reading your updated Terms and Conditions. However, the reality is far from that. The bot threat landscape is so large. There are a zillion intelligent bots that scrape your content by conveniently skipping your Terms and Conditions section. So, how do we stop them?

Stringent terms and conditions accompanied by a reliable bot prevention solution will not only impart certain level of legitimacy to your business, but also help identify bot bots and take necessary actions against them with greater might and insight.

Bad bots are of many types. Detecting and categorising them with an anti-bot solution, which ensures zero false positives, will help online businesses shield themselves from scraping and spamming attacks without hampering genuine users.

The views expressed here are of the author’s, and e27 may not necessarily subscribe to them. e27 invites members from Asia’s tech industry and startup community to share their honest opinions and expert knowledge with our readers. If you are interested in sharing your point of view, submit your article here.

The post How your website’s terms and conditions can prevent content scraping appeared first on e27.

from e27 http://ift.tt/1TkW7Om

Ce contenu a été publié dans #Asia par Startup365. Mettez-le en favori avec son permalien.

A propos Startup365

Chaque jour nous vous présenterons une nouvelle Startup française ! Notre pays regorge de talents et d'entrepreneurs brillants ! Alors partons à la découverte des meilleures startup françaises ! Certaines d'entre elles sont dans une étape essentielle dans la vie d'une startup : la recherche de financement, notamment par le financement participatif (ou crowdfunding en anglais). Alors participez à cette grande aventure en leur faisant une petite donation ! Les startups françaises ont besoin de vous !