ActiveComply Website archival occurs through the automated scheduling a bot to view all of the content on a website. When a bot looks at the homepage of a website, it takes note of all of the navigational links on the page and creates a map of all of the pages on a website and then visits those pages as well. This process is called "crawling" and a it creates a comprehensive list of pages on a website. This list is then crawled again on a scheduled basis to look for new pages or for changes on those pages.

This process takes place several times per week. When a change is found on a page, a screenshot is taken and added as a version of that page in the archive history for that site. This allows us to create a historical archive for how every page on a website changes over time.

Not every page on a website can be found by clicking through it's links. An example could be a marketing landing page that is only used for sharing to social media. Other links might only exist as a link on someone's "About" section of their Facebook Page. When our system sees these links, they are added to the crawling schedule and archived with the rest of the website.

Did this answer your question?