Skip to main content
All CollectionsHelp With Data Collection For Campaigns
Using Axiom for scraping and automation
Using Axiom for scraping and automation

How to scrape different sites and automate data collection tasks with Axiom.

Veronica Fletcher avatar
Written by Veronica Fletcher
Updated over 6 months ago

Axiom.ai is a great no-code web automation that can help you collect data from different websites that would otherwise take a long time manually. It wont be able to do everything, and for more complicated tasks you might need some code. But it can help with a lot.

The below examples don't cover every website you can scrape, but they cover the basics of using Axiom and a few of the different things you can do with it.

You should also be aware that some websites have more protection against scraping than others. Any website where you need to be logged in to collect the information, you need to be extra careful or you could end up getting your account blocked (for example, LinkedIn is very strict with scrapes).

Here are a few things you can do with Axiom to avoid being detected.

  • Using Axioms built in bot bypassing methods (I highly recommend enabling this - learn more here).

  • Going slowly and adding sufficient wait times in so the bot 'acts' more like a human.

  • Using proxies (a intermediate between you and the website to hide your IP address and make it look like all your requests are coming from different people).

If you are coding your own web scraper, there is more you can do to avoid bot detection.

Now onto the examples.

Example 1: Scraping Coinopsy with Axiom.

This is a very simple example showing you the basic scraping capabilities of Axiom. It will be helpful if you have a single page you want to collect information from but can't / don't want to do it manually.

Example 2: How to scrape IMDB with Axiom (collecting data on how many horror films were shot in different locations)

This is a good example of using Axiom on a website where the URL structure is predictable and you are able to construct the URL's of the pages you want to scrape yourself.

Example 3: Automating travel time data collection with Axiom (scraping Rome2Rio for the travel time between different destinations)

This is another example of using Axiom on a website where the URL structure is predictable and you are able to construct the URL's of the pages you want o scrape yourself.

Example 4: Scraping Trust Radius with Axiom (scraping reviews from a seed list of companies from Trust Radius).

This is third example of using Axiom on a website where the URL structure is predictable and you are able to construct the URL's of the pages you want o scrape yourself.

Example 5: Scraping CrimeMapper with Axiom (scraping the number of crimes from different locations).

This is an example of using Axiom when the URL's of the website you're trying to scrape aren't easy to guess / they're not uniform. This example focuses more on how you can interact with the website via axiom to navigate to the information you want to scrape.

Note: In this case it's better to use a new sheet for the results and not write them on the same sheet as the input (as I suggested in the video).

So the set up would be the location data in a 'data sheet' and then you would tell the Axiom to write the results to a new sheet (making sure 'add to existing data' is selected). Then you can merge the two sheets together at the end.

Example 6: Scraping Glassdoor with Axiom (scraping reviews from interns).

Part 1: Scraping URLs of companies which fit a certain criteria from Glassdoor.

This is an example of using Axiom to collect links from a page that you will then use later to scrape.

Part 2: Using those URLs to scrape reviews (specifically from interns) from Glassdoor

This is another example of using Axiom when the URL's of the website you're trying to scrape aren't easy to guess / they're not uniform. This example focuses more on how you can interact with the website via axiom to navigate to the information you want to scrape.

This example also touches on preventing bot detection.

Example 7: Scraping Airline Reviews with Axiom (taking reviews about different airlines)

This example is slightly more complicated than the other examples because it uses a bit of custom code in the Axiom to append the URL to the data. This is useful when there is no identifying information in what you're scraping.

Did this answer your question?