Review – Turn Any Website Into An API Review - Turn Any Website Into An API

Have you ever thought to yourself if there was just an easy way where I could scrape blog post titles or images from a website? Or maybe you want to monitor your competitors’ websites to see whenever they make changes. This right here is for Browse.AI, which can do precisely that: scrape text and images and help you monitor websites for changes. So let’s put it to the test.

Now, here we are in the Browse.AI platform. And here, you can see just an overview of the robots I’ve been playing around with and testing and using Browse.AI you can set up two different types of robots. You can either choose to extract structured data and then put it into a CSV file or Google Sheets. Or you can monitor site changes, and we will go through both. But to begin with, we will start by extracting structured data.

subscribe to my newsletter

Software reviews each week

Join 700+ others, and get one new software review in your inbox weekly.

    Want to read the latest issue? Read it here.

    1. Features of
    2. Pricing
    3. Alternatives to
    4. Future of
    5. Pros & Cons of using
    6. Last thoughts about

    Features of

    So here you can see that you can get a demo. You can see how you use this bot, but it is pretty straightforward. The first step is to enter the URL from which you want to pull data. You can choose whether you need to sign in to extract this data. Maybe it’s behind a login wall or paywall or something third.

    That is also here, supported by clicking here. And then, you can choose to either log in via session cookies or with a password. In this case, I will not use this feature. I will just start recording the task. Now we are in a window where it’s recording everything we’re doing.

    So up here, you can see that we have a robot. And this robot is telling us that now we need to show what we need to do to extract data. So be aware that everything you click on right now, the robot will then imitate later on. So I will click okay, understood. And I will not recommend you accept cookies when you do this because then the robots sometimes fail.

    But scrolling down, you can see we have four different blog posts here. And I want to scrape each of these blog post titles because I want to save them in a sheet so I can have an overview of all my blog post titles. So I do that by clicking on the image up here of the robot. I then say Capture list, and then I choose that I want to capture this list right here. Now I need to choose what text do I want?

    Do I only want the title? Do I also want the description, or do I want both? I only want the title. So I’ll click on this here. I will then choose whether I want to capture the link or the text.

    And I want the text. So right here we have the visible text, and I will now press Enter to save this. Now we need to tell what is this type of text and whether it is a blog post title. So I’ll just write the blog post title and save it. Now here, you can choose whether you want to capture more than what is showing and that you can do by using Pagination here.

    So you can tell that there is Pagination by loading more. Maybe you have numbers that you need to click on or something third. For now, there are no more items I want to load, and I maximum want 10 I will call this list blog post titles. So now we are ready to capture the list. And I’ll now say okay, understood up here. So now we have told everything the robot needs to do to fetch all the blog post titles.

    So I will then again click on the robot and press finish recording. It is now uploading our recording to Browse.AI. It has given it a robot name, which you can always change. I will then press save. And now it is running the robot in the background.

    So you can see it’s opening chrome, it’s navigating to the URL and then it’s pulling all the blog post titles. So you can see right here that it only pulled an empty one. And then it pulled out why time management is important for students. But there were four. So what you can do is you can go down here, and then you can retrain the robot.

    When you retrain the robot, you get the same view where we go up, and we say capture list, and this time then maybe let’s try and click on the object itself. So now we have the object, and we are clicking on this text. So now you can see all of them are highlighted, and I will capture the visible text. Again, press Enter, and write the blog post title. You can see it’s suggesting it, save it and give it the same name.

    And then I’ll press Capture list. So now we are back and Browse.AI, and it is again running the robot. And now you can see that we got all four blog post titles. So sometimes, you need to tweak it a little to ensure you get the desired result. And here is one thing that I will discuss later on.

    These four rows here have cost me now four credits. I will dive deeper into why that is important in the pricing section. But here we can see we also get a screenshot, and we get the text itself. Now let’s try and go back and make a different type of robot again. I will say build a new robot, extract structured data and then let’s see if we can extract the images themselves.

    So now you can see again we are on the blog. If I want to do the same as before, I say capture list and click on the images. But what I want to show you is to capture a screenshot. So here, we can choose to either capture a selection that we do, and we can capture the entire page or only the visible part we can see in our screen port right now. So I want to capture the entire page and call it blog.

    I will save it. And now we have this one ready. So again, I will say finish recording. We are now back in Browse.AI. I will save this one.

    And now we should receive a screenshot of the entire page. And this is great because you can schedule, and I’ll show you with a different type of robot how we can do that just in a moment. But until then, you can see where it has captured the screenshot, the final screenshot. Down here, we can see what it looks like up here. It didn’t fetch all the images. So maybe there is some missing support for the lazy load.

    It fetched the first image, but it’s also loaded a little bit funny. But that is how that works. I can now say that this looks good, and then I can save it. See, I can now choose to run this task just right now. It will then create a new screenshot.

    Or I can bulk run tasks. I can also go to the history to see when has it been run. I can see the monitor where I can add a new monitor. So here I can set it up to schedule. So I can say that every day it needs to run a screenshot of this blog.

    It needs to be a default monitor as a name, and then a save monitor. This way, I can then monitor this website to see whenever they make changes. But for that task, there is an even more exciting way of doing that. For the robot, we can also integrate Zapier, webhooks or something third to take our content and put it over into a third app, which is very easy to set up.

    And here you can see that we get integration with Zapier here. And then you can start setting it up. You can also set the settings for the robot, which are the URL, the name and some different elements you can set. But now, let’s go back and check out the other type of robot which monitors site changes. So in here again, you see the same screen.

    Now it’s showing a different demo video. And again, we can enter a URL. This time I’ll just enter the front page. We can now start recording the task. So now we are again back on the website.

    And here, we can choose again to take the robot again and then capture the list, text or screenshot. So if you do a screenshot, you capture the entire page, and then you cheque whenever there are changes. You can also just text. So, in this case, I want to track this H1 up here. So I will say capture text and capture the H1 up here. I want to know every time this one changes.

    I’ll just call it a very simple headline. And now we have this robot ready. I will then finish the recording again so you can see we’re again back here. We have the same setup. I will save it.

    And now, it is already showing me how I want to be notified, in this case, per email and how often it needs to run. So I will save the monitor from notifying me per email, which needs to run daily. So you can see it has captured the text here. We have a final screenshot of how it’s looking. Again, there are some elements it’s not loading on these final screenshots, but everything is looking correct.

    So I will now say that this looks good. Otherwise, as you saw before, you can run it again, train it again differently, and try to click on some elements to make sure that it works for you. But now we have set up two to three different types of robots, and that’s how easy it is to set up Browse.AI and make sure that it monitors the right thing, captures the right text and the possibilities for using this are endless. You can also go to a competitor’s website, and scrape all their blog post titles, so you get some ideas of what you can write about. Then you can take those blog post titles, maybe take the URL, and then attach them to a third-party API using Zapier to see which of these blog posts get the most traffic.

    That is just one example. There are endless types of examples that you can use Browse.AI for. Of course, it is only the imagination and the pricing that limit us. Pricing

    Now, taking a look at the pricing, there is one thing you need to be aware of. They do have a free plan, and they have three paid plans as well.

    And all of these plans differ in limits. And especially the credits are what you need to be aware of because the credits define how much you can use Browse.AI. One credit is usually one scrape of text or an image. So let’s say that you’re scraping a website of 15 blog posts. That will be 15 credits.

    If you’re only scraping the title. Of course, if you also want to scrape the description, maybe the content, you must time it up by three, so just be aware of this. And when we look at the paid plans, I feel the number of credits is very low. You can’t almost scrape anything, and then you will hit the limit. It’s very limited.

    So I hope they will increase those.

    Alternatives to

    Because when we compare the alternatives, there are two types of alternatives. First, we have and Zyte these types of companies. You need to book a demo and go through it with one of their sales consultants. You can’t just sign up yourself.

    TexAu - Automate your Marketing Growth


    Texau is an even more advanced platform to help you grow your marketing.

    But the other alternative which reminds me more Browse.AI is Apify. Apify. You can just get started by signing up again completely free. And Apify has higher limits. You get many more credits to use, and it works more or less in the same way with these credits, where the more you scrape, the more credit you have to pay from your account.

    Hexomatic Review - Automate any manual process


    Hexomatic offers many of the same features as Browse.AI, and can run in any browser.

    But with Apify, those limits are just a lot higher.

    Future of

    Now, when we dive into the future of Browse.AI, they’re working on many different elements. But I have picked out four things I will see very interesting. And the first thing is that they’re adding more than 50 pre-built templates for scraping text. This will help you and I to onboard easier to Zapier to get an idea of what Browse.AI can do because I think it can do much more than what we think it can do. Then they are also working on making themselves independent of the Chrome extension because right now, you can only use Browse.AI using Chrome and using their Chrome extension. After all, as you saw in the walkthrough, the Chrome extension is used to train the bot to tell it how it needs to do.

    Then they are also elaborating on their API. And with this elaborate API, we will be able to do a lot more using Zapier with the data that we get. And last but not least, then they’re adding an Integrately integration. And Integrately is the same as Zapier.

    We just get more options to choose whether we want to use Zapier, Integrately or maybe even something third in the future.

    Pros & Cons of using

    Now, after using Browse.AI for some time, I like that it works well on JavaScript-heavy websites. I like their Zapier integration, which is very easy to use with a Chrome extension. I wish they would improve the pricing and the fact that they’re very dependent on their Chrome extension.

    Last thoughts about

    Browse.AI is a really powerful tool that can be used to save a lot of manual hours, though there is a but, their credit limits are just too low, and it’s limited in what we can use Browse.AI for. And therefore, I want to give Browse.AI three and a half stars.

    Their core functionality is working great, but the limits on the credits are a huge setback. And then the fact that we can only use it using Chrome when they get to change will be a game changer. That’s my review.

    Posted by
    Phillip Stemann

    I have been in the software industry for 10+ years, and I’ve gathered a ton of experience I’m sharing with you. I test out tools each week and share my findings with you, for you to easily choose the right software for your needs. I have so far reviews many types of software and even built software myself, it’s a huge passion for me.