1.4.0 Major Release – Google Pagespeed Scores!

  • Facebook
  • Twitter
  • Reddit

Download WindowsMac or Linux

Remember you need Java installed for Mac & Linux. As well as to uninstall any previous version.

Installing on Mac Instructions

See link here how to do it: https://github.com/garethjax/beamusup-osx-installer 

Need Support?

Reply to a comment with your real email address and real URL in this post and I’ll get back to you.

Whats in the update?

New Feature – Google Pagespeed Check

You can now check Google Pagespeed score for free and bulk on all your links! Just follow this easy tutorial to get an API key and then you are good to go! Then on the app just right click the link you want:

 

Bug Fixes

  • Able to search websites with ports example.com:8000
  • Fixed robots.txt missing breaking crawl

Add comment

33 comments

  1. Ralf
    04 . Feb . 2025

    I’ve downloaded the jar-Version, but when running it, I get “A Java Execption has occurred” from Java-VM Launcher.

    Java 1.8.0_441-b07 is installed and it’s a Win11Pro machine

    • G
      04 . Feb . 2025

      Hey Ralf, can you try installing the Windows version instead? Let me know if it works.

      Thanks.

  2. Braulio
    06 . Feb . 2025

    I’ve found around 300 not found (404) pages. I want to quickly export all the “incoming links” for all of these, so I can edit the source, as sometimes the URLs are “hardcoded” in the content rather than being called dynamically. Is there any way to do this?

    • G
      07 . Apr . 2025

      No but good idea. Thanks

  3. Gene
    06 . Feb . 2025

    Thank you for this brilliant resource! One question: is there something in the settings that would lessen the load on my server? I noticed while crawling or even while loading a previous crawl my server CPU, RAM, and I/O all peak at 100.

    • G
      07 . Apr . 2025

      If you’re website can’t take it there is problem with your site, which I would check. But its a good idea and i’ll implement it in the future. Thanks

  4. John A
    08 . Feb . 2025

    Awesome tool, thanks for doing this!

    Small issue. It seems to be parsing .svg files, which are text but not html so it complains about missing components.
    Does not fail, but it is a little distracting.

    Property Value
    Url Address https://10.0.0.10/img/pycro-nano-arduino.svg
    Filters Matched Title Is Missing, Meta Description Missing, No H1 Header

  5. John A
    14 . Feb . 2025

    Running v1.4.0. Worked fine several runs across a local server and a remote one. Got an exception today for some reason:
    Exception in thread “AWT-EventQueue-0” java.lang.ArrayIndexOutOfBoundsException: Index 24 out of bounds for length 24
    at java.desktop/javax.swing.DefaultRowSorter.setModelToViewFromViewToModel(DefaultRowSorter.java:745)
    at java.desktop/javax.swing.DefaultRowSorter.rowsInserted0(DefaultRowSorter.java:1078)
    at java.desktop/javax.swing.DefaultRowSorter.rowsInserted(DefaultRowSorter.java:879)
    at org.jdesktop.swingx.sort.DefaultSortController.rowsInserted(DefaultSortController.java:403)
    at java.desktop/javax.swing.JTable.notifySorter(JTable.java:4338)
    at java.desktop/javax.swing.JTable.sortedTableChanged(JTable.java:4186)
    at java.desktop/javax.swing.JTable.tableChanged(JTable.java:4463)
    at org.jdesktop.swingx.JXTable.tableChanged(JXTable.java:1529)
    at java.desktop/javax.swing.table.AbstractTableModel.fireTableChanged(AbstractTableModel.java:302)
    at java.desktop/javax.swing.table.AbstractTableModel.fireTableRowsInserted(AbstractTableModel.java:237)
    at com.beamusup.webcrawler.swing.model.ResultTableModel.onModelChange(ResultTableModel.java:68)
    at com.beamusup.webcrawler.swing.model.ResultTableModel.onModelChange(ResultTableModel.java:47)
    at com.beamusup.webcrawler.swing.MainFrame$ModelListener.lambda$crawlTaskCompleted$4(MainFrame.java:294)
    at java.desktop/java.awt.event.InvocationEvent.dispatch(InvocationEvent.java:318)
    at java.desktop/java.awt.EventQueue.dispatchEventImpl(EventQueue.java:773)
    at java.desktop/java.awt.EventQueue$4.run(EventQueue.java:720)
    at java.desktop/java.awt.EventQueue$4.run(EventQueue.java:714)
    at java.base/java.security.AccessController.doPrivileged(AccessController.java:400)
    at java.base/java.security.ProtectionDomain$JavaSecurityAccessImpl.doIntersectionPrivilege(ProtectionDomain.java:87)
    at java.desktop/java.awt.EventQueue.dispatchEvent(EventQueue.java:742)
    at java.desktop/java.awt.EventDispatchThread.pumpOneEventForFilters(EventDispatchThread.java:203)
    at java.desktop/java.awt.EventDispatchThread.pumpEventsForFilter(EventDispatchThread.java:124)
    at java.desktop/java.awt.EventDispatchThread.pumpEventsForHierarchy(EventDispatchThread.java:113)
    at java.desktop/java.awt.EventDispatchThread.pumpEvents(EventDispatchThread.java:109)
    at java.desktop/java.awt.EventDispatchThread.pumpEvents(EventDispatchThread.java:101)
    at java.desktop/java.awt.EventDispatchThread.run(EventDispatchThread.java:90)

    (try on arrizza.com)

  6. Cecelia
    20 . Feb . 2025

    Thanks for the great tool!

    One thing I’m experiencing is that when I tell Google Pagespeed to check all URLs, the queue window pops up over and over again. I’m not clicking on the “Google Pagespeed” filter. Any way to prevent this or am I doing something wrong?

    • G
      07 . Apr . 2025

      It is a bug. Thanks

  7. Flavia Donadio
    21 . Feb . 2025

    I’ve been using BUU for some time and it works really well. One feature I really need, but that BUU doesn’t support (yet), is finding tags missing the “alt” parameter. That expensive alternative supports this, but it’s hard to beat free, right? 😉

    If you can implement that, I’ll be happy to pay you a beer (or a 6-pack).

    • G
      07 . Apr . 2025

      We’ll see 🙂

  8. AK
    21 . Feb . 2025

    I’m trying to download the Windows version and it wants to open this webpage (https://beamusup.com/downloads/buu.msi) but it just keeps spinning no matter the browser I use. I’ll try back in a few days but wanted to give you a heads up.

    • G
      07 . Apr . 2025

      Wasn’t aware but guess its working for most people. Thanks

  9. Lukas
    24 . Feb . 2025

    Hi, is there any way to export Page Speed data to Excel?

    • G
      07 . Apr . 2025

      Yes just export on the top right of the app.

  10. Ivan
    27 . Feb . 2025

    Love this awesome tool, thank you so much for building this!

    Was wondering if it’s possible to extract images/videos links from the site as well or not. It would be great if that’s possible too!

    • G
      07 . Apr . 2025

      Thanks appreciated no that isn’t possible

  11. Brodey
    18 . Mar . 2025

    The column “Total Internal Links” is being treated as a string, not an intiger. So it’ll take you from 80s, 70s, 60s then suddenly 5000. Despite 5,000 is clearly more than 80. Because a string is treating the first character, an integer would be treated as all numbers.

    Simple fix. Other columns that should be integers might be strings as well, I’m not sure.

    • G
      07 . Apr . 2025

      Thanks!

  12. Dirk
    20 . Mar . 2025

    Very cool! Works for me!

    I run it on a Macbook which I’m not very familiar with, so as a noob I found out how to do it. If you would like to receive all the steps needed, let me know! I’ve let GPT type it out.

  13. Tore
    24 . Mar . 2025

    I can’t seem to get it started on my MacBook. I have allowed in the Settings, however, it won’t open the program.

  14. Tahir
    27 . Mar . 2025

    Would it be possible to add a wait time between URL checks due to a Cloudflare server block?

    • G
      07 . Apr . 2025

      Whitelist your IP on cloudflare 😀

  15. Ward
    01 . Apr . 2025

    Nice tool! Any chance basic authentication is supported?

    • G
      07 . Apr . 2025

      No not yet.

  16. Monica
    03 . Apr . 2025

    Hello
    I am trying to analyze the web with more than 100K links but once the crawl is almost finished, the app freezes and does not let me do anything. If I export to excel, it appears empty.

    I have let it work without touching anything so that it can process the whole web but it reaches a point that does not let me do anything.

    I have a part of the web that has allowed me to export but I need it to be complete.

    any idea?

    • G
      07 . Apr . 2025

      This is likely due to limitations of your computer. However, you should consider splitting your crawls to target specific sections of the website. In most cases, once you’ve reached around 50,000 links, you’ve already captured the main site architecture and key pages. At that point, it’s usually better to stop crawling and start analyzing the data.

      ——————–

      Esto probablemente se deba a las limitaciones de tu computadora. Sin embargo, deberías considerar dividir tus rastreos para enfocarte en secciones específicas del sitio web. En la mayoría de los casos, cuando alcanzas unas 50,000 enlaces, ya has capturado la arquitectura principal del sitio y las páginas clave. En ese punto, lo mejor suele ser detener el rastreo y comenzar a analizar los datos.

  17. Andrea Scarpetta
    06 . Apr . 2025

    hey there! I’ve made a short guide to install BUU on osx.
    https://github.com/garethjax/beamusup-osx-installer

    • G
      07 . Apr . 2025

      Thanks!

  18. Monica
    09 . Apr . 2025

    Gracias G!

    Finalmente rastree la web 118K de links. Lo he parado para poder exportarlo a XSL. Por saber un poco, cuánto tiempo tarda en procesar todo ?

    GRaciasª

Related

Get a Google Pagespeed API Key
3 months ago

To enable bulk google pagespeed collection in BeamUsUp you’ll need an API key so here ...

Beam Us Up Crawler Updated v1.3.0 – New features & Fixes
3 months ago

Download Windows, Mac or Linux Remember you need Java installed for Mac & Linux. As well as to ...

Beam Us Up Crawler Updated v1.2.1 – Big Feature Update
1 year ago

Download Windows, Mac or Linux Remember you need Java installed for Mac (how to run on mac guide) ...

Beam Us Up Crawler Updated 2024 v1.1.1
1 year ago

After many attempts of finding someone to help me updated the crawler, I have finally ...