Categories
JavaScript Answers

How to do Simple Web Scraping Using jQuery?

Spread the love

To do simple web scraping using jQuery, we can use the jQuery’s $.get method to make GET requests to the web pages we want to scrape.

Then in the success callback, we can parse the HTML string obtained from the GET request and parse the elements we want with jQuery.

For instance, we can write:

$.get('https://jsfiddle.net/', (html) => {
  [...$(html).find("div")].forEach((el) => {
    const text = $(el).text();
    console.log(text)
  })
})

We call $.get with the URL we want to get data from and the callback that’s run when that succeeds.

In the callback, we parse the HTML result into a DOM object with $(html).

Then we spread the div elements returned by the find method into an array.

Finally, we call forEach on the array and get the element from the el parameter.

We then call get the text content of each element with $(el).text().

It is likely that we will run into CORS issues if we try to use $.get to scrape data from a web page unless the page is in the same domain as where the code is hosted.

By John Au-Yeung

Web developer specializing in React, Vue, and front end development.

2 replies on “How to do Simple Web Scraping Using jQuery?”

Leave a Reply

Your email address will not be published. Required fields are marked *