Member-only story
Scraping 500+ links from 100 emails
Web scraping real-world examples without using AI
Hello people,
Welcome to this week’s new blog.
Well, I was working extensively on APIs, and have been scraping for a few weeks a couple of articles on AI agents on web scraping, web scraping APIs paid and free including and other web scraping APIs are shared on the platform.
But this is something else we will be talking about today.
For the past 2 years we have been writing newsletters once a week to our subscriber’s community and each of the newsletters or at alteast 90% of them if not all have some web resource links and some of them are domains, tools, agents, github links, youtube videos etc.
In today’s blog, we will go through the process of recollecting the 200+ links from the previous return newsletter to our subscribers on our website.
Introduction
Let’s go deep into the quick introduction of the problem statement.
Problem: Collect all the useful and unique links only links or more specifically unique domains from all the emails stored in the database.
Database: Firebase Firestore
Backend: Express/node.js