Contents
If you’re searching for a Google Maps scraper, you probably want one thing.
A simple way to turn Google Maps searches (like “dentist Leeds” or “plumber Birmingham”) into a clean list of leads you can contact.
Here’s the truth. Scraping is the easy bit. The hard bit is what comes after: deduping, finding emails, writing outreach that doesn’t sound robotic, and tracking replies properly.
So this guide is a bit different. It’s a proper tutorial you can follow.
Why Google Maps is a goldmine for lead generation
Google Maps is not just for finding coffee. It’s a live directory of real businesses that:
- exist in the real world
- actively want customers
- usually have a website, phone number, and category
- are grouped by location, which is perfect for local sales
You’re not just seeing pins, you’re seeing structured profiles with:
- business name
- address and postcode
- category
- website
- phone number
- reviews and rating
- photos, booking links, menus, and more
What people really mean by “Google Maps scraper”
When someone searches “Google Maps scraper”, they’re usually thinking:
“I want to search my niche and city, click a button, and get a neat list of businesses with contact details so I can start outreach.”
A typical workflow looks like this:
- search Google Maps
- collect the businesses that show up
- export the basics to CSV
- visit each website to find emails and social links
- clean the list, dedupe it, prioritise it
- send outreach and track replies
Most tools stop at step 3. That’s why so many people end up with duplicates, missing emails, and thousands of rows they never contact.
Instead of obsessing over the scrape, build a pipeline:
Discover → Store → Enrich → Clean → Segment → Outreach → Track → Follow up
That’s what turns data into revenue.
A quick word about terms and risk
Google’s platform terms restrict scraping or bulk exporting Maps content for use outside Google services. If you go heavy on browser automation, you are taking on risk, both technically and commercially.
If you want something more stable, the safer route is to use the official Places API.
If you want to read the source material, here are the two references most people should skim. This tutorial uses the API approach because it is more reliable and easier to maintain.
Tutorial: Build a Google Maps scraper style lead list using the Places API
You’ll build a lead collector that:
- searches for businesses by niche and location
- pulls key details (name, address, rating, website, phone)
- exports to CSV
- keeps a unique ID so deduping is easy
Then we’ll add a basic enrichment step to look for emails on websites.
Step 1: Create your API key
In Google Cloud:
- Create a project
- Enable Places API
- Create an API key
- Restrict it (IP restriction if server-side)
- Enable billing
Step 2: Install dependencies
pip install requests pandas beautifulsoup4
Step 3: Search for businesses (Text Search)
This matches how people actually use Maps: “dentist Leeds”, “accountant Manchester”, and so on.
import time
import requests
import pandas as pd
API_KEY = "*****YOUR_API_KEY*****"
def places_text_search(query, region="uk"):
url = "https://maps.googleapis.com/maps/api/place/textsearch/json"
params = {"query": query, "key": API_KEY, "region": region}
all_results = []
while True:
res = requests.get(url, params=params, timeout=30).json()
status = res.get("status")
if status not in ("OK", "ZERO_RESULTS"):
raise RuntimeError(f"Places error: {status} - {res.get('error_message')}")
all_results.extend(res.get("results", []))
token = res.get("next_page_token")
if not token:
break
# token needs a short delay before it becomes valid
time.sleep(2)
params["pagetoken"] = token
return all_results
results = places_text_search("dentist Leeds")
print(len(results))
You’ll notice a field called place_id.
That is your unique identifier. It’s how you dedupe properly and avoid a messy list.
Step 4: Pull richer details per business (website + phone)
Text Search results often miss fields like website. You call Place Details to fetch them.
def place_details(place_id):
url = "https://maps.googleapis.com/maps/api/place/details/json"
params = {
"place_id": place_id,
"key": API_KEY,
"fields": ",".join([
"place_id",
"name",
"formatted_address",
"international_phone_number",
"website",
"rating",
"user_ratings_total",
"url",
"types"
])
}
res = requests.get(url, params=params, timeout=30).json()
if res.get("status") != "OK":
return None
return res.get("result", {})
def build_rows(text_results):
rows = []
seen = set()
for r in text_results:
pid = r.get("place_id")
if not pid or pid in seen:
continue
seen.add(pid)
d = place_details(pid)
if not d:
continue
rows.append({
"place_id": d.get("place_id"),
"name": d.get("name"),
"address": d.get("formatted_address"),
"phone": d.get("international_phone_number"),
"website": d.get("website"),
"rating": d.get("rating"),
"reviews": d.get("user_ratings_total"),
"maps_url": d.get("url"),
"types": ",".join(d.get("types", []))
})
time.sleep(0.2)
return rows
rows = build_rows(results)
df = pd.DataFrame(rows)
df.to_csv("google_maps_leads.csv", index=False)
print(df.head())
At this point you’ve built the core output most people expect from a “Google Maps scraper”.
Now comes the bit that decides whether this becomes money - enrichment.
Step 5: Enrich websites to find emails
Google Maps generally won’t hand you emails.
So you do what most lead gen tools do:
- visit the website
- find the contact page
- extract any public email addresses
import re
from bs4 import BeautifulSoup
from urllib.parse import urljoin
EMAIL_RE = re.compile(r"[a-zA-Z0-9._%+\-]+@[a-zA-Z0-9.\-]+\.[a-zA-Z]{2,}")
def fetch_html(url):
try:
r = requests.get(url, timeout=15, headers={"User-Agent": "Mozilla/5.0"})
if r.status_code >= 400:
return None
return r.text
except Exception:
return None
def extract_emails(html):
return sorted(set(EMAIL_RE.findall(html or "")))
def find_contact_link(base_url, html):
soup = BeautifulSoup(html, "html.parser")
for a in soup.select("a[href]"):
href = a.get("href", "")
text = (a.get_text() or "").lower()
if "contact" in href.lower() or "contact" in text:
return urljoin(base_url, href)
return None
def get_emails_for_site(site_url):
if not site_url:
return []
home = fetch_html(site_url)
if not home:
return []
emails = set(extract_emails(home))
contact_url = find_contact_link(site_url, home)
if contact_url:
contact_html = fetch_html(contact_url)
emails.update(extract_emails(contact_html))
return sorted(emails)
df["emails_found"] = df["website"].apply(lambda u: ", ".join(get_emails_for_site(u)))
df.to_csv("google_maps_leads_enriched.csv", index=False)
You now have a working lead list that includes:
- business details from Maps
- website and phone
- any public emails found on the site
That's enough to run a real campaign.
Step 6: Clean, prioritise, and stop yourself from drowning
This is where most people stall. They scrape 5,000 leads and then do nothing.
Do this instead:
Start with 50 to 150 leads
Small batch & experiment.
Dedupe using place_id
It’s the simplest way to avoid repeated outreach and wasted time.
Segment into 2 to 3 buckets
Examples:
- high rating, lots of reviews
- low rating, weak photos
- no website or an outdated website
Turning Maps data into outreach that gets replies
A list isn’t revenue, conversations are.
The biggest mistake people make after using a Google Maps scraper is sending generic outreach that could have been written to anyone.
Here’s the difference.
Generic (ignored)
“Hi, we help businesses get more customers. Fancy a quick call?”
Maps-informed (gets replies)
“Hi Sarah,
I was looking at Italian restaurants around Leeds and yours stood out. 400+ Google reviews and a 4.7 rating is seriously impressive.
One thing I noticed is that your Maps listing sends people to a site that doesn’t make booking a table easy on mobile. That’s usually where most Maps clicks land.
I run a small studio that helps restaurants turn Google Maps traffic into bookings. Want me to send over a quick 2 minute teardown of what I’d change?”
Short, specific & human. Easy to reply to.
Recommended workflow
- Pick one niche + one area
- Collect 50 to 150 leads
- Enrich emails where possible
- Segment into 2 to 3 buckets
- Write one message per bucket
- Send, track, and follow up properly
- Double down on what replies
Do that for 4 weeks and you’ll have a predictable system.
Where OdjoAI fits (without juggling five tools)
If you build the DIY version above, you’ll hit the same friction points:
- collecting and storing leads consistently
- enrichment (emails + phones)
- writing personalised messages at scale
- tracking who opened, replied, clicked
- logging calls and outcomes
- avoiding duplicate outreach next month
OdjoAI is built to cover that whole pipeline in one place:
- Maps-style business search
- email and phone enrichment
- email writing in your tone
- sending and tracking
- lightweight CRM with interaction history
- analytics so you can see what’s working
Mistakes to avoid
1) Collecting data for the sake of it
A huge list feels productive. It isn’t.
2) Waiting for perfect data
It never arrives. Start with what you’ve got.
3) Ignoring follow-ups
Most replies come after the second or third touch. If you don’t track, you’ll forget.
4) Sounding like everyone else
People don’t hate cold outreach. They hate irrelevant outreach.
Quick checklist
Before you run your first campaign, ask:
- Have we written a basic ICP (niche, area, deal breakers)?
- Do we have a way to capture leads without losing them in tabs?
- Can we get at least one usable contact method per business?
- Are we segmenting before we send?
- Can we track replies and follow-ups in one place?
If you’re missing more than a couple, you don’t have a “Google Maps scraper” problem, you’ve got a workflow problem.
Fix the workflow, and Maps becomes a reliable source of leads.
If you want the short version
If you’re searching “Google Maps scraper”, what you really want is customers.
You can absolutely build a clean lead extractor using the Places API. That’s what the tutorial above gives you.
If you want the full pipeline in one place, from discovery to enrichment to outreach and tracking, try OdjoAI below!
The hard part isn’t finding businesses. It’s contacting them properly.
OdjoAI takes you from maps searches to real conversations in one system.
