Why scrape from inside a spreadsheet?
Traditional web scraping requires Python scripts, browser extensions, or expensive SaaS tools. The data always ends up in the same place: a spreadsheet. You export a CSV, import it into Google Sheets, clean the columns, and start working.
SheetMagic cuts out every step between "I need this data" and "it is in my spreadsheet." You write a formula, point it at a URL or keyword, and the extracted data lands directly in your cells. No code. No CSV imports. No context switching.
And with nine Apify integrations built into the AI Chat sidebar, you can also pull structured data from platforms that are notoriously difficult to scrape -- Google Maps, YouTube, Amazon, LinkedIn, and more -- using natural language.
This guide covers every scraping formula SheetMagic offers, all nine Apify integrations, how credits work, and real workflows you can build today.
Part 1: Formula-based scraping
SheetMagic includes a family of scraping formulas. They all follow the same pattern: point them at a URL or keyword, tell them what to extract, get structured data back.
=SERP() -- Search engine results
=SERP("best crm software for startups")Returns up to 20 Google search results with titles, URLs, snippets, and ranking positions.
=SERP() is the starting point for keyword research, competitive analysis, and prospecting. Put your keywords in column A, put =SERP(A2) in column B, drag down, and you have structured SERP data for your entire keyword list.
Real use case -- finding prospects:
=SERP("saas companies series A funding 2026")Pull a list of recently funded companies from search results. Use the URLs in the next step for deeper research.
=BULKSERP() -- Horizontal SERP layout
=BULKSERP(A2)Same data as =SERP() but laid out horizontally -- one row per keyword, results spread across columns. Better for comparing multiple keywords.
Use =BULKSERP() when you are comparing SERP landscapes across keywords. Each keyword stays on one row, making it easy to scan across your keyword list.
=VISIT() -- Full page content
=VISIT(A2)Retrieves up to 50,000 characters of text content from the URL in A2. Supports JavaScript-rendered pages.
=VISIT() pulls the raw text content of a webpage. This is the formula to use when you need the full page -- for summarization, content analysis, or feeding into an =AITEXT() formula for processing.
Real use case -- competitor content analysis:
Pull competitor URLs from SERP data
Run =SERP("your target keyword") and extract the top 10 URLs.
Scrape each page's full content
Put the URLs in column A and run =VISIT(A2) in column B. You now have the full text of every competitor page.
Analyze with AI
Use =AITEXT("Summarize this page in 3 sentences. What is its main argument and what audience is it targeting?", B2) in column C to process each page.
=PAGEDATA() -- Bulk page metadata
=PAGEDATA("https://example.com/about")Returns structured metadata: URL, Title, Description, H1, and HTTP Status.
=PAGEDATA() extracts page metadata (title tag, meta description, H1 heading, HTTP status) from one or more URLs at once. Pass multiple URLs to get a table of metadata in a single call:
=PAGEDATA(A2, A3, A4, A5)Returns one row per URL with Title, Description, H1, and Status columns. Great for bulk SEO audits.
For targeted data extraction, combine =VISIT() with =AITEXT():
=AITEXT("Extract: company description in one sentence, main product or service, pricing page URL if visible, contact email if visible. Format as pipe-separated values.", VISIT(A2))Scrapes the page with =VISIT() and uses AI to extract specific fields. Use =SPLIT() afterward to break the result into columns.
Format your extraction instructions with pipe-separated output (using the | character), then use =SPLIT(B2, "|") in adjacent columns to break the result into clean, separate cells. This turns unstructured web pages into structured spreadsheet data.
=GETSELECTOR() -- CSS selector scraping
=GETSELECTOR(A2, "div.product-price span.amount")Extracts content matching a specific CSS selector. For developers or anyone who inspects page elements.
If you know the exact HTML element you need, =GETSELECTOR() targets it directly with a CSS selector. This is the most precise scraping method -- useful when you need a specific price, rating, or data point that lives in a consistent HTML structure.
=GETIMG() -- Extract image URLs
=GETIMG(A2)Returns image URLs found on the page at A2.
Use this for cataloging product images, building visual mood boards, or auditing image assets across a website.
=DOMAINCHECK() -- Domain availability
=DOMAINCHECK("mybrandname.com")Returns whether the domain is available for registration.
Useful for brand naming projects. Generate name ideas with =AILIST(), then check domain availability for each one:
=DOMAINCHECK(A2&".com")Checks .com availability for the brand name in A2. Combine with =AILIST() to generate and check names in bulk.
Page metadata formulas
These are single-purpose formulas optimized for speed. They extract one specific piece of metadata per call:
| Formula | What it extracts | Example use |
|---|---|---|
=GETMETATITLE(A2) | Title tag | SEO audit, competitor analysis |
=GETMETADESCRIPTION(A2) | Meta description | SERP preview, content gap analysis |
=GETH1(A2) | H1 heading | Content structure analysis |
=GETH2(A2) | All H2 headings | Outline extraction, topic mapping |
=GETHEADINGS(A2) | All headings (H1-H6) | Full content structure audit |
=GETP(A2) | Paragraph text | Content extraction, word count |
Real use case -- SEO site audit:
Set up your sheet with a list of your own page URLs in column A, then:
- Column B:
=GETMETATITLE(A2)-- check for missing or duplicate titles - Column C:
=GETMETADESCRIPTION(A2)-- find pages without meta descriptions - Column D:
=GETH1(A2)-- verify every page has an H1 - Column E:
=AITEXT("Is this meta description under 160 characters? Does it include a call to action? Return Yes/No for each.", C2)-- automated quality check
For more SEO workflows, see the complete SEO guide.
Part 2: Apify integrations -- structured data from major platforms
Some platforms are hard to scrape with generic formulas. Google Maps, YouTube, Amazon, and LinkedIn all have complex page structures, anti-scraping measures, and dynamic content. SheetMagic solves this with nine curated Apify integrations, accessible through the AI Chat sidebar.
Apify integrations are available through the AI Chat sidebar (paid subscribers on Solo, Team, and Business plans). You describe what data you need in natural language, and the AI selects the right Apify actor, configures the parameters, and writes results directly into your sheet. Each Apify operation uses integration credits.
Google Maps scraper
Pull business listings for any location and category. Useful for lead generation, market research, and competitive analysis.
What you can extract:
- Business name, address, phone number
- Rating, review count
- Website URL
- Business category
- Hours of operation
Example conversation with AI Chat:
You: "Find all Italian restaurants in Austin, TX with at least 4 stars. Put the results in a new sheet."
AI: Runs the Google Maps scraper, returns a structured table with business names, addresses, ratings, phone numbers, and websites.
Google Search SERP scraper
A more powerful alternative to the =SERP() formula for large-scale keyword research. Handles complex queries, geo-targeted results, and returns richer data.
YouTube channel and video scraper
Extract channel statistics, video metadata, view counts, and engagement metrics.
What you can extract:
- Channel subscriber count, total views
- Video titles, descriptions, view counts
- Upload dates, engagement rates
- Tags and categories
Example use: Building a creator outreach list by scraping channels in your niche, filtering by subscriber count, and generating personalized outreach emails with =AITEXT().
Amazon product scraper
Pull product data including prices, ratings, review counts, and product descriptions. Essential for e-commerce competitive intelligence.
What you can extract:
- Product title, price, rating
- Number of reviews
- Best Seller Rank
- Product description and bullet points
- ASIN
LinkedIn company scraper
Extract company profile data for sales prospecting and market research.
What you can extract:
- Company name, industry, size
- Headquarters location
- Description, specialties
- Website URL
Additional Apify integrations
SheetMagic includes four more specialized scrapers covering additional platforms and data sources. All work through the same AI Chat interface -- describe what you need and the AI handles the configuration.
Part 3: How credits work
Web scraping formulas and Apify integrations use integration credits, which are separate from the AI tokens used by formulas like =AITEXT().
Credit costs
| Operation | Credits used |
|---|---|
=SERP() query | 1 credit per keyword |
=BULKSERP() query | 1 credit per keyword |
=VISIT() page | 1 credit per URL |
=PAGEDATA() extraction | 1 credit per URL |
=GETSELECTOR() | 1 credit per URL |
=GETMETATITLE(), =GETMETADESCRIPTION(), etc. | 1 credit per URL |
=GETIMG() | 1 credit per URL |
=DOMAINCHECK() | 1 credit per domain |
| Apify integrations | Varies by actor and results returned |
Credit limits by plan
| Plan | Integration credits/month | Price |
|---|---|---|
| Free | 50 | Free |
| Solo | 1,000 | $19/mo |
| Team | 5,000 | $79/mo |
| Business | 20,000 | $149/mo |
BYOK (Bring Your Own Key) users on lifetime deals can also purchase additional integration credit packs from the dashboard. See the pricing page for details.
Check your remaining credits anytime: open the SheetMagic sidebar under Extensions > SheetMagic, or visit the dashboard. Credits reset monthly on your billing cycle.
Smart credit usage
SheetMagic includes 30-minute intelligent caching. If you scrape the same URL twice within 30 minutes, the second request returns cached data without using additional credits. This means you can iterate on your extraction instructions without burning through your quota.
Part 4: Complete workflows
Here are three end-to-end workflows that combine scraping and AI formulas.
Workflow 1: Lead generation from Google Maps
Scrape business listings
Open the AI Chat sidebar and ask: "Find all marketing agencies in Denver, CO. Include name, website, phone, and rating."
The Apify Google Maps scraper returns results directly into your sheet.
Enrich with website data
With website URLs in column D, run =AITEXT("What does this company specialize in? One sentence.", VISIT(D2)) in column E.
Qualify leads
Use =AITEXT("Based on this company description, rate the likelihood they need a CRM tool on a scale of 1-5. Return only the number.", E2) in column F.
Generate personalized outreach
For qualified leads (rating 4-5), generate email openers: =AITEXT("Write a 2-sentence cold email opener for a marketing agency that specializes in "&E2&". Reference something specific about their work.", "") in column G.
Workflow 2: E-commerce price monitoring
List competitor product URLs
Put competitor product page URLs in column A.
Extract pricing data
=AITEXT("Extract: product name, current price, original price if on sale, availability status. Pipe-separated.", VISIT(A2)) in column B.
Split into columns
=SPLIT(B2, "|") in columns C through F to get structured data.
Set up price comparison
Add your own prices in column G. Use =IF(C2<G2, "Competitor is cheaper by "&TEXT(G2-C2, "$0.00"), "You are competitive") in column H.
Generate price analysis
=AITEXT("Based on these prices, suggest a competitive pricing strategy: My price: "&G2&", Competitor price: "&C2&". Keep it to 2 sentences.", "") in column I.
Workflow 3: Content research and planning
Research top-ranking content
=SERP("your target keyword") to get the top-ranking URLs.
Extract content structure
=GETHEADINGS(A2) to pull the heading structure of each competitor page.
Analyze content gaps
=AITEXT("These are the heading structures from 5 articles ranking for the keyword 'your keyword'. Identify 3 subtopics that none of them cover well: "&B2&" | "&B3&" | "&B4&" | "&B5&" | "&B6, "") in a separate cell.
Generate a content brief
Feed the gap analysis into a brief: =AITEXT("Create a content brief for an article that covers these underserved subtopics: "&C2&". Include 8 suggested headings, target audience, and recommended word count.", "")
For the full version of this SEO workflow, see the SEO guide.
Best practices for reliable scraping
1. Be specific about what you want. When using =VISIT() + =AITEXT() for targeted extraction, "Extract the price" returns better results than "Get all data." Name the exact fields you need.
2. Use pipe-separated output. Ask for results formatted with | separators, then use =SPLIT() to break them into columns. This is the fastest way to turn unstructured web data into structured spreadsheet data.
3. Process in batches. When scraping hundreds of URLs, do not run all formulas at once. Work in batches of 50 to 100 rows. This avoids timeouts and makes it easier to spot issues early.
4. Take advantage of caching. SheetMagic caches scraping results for 30 minutes. If you need to adjust your extraction instructions, you can rerun formulas against the same URLs without using additional credits.
5. Validate your results. Spot-check extracted data against the source page, especially for pricing and contact information. If results look inconsistent, refine your extraction instructions or try =GETSELECTOR() with a specific CSS selector.
6. Combine scraping with AI. The most powerful workflows use scraping to gather data and AI to process it. =VISIT() to get page content, =AITEXT() to analyze, classify, or extract specific fields. Use =PAGEDATA() when you just need metadata (titles, descriptions, H1s) in bulk. See the workflows above for examples.
Web scraping results depend on the structure and accessibility of the target website. Some sites block automated access or load content dynamically. If a formula returns incomplete data, try =VISIT() (which supports JavaScript-rendered pages) or use an Apify integration through the AI Chat sidebar for platforms with complex structures.
Start scraping today
Web scraping in Google Sheets is not a futuristic idea. It works right now, with formulas you can write in seconds. The free tier gives you 50 integration credits per month -- enough to scrape 50 pages and see how it fits into your workflow.
Install SheetMagic, open a sheet, and try =SERP("your keyword") or =PAGEDATA("any-url.com") to pull page metadata instantly. If you want structured data from Google Maps, YouTube, Amazon, or LinkedIn, open the AI Chat sidebar and describe what you need.
For more on combining scraping with AI: How to use AI formulas | SEO workflows | Getting started guide
Questions about web scraping, credits, or specific integrations? Visit the FAQ or check the documentation.

