#
# # # # #

Amazon Product Page Content Scraper API

API:Scraper

US$15US$20

  • Amazon PDP info scraper crawls the specific product page data from Amazon.com. Users can cralw a specific Amazon PDP information using ASIN code and country domain. Country domain covers all Amazon marketplace domain worldwide, such as .de, .co.jp, etc. Scraped dataset has 10+ metrics related to the products.

  • For more details regarding API usage obligation & liability, please read Legal Terms of Service & Condition

Features

  • Amazon PDP info scraper crawls the specific product page data from Amazon.com
  • Users can cralw a specific Amazon PDP information using ASIN code and country domain
  • Country domain covers all Amazon marketplace domain worldwide, such as .de, .co.jp, etc
  • Scraped dataset has 10+ metrics related to the products

API Endpoint Specifications

  • Endpoint Path: /api/1/amazonpdp
  • Type of Data: JSON & 20/minute
  • Data Source: Amazon
  • Request Limit: 500 request/month
  • Script & Integration: Code to integrate with cURL, JS, Python, Ruby, Php, Node.js, Java, .NET, Rust, Go, Typescript
Amazon Product Page Content Scraper API Endpoint Basic Info

API Endpoint Path

required

Amazon PDP content scraper API

api/1/amazonpdp


Call Method

Required

GET

Type of Data Return

JSON

Output structured JSON data on Amazon PDP


Available API Arguments & Parameters

token

required

BUYFROMLO API token. Paid subscription API is available: /api/1/amazonpdp, and accessible to on-site APP on /app/1/amazonpdp as well

keyword

required

Input a keyword related to the product data to be scraped


country

Optional

Enter the country of marketplace. The default is amazon.com. The API is accessible to full lists of marketplace country that is availabe on Amazon


Amazon PDP content scraper API

api/1/amazonpdp


Code Integration and Response

Python Code Sample


import requests

apiendpoint = "https://api.buyfromlo.com/api/1/amazonpdp?"

## Required Arguments & Parameters ##

token = ""your buyfromlo token""
keyword = ""Input a keyword""

data="keyword=" + keyword
headers={"Authorization": "Bearer " + token}

## Call the api ##
response = requests.get(apiendpoint + data, headers=headers)
print(response.status_code)
print(response.json())
                        

JSON Response Sample


{
    "Product Name": " " (string),
    "Brand Name": " " (string),
    "Brand Store Page": " " (string),
    "Product Full Description": " " (string),
    "product Brief Description": " " (string),
    "Product Pricing": " " (float),
    "Product Offer Pricing": " " (float),
    "Shipping Price": " " (float),
    "Availability": " " (string),
    "Product Category": " " (string),
    "Average Rating": " " (float),
    "Total Reviews": " " (integer),
    "Seller ID": " " (string),
    "Seller Name": " " (string),
    "FBA": " " (string),
    "Images URL": " " (string),
    "Featured Bullet Points": " " (string),
}
                        

4.5 (Overall)

  • 5 stars - 38
  • 4 stars - 10
  • 3 stars - 3
  • 2 stars - 1
  • 1 star - 0

Latest Reviews

FAQ

The Amazon Product Page Content Scraper API enables you to gather specified product page information from Amazon.com by utilizing the ASIN code and the relevant country domain.

The API retrieves a range of product-related metrics, including title, description, price, reviews, ratings, images, variations, and more.

The API delivers the scraped data in JSON format.

The data is updated and refreshed every 20 minutes.

The data is directly obtained from Amazon.

The API has a usage limit of 500 requests per month per token.

The API offers integration with a variety of programming languages, including cURL, JS, Python, Ruby, Php, Node.js, Java, .NET, Rust, Go, and Typescript.

No, the API is designed to be easily accessible and doesn't require any specific software or complex setup.

To access the API, you'll need a BUYFROMLO API token. You can obtain an API token by signing up for a paid subscription on the BUYFROMLO website.

Yes, there are paid subscription options available that offer additional features and increased request limits.