How To Extract Locations Data from Walmart With Python 3?

Here, in the below tutorial, we display how to scrape the information of locations store feasible at Walmart, is the biggest retail supplier in the United States.

We can explore Walmart.com from locations store based on scrape data & zip code:

  • Name of Store
  • Distance
  • Store ID
  • Zip Code
  • City
  • Address
  • Mobile Number

The over-all figure of Walmart store is around 4,674 in the USA.

You can scrape further information from store page like days open, timings, services& departments. But as-of-now, we will have to stick with these fields and keep it simple.

Extracting The Information

Open any of the browser and click on the given link at https://www.walmart.com/store/finder?location=20005&distance=50. In this link you can explore for Walmart locations store registered for different zip code (20005) with a circular area within 50 miles.

You need to Click-Right on any given link and choose a page – Review Factors. The gateway will open toolbar & display the HTML Contented of the formatted Page. You can click on Network panel for fair request table.

You need to enter required zip code or you need click on explore option. Let’s select zip code (20005) as of now. Click on request and select XHR.

stores?single LineAddr=20005&servicceTypes=pharmancy distance=50.

You can get the Requested URL at-

https://www.walmart.com/store/finder/electrode/api/stores?singleLineAddr=20005&serviceTypes=pharmacy&distance=50

Explore given link in fresh tab and you will able to get the details in the unstructured form. You can see data in the format like JSON, download extension in JSON.

At LocationsCloud, we provide the Walmart store locations and other data scraping using Python web scraping services.

Build The Extractor

We utilize Python 3 in this Blog. The code will not route if you utilize Python ’2.7’. You require PIP & Python 3 which is installed in your computer.

Utmost UNIX functioning systems like Mac OS &Linux arrives with pre-installed Python. But, not every Linux functioning System conveys using Python 3 without any failure.

To check version of Python, you need to open a station in (Mac OS and Linux) or Command Quick (on Window) and press enter

Installing PIP & Python 3

In the below given link there is a guidebook to mount Python 3 With

Linux at – http://docs.python-guide.org/en/latest/starting/install3/linux/

For Mac operators can monitor this link – http://docs.python-guide.org/en/latest/starting/install3/osx/

For Windows Users – https://docs.python-guide.org/starting/install3/win/#setuptools-pip

or if you are looking for more details then you can contact us on the given link. https://www.locationscloud.com/contact-us/

Installing Packages

The Python Requirements, You can download & make request to the HTML contented of different pages (http://docs.python-requests.org/en/master/user/install/).

The Coding

import csv
import requests
import json
import argparse
import traceback

def locate_stores(zip_code):
    """
    Function to locate walmart stores
    """
    url = "https://www.walmart.com/store/finder/electrode/api/stores?singleLineAddr=%s&serviceTypes=pharmacy&distance=50"%(zip_code)
    headers = { 'accept':'text/html,application/xhtml+xml,application/xml;q=0.9,image/webp,image/apng,*/*;q=0.8',
                'accept-encoding':'gzip, deflate, br',
                'accept-language':'en-GB,en;q=0.9,en-US;q=0.8,ml;q=0.7',
                'cache-control':'max-age=0',
                'upgrade-insecure-requests':'1',
                'user-agent':'Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/64.0.3282.186 Safari/537.36'
    }
    stores = []
    print("retrieving stores")
    for retry in range(10):
        try:
            get_store = requests.get(url, headers=headers, verify=False)
            store_response = get_store.json()
            stores_data = store_response.get('payload',{}).get("storesData",{}).get("stores",[])
            
            if not stores_data:
                print("no stores found near %s"%(zip_code))
                return []
            print("processing store details")
            #iterating through all stores
            for store in stores_data:
                store_id = store.get('id')
                display_name = store.get("displayName")
                address = store.get("address").get("address")
                postal_code = store.get("address").get("postalCode")
                city = store.get("address").get("city")
                phone = store.get("phone")
                distance = store.get("distance")

                data = {
                        "name":display_name,
                        "distance":distance,
                        "address":address,
                        "zip_code":postal_code,
                        "city":city,
                        "store_id":store_id,
                        "phone":phone,
                }
                stores.append(data)
            return stores
        except:
            print(trackback.format_exc())
    
    return []   

if __name__=="__main__":
    
    argparser = argparse.ArgumentParser()
    argparser.add_argument('zip_code',help = 'zip code to search')
    args = argparser.parse_args()
    zip_code = args.zip_code
    scraped_data = locate_stores(zip_code)
    
    if scraped_data:
        print ("Writing scraped data to %s_stores.csv"%(zip_code))
        with open('%s_stores.csv'%(zip_code),'wb') as csvfile:
            fieldnames = ["name","store_id","distance","address","zip_code","city","phone"]
            writer = csv.DictWriter(csvfile,fieldnames = fieldnames,quoting=csv.QUOTE_ALL)
            writer.writeheader()
            for data in scraped_data:
                writer.writerow(data)

Perform entire code through the help of script name trailed by zip code like this:

python 3walmart_store_locator.py zipcode

For example, to explore different Walmart shops in nearby Massachusetts, Boston we will put ‘20005’as zip code like this:

python3 walmart_store_locator.py 20005

Also, you will receive file in stores.csv both files would be in similar database like script. The result file will visible same to this.

If you need experts who can assist you in extracting difficult websites, contact us for any queries!