Skip to main content

Job Methods

Job methods allow you to execute scrapes, check status, and retrieve results. See the MeterClient reference for complete method signatures.

Quick reference

MethodDescription
create_job()Create a new scrape job
get_job()Get job status and results
wait_for_job()Wait for job completion with polling
list_jobs()List jobs with filtering
compare_jobs()Compare two jobs for changes
get_strategy_history()Get timeline of jobs for a strategy

Common workflows

Create and wait

from meter_sdk import MeterClient

client = MeterClient(api_key="sk_live_...")

# Create job
job = client.create_job(
    strategy_id="550e8400-e29b-41d4-a716-446655440000",
    url="https://example.com/products"
)

# Wait for completion
completed = client.wait_for_job(job['job_id'], timeout=300)

# Process results
for item in completed['results']:
    print(item)

Poll manually

import time

job = client.create_job(strategy_id, url)

while True:
    status = client.get_job(job['job_id'])

    if status['status'] == 'completed':
        print(f"Done! {status['item_count']} items")
        break
    elif status['status'] == 'failed':
        print(f"Failed: {status['error']}")
        break

    print(f"Status: {status['status']}")
    time.sleep(2)

Compare for changes

# Get last two jobs
jobs = client.list_jobs(strategy_id=strategy_id, limit=2)

if len(jobs) >= 2:
    comparison = client.compare_jobs(jobs[0]['id'], jobs[1]['id'])

    if not comparison['content_hash_match']:
        print("Content has changed!")
    else:
        print("No changes detected")

Monitor failures

# Check for recent failures
failed = client.list_jobs(
    strategy_id=strategy_id,
    status='failed',
    limit=5
)

if len(failed) > 0:
    print(f"Warning: {len(failed)} recent failures")
    for job in failed:
        print(f"  - {job['id']}: {job['error']}")

See also

Need help?

Email me at [email protected]