Working with PagerDuty :: How to fetch audit records from PagerDuty for a month using rest API and Node JS

 

PagerDuty is a powerful incident management and response platform widely used for managing IT alerts and orchestrating on-call workflows. As an IT infrastructure engineer or developer, you might need to access and analyze audit records to track system changes, ensure compliance, or troubleshoot issues. Fortunately, PagerDuty provides a REST API that makes it straightforward to programmatically fetch audit records.

In this guide, we'll explore how to retrieve a month's worth of audit records from PagerDuty using the REST API and Node.js. We'll cover everything from setting up your API access keys to implementing efficient pagination for large datasets, ensuring you can effortlessly pull the data you need for analysis or reporting. Whether you're a seasoned developer or just getting started with PagerDuty integrations, this walkthrough will equip you with the skills to work effectively with its API.


Steps to Create a Read-Only API Key for Fetching Audit Records from PagerDuty

  1. Log in to PagerDuty

    • Access your PagerDuty account by logging in with your credentials.
    • Ensure you have the necessary administrative privileges to manage API keys.
  2. Navigate to API Access Keys

    • In the PagerDuty dashboard, click on your profile avatar (usually in the top-right corner).
    • Select Account Settings from the dropdown menu.
    • In the settings menu, find and click API Access Keys under the Developers section.
  3. Create a New API Key

    • Click the Create New API Key button.
    • In the dialog box, provide a descriptive name for the API key (e.g., "Read-Only Key for Audit Records").
  4. Set Permissions

    • Ensure the key has Read-Only permissions.
    • This restricts the API key to fetching data without allowing it to modify any PagerDuty resources.
  5. Save the API Key

    • Once you’ve set the permissions, click Create Key to generate the API key.
    • Copy the API key to a secure location as it will only be displayed once.
  6. Store the API Key Securely

    • Use a secure method to store the key, such as environment variables, a secrets manager, or an encrypted file.
    • Avoid hardcoding the key directly in your source code.
  7. Verify API Access

    • Test the API key by making a simple GET request to the PagerDuty REST API (e.g., the /users or /audit/records endpoint) using tools like Postman or a basic Node.js script.
    • Ensure the API key works as intended and only retrieves data without modification permissions.

Once your API key is ready, you can use it to fetch audit records programmatically via the REST API in your Node.js application.


Node JS script for fetching audit record for a month and save in a CSV file :-


const axios = require('axios');
const fs = require('fs');
const { Parser } = require('json2csv');

// Function to fetch audit records
const fetchAuditRecords = async (token, startDate, endDate) => {
    let allRecords = [];
    let nextCursor = null;

    try {
        do {
            const response = await axios.get('https://api.pagerduty.com/audit/records', {
                headers: {
                    Authorization: `Token token=${token}`,
                    Accept: 'application/vnd.pagerduty+json;version=2',
                },
                params: {
                    since: startDate,
                    until: endDate,
                    limit: 100,
                    cursor: nextCursor,
                },
            });

            console.log(`Fetched ${response.data.records?.length || 0} records from API.`);
            allRecords = allRecords.concat(response.data.records || []);
            nextCursor = response.data.next_cursor || null;
        } while (nextCursor);

        console.log(`Total records fetched: ${allRecords.length}`);
        return allRecords;
    } catch (error) {
        console.error(`Error fetching records: ${error.response?.data?.error || error.message}`);
        return [];
    }
};

// Function to process records dynamically
const processRecords = (allRecords) => {
    return allRecords.map((record) => {
        let changes = 'No changes found';

        if (record.details) {
            const fieldChanges = record.details.fields?.map((field) => {
                const before = field.before_value || 'N/A';
                const current = field.value || 'N/A';
                return `${field.name}: ${before} -> ${current}`;
            });

            const referenceChanges = record.details.references?.map((ref) => {
                const added = ref.added?.map((item) => item.summary).join(', ') || 'N/A';
                const removed = ref.removed?.map((item) => item.summary).join(', ') || 'N/A';
                return `${ref.name}: Added -> [${added}], Removed -> [${removed}]`;
            });

            const allChanges = [...(fieldChanges || []), ...(referenceChanges || [])];
            changes = allChanges.length > 0 ? allChanges.join('; ') : changes;
        }

        return {
            id: record.id,
            action: record.action,
            execution_time: record.execution_time,
            resource_summary: record.root_resource?.summary || 'N/A',
            resource_type: record.root_resource?.type || 'N/A',
            actor_name: record.actors?.map((actor) => actor.summary).join(', ') || 'N/A',
            method_type: record.method?.type || 'N/A',
            changes: changes,
        };
    });
};

// Function to save records to CSV
const saveToCSV = (formattedRecords, outputPath) => {
    if (formattedRecords.length === 0) {
        console.log('No records to save in CSV format.');
        return;
    }

    try {
        const parser = new Parser();
        const csv = parser.parse(formattedRecords);
        fs.writeFileSync(outputPath, csv);
        console.log(`Records saved to CSV at: ${outputPath}`);
    } catch (error) {
        console.error(`Error saving to CSV: ${error.message}`);
    }
};

// Main function
const main = async () => {
    const token = 'add your key'; // Replace with your PagerDuty API token
    const startDate = '2024-11-20T00:00:00Z'; // Specify the start date
    const endDate = '2024-11-22T23:59:59Z'; // Specify the end date
    const outputPath = 'audit_records.csv'; // Output CSV file path

    const allRecords = await fetchAuditRecords(token, startDate, endDate);
    const formattedRecords = processRecords(allRecords);

    console.log('Sample processed records:', formattedRecords.slice(0, 5));
    saveToCSV(formattedRecords, outputPath);
};

// Run the script
main();



This code retrieves audit records from PagerDuty for a specified date range using the REST API and processes them for reporting. Here's a breakdown:

  1. Fetch Audit Records: The fetchAuditRecords function uses axios to send paginated requests to the PagerDuty API and retrieve audit records based on the provided start and end dates.

  2. Process Records: The processRecords function formats the fetched records into a structured format, including details like changes made, actor information, and resource summaries.

  3. Save Records to CSV: The saveToCSV function converts the processed records into CSV format using json2csv and saves the file locally.

  4. Main Execution: The main function coordinates the process by fetching records, formatting them, and saving the results to a CSV file. It also logs sample records for verification.

This script is ideal for exporting and analyzing PagerDuty audit logs programmatically.






















Comments

Popular posts from this blog

The Importance of Monitoring in Modern IT Environments

Working with Moogsoft OnPrem: How to target an auto-ticketing workflow(Service Now Management) to a specific host.