Skip to content

acountyable/recorder

Repository files navigation

name: Extension Build and Publish

on: push: paths: - 'extension/' pull_request: paths: - 'extension/'

jobs: build: runs-on: ubuntu-latest

steps:
  - name: Checkout Repository
    uses: actions/checkout@v4

  - name: Set up Node.js
    uses: actions/setup-node@v3
    with:
      node-version: '20'

  - name: Install Dependencies
    run: npm install

  - name: Build Extension
    run: npm run build:extension

  - name: Upload Extension Build
    uses: actions/upload-artifact@v3
    with:
      name: extension-build
      path: extension/dist/

This repo is intended to organize our Chrome devtools (recorder) extension within the same GitHub repository of our recordings that are executable by GitHub Action workflows.

  • Consolidating these components into a single repository can streamline
  • development, simplify version control, and enhance collaboration. However, it’s essential to structure the repository thoughtfully to maintain clarity and manageability as the project scales.

Table of Contents

  1. Benefits of a Single Repository
  2. Repository Structure
  3. Integrating the Chrome Extension
  4. Setting Up GitHub Actions Workflows
  5. Handling Secrets and Configuration
  6. Publishing the Chrome Extension
  7. Best Practices
  8. Example Repository Structure
  9. Conclusion

1. Benefits of a Single Repository

Centralized Management:

  • Unified Version Control: Track changes to both the extension and scraper recordings in one place.
  • Simplified Collaboration: Team members can access all project components without navigating multiple repositories.

Consistent Configuration:

  • Shared Configuration Files: Utilize shared configurations (e.g., ESLint, Prettier) across the extension and scraper scripts.
  • Simplified Dependency Management: Manage dependencies for both components within a single package.json.

Streamlined CI/CD:

  • Integrated Workflows: Create GitHub Actions workflows that can operate on both the extension and scraper recordings.
  • Consistent Environment Setup: Ensure consistent development and build environments across all project parts.

2. Repository Structure

Organizing your repository logically is crucial for maintainability and scalability. Here's a recommended structure:

recorder/
├── .github/
│   └── workflows/
│       ├── extension-build.yml
│       └── scraper-workflow.yml
├── extension/
│   ├── manifest.json
│   ├── devtools.html
│   ├── background.js
│   ├── scripts/
│   │   ├── recorderPlugin.js
│   │   ├── githubSync.js
│   │   └── ui.js
│   ├── popup.html
│   ├── options.html
│   ├── icons/
│   │   ├── icon16.png
│   │   ├── icon48.png
│   │   └── icon128.png
│   └── README.md
├── recordings/
│   ├── recording1.json
│   ├── recording2.json
│   └── ...
├── scripts/
│   ├── run-scraper.js
│   └── helpers.js
├── results/
│   ├── result1.json
│   ├── result2.json
│   └── ...
├── package.json
├── package-lock.json
├── README.md
└── ... (other project files)

Explanation of Key Directories:

  • .github/workflows/: Contains GitHub Actions workflow YAML files for CI/CD processes related to both the extension and scraper.

  • extension/: Houses all Chrome extension-related files, including scripts, HTML pages, and assets.

  • recordings/: Stores scraper recordings generated by the Chrome Recorder extension.

  • scripts/: Contains utility scripts, such as the scraper execution script (run-scraper.js) and helper functions.

  • results/: Stores output from the scraper, such as JSON data, screenshots, or logs.


3. Integrating the Chrome Extension

a. Directory Placement

Place the Chrome extension within its own directory (extension/) to encapsulate all related files. This separation ensures that extension-specific code doesn't clutter other parts of the repository and facilitates easier maintenance.

b. Managing Dependencies

If your extension or scraper scripts require Node.js dependencies (e.g., puppeteer), manage them centrally via the root package.json. Alternatively, you can create separate package.json files within subdirectories for more granular dependency management.

Option 1: Single package.json

{
  "name": "chrome-recorder-sync",
  "version": "1.0.0",
  "scripts": {
    "build:extension": "webpack --config extension/webpack.config.js",
    "run:scraper": "node scripts/run-scraper.js"
  },
  "dependencies": {
    "puppeteer": "^19.0.0"
  },
  "devDependencies": {
    "webpack": "^5.0.0",
    "webpack-cli": "^4.0.0"
  }
}

Option 2: Multiple package.json Files

recorder/
├── extension/
│   ├── package.json
│   └── ...
├── scripts/
│   ├── package.json
│   └── ...
└── package.json (optional)

Recommendation: For simplicity, especially in smaller projects, use a single package.json at the root. For larger projects with distinct dependencies, consider multiple package.json files.

c. Development and Build Processes

1. Building the Extension:

Use build tools like Webpack or Parcel to bundle your extension scripts, especially if you're using modern JavaScript or TypeScript.

Example webpack.config.js for Extension:

// extension/webpack.config.js

const path = require("path");

module.exports = {
  entry: {
    background: "./background.js",
    recorderPlugin: "./scripts/recorderPlugin.js",
    githubSync: "./scripts/githubSync.js",
    ui: "./scripts/ui.js",
  },
  output: {
    path: path.resolve(__dirname, "dist"),
    filename: "[name].bundle.js",
  },
  mode: "production",
};

2. Running Scripts:

Use npm scripts to handle tasks like running the scraper.

Example package.json Scripts Section:

"scripts": {
  "build:extension": "webpack --config extension/webpack.config.js",
  "run:scraper": "node scripts/run-scraper.js"
}

3. Live Reloading and Development:

Consider using webpack-dev-server or similar tools for a better developmentexperience with live reloading.


4. Setting Up GitHub Actions Workflows

With both the Chrome extension and scraper in the same repository, you can define multiple workflows targeting different tasks.

a. Workflow for Extension Development

Create a workflow to build, test, and possibly publish your Chrome extension.

Example extension-build.yml:

name: Extension Build and Publish

on:
  push:
    paths:
      - 'extension/**'
  pull_request:
    paths:
      - 'extension/**'

jobs:
  build:
    runs-on: ubuntu-latest

    steps:
      - name: Checkout Repository
        uses: actions/checkout@v3

      - name: Set up Node.js
        uses: actions/setup-node@v3
        with:
          node-version: '20'

      - name: Install Dependencies
        run: npm install

      - name: Build Extension
        run: npm run build:extension

      - name: Upload Extension Build
        uses: actions/upload-artifact@v3
        with:
          name: extension-build
          path: extension/dist/

Optional: Publishing the Extension

If you intend to automate the publishing process to the Chrome Web Store, you'll need to integrate with Chrome's API using the Chrome Web Store Publish Action or similar.

b. Workflow for Scraper Recordings

Create a workflow that triggers on changes to the recordings/ directory to runthe scraper.

Example scraper-workflow.yml:

name: Runner

on:
  push:
    paths:
      - 'recordings/**'
  workflow_dispatch:

jobs:
  run-scraper:
    runs-on: ubuntu-latest

    steps:
      - name: Checkout Repository
        uses: actions/checkout@v3

      - name: Set up Node.js
        uses: actions/setup-node@v3
        with:
          node-version: '18'

      - name: Install Dependencies
        run: npm install

      - name: Run Scraper
        run: npm run run:scraper

      - name: Upload Scraper Results
        uses: actions/upload-artifact@v3
        with:
          name: scraper-results
          path: results/

Explanation:

  • Triggers: Runs on pushes to the recordings/ directory or can be manually triggered.
  • Jobs:
    • Checkout: Clones the repository.
    • Setup: Configures Node.js environment.
    • Install: Installs necessary dependencies.
    • Run Scraper: Executes the scraper script.
    • Upload Results: Saves the scraper output as artifacts for later inspection.

5. Handling Secrets and Configuration

Managing sensitive information securely is paramount, especially when interacting with the GitHub API.

a. GitHub Secrets

Use GitHub Secrets to store sensitive data like GitHub Personal Access Tokens (PATs), which your extension and workflows will use to authenticate with the GitHub API.

  1. Add Secrets to GitHub Repository:

    • Navigate to your repository on GitHub.
    • Go to Settings > Secrets and variables > Actions.
    • Click "New repository secret" and add the following:
      • GITHUB_TOKEN (if not using the default)
      • EXTENSION_GITHUB_TOKEN (for extension-specific operations)
      • Any other necessary tokens or keys.
  2. Access Secrets in Workflows:

    Reference secrets in your workflow YAML files using ${{ secrets.SECRET_NAME }}.

    Example:

    - name: Run Scraper
      run: node scripts/run-scraper.js
      env:
        GITHUB_TOKEN: ${{ secrets.EXTENSION_GITHUB_TOKEN }}

b. Environment Variables in Extension

To securely pass tokens from the extension to GitHub, consider the following:

  • User Input: Prompt users to input their GitHub PAT via the extension's UI and store it using chrome.storage.sync.

  • Avoid Hardcoding: Do not embed secrets directly into the extension code. Always retrieve them securely at runtime.

Example in githubSync.js:

// scripts/githubSync.js

class GitHubSync {
  constructor() {
    chrome.storage.sync.get(
      ["githubToken", "repoOwner", "repoName"],
      (result) => {
        this.githubToken = result.githubToken;
        this.repoOwner = result.repoOwner;
        this.repoName = result.repoName;
      },
    );
  }

  // ... existing methods
}

User Interface to Set Credentials (popup.html and ui.js):

Ensure users can input and save their credentials securely.


6. Publishing the Chrome Extension

After development and testing, you may want to publish your Chrome extension to the Chrome Web Store for broader use.

a. Packaging the Extension

  1. Build the Extension:

    Run your build script to generate the final extension files.

    npm run build:extension
  2. Compress the Extension:

    Navigate to the extension/dist/ directory and zip the contents.

    cd extension/dist/
    zip -r ../extension.zip .

b. Publishing to Chrome Web Store

  1. Create a Developer Account:

  2. Submit the Extension:

    • Click "Add new item" and upload the extension.zip file.
    • Fill in the required details like description, screenshots, and icons.
    • Set up pricing and distribution options.
  3. Automate Publishing (Optional):

    For advanced setups, use GitHub Actions to automate the publishing process using tools like Chrome Web Store Publish Action.

Example Workflow Step:

- name: Publish to Chrome Web Store
  uses: ncipollo/release-action@v1
  with:
    api-key: ${{ secrets.CHROME_WEBSTORE_API_KEY }}
    client-id: ${{ secrets.CHROME_WEBSTORE_CLIENT_ID }}
    client-secret: ${{ secrets.CHROME_WEBSTORE_CLIENT_SECRET }}
    extension-id: your-extension-id
    zip-file: extension.zip

Note: Refer to the specific action's documentation for accurate implementation.


7. Best Practices

a. Modular Code Design

  • Separation of Concerns: Keep extension logic separate from scraper scripts.
  • Reusable Components: Develop utility functions that can be shared across different parts of the project.

b. Comprehensive Documentation

  • README.md: Provide clear instructions on setting up, configuring, and using the extension and scraper.
  • Code Comments: Add meaningful comments to explain complex logic or workflows.
  • Contribution Guidelines: If collaborating, outline how others can contribute to the project.

c. Version Control Discipline

  • Branching Strategy: Use branches like main, development, and feature-specific branches.
  • Commit Messages: Write descriptive commit messages to track changes effectively.
  • Pull Requests: Use PRs for code reviews and to maintain code quality.

d. Testing

  • Unit Tests: Implement tests for critical functions, especially those interacting with APIs.
  • Integration Tests: Test the end-to-end flow from recording to synchronization and scraper execution.
  • Manual Testing: Regularly test the extension in Chrome to ensure functionality.

e. Error Handling and Logging

  • Graceful Failures: Handle API errors, network issues, and unexpected inputs gracefully.
  • User Feedback: Inform users of successes, failures, and required actions via the extension's UI.
  • Logging: Maintain logs for both the extension and scraper scripts to facilitate debugging.

f. Security Measures

  • Secure Storage: Store tokens and sensitive data securely using Chrome's storage APIs.
  • Input Validation: Validate and sanitize all inputs to prevent injection attacks.
  • Least Privilege: Grant only necessary permissions to the extension and workflows.

8. Example Repository Structure

Here’s an expanded view of how your repository might look with the Chrome extension integrated alongside scraper recordings and GitHub Actions workflows:

chrome-recorder-sync/
├── .github/
│   └── workflows/
│       ├── extension-build.yml
│       └── scraper-workflow.yml
├── extension/
│   ├── dist/
│   │   ├── background.bundle.js
│   │   ├── recorderPlugin.bundle.js
│   │   ├── githubSync.bundle.js
│   │   ├── ui.bundle.js
│   │   ├── popup.html
│   │   ├── options.html
│   │   └── icons/
│   │       ├── icon16.png
│   │       ├── icon48.png
│   │       └── icon128.png
│   ├── src/
│   │   ├── background.js
│   │   ├── scripts/
│   │   │   ├── recorderPlugin.js
│   │   │   ├── githubSync.js
│   │   │   └── ui.js
│   │   ├── popup.html
│   │   ├── options.html
│   │   └── icons/
│   │       ├── icon16.png
│   │       ├── icon48.png
│   │       └── icon128.png
│   ├── manifest.json
│   ├── webpack.config.js
│   └── README.md
├── recordings/
│   ├── recording1.json
│   ├── recording2.json
│   └── ...
├── scripts/
│   ├── run-scraper.js
│   ├── helpers.js
│   └── package.json (optional)
├── results/
│   ├── result1.json
│   ├── result2.json
│   └── ...
├── package.json
├── package-lock.json
├── README.md
└── ... (other project files)

Highlights:

  • extension/src/: Contains source files for the extension, which are bundled into dist/ using build tools.

  • recordings/: Centralized location for all scraper recordings.

  • scripts/: Houses scraper execution scripts and utilities.

  • results/: Stores outputs from the scraper runs.

  • package.json: Manages dependencies and scripts for both the extension and scraper.

  • .github/workflows/: Contains separate workflows for building the extension and running the scraper based on different triggers.


9. Conclusion

Integrating your Chrome extension with scraper recordings and GitHub Actions within a single repository is not only feasible but also beneficial for maintaining a cohesive development environment. By following the structured approach outlined above, you can ensure that your project remains organized, scalable, and secure.

Key Takeaways:

  • Organize Logically: Separate different project components into distinct directories to maintain clarity.

  • Manage Dependencies Effectively: Use centralized or modular dependency management based on project size and complexity.

  • Automate with GitHub Actions: Leverage GitHub Actions to build, test, and deploy both your extension and scraper workflows seamlessly.

  • Prioritize Security: Handle all sensitive data with care, using secure storage and minimizing exposure.

  • Document Thoroughly: Maintain comprehensive documentation to facilitate collaboration and future maintenance.

By consolidating your extension and scraper functionalities within a single repository, you create an efficient workflow that simplifies development, testing, and deployment processes. This setup enhances productivity and ensures that all project components evolve in harmony, leveraging the full potential of GitHub’s ecosystem.

About

Centralized Recorder Things like Schemas

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published