Fetch: performing API requests or fetching data TypeScript only

The Inngest TypeScript SDK provides a step.fetch() API and a fetch() utility, enabling you to make requests to third-party APIs or fetch data in a durable way by offloading them to the Inngest Platform.

For more information on how Fetch works, see the Fetch documentation.

Getting started with step.fetch()

The step.fetch() API enables you to make durable HTTP requests while offloading them to the Inngest Platform, saving you compute and improving reliability:

src/inngest/functions.ts

import { inngest } from "./client";

export const retrieveTextFile = inngest.createFunction(
  { id: "retrieveTextFile" },
  { event: "textFile/retrieve" },
  async ({ step }) => {
    // The fetching of the text file is offloaded to the Inngest Platform
    const response = await step.fetch(
      "https://example-files.online-convert.com/document/txt/example.txt"
    );

    // The Inngest function run is resumed when the HTTP request is complete
    await step.run("extract-text", async () => {
      const text = await response.text();
      const exampleOccurences = text.match(/example/g);
      return exampleOccurences?.length;
    });
  }
);

step.fetch() takes the same arguments as the native fetch API.

Parallelize HTTP requests with step.fetch()

step.fetch() shares all the benefits of step.run(), including the ability to parallelize requests using Promise.all():

const processFiles = inngest.createFunction(
  { id: "process-files", concurrency: 10 },
  { event: "files/process" },
  async ({ step, event }) => {
    // All requests will be offloaded and processed in parallel while matching the concurrency limit
    const responses = await Promise.all(event.data.files.map(async (file) => {
      return step.fetch(`https://api.example.com/files/${file.id}`)
    }))

    // Your Inngest function is resumed here with the responses
    await step.run("process-file", async (file) => {
      const body = await response.json()
      // body.files
    })
  }
)

Note that step.fetch(), like all other step APIs, matches your function's configuration such as concurrency or throttling.

Caching expensive operations like LLM calls

One of the most powerful features of step.fetch() is automatic result caching. This makes it perfect for expensive operations like calling LLMs, where you want to avoid re-running the same costly operation on retries:

import { inngest } from "./client";

export const summarizeWithAI = inngest.createFunction(
  { id: "summarize-with-ai" },
  { event: "article/summarize" },
  async ({ step, event }) => {
    // This LLM call is expensive, but step.fetch() caches the result
    const response = await step.fetch("https://api.anthropic.com/v1/messages", {
      method: "POST",
      headers: { 
        "x-api-key": process.env.ANTHROPIC_API_KEY,
        "anthropic-version": "2023-06-01",
        "content-type": "application/json"
      },
      body: JSON.stringify({
        model: "claude-3-5-sonnet-20241022",
        max_tokens: 1024,
        messages: [{ 
          role: "user", 
          content: `Summarize this article: ${event.data.content}` 
        }]
      })
    });
    
    const summary = await response.json();

    // If this database write fails and triggers a retry,
    // the LLM call above will NOT be re-executed - the cached result is used
    await step.run("save-to-database", async () => {
      await db.articles.update({
        id: event.data.id,
        summary: summary.content[0].text
      });
    });
    
    // Same here - if sending the email fails, the LLM is not called again
    await step.run("send-notification", async () => {
      await sendEmail({
        to: event.data.author,
        subject: "Your article summary is ready",
        body: summary.content[0].text
      });
    });
  }
);

Important: Order-based caching

step.fetch() uses order-based caching, not URL-based caching like Redis or traditional HTTP caches.

  • The 1st step.fetch() call returns the 1st cached result
  • The 2nd step.fetch() call returns the 2nd cached result
  • And so on...

This means the cache is matched by the position in your code, not by the URL or parameters. Keep this in mind when calling the same endpoint multiple times or using dynamic URLs.

Make 3rd party library HTTP requests durable with the fetch() utility

Inngest's fetch() utility can be passed as a custom fetch handler to make all the requests made by a 3rd party library durable.

For example, you can pass the fetch() utility to the AI SDK or the OpenAI libraries:

import { fetch as inngestFetch } from 'inngest';
import { generateText } from 'ai';
import { createAnthropic } from '@ai-sdk/anthropic';

// Pass the Inngest fetch utility to the AI SDK's model constructor:
const anthropic = createAnthropic({
  fetch: inngestFetch,
});

const weatherFunction = inngest.createFunction(
  { id: "weather-function" },
  { event: "weather/get" },
  async ({ step }) => {
    // This request is offloaded to the Inngest platform
    // and it also retries automatically if it fails!
    const response = await generateText({
      model: anthropic('claude-3-5-sonnet-20240620'),
      prompt: `What's the weather in London?`,
    });
  }
)