iTranslated by AI

The content below is an AI-generated translation. This is an experimental feature, and may contain errors. View original article
🚗

Automating Development Pipelines with n8n and Claude Code Action

に公開

Pipeline-izing Development

I've created a flow using n8n and Claude Code Action where AI handles everything from planning an issue to implementation and review.
(I also built this as practice for using n8n.)

By the way, after some fine-tuning, Claude was able to create a functional n8n workflow quite well; I can really feel the difference compared to Dify.

Repository

https://github.com/tesla0225/n8n_claude_code_automation

*Note: This is configured to work minimally in a local environment and is not Prod-Ready. If you intend to use it in production, please review files like init-data.sh accordingly.
If you like it, please give it a star!

How it Works

Since there is a README.md, please check that for how to run it; here, I will just give a rough overview of the mechanism.

  • A GitHub Webhook is set up, and a workflow is built on the n8n side that uses it as a trigger.
  • The workflow is set to comment on issues or PRs according to the labels.
  • It is split into sub-workflows to make it easy to modify.

Below is a brief summary of the flow I built.
n8n operates with the following three flows.

Planning

(Webhook) => Detects change in the planning label => To SubFlow => Comments on that label via n8n in a specified format using @claude to create a PR and have the implementation label attached.

Actual comment content being set:

"@claude このissueをコードベースをもとに実装せず分析して実行計画を立てて、終わったらghコマンドを使用して既存の"workflow:ready-for-impl"ラベルをつけて"

Implementation

(Webhook) => Detects change in the implementation label => To SubFlow => Comments on that label via n8n in a specified format using @claude to create a PR and have the review label attached.

"@claude コメントを含めissue全体を見てからPRを作成して実装して、PRの作成にはghコマンドを使うこと。終わったらghコマンドで作成したPRに既存の"workflow:ready-for-review"のラベルをつけること"

Review

(Webhook) => Detects change in the review label => To SubFlow => Have Claude comment in a specified format via n8n.

"@claude レビューして"
  • Since Claude cannot review a PR created by Claude itself, this is set as a comment.

By slightly modifying these flows, you can customize the automation, such as:

  • Instructing Claude to attach a human-required label when it is not confident in the plan.
  • Setting it up not to attach implementation labels (i.e., only the planning is automatic).
  • Attaching only the review label (only the review is automatic).

Also, by linking with local tools like Claude code or Cursor, you can delegate tasks asynchronously by opening an issue in the local Claude code and attaching the impl label.

The thinking is similar to this article (this one has AI within the flow, which is useful if you want to, for example, distribute tasks among multiple labels or do other things):
https://note.com/gura105/n/n81096d8d9970

Why I didn't complete it using only GitHub Actions

  • To allow flexibility for actions that organizations might want to perform when a label is attached, depending on the organization.
  • There's an intention to be able to change sub-workflows to, for example, a local LLM (in the future), running Claude Code in a container, or switching to a GitHub Copilot Agent.

Also, integrations like Slack can be easily swapped out.
For example:

  • Connecting a webhook that creates an issue from Slack (or Teams) and assigns a label.
  • Sending the planning content to Slack (or Teams).

If you refer to the flow above, it could probably be completed using only GitHub Actions (likely).

About "Pipeline-izing Development"

I have been interested in the pipeline-ization of development for a long time.
https://x.com/tesla0225/status/1907953100062953892

However, the performance wasn't very good before, but now that Claude Code provides a certain level of reliability, I want to believe that this kind of mechanism will become useful.

Also, as something I gave up on didn't do this time, the approach of running Claude Code by executing commands in your own container can also be modified based on this. So, those on the MAX plan can likely assign issues without worry.
(I didn't make the self-hosted runner method the default because I consider it a bit of a gray area.)

Possible Future Feature Additions

  • Adding a sub-workflow that starts in your own container
  • Adding an admin dashboard

I'll think about these while looking at other tools...

Cons

  • There's nothing to be done when an "Overloaded Error" occurs (you have to manually change the label and retry).

Side Note

I feel like the things I thought were tough around here have finally come to fruition.
I used to think that git worktree has a high cognitive load, so I didn't want to do it much.
https://zenn.dev/tesla/articles/3768e558b73ad8
https://zenn.dev/tesla/articles/1c0698a1cb9742

I think I was able to achieve this (though it still needs a bit more tuning).
It feels like it's okay just to glance at GitHub without having to work hard.
https://x.com/ueeeeniki/status/1929747987921875068

At this point, I hope this becomes the fastest and definitive version.

Please follow me if you'd like

I write articles like this about things I've come up with, and I tweet things that don't make it into articles.
https://x.com/tesla0225

Recent popular tweet:
https://x.com/tesla0225/status/1934806508317110334

Opinionated tweet:
https://x.com/tesla0225/status/1933485696989450356

Bonus

A little deeper dive:
https://zenn.dev/tesla/articles/57b9ebc3117d5f

Discussion

Hidden comment