iTranslated by AI
[Completely Free] Building a Blog System with Remix and Cloudflare Pages
I am writing a blog using the same system here.
- Source code
https://github.com/SoraKumo001/cloud-blog
Environment Configuration
Infrastructure
| Service | Content |
|---|---|
| Supabase | Database |
| Firebase | Storage Auth |
| Cloudflare Pages | Front(Remix) & Backend(GraphQL yoga) |
| Cloudflare Workers | Image optimization OGP generation PrismaQueryEngine |
Everything is available for free. Each service has a generous free tier, and since the final output is delivered via the Cloudflare CDN cache (which can be used without limits), it can handle tens of thousands of accesses per day for free.
Main Packages Used
| Package | Description |
|---|---|
| Prisma | ORM for Database |
| Graphql Yoga | GraphQL Server |
| Pothos GraphQL | GraphQL Framework |
| Remix | React framework that works well with Cloudflare |
| Urql | GraphQL client |
It was built with the above configuration.
Since I wanted to integrate the backend and frontend builds, the GraphQL server is hosted within Remix. What's amazing about this setup is that it takes about 17 seconds from running a command in the local environment to build and deploy to Cloudflare Pages.
Reasons for Migrating from Next.js to Remix
I was originally using Next.js as the React framework. However, since I wanted to use Cloudflare Pages—which allows commercial use and has generous limits—as my infrastructure, I decided to migrate to Remix, which is highly compatible with it.
Next.js does run on Cloudflare Pages, but when I applied Edge settings to my blog system, a mysterious build error occurred. For some reason, the build would pass if I reduced the number of files, regardless of the content. It wasn't a capacity issue, and since it was ultimately unsolvable, I gave up on using Next.js.
Cloudflare Pages/Workers Limitations to Know in Advance
| Item | Limit |
|---|---|
| Requests | 100,000/day 1,000/min |
| Memory | 128MB |
| CPU | 10ms From my experience, it seems to allow some flexibility as long as you don't exceed it continuously. |
| External Fetch | Max concurrency: 6 50 times per access |
| Internal Services | 1,000 times per access |
| Body Size | 100MB |
| Worker Size | 1MB (calculated based on compressed zip size) |
For image optimization, converting images around 1600*1600 has a high probability of hitting CPU limits. Also, when writing code that calls external services, you need to be careful to avoid deadlocks during concurrent access.
Also, the Worker size is the size of the JavaScript and wasm running in the backend when compressed. Files distributed as assets in Pages are not included in this calculation. If you can't manage this size, your only choice is to separate the Workers.
Required Features
Image Optimization Feature
It's not strictly necessary, but it's recommended because it makes the pages lighter. Cloudflare has a paid image optimization service, but since the premise is for it to be free, I rejected that option.
You don't have to build image conversion from scratch; libraries are readily available in the world. Most such libraries are built in C, so it's common knowledge that you can use them from JavaScript by converting them to wasm. All you need to build yourself is the part that combines and patches them together.
OGP Image Generation Feature
In Next.js, Vercel provides a library, so it's easy to implement. To do the same on Cloudflare, you need to re-implement similar functionality yourself due to differences in specifications like wasm loading.
Prisma QueryEngine
When using an Edge Runtime like Cloudflare with Prisma, the QueryEngine implemented as a native binary cannot be used. You must use the wasm version, but its size is around 900KB even when compressed. Since the maximum size for the free version of Cloudflare is 1MB compressed, just including the engine leaves almost no room for anything else. The solution is to separate the QueryEngine from the part that calls it.
Prisma provides a service called Prisma Accelerate, which has built-in functionality to offload the QueryEngine to an external service. While Prisma Accelerate can be used for free, the free tier is limited, so this part can become a bottleneck as traffic increases.
In other words, I had to implement this functionality myself as well. So, I built it. If run locally, it provides access to the local database, making debugging easier, and if placed on Cloudflare Workers, it acts as a replacement for Prisma Accelerate.
Migration to Remix
It took some effort to prepare, but I can finally migrate to Remix.
Modifying fetch
First, we apply a workaround to fetch in server.ts, which is called when Remix starts. First, if Prisma's data-proxy feature tries to access the local proxy (127.0.0.1), it converts https to http. Since Prisma's data-proxy has https hardcoded and it was not possible to change the settings from the outside, this countermeasure is necessary. Additionally, when access to DATABASE_URL is detected during production, it switches fetch to the one for Service Bindings. This connects the interaction between Workers via Cloudflare's internal network, making it somewhat faster.
- server.ts
import { logDevReady } from "@remix-run/cloudflare";
import { createPagesFunctionHandler } from "@remix-run/cloudflare-pages";
import * as build from "@remix-run/dev/server-build";
if (process.env.NODE_ENV === "development") {
logDevReady(build);
}
type Env = {
prisma: Fetcher;
DATABASE_URL: string;
};
const initFetch = (env: Env) => {
const that = globalThis as typeof globalThis & { originFetch?: typeof fetch };
if (that.originFetch) return;
const originFetch = globalThis.fetch;
that.originFetch = originFetch;
globalThis.fetch = (async (input: RequestInfo, init?: RequestInit) => {
const url = new URL(input.toString());
if (["127.0.0.1", "localhost"].includes(url.hostname)) {
url.protocol = "http:";
return originFetch(url.toString(), init);
}
const databaseURL = new URL(env.DATABASE_URL as string);
if (url.hostname === databaseURL.hostname && env.prisma) {
return (env.prisma as Fetcher).fetch(input, init);
}
return originFetch(input, init);
}) as typeof fetch;
};
export const onRequest = createPagesFunctionHandler({
build,
getLoadContext: (context) => {
initFetch(context.env);
return context;
},
mode: build.mode,
});
Backend
Creating a GraphQL Server
This is a combination of GraphQL Yoga + Pothos GraphQL + pothos-query-generator.
pothos-query-generator generates a GraphQL schema automatically by referencing the Prisma schema information. I also created this tool previously. Using it eliminates the need to write resolvers manually, except for some system-dependent logic such as login.
- app/routes/api.graphql.ts
import { ActionFunctionArgs, LoaderFunctionArgs } from "@remix-run/cloudflare";
import { parse, serialize } from "cookie";
import { createYoga } from "graphql-yoga";
import { getUserFromToken } from "@/libs/client/getUserFromToken";
import { Context, getPrisma } from "../libs/server/context";
import { schema } from "../libs/server/schema";
const yoga = createYoga<
{
request: Request;
env: { [key: string]: string };
responseCookies: string[];
},
Context
>({
schema: schema(),
fetchAPI: { Response },
context: async ({ request: req, env, responseCookies }) => {
const cookies = parse(req.headers.get("Cookie") || "");
const token = cookies["auth-token"];
const user = await getUserFromToken({ token, secret: env.SECRET_KEY });
const setCookie: typeof serialize = (name, value, options) => {
const result = serialize(name, value, options);
responseCookies.push(result);
return result;
};
return {
req,
env,
prisma: getPrisma(env.DATABASE_URL),
user,
cookies,
setCookie,
};
},
});
export async function action({ request, context }: ActionFunctionArgs) {
const env = context.env as { [key: string]: string };
const responseCookies: string[] = [];
const response = await yoga.handleRequest(request, {
request,
env,
responseCookies,
});
responseCookies.forEach((v) => {
response.headers.append("set-cookie", v);
});
return new Response(response.body, response);
}
export async function loader({ request, context }: LoaderFunctionArgs) {
const env = context.env as { [key: string]: string };
const responseCookies: string[] = [];
const response = await yoga.handleRequest(request, {
request,
env,
responseCookies,
});
responseCookies.forEach((v) => {
response.headers.append("set-cookie", v);
});
return new Response(response.body, response);
}
- app/libs/server/builder.ts
This is the creation process for Pothos.
import SchemaBuilder from "@pothos/core";
import PrismaPlugin from "@pothos/plugin-prisma";
import PrismaUtils from "@pothos/plugin-prisma-utils";
import { PrismaClient } from "@prisma/client/edge";
import PothosPrismaGeneratorPlugin from "pothos-prisma-generator";
import PrismaTypes from "@/generated/pothos-types";
import { Context } from "./context";
/**
* Create a new schema builder instance
*/
type BuilderType = {
PrismaTypes: PrismaTypes;
Scalars: {
Upload: {
Input: File;
Output: File;
};
};
Context: Context;
};
export const createBuilder = (datasourceUrl: string) => {
const builder = new SchemaBuilder<BuilderType>({
plugins: [PrismaPlugin, PrismaUtils, PothosPrismaGeneratorPlugin],
prisma: {
client: new PrismaClient({
datasourceUrl,
}),
},
pothosPrismaGenerator: {
authority: ({ context }) => (context.user ? ["USER"] : []),
replace: { "%%USER%%": ({ context }) => context.user?.id },
},
});
return builder;
};
- app/libs/server/schema.ts
Here, additional resolvers for login and file upload processing to Firebase are being added to the schema.
import { GraphQLScalarType, GraphQLSchema } from "graphql";
import { SignJWT } from "jose";
import { createBuilder } from "./builder";
import { prisma } from "./context";
import { getUser } from "./getUser";
import { getUserInfo } from "./getUserInfo";
import { importFile } from "./importFile";
import { normalizationPostFiles } from "./normalizationPostFiles";
import { isolatedFiles, uploadFile } from "./uploadFile";
export const schema = () => {
let schema: GraphQLSchema;
return ({ env }: { env: { [key: string]: string | undefined } }) => {
if (!schema) {
const builder = createBuilder(env.DATABASE_URL ?? "");
builder.mutationType({
fields: (t) => ({
signIn: t.prismaField({
args: { token: t.arg({ type: "String" }) },
type: "User",
nullable: true,
resolve: async (_query, _root, { token }, { setCookie }) => {
const userInfo =
typeof token === "string"
? await getUserInfo(env.NEXT_PUBLIC_projectId, token)
: undefined;
if (!userInfo) {
setCookie("auth-token", "", {
httpOnly: true,
secure: env.NODE_ENV !== "development",
sameSite: "strict",
path: "/",
maxAge: 0,
domain: undefined,
});
return null;
}
const user = await getUser(prisma, userInfo.name, userInfo.email);
if (user) {
const secret = env.SECRET_KEY;
if (!secret) throw new Error("SECRET_KEY is not defined");
const token = await new SignJWT({ payload: { user: user } })
.setProtectedHeader({ alg: "HS256" })
.sign(new TextEncoder().encode(secret));
setCookie("auth-token", token, {
httpOnly: true,
secure: env.NODE_ENV !== "development",
maxAge: 1000 * 60 * 60 * 24 * 7,
sameSite: "strict",
path: "/",
domain: undefined,
});
}
return user;
},
}),
uploadSystemIcon: t.prismaField({
type: "FireStore",
args: {
file: t.arg({ type: "Upload", required: true }),
},
resolve: async (_query, _root, { file }, { prisma, user }) => {
if (!user) throw new Error("Unauthorized");
const firestore = await uploadFile({
projectId: env.GOOGLE_PROJECT_ID ?? "",
clientEmail: env.GOOGLE_CLIENT_EMAIL ?? "",
privateKey: env.GOOGLE_PRIVATE_KEY ?? "",
binary: file,
});
const system = await prisma.system.update({
select: { icon: true },
data: {
iconId: firestore.id,
},
where: { id: "system" },
});
await isolatedFiles({
projectId: env.GOOGLE_PROJECT_ID ?? "",
clientEmail: env.GOOGLE_CLIENT_EMAIL ?? "",
privateKey: env.GOOGLE_PRIVATE_KEY ?? "",
});
if (!system.icon) throw new Error("icon is not found");
return system.icon;
},
}),
uploadPostIcon: t.prismaField({
type: "FireStore",
args: {
postId: t.arg({ type: "String", required: true }),
file: t.arg({ type: "Upload" }),
},
resolve: async (
_query,
_root,
{ postId, file },
{ prisma, user }
) => {
if (!user) throw new Error("Unauthorized");
if (!file) {
const firestore = await prisma.post
.findUniqueOrThrow({
select: { card: true },
where: { id: postId },
})
.card();
if (!firestore) throw new Error("firestore is not found");
await prisma.fireStore.delete({
where: { id: firestore.id },
});
return firestore;
}
const firestore = await uploadFile({
projectId: env.GOOGLE_PROJECT_ID ?? "",
clientEmail: env.GOOGLE_CLIENT_EMAIL ?? "",
privateKey: env.GOOGLE_PRIVATE_KEY ?? "",
binary: file,
});
const post = await prisma.post.update({
select: { card: true },
data: {
cardId: firestore.id,
},
where: { id: postId },
});
await isolatedFiles({
projectId: env.GOOGLE_PROJECT_ID ?? "",
clientEmail: env.GOOGLE_CLIENT_EMAIL ?? "",
privateKey: env.GOOGLE_PRIVATE_KEY ?? "",
});
if (!post.card) throw new Error("card is not found");
return post.card;
},
}),
uploadPostImage: t.prismaField({
type: "FireStore",
args: {
postId: t.arg({ type: "String", required: true }),
file: t.arg({ type: "Upload", required: true }),
},
resolve: async (
_query,
_root,
{ postId, file },
{ prisma, user }
) => {
if (!user) throw new Error("Unauthorized");
const firestore = await uploadFile({
projectId: env.GOOGLE_PROJECT_ID ?? "",
clientEmail: env.GOOGLE_CLIENT_EMAIL ?? "",
privateKey: env.GOOGLE_PRIVATE_KEY ?? "",
binary: file,
});
await prisma.post.update({
data: {
postFiles: { connect: { id: firestore.id } },
},
where: { id: postId },
});
return firestore;
},
}),
normalizationPostFiles: t.boolean({
args: {
postId: t.arg({ type: "String", required: true }),
removeAll: t.arg({ type: "Boolean" }),
},
resolve: async (_root, { postId, removeAll }, { prisma, user }) => {
if (!user) throw new Error("Unauthorized");
await normalizationPostFiles(prisma, postId, removeAll === true, {
projectId: env.GOOGLE_PROJECT_ID ?? "",
clientEmail: env.GOOGLE_CLIENT_EMAIL ?? "",
privateKey: env.GOOGLE_PRIVATE_KEY ?? "",
});
await isolatedFiles({
projectId: env.GOOGLE_PROJECT_ID ?? "",
clientEmail: env.GOOGLE_CLIENT_EMAIL ?? "",
privateKey: env.GOOGLE_PRIVATE_KEY ?? "",
});
return true;
},
}),
restore: t.boolean({
args: {
file: t.arg({ type: "Upload", required: true }),
},
resolve: async (_root, { file }, { user }) => {
if (!user) throw new Error("Unauthorized");
importFile({
file: await file.text(),
projectId: env.GOOGLE_PROJECT_ID ?? "",
clientEmail: env.GOOGLE_CLIENT_EMAIL ?? "",
privateKey: env.GOOGLE_PRIVATE_KEY ?? "",
});
return true;
},
}),
}),
});
const Upload = new GraphQLScalarType({
name: "Upload",
});
builder.addScalarType("Upload", Upload, {});
schema = builder.toSchema({ sortSchema: false });
}
return schema;
};
};
Frontend
Handling Environment Variables and Session Processing
In Cloudflare Pages, environment variables are passed when a connection request is received from the client. This means that environment variables are not available until a connection request occurs. While the Remix documentation suggests receiving them in the loader of each routing page, processing them in handleRequest allows you to handle it all at once.
Here, I've implemented a behavior similar to getInitialProps in Next.js, distributing session data and the familiar NEXT_PUBLIC_* environment variables to components through Context. One thing to note is that the values distributed here are only valid during server-side rendering. Since they disappear in client-side components, additional logic is required elsewhere.
- app/entry.server.tsx
/**
* By default, Remix will handle generating the HTTP Response for you.
* You are free to delete this file if you'd like to, but if you ever want it revealed again, you can run `npx remix reveal` ✨
* For more information, see https://remix.run/file-conventions/entry.server
*/
import { RemixServer } from "@remix-run/react";
import { renderToReadableStream } from "react-dom/server";
import { getUserFromToken } from "./libs/client/getUserFromToken";
import { getHost } from "./libs/server/getHost";
import { RootProvider } from "./libs/server/RootContext";
import type { AppLoadContext, EntryContext } from "@remix-run/cloudflare";
export default async function handleRequest(
request: Request,
responseStatusCode: number,
responseHeaders: Headers,
remixContext: EntryContext,
// This is ignored so we can keep it in the template for visibility. Feel
// free to delete this parameter in your app if you're not using it!
// eslint-disable-next-line @typescript-eslint/no-unused-vars
loadContext: AppLoadContext
) {
const rootValue = await getInitialProps(request, loadContext);
const body = await renderToReadableStream(
<RootProvider value={rootValue}>
<RemixServer context={remixContext} url={request.url} />
</RootProvider>,
{
signal: request.signal,
onError(error: unknown) {
// Log streaming rendering errors from inside the shell
console.error(error);
responseStatusCode = 500;
},
}
);
await body.allReady;
responseHeaders.set("Content-Type", "text/html");
return new Response(body, {
headers: responseHeaders,
status: responseStatusCode,
});
}
const getInitialProps = async (
request: Request,
loadContext: AppLoadContext
) => {
const env = loadContext.env as Record<string, string>;
const cookie = request.headers.get("cookie");
const cookies = Object.fromEntries(
cookie?.split(";").map((v) => v.trim().split("=")) ?? []
);
const token = cookies["auth-token"];
const session = await getUserFromToken({ token, secret: env.SECRET_KEY });
const host = getHost(request);
return {
cookie: String(cookie),
host,
session: session && { name: session.name, email: session.email },
env: Object.fromEntries(
Object.entries(env).filter(([v]) => v.startsWith("NEXT_PUBLIC_"))
),
};
};
Common Server/Client Processing
Data passed in entry.server.tsx is received and embedded during the initial HTML rendering. The client side then receives and processes this embedded data. This allows session information and environment variables generated on the server to be handled on the client as well.
I use Urql as the GraphQL client. Within this, I use https://www.npmjs.com/package/@react-libraries/next-exchange-ssr for SSR. By integrating this, Urql queries in components are automatically server-side rendered just by using them normally. There is no need to write loader and useLoadData for every page.
import { NextSSRWait } from "@react-libraries/next-exchange-ssr";
import { cssBundleHref } from "@remix-run/css-bundle";
import {
Links,
LiveReload,
Meta,
Outlet,
Scripts,
ScrollRestoration,
} from "@remix-run/react";
import stylesheet from "@/tailwind.css";
import { GoogleAnalytics } from "./components/Commons/GoogleAnalytics";
import { HeadProvider, HeadRoot } from "./components/Commons/Head";
import { EnvProvider } from "./components/Provider/EnvProvider";
import { UrqlProvider } from "./components/Provider/UrqlProvider";
import { Header } from "./components/System/Header";
import { LoadingContainer } from "./components/System/LoadingContainer";
import { NotificationContainer } from "./components/System/Notification/NotificationContainer";
import { StoreProvider } from "./libs/client/context";
import { RootValue, useRootContext } from "./libs/server/RootContext";
import type { LinksFunction } from "@remix-run/cloudflare";
export const links: LinksFunction = () => [
{ rel: "stylesheet", href: stylesheet },
...(cssBundleHref ? [{ rel: "stylesheet", href: cssBundleHref }] : []),
];
export default function App() {
const value = useRootContext();
const { host, session, cookie, env } = value;
return (
<html lang="ja">
<EnvProvider value={env}>
<StoreProvider initState={() => ({ host, user: session })}>
<UrqlProvider host={host} cookie={cookie}>
<HeadProvider>
<head>
<meta charSet="utf-8" />
<meta
name="viewport"
content="width=device-width, initial-scale=1"
/>
<link rel="preconnect" href="https://fonts.googleapis.com" />
<link
rel="preconnect"
href="https://fonts.gstatic.com"
crossOrigin="anonymous"
/>
<Meta />
<Links />
<GoogleAnalytics />
<NextSSRWait>
<HeadRoot />
</NextSSRWait>
<RootValue value={{ session, env }} />
</head>
<body>
<div className={"flex h-screen flex-col"}>
<Header />
<main className="relative flex-1 overflow-hidden">
<Outlet />
</main>
<LoadingContainer />
<NotificationContainer />
</div>
<ScrollRestoration />
<Scripts />
<LiveReload />
</body>
</HeadProvider>
</UrqlProvider>
</StoreProvider>
</EnvProvider>
</html>
);
}
Head Information Insertion
To embed information such as titles and OGP data, it is necessary to insert information into the head. In Remix, you are supposed to use the meta function, but what I want is a feature equivalent to next/head used in Next.js Pages, where you can set information from within a component.
So, I'll make it quickly.
- app/components/Commons/Head/index.tsx
import React from "react";
import {
FC,
ReactNode,
createContext,
useContext,
useEffect,
useRef,
useSyncExternalStore,
} from "react";
const DATA_NAME = "__HEAD_VALUE__";
export type ContextType<T = ReactNode[]> = {
state: T;
storeChanges: Set<() => void>;
dispatch: (callback: (state: T) => T) => void;
subscribe: (onStoreChange: () => void) => () => void;
};
export const useCreateHeadContext = <T,>(initState: () => T) => {
const context = useRef<ContextType<T>>({
state: initState(),
storeChanges: new Set(),
dispatch: (callback) => {
context.state = callback(context.state);
context.storeChanges.forEach((storeChange) => storeChange());
},
subscribe: (onStoreChange) => {
context.storeChanges.add(onStoreChange);
return () => {
context.storeChanges.delete(onStoreChange);
};
},
}).current;
return context;
};
const HeadContext = createContext<
ContextType<{ type: string; props: Record<string, unknown> }[][]>
>(undefined as never);
export const HeadProvider = ({ children }: { children: ReactNode }) => {
const context = useCreateHeadContext<
{ type: string; props: Record<string, unknown> }[][]
>(() => {
if (typeof window !== "undefined") {
return [
JSON.parse(
document.querySelector(`script#${DATA_NAME}`)?.textContent ?? "{}"
),
];
}
return [[]];
});
return (
<HeadContext.Provider value={context}>{children}</HeadContext.Provider>
);
};
export const HeadRoot: FC = () => {
const context = useContext(HeadContext);
const state = useSyncExternalStore(
context.subscribe,
() => context.state,
() => context.state
);
useEffect(() => {
context.dispatch(() => {
return [];
});
}, [context]);
const heads = state.flat();
return (
<>
<script
id={DATA_NAME}
type="application/json"
dangerouslySetInnerHTML={{
__html: JSON.stringify(heads).replace(/</g, "\\u003c"),
}}
/>
{heads.map(({ type: Tag, props }, index) => (
<Tag key={`HEAD${Tag}${index}`} {...props} />
))}
</>
);
};
export const Head: FC<{ children: ReactNode }> = ({ children }) => {
const context = useContext(HeadContext);
useEffect(() => {
const value = extractInfoFromChildren(children);
context.dispatch((heads) => [...heads, value]);
return () => {
context.dispatch((heads) => heads.filter((head) => head !== value));
};
}, [children, context]);
if (typeof window === "undefined") {
context.dispatch((heads) => [...heads, extractInfoFromChildren(children)]);
}
return null;
};
const extractInfoFromChildren = (
children: ReactNode
): { type: string; props: Record<string, unknown> }[] =>
React.Children.toArray(children).flatMap((child) => {
if (React.isValidElement(child)) {
if (child.type === React.Fragment) {
return extractInfoFromChildren(child.props.children);
}
if (typeof child.type === "string") {
return [{ type: child.type, props: child.props }];
}
}
return [];
});
HeadRoot is placed in root.tsx. With this feature, by setting the Head tag in each component just like in Next.js, that information is collected and inserted inside the <head> tag.
Component Example
Data is read and rendered from Urql hooks generated by graphql-codegen. What makes this different from the typical Remix approach is that it doesn't use loader and useLoadData. By just incorporating next-exchange-ssr into Urql, the content fetched by queries is automatically server-side rendered (SSR).
I also built this previously.
- app/components/Pages/TopPage/index.tsx
import { FC, useMemo } from "react";
import { PostsQuery, usePostsQuery, useSystemQuery } from "@/generated/graphql";
import { useLoading } from "@/hooks/useLoading";
import { PostList } from "../../PostList";
import { Title } from "../../System/Title";
interface Props {}
/**
* TopPage
*
* @param {Props} { }
*/
export const TopPage: FC<Props> = ({}) => {
const [{ data: dataSystem }] = useSystemQuery();
const [{ fetching, data }] = usePostsQuery();
const posts = useMemo(() => {
if (!data?.findManyPost) return undefined;
return [...data.findManyPost].sort(
(a, b) =>
new Date(b.publishedAt).getTime() - new Date(a.publishedAt).getTime()
);
}, [data?.findManyPost]);
const categories = useMemo(() => {
if (!data?.findManyPost) return undefined;
const categoryPosts: {
[key: string]: { name: string; posts: PostsQuery["findManyPost"] };
} = {};
data.findManyPost.forEach((post) => [
post.categories.forEach((c) => {
const value =
categoryPosts[c.id] ??
(categoryPosts[c.id] = { name: c.name, posts: [] });
value.posts.push(post);
}),
]);
return Object.entries(categoryPosts).sort(([, a], [, b]) =>
a.name < b.name ? -1 : 1
);
}, [data?.findManyPost]);
const system = dataSystem?.findUniqueSystem;
useLoading(fetching);
if (!posts || !categories || !system) return null;
return (
<>
<Title>{system.description || "Article List"}</Title>
<div className="flex h-full w-full flex-col gap-16 overflow-auto p-8">
<PostList id="news" title="Newest" posts={posts} limit={10} />
{categories.map(([id, { name, posts }]) => (
<PostList key={id} id={id} title={name} posts={posts} limit={10} />
))}
</div>
</>
);
};
Others
Blurhash
This is a component for the image optimization feature. It includes the functionality to convert the provided URL into an address for image optimization and the Blurhash feature. Blurhash displays a blurred placeholder image generated from the original until the image finishes loading. In this blog system, a short string generated by Blurhash is embedded in the filename during image upload. The placeholder is displayed by referencing that filename. I've implemented it for now, but since images are delivered extremely fast through the image optimization feature and Cloudflare CDN, it's hard to actually see it in action.

Note that the Base83 string generated by Blurhash can cause problems in URL strings if used as-is for filenames, so it is first converted back to binary before being re-encoded.
- app/components/Commons/Image/index.tsx
import { decode } from "blurhash";
import { useEffect, useRef, useState } from "react";
import { useEnv } from "@/components/Provider/EnvProvider";
import { fileNameToBase83 } from "@/libs/client/blurhash";
import { classNames } from "@/libs/client/classNames";
type Props = {
src: string;
width?: number;
height?: number;
alt?: string;
className?: string;
};
const useBluerHash = ({
src,
width,
height,
}: {
src: string;
width: number;
height: number;
}) => {
const [value, setValue] = useState<string>();
useEffect(() => {
const hash = src.match(/-\[(.*?)\]$/)?.[1];
if (!hash || !width || !height) return;
try {
const canvas = document.createElement("canvas");
canvas.width = width;
canvas.height = height;
const ctx = canvas.getContext("2d")!;
const imageData = ctx.createImageData(width, height);
const pixels = decode(fileNameToBase83(hash), width, height);
imageData.data.set(pixels);
ctx.putImageData(imageData, 0, 0);
setValue(canvas.toDataURL("image/png"));
} catch (e) {}
}, [height, src, width]);
return value;
};
export const Image = ({ src, width, height, alt, className }: Props) => {
const env = useEnv();
const optimizer = env.NEXT_PUBLIC_IMAGE_URL;
const url = new URL(optimizer ?? src);
if (optimizer) {
url.searchParams.set("url", encodeURI(src));
width && url.searchParams.set("w", String(width));
url.searchParams.set("q", "90");
}
const [, setLoad] = useState(false);
const hashUrl = useBluerHash({
src,
width: width ?? 0,
height: height ?? 0,
});
const ref = useRef<HTMLImageElement>(null);
const isBlur = hashUrl && !ref.current?.complete;
return (
<>
<img
className={classNames(isBlur ? className : "hidden")}
src={hashUrl}
alt={alt}
width={width}
height={height}
/>
<img
className={isBlur ? "invisible fixed" : className}
ref={ref}
src={url.toString()}
width={width}
height={height}
alt={alt}
loading="lazy"
onLoad={() => setLoad(true)}
/>
</>
);
};
AVIF conversion of images
When uploading images, I convert the format to AVIF to reduce the size. The conversion runs an AVIF encoding WASM within the browser. This WASM is 3MB and exceeds 1MB even when compressed, but when placed as a browser asset, it doesn't run into the 1MB limit.
I built this quickly as well.
import { encode } from "@node-libraries/wasm-avif-encoder";
import { encode as encodeHash } from "blurhash";
import { arrayBufferToBase64 } from "@/libs/server/buffer";
import { base83toFileName } from "./blurhash";
const type = "avif";
export const convertImage = async (
blob: Blob,
width?: number,
height?: number
): Promise<File | Blob | null> => {
if (!blob.type.match(/^image\/(png|jpeg|webp|avif)/)) return blob;
const src = await blob
.arrayBuffer()
.then((v) => `data:${blob.type};base64,` + arrayBufferToBase64(v));
const img = document.createElement("img");
img.src = src;
await new Promise((resolve) => (img.onload = resolve));
let outWidth = width ? width : img.width;
let outHeight = height ? height : img.height;
const aspectSrc = img.width / img.height;
const aspectDest = outWidth / outHeight;
if (aspectSrc > aspectDest) {
outHeight = outWidth / aspectSrc;
} else {
outWidth = outHeight * aspectSrc;
}
const canvas = document.createElement("canvas");
[canvas.width, canvas.height] = [outWidth, outHeight];
const ctx = canvas.getContext("2d");
if (!ctx) return null;
ctx.drawImage(img, 0, 0, img.width, img.height, 0, 0, outWidth, outHeight);
const data = ctx.getImageData(0, 0, outWidth, outHeight);
const value = await encode({
data,
worker: `/${type}/worker.js`,
quality: 90,
});
if (!value) return null;
const hash = encodeHash(data.data, outWidth, outHeight, 4, 4);
const filename = base83toFileName(hash);
return new File([value], filename, { type: `image/${type}` });
};
export const getImageSize = async (blob: Blob) => {
const src = await blob
.arrayBuffer()
.then((v) => `data:${blob.type};base64,` + arrayBufferToBase64(v));
const img = document.createElement("img");
img.src = src;
await new Promise((resolve) => (img.onload = resolve));
return { width: img.naturalWidth, height: img.naturalHeight };
};
Using Vite
Midway through, I switched Remix to the Vite-supported version. I have summarized the issues that occurred at that time here.
Summary
If it doesn't exist, build it. That is the only point.
Discussion