- Phaser World
- Posts
- Phaser World Issue 219
Phaser World Issue 219

Welcome to another issue of Phaser World! We kept this one back a few days so we could include the latest release of Phaser v4 within it. Plus we’ve got new games, 3 tutorials and part 1 of a massive guide on how rendering works in Phaser 4.
This week:
⭐️ Phaser v4 Release Candidate 2

It’s here! Thank you to everyone who has been helping test Phaser 4. As we've worked through both your findings and our own tests, we identified some further issues that needed resolving, to ensure it's as stable as it can possibly be. On that front, we bring you Release Candidate 2. This version resolves all reported issues so far and will likely form the final build.
🎦 Phaser Showcase
» Six-Sided Streets

Discover Six-Sided Streets, a stylish turn-based strategy game inspired by tabletop hex-map mechanics, featuring elegant visuals where players create a charming town using tri-hex tiles. Each turn presents a set of three interconnected hexes featuring various elements like roads, parks, and buildings.
Thoughtfully place streets, wind turbines and parks to optimize your town's layout and maximize points.
» Ethgard Legends

An innovative blockchain-based game that seamlessly combines elements of auto battlers and collectible card games. It is made for strategy enthusiasts, TCG fans, and Web3 gamers seeking a competitive, skill-based experience backed by true asset ownership. Buy, position, and upgrade your cards to battle enemy heroes while collecting, trading, and upgrading your own.
👨🏫 Tutorials
» Phaser tile based platformer handling slopes

Learn how to implement smooth player movement over angled tile slopes in a Phaser platformer without relying on Phaser's Arcade physics. Use Tiled Map Editor to design levels with slopes and flat terrain and gain a deeper understanding of game mechanics by building it from scratch.
» How to Create a Stage in Phaser.js

Naosim shares his experience building action game stages with Phaser.js. He notes that Phaser.js requires precise implementation, as even small deviations can cause functionality issues.
The article explains the initial setup using the arcade physics engine and player controls with arrow keys. It then presents two correct implementation methods for creating platforms: using Group and StaticGroup.
» Basic Enemy in PhaserJS

Creating a basic enemy character with movement, health, damage response, and temporary invincibility in a PhaserJS platformer game requires several implementation steps.
This tutorial is for beginner to intermediate game developers who want to add interactive and animated enemies to their mobile platformer games.
🪵Phaser Studio Developer Logs
This is what the Phaser Studio team was up to last week…
»👨💻Rich - CTO & Founder
Oh, it’s been a while since I wrote one of these! Things have just been so crazy, with constant meetings and project sprints that, quite frankly, even sleep has been a luxury, never mind taking some time out to pen my thoughts. Running a start-up was always going to be intense. I knew that from the beginning. But morphing that start-up into its next stage? Wow, that is on a whole other level of crazy! I’m very thankful to be working with some great people to make this happen, and we’ll have more to report on this soon.
Back in more normal territory, Phaser 4 continues to evolve. Right now, while Ben takes a well-earned holiday, the team is going through thousands of examples and test cases to check to see if we’ve overlooked anything, or if RC2 is the one. So far, so good, but it’s always the little things that catch you out. Plus, we’ve a stack of side dishes to prepare to go with the main course: Updating the templates, the examples site, Phaser Editor, Phaser Launcher, the CLI tool, the documentation, checking through things we’ve published like the Phaser by Example book, and so on. Long gone are the days when we can simply whack the ‘release’ button on GitHub and be done with it. But that’s fine, it is just what’s required now.
Anyway, I’m going to keep this short - please play with RC2 and share your feedback. Hopefully, in a few weeks, we can celebrate the release together.
»🧑💻 Arian - Phaser Editor
Hello friends
What a week! If you're following the news in the world of game development and AI, you've most likely seen some videos where a user asks an AI (LLM) to take control of a game editor and start creating levels. Well, Phaser Editor is joining the party!
This past week, we've been working on integrating Claude Desktop with Phaser Editor. Basically, you can ask Claude to generate the objects for a level or scene. You provide the details, and Claude will start working on the scene.
Implementing this feature in the editor has been a real treat. I confess that when I was given this task, I thought it would be very complicated, but the truth is, it isn't. Several AI products already have an API that makes it relatively easy to connect AI with other local applications. Specifically, we're working with the Model Context Protocol (MCP) (https://modelcontextprotocol.io/) and Claude Desktop.
This is very new to me, and I still have a lot to learn, but the general idea is that you implement a server where you define operations (tools) that the AI can execute to interact with your product.
I've only implemented a few operations, but you can already ask Claude to build a level using images and text. In the following video, I show you how Claude instructs the editor after I ask him to analyze the project's assets and generate a Super Mario-style platformer level:
Note Claude isn't generating the game assets, but rather using the assets you already have in your project. These are the commands (tools
in MCP terminology) that we implemented so far:
project-get-available-textures
- Scans all the project's asset packs and returns a list of available textures. A texture can be either an image or a frame in a texture atlas.project-get-texture-binary
- Captures the texture contents in PNG format.scene-get-scene-dimension
- Returns the scene dimensions.scene-get-screenshot
- Returns a screenshot of the scene. The LLM decides which part of the scene to capture.scene-get-scene-data
- Returns all scene information. Since the.scene
format used by the editor is very clear, this operation simply returns the contents of the.scene
file as text.scene-move-object-in-render-list
- Moves an object in the scene's render list. This is important because the order of objects in the render list affects the order in which they are drawn on the screen.scene-delete-objects
- Deletes objects from the scene.scene-move-objects-to-parent
- Moves objects to a container, layer, or the scene's display list.scene-pack-objects-in-container
- Packs objects into the smallest container possible, maintaining the objects' global position.project-get-texture-content-bounding-box
- Returns the bounding box of a texture. This is important because the LLM needs to know the "true" size of the object containing the texture. In other words, it excludes the transparent space surrounding the texture. Thisscene-add-image
- Adds an Image game object to the scene. The LLM can choose from the textures available in the project.scene-update-image
- Updates the properties (such as position, scale, texture, etc.) of an existing Image game object in the scene.scene-add-text
- Adds a Text game object to the scene.scene-update-text
- Updates the properties (such as position, text, color, shadow, size, etc.) of an existing Text game object in the scene.scene-add-layer
- Adds a Layer game object to the scene.scene-update-layer
- Updates the properties of an existing Layer game object in the scene.scene-add-container
- Adds a Container game object to the scene.scene-update-container
- Updates the properties of an existing Container game object in the scene.
As you can see, we've implemented just a few operations. There are many more yet to be implemented, but even so, Claude can already do some very interesting things. The philosophy of Phaser Editor is to assist both novice and experienced programmers with visual tools. The idea is the same with AI. The AI could (we hope at some point) do most of the work of setting up scenes and writing game logic, but it must also be able to assist an advanced programmer in performing repetitive or tedious tasks. Last night, I ran some fun (for me) tests where I asked Claude to split the scene into layers, following the game's style, and to move each of the scene's objects to the corresponding layer: (edited)

I asked him to group the objects that were part of the same platform into a container:

I also asked him to change the name of each object in the scene from English to Spanish. This is something that can be tedious to do by hand, but Claude did it himself. And he did it very well. Of course, it's important to be precise in the tasks you ask the AI to do. Knowing the tools the AI has access to is very helpful. At first, I had difficulty joining the platform's edges. I implemented "screenshot" operations in the hope that it could analyze a global image of the scene and correct the mismatches, but it wasn't working. Finally, I was able to solve this by building a platform myself and asking Claude to learn how to do it. He also had to take into account the "origin" values of each object and their transparent padding. Something similar happens with the items that rest on the platforms. Sometimes it places them in the air or too low to the ground. But after "training" Claude, he was able to work much better. The "screenshot" operation, however, is very useful for Claude to understand the colors in the scene. With this information, he was able to create the game title text with the appropriate style on the first try. No changes were necessary:

Something that excites me is the fact that the LLM doesn't make changes that are incompatible with the scene format, as each modification is made through the operations we've implemented, which are practically a wrapper for the operations the user can perform in the editor at a high level. There's still a lot to do. I'm looking forward to seeing how the AI can create levels made of tilemaps, how it can create sprite animations, particle animations, and how it can inject code into the scene without deviating from the editor's conventions. I'd also love to incorporate an asset generator, and eventually, for AI to be able to generate entire projects. But if there's one thing that distinguishes the editor, it's that every step we take is very well thought out, and we always try to find what can help more the users. Best regards!
» 💎Tales from the Pixel Mines by Ben Richards

Phaser 4 Rendering Concepts
Phaser 4 is a total overhaul of the WebGL rendering engine. In this article we'll go through why we needed an overhaul, what we added in the process, and how the renderer does what it does, with a focus on performance. Due to the size of this article we’ll split the content over two issues of the newsletter!
State of Phaser 3: Why We Did It
Phaser 3 has been evolving for the last 7 years. This made it a highly capable game engine for the Web. It established rendering solutions for performant gaming situations, powerful effects, and flexible rendering targets.
However, these solutions all had to share the WebGL state. WebGL works by setting various internal parameters, the "state", which take effect when commands run. For example, to create a texture with premultiplication, we run gl.pixelStorei(gl.UNPACK_PREMULTIPLY_ALPHA_WEBGL, true)
to set the state, then run gl.texImage2D(gl.TEXTURE_2D, 0, gl.RGBA, gl.RGBA, gl.UNSIGNED_BYTE, image)
to upload image
with premultiplication. The texImage2D
command has no way to set premultiplication; it is provided solely by the WebGL state.
There are a lot of internal parameters in the WebGL state, so there are a lot of possible settings that could apply to a command. We have to make sure that every relevant parameter is correct before running that command, or something could go wrong: a texture could appear upside-down, or be cut off at the wrong place, or try to draw to itself (this is an error).
The problem was, Phaser 3 generally trusted each rendering solution to handle WebGL itself. So if a single solution forgot to set or unset a parameter, the WebGL state would start to do unexpected things in other solutions. For example, some kinds of FX (but not all) would mess up a Mask.
It was no longer tenable to align all the disparate rendering solutions. We had to rebuild the render system, such that rendering didn't require alignment at all.
Goals of the New Renderer
If it ain't broke, don't fix it. The Phaser 3 API should remain intact everywhere, except the new renderer and renderer-related systems.
Manage WebGL state. We should always know what the state is.
Facilitate WebGL context restoration. Sometimes, the browser can revoke WebGL. With a managed state, we know all the settings to re-apply.
Prioritize context-agnostic rendering. Objects should render the same way whatever context they're in, and require minimal knowledge of context. This makes behavior easy to predict for developers.
Externalize drawing. If Phaser itself can handle a standard drawing procedure, objects can remain agnostic about these standards and handle less data themselves.
Follow standards. WebGL has its own ideas about coordinate orientation, GLSL shader code, etc. Wherever possible, we should adhere to WebGL's standards, and not try to fight them.
Make major changes. Phaser 4 is a major release, so it's the perfect time to fix API decisions that were holding us back in Phaser 3.
Centralize quads. Our early development mantra was "Phaser draws quads". A lot of WebGL practices are best for 3D applications, with models consisting of large collections of linked triangles. But Phaser draws sprites, which are four-sided ("quad") shapes, so that's what we should do best.
Prioritize performance. Everything should go fast.
What's New in Phaser 4
As Phaser 4 developed, and went through a great public beta process, we found opportunities to add and update features. Let's look at significant removals, additions, and changes.
This is not an exhaustive list. As always, check the API documentation for the most accurate details.
Significant Removals
Pipelines
Derived FX
Mesh
andPlane
BitmapMask
Point
Pipelines
A Pipeline
was a Phaser 3 rendering system. Frequently, pipelines had several responsibilities, such as the Util pipeline which handled various different rendering tasks. This contributed to some of the alignment difficulties.
The Pipeline was replaced by the RenderNode
in Phaser 4. A render node is intended to handle a single rendering task. This makes maintenance straightforward. All render nodes have a run
method to execute that task. Some render nodes have a batch
method, so they can assemble state from several sources before invoking run
.
It's generally not necessary to handle render nodes, but game objects do maintain defaultRenderNodes
and customRenderNodes
maps to support configuration. Common tasks like enabling lighting, which originally involved assigning a new pipeline, can now be performed via gameObject.setLighting(true)
.
Derived FX
The FX Bloom
, Gradient
, Shine
, Vignette
, and Wipe
were removed.
These FX are all "derived" from other, more basic processes. Many involve a gradient function of some sort, and there are a whole lot of possible gradient definitions. This means a lot of possible complexity.
We removed these FX, with the intention that they can be built up from the Filters included in Phaser 4, and perhaps some custom texture shaders to provide gradients. For example, the Bloom filter can be recreated using ParallelFilters with a top pass of Threshold and Blur to select bright tones and spread them out, blended onto the bottom pass with ADD.
See below for how FX have become Filters.
Mesh
and Plane
We intend to handle 3D properly in the future, so we removed these limited 3D implementations.
BitmapMask
The BitmapMask
class was removed, because it was only ever used in WebGL, and WebGL now has the Mask filter for more powerful masking operations.
GeometryMask
remains available in the canvas renderer, but not in WebGL.
Point
The Point
class was removed because it overlapped with Vector2
. We only need one way to represent x,y coordinates.
Significant Additions
The following game objects were added. They are all only available in WebGL, except Stamp which is also available in Canvas.
CaptureFrame
SpriteGPULayer
Stamp
TilemapGPULayer
CaptureFrame
The CaptureFrame
game object does not render. Instead, it copies whatever has already been rendered, and saves it to a texture for later. This is useful for applying filters to part of the scene's depth, without messing about with layers, containers, or dynamic textures. As it is a simple object, it can be moved around in the display list to change what it captures.
Technically, CaptureFrame copies the WebGL framebuffer that is currently bound. Mostly this means the main game canvas. Filters and DynamicTextures use their own framebuffers, so a CaptureFrame inside a Container with filters will capture just the contents of the Container.
SpriteGPULayer
The SpriteGPULayer
game object renders high efficiency static objects. It is intended to render millions of background objects. By comparison, Phaser 3 and 4 can typically render tens of thousands of sprites with good performance. SpriteGPULayer is a hundred times faster.
SpriteGPULayer is very powerful, but requires special handling to reach its full potential. It works because, unlike other WebGL rendering systems, it doesn't update its data every frame. It renders a large buffer of static data. You can update its data after it is created, but this update costs, just like regular sprites.
Why not just render a flat background image? First, SpriteGPULayer supports scroll factor on its member objects, allowing background parallax. Second, although it is static, it can still be animated. Member objects can use frame animation, and also support an extensive range of customizable animations on their properties. They can fade in and out, bounce around or wave in the wind, grow and shrink, fall past the camera repeatedly, adjust their colors, etc. They're perfectly suited to bring life to a non-interactive background.
SpriteGPULayer is the reflection of other Phaser game objects, which are interactive and get updated every frame. We were checking for bottlenecks in the regular rendering process, and we realized that the GPU update bottleneck simply went away if the data was static. Thus SpriteGPULayer was born.
Stamp
The Stamp
game object renders a quad without any reference to the camera. This is mostly used for DynamicTexture
operations, but it might be useful elsewhere. Note that, because it doesn't use the camera at all, it may occasionally create unexpected results.
TilemapGPULayer
The TilemapGPULayer
object is an option within Tilemap
. When you create a TilemapLayer
, you can instead create a TilemapGPULayer.
TilemapGPULayer renders the layer as a single quad. This has quality and performance advantages. Because the shader has knowledge of the full layer, it can accurately blend across tile boundaries, resulting in perfect texture filtering when antialiasing is enabled. And because the shader cost is per-pixel, not per-tile, it can render very quickly.
For technical reasons, mobile GPUs may not render TilemapGPULayer as efficiently as desktop GPUs. Bear this in mind before choosing it for your project.
However, because it renders per-pixel, TilemapGPULayer suffers no performance loss when rendering large numbers of tiles. It can render a tile layer up to 4096x4096 tiles, and will happily render the entire layer just as quickly if the camera zooms out to see all 16 million tiles. If you need extremely large numbers of tiles on screen at once, TilemapGPULayer may still be a superior choice for mobile platforms.
Significant Changes
We changed the way we handle some systems. We want to preserve the API wherever possible, but in some cases changes were necessary.
GL Orientation
FX and Masks are now Filters
Lighting
DynamicTexture
andRenderTexture
APIGraphics
andShape
APIShader
APITileSprite
API
GL Orientation
GL and WebGL use different coordinate systems than other computer systems. In most systems, the coordinate 0,0 refers to the top left of the screen. In GL, it refers to the bottom left of textures, and the middle of vertex clip space.
Phaser 3 chose to represent things using top-left orientation. This led to mismatches between different parts of the system: it would draw framebuffers upside-down, then flip them over to draw to the screen, where GL conventions must be followed.
Phaser 4 has switched to using GL orientation. This is largely invisible to the user, as we take care of texture coordinate handling. Some shader code may need to be revised, as the top and bottom might have switched. Mostly it just simplifies the codebase.
If you are using compressed textures, note that Phaser 4 requires them to be encoded with the Y axis pointing "up". This is usually available as a "flip Y" option in your texture compression software. We cannot re-encode these textures at runtime, due to the way compression achieves efficiency.
FX and Masks are now Filters
We unified FX and Masks under the new title Filters. Filters are handled in a standard way, ensuring that they are all compatible even if they have no knowledge of one another.
A Filter is a simple process: it takes an input image, and creates an output image. In most cases this can be done with a single shader program, so custom filters are easy to author.
Filters can be applied to any game object or scene camera. Phaser 3 had restrictions on which objects supported FX, and whether preFX and postFX were available. Phaser 4 does not have this restriction. You can even apply filters to Extern
objects.
Filters are divided into "internal" and "external" lists. Internal filters affect just the object. External filters affect the object in its rendering context, usually the full screen. Internal filters are a good way to have filters match the position of the object.
Note that some objects cannot define the internal space, so they use the external space instead. Objects without width or height are affected. Shape
objects are also affected, as they may have a stroke which would be cut off by the reported width and height; they can turn this off by setting shape.filtersFocusContext = false
.
As noted above, we removed the filters Bloom, Gradient, Shine, Vignette and Wipe.
The existing filter ColorMatrix shifted its color management methods onto a property called colorMatrix
, so you would now call colorMatrix.colorMatrix.sepia()
.
We added the new filters Blend, Mask, Parallel Filters, Sampler, and Threshold.
Blend combines the input with a texture. This is similar to blend modes, but whereas WebGL blend modes are fairly restricted, this filter supports all the blend modes available in the canvas renderer. The Blend filter also support overdriving, mixing outside the usual 0-1 range, which can create useful color effects.
Mask takes the place of masks in Phaser 3. It can take a texture, or a game object, which it draws to a DynamicTexture. Note that a Container with other objects, even objects with their own filters and masks, is a valid mask source.
Parallel Filters passes the input image through two different filter lists, and combines them at the end. It is useful when you want some memory in a complex stack of filters.
Sampler extracts data from the WebGL texture and sends it back to the CPU for use in a callback. It is similar to the snapshot functions available on DynamicTexture.
Threshold applies a soft or hard threshold to the colors in the image.
Lighting
In Phaser 3, lighting was added to objects by adding a new pipeline.
In Phaser 4, simply call gameObject.setLighting(true)
. You don't need to worry about how lighting is applied, just that we're taking care of it.
Lighting is available on many game objects, including BitmapText, Blitter, Graphics and Shape, Image and Sprite, Particles, SpriteGPULayer, Stamp, Text, TileSprite, Video, and TilemapLayer and TilemapGPULayer.
Objects can now cast "self-shadows", using a more realistic shader to simulate the shadows cast by features on their own surface. This uses the brightness of the texture to guess how concave or convex the surface is at a given point. Self shadows can be enabled as a game-wide setting, or per-object.
In Phaser 3, lights had an implicit height, based on the game resolution. In Phaser 4, lights have a Z value to set this explicitly.
Note that lighting changes the shader, which breaks batches (see below).
DynamicTexture
and RenderTexture
API
DynamicTexture
allows you to draw to a texture at runtime, then use it on other objects.
In Phaser 3, DynamicTexture allowed you to define batches and perform other intricate drawing operations. While efficient, this was too technical for most uses. In addition, it used its own drawing logic, which made for compatibility issues.
In Phaser 4, we removed many of these complex methods. Instead, we used the basic rendering system, which supports batching automatically. As a concession to the change, you must now call dynamicTexture.render()
to execute all buffered drawing commands.
The new capture()
method is similar to draw()
, but supports more configuration, and captures the current camera view of a game object.
The new preserve()
method interacts with the drawing command buffer. DynamicTexture now stores its commands in a buffer, waiting for execution via render()
. Normally, it clears the buffer after it renders. You can choose instead to preserve a series of drawing commands, allowing you to render them many times. This is useful if you're drawing game objects which change over time.
The new callback()
method inserts a callback to run as the drawing command buffer executes.
The repeat()
method uses a TileSprite
behind the scenes, so its capabilities are extended to match.
RenderTexture
The RenderTexture
game object wraps a DynamicTexture. It has all the same drawing methods, mapped to its DynamicTexture.
RenderTexture has a new property: renderMode
. When set to "render", this draws the RenderTexture like an ordinary Image. When set to "redraw", the RenderTexture instead runs render()
, updating its texture, but does not draw itself. When set to "all", it does both.
The "redraw" renderMode allows RenderTexture to update a texture during the render loop. This was not possible before, and it allows you to draw things that have only just updated, such as same-frame shader outputs or other RenderTextures.
Graphics
and Shape
API
The Graphics
game object is largely unchanged, as is the Shape
which uses the same render systems, but there are a couple of improvements.
Graphics has a new pathDetailThreshold
property. This can also be set as a global game config option. This option skips vertices within a certain distance of one another, greatly improving performance on complex curves displayed in small areas.
Shape has updates to Rectangle
and Grid
. Rectangle now supports rounded corners (this was introduced in Phaser v3.89). Grid has changed some property names to follow the conventions of other Shapes: it has a stroke instead of an outline. Grid also has controls for how to render the gutters between grid cells, and whether to draw outlines on the outside of the grid or just between cells.
Shader
API
The Shader
game object allows you to use custom shader code instead of a texture for a quad. Shader objects will need to be rewritten for Phaser 4.
The game object construction signature has changed. It now takes a config object (ShaderQuadConfig
) which allows you to configure the way the shader executes. Consult examples or the Shader Guide at phaser.io for more details on shader setup.
Shaders formerly set a number of shader uniforms in line with websites like Shadertoy. These uniforms are no longer set automatically. You can encode them into your configuration if you need to use them.
Note that the texture coordinates of your shader will now use GL conventions, where Y=0 is at the bottom of the image.
GLSL Changes
The way Phaser loads GLSL code has changed.
GLSL code is now loaded without regard as to how it will be used. It is not classified as fragment or vertex code, because under the new system it could be either, or both. You load fragment and vertex shaders separately, and combine them when creating a Shader.
We have removed custom templates from the shader code. We now use #pragma
preprocessor directives, which are valid GLSL. This means our shader code works with automated syntax checkers. (The pragmas are removed before compilation. They serve merely as identifiers for our custom templates.)
The Shader Guide explains more about how shaders are composed, including Shader Additions which add configurable functionality to shaders. This is an advanced topic, and a pre-written shader doesn't need any addition code.
TileSprite
API
The TileSprite
game object displays a repeating texture across a quad. In Phaser 4 this has become more powerful, with frame support and tile rotation.
In Phaser 3, TileSprite used WebGL texture wrapping parameters to repeat the texture. These parameters control what happens when you sample outside the texture: do you get the edge of the texture, or a reflection of the texture, or a repeat of the texture?
However, this approach was limited. It would only repeat the entire texture file. In addition, there were problems with compressed textures, textures with a size that was not a power of two, and DynamicTextures.
So in Phaser 4 we switched to a different shader, manually controlling the texture coordinate wrapping instead of using texture wrapping parameters. This now supports any texture, and can use frames within that texture, thus enabling texture atlas/spritesheet support. It works like a regular game object.
The tileRotation
property allows you to rotate the texture, so now it can be transformed any way you like.
Next issue we’ll cover the new rendering system in Phaser 4 and how it works in detail.
Share your content with 18,000+ readers
Have you created a game, tutorial, code snippet, video, or anything you feel our readers would like?
Please send it to us!

Until the next issue, happy coding!